StreamSets Adds New Features to DataOps Platform
September 11, 2018

StreamSets announced innovations that help companies efficiently build and continuously operate dataflows that span their data center and leading cloud platforms — AWS, Microsoft Azure and Google Cloud Platform.

New capabilities include data drift handling for cloud data stores for improved pipeline resiliency, continuous integration and delivery (CI/CD) automation that brings DevOps-style agility to dataflow pipelines, and the ability to centrally manage in-stream data protection policies for security and compliance.

These features build on StreamSets DataOps Platform’s rich catalog of cloud connectors, its cloud-native architecture for easy cross-platform deployment, and its ability to elastically scale dataflows via Kubernetes.

Features such as data drift handling and in-stream data protection are powered by StreamSets’ unique Intelligent Pipelines capability, which inspects and analyzes data in-flow, overcoming the lack of visibility common in traditional data integration and big data ingestion approaches.

A majority of StreamSets customers already use the StreamSets DataOps Platform for cloud dataflows, executing both “lift and shift” cloud migration projects that require peak throughput, and continuous real-time streaming of data.

“As our customers embark on their hybrid cloud journey, we see first-hand their struggle to orchestrate end-to-end management of data movement across a growing range of on-premises and cloud platforms,” said Arvind Prabhakar, CTO, StreamSets. “Our DataOps platform was architected as cloud-native from the start, allowing us to easily evolve with the market. Cloud drift-handling and CI/CD for dataflows are unique enhancements that help our customers on their journey from traditional to modern data integration based on DataOps.“

The expansion of data architectures into the cloud creates challenges for enterprises that still rely on traditional data integration software or single-purpose big data ingestion tools. Using these methods, pipelines take too long to build and deploy, and often rely on valuable, specialized developers. They are opaque, denying end-to-end visibility into pipeline performance to prevent failures or detect sensitive personal data in the dataflow. Finally, they are rigid, breaking whenever data drift occurs, such as when fields are added or changed or data platforms are upgraded.

With these new features, which began rolling out in late August, StreamSets DataOps Platform now offers:

- Development automation through a full-featured dataflow designer that includes “easy button” connectors for Amazon S3, Elastic MapReduce (EMR) and RedShift; Azure Data Lake Storage, HDInsight and Azure Databricks; Google DataProc and Snowflake

- Elastic scaling of cloud, multi-cloud and reverse hybrid cloud dataflows via Kubernetes

- New data drift handling, which automatically reflects updates to source schema in Amazon Athena, Azure SQL and Google BigQuery cloud data services

- A new CI/CD framework for automating frequent changes to dataflows through iterative design, test, validate and deployment steps

- New central governance of StreamSets Data Protector policies that detect and deal with sensitive data such as PII and PHI

The Latest

September 24, 2018

From how applications and infrastructure are developed, configured and built to how they are tested and deployed, pervasive automation is the key to achieving better efficiency and standardization that gives companies the competitive edge. Pervasive automation is the concept of scaling automation broadly and deeply across the entire software delivery lifecycle ...

September 20, 2018

The latest Accelerate State of DevOps Report from DORA focuses on the importance of the database and shows that integrating it into DevOps avoids time-consuming, unprofitable delays that can derail the benefits DevOps otherwise brings. It highlights four key practices that are essential to successful database DevOps ...

September 18, 2018

To celebrate IT Professionals Day 2018 (this year on September 18), the SolarWinds IT Pro Day 2018: A World Powered by Tech Pros survey explores a "Tech PROactive" world where technology professionals have the time, resources, and ability to use their technology prowess to do absolutely anything ...

September 17, 2018

The role of DevOps in capitalizing on the benefits of hybrid cloud has become increasingly important, with developers and IT operations now working together closer than ever to continuously plan, develop, deliver, integrate, test, and deploy new applications and services in the hybrid cloud ...

September 13, 2018

"Our research provides compelling evidence that smart investments in technology, process, and culture drive profit, quality, and customer outcomes that are important for organizations to stay competitive and relevant -- both today and as we look to the future," said Dr. Nicole Forsgren, co-founder and CEO of DevOps Research and Assessment (DORA), referring to the organization's latest report Accelerate: State of DevOps 2018: Strategies for a New Economy ...

September 12, 2018

This next blog examines the security component of step four of the Twelve-Factor methodology — backing services. Here follows some actionable advice from the WhiteHat Security Addendum Checklist, which developers and ops engineers can follow during the SaaS build and operations stages ...

September 10, 2018

When thinking about security automation, a common concern from security teams is that they don't have the coding capabilities needed to create, implement, and maintain it. So, what are teams to do when internal resources are tight and there isn't budget to hire an outside consultant or "unicorn?" ...

September 06, 2018

In evaluating 316 million incidents, it is clear that attacks against the application are growing in volume and sophistication, and as such, continue to be a major threat to business, according to Security Report for Web Applications (Q2 2018) from tCell ...

September 04, 2018

There's a welcome insight in the 2018 Accelerate State of DevOps Report from DORA, because for the first time it calls out database development as a key technical practice which can drive high performance in DevOps ...

August 29, 2018

While everyone is convinced about the benefits of containers, to really know if you're making progress, you need to measure container performance using KPIs.These KPIs should shed light on how a DevOps team is faring in terms of important parameters like speed, quality, availability, and efficiency. Let's look at the specific KPIs to track for each of these broad categories ...

Share this