Advertisement

Attunity Launches Streaming Data Pipeline Solution for Data Lakes on AWS

By on

A new press release reports, “Attunity Ltd., a leading provider of data integration and big data management software solutions, announced a new solution today, Attunity for Data Lakes on Amazon Web Services (AWS) designed to automate streaming data pipelines on AWS. The offering is designed to support streaming real-time data from major enterprise databases, mainframes, and applications such as SAP, to accelerate near real-time analytics, machine learning (ML) and artificial intelligence (AI) initiatives. These new capabilities are being demonstrated live this week in Attunity booth 630 at AWS re:Invent 2018.”

The release goes on, “Enterprises are moving to cloud data lakes as they provide greater agility and elasticity but continue to be challenged to efficiently create analytics-ready data sets from heterogeneous data sources. Such integration can be a manually intensive and complex endeavor, challenging to assemble and often resulting in outdated data when it’s finally ready for business consumption. Attunity helps overcome these challenges with a solution leveraging Apache Spark technology to further accelerate and automate data pipelines – from the generation of source system data streams right through to the creation of analytics-ready data sets.”

It adds, “Attunity’s support for data lakes on AWS helps enterprises to: (1) Improve operational efficiency and increase ROI – Using Apache Spark as a high-performance engine, the solution is designed to automate the generation of transformations and allow for analytics-ready data sets. This means that data engineers can expect to quickly create reusable, automated data pipelines that streamline the delivery of analytics-ready data sets to end users, lessening the need for manual coding or expensive development resources. (2) Provide transactional data for analytics efficiently and in near real-time – Continuously streaming data and metadata updates powered by Attunity’s change data capture (CDC) technology means that data sets are accurate and current, and also support a broad range of data sources from on-premises and cloud databases, to data warehouses, SAP and mainframe systems.”

Read more at PR Newswire.

Image used under license from Shutterstock.com

Leave a Reply