Back in 2015, a new term was introduced to the market: DataOps. “DataOps,” wrote Andy Palmer:
“Is a Data Management method that emphasizes communication, collaboration, integration, automation and measurement of cooperation between data engineers, data scientists and other data professionals.”
The practice was born out of the democratization of analytics and the implementation of built-for-purpose database engines. The goal of DataOps is to help organizations rapidly deliver data that accelerates analytics and enables previously impossible analytics, Palmer said.
JOIN US AT THE DATA GOVERNANCE & INFORMATION QUALITY CONFERENCE
Learn from dozens of real-world case studies, tutorials, seminars, and more – Dec. 5-9, 2022 in Washington, D.C. (Register by Oct. 7 to save up to $400!)
“DataOps has spawned a robust ecosystem of vendors. To date, over $50 million has been invested in companies who market a wide array of DataOps product and services.”
Those products break down into data pipeline orchestration, automated testing and production quality and alerts, deployment automation and development sandbox creation, and Data Science model deployment.
The DataKitchen article noted that Delphix, with its software platform that enables teams to virtualize, secure, and manage data, has a critical role in supporting DataOps functions. Eric Schrock, CTO at Delphix and a Forbes Councils Member, wrote an article last year for Forbes on the topic himself. He predicted that DataOps principles and approaches will evolve over the next few months before starting to coalesce as proven practices emerge in the industry.
“DataOps,” Schrock stated. “Promises to accelerate innovation by providing everyone ready access to quality data where they need it while maintaining appropriate security and privacy controls.”
Practice Hones DataOps Expertise
Delphix has created a data modernization practice headed by VP and global practice director Sanjeev Sharma. “Every company is a data company,” Sharma says, and 90 percent of respondents to a report from 451 Research and Delphix say that they will significantly increase their investment in DataOps technologies this year. That reflects a common need to harness, control, and secure their data.
Customers tell Delphix that data delivery is causing serious delays in the development process, disrupting otherwise highly-automated CI/CD pipelines. DataOps helps address the data bottleneck problem that slows down the application release cycles critical to a strong DevOps strategy.
“As developers adopt CI/CD, more and more app builds get delivered, and testers and QA practitioners need to run more tests,” Sharma says. “Think about how many more releases, updates, and bug fixes teams could put out with self-service secure access to the most up-to-date data.”
The new solutions, processes, and cultural practices at the core of DataOps allow companies to eliminate the manual processes associated with data delivery to make it more agile and secure, Sharma says. Those processes include data provisioning, versioning, and aligning database code with application code. DataOps enables fast provisioning of current production data with masked data while keeping the database schema in synchronization.
“DataOps also allows for integrated test environments that use high-fidelity, masked data, and data structures to provide developers with more autonomy, less dependence on infrastructure teams, and faster releases,” Sharma says.
A New Culture for DataOps
Sharma emphasizes that transforming data from a liability into an asset via DataOps requires commitment from every level of the company – from top executives to developers to quality assurance teams. Adopting it means having the dedication, resources, time, and talent necessary for data-driven digital transformation.
“You need to bring all the practitioners who manage, govern, and secure data into the fold. Data analysts, DBAs, and security admins must be part of the adoption initiatives in order to better balance their mission to protect and secure data with DevOps team goals to access that data,” Sharma insists. “They need to think in terms of addressing data friction to achieve flow in the delivery pipeline, but to do this they’ll also need the technology that enables modern Data Management, governance, and data security and compliance practices in their arsenal.”
The process for sweeping cultural change will be different for every enterprise. With that in mind, Sharma advises taking stock of your company’s existing infrastructure, applications, and processes to formulate a strategy that will scale and enhance the agility and adaptability of your own IT systems while managing ongoing business demands.
The Data Virtualization Fit with DataOps
Data virtualization is among the technologies that inform the DataOps process. Delphix’s approach is to use highly compressed virtualized data copies with minimal footprints from any data source, increasing storage efficiency and lowering costs.
Virtualized data environments allow quick and easy distribution to users. “With data virtualization, data consumers have access to an advanced range of capabilities, allowing them to consume data in the manner desired,” Sharma explains.
Users can refresh data as needed from source systems as well as create bookmarks and branches of data versions within their environments. They can share this with other users to drive collaboration between development and quality assurance teams to increase efficiency, Sharma says.
DataOps in Practice
Delphix says that one-third of the Fortune 100 companies are its customers. They span verticals from finance to healthcare, from banking to retail.
One Fortune 100 customer is Dentegra, the largest dental benefits company in the United States. Dentegra began using Delphix for moving data to the cloud.
“With Delphix,” Sharma says. “The organization reduced the time it takes to move data to cloud environments from more than eight weeks to mere hours and decreased storage requirements in AWS by leveraging virtual instead of physical data copies.”
The company also uses Delphix to mask hyper-sensitive PII and PHI data before its replication to AWS.
“Additionally, Dentegra can determine requirements for a new application project on one day, then gather the necessary data and compute resources to execute against those requirements in under 24 hours,” Sharma says.
They are working on new solutions to help developers treat data as code for faster, more flexible software development. One of the products to enable this is Delphix for Healthcare, announced this year. Healthcare companies can use its Dynamic Data Platform and “Data Pods” technology for rapid data provisioning, secure data masking, and self-service data access while staying compliant with HIPAA and other data privacy regulations.
Delphix is also working to integrate with new partners, allowing enterprises to accelerate their adoption of a multi-cloud strategy, Sharma says.
Image used under license from Shutterstock.com