Loading...
You are here:  Home  >  Data Blogs | Information From Enterprise Leaders  >  Current Article

How Data Quality Changed in 2017 and What it Means for the Year Ahead

By   /  March 8, 2018  /  No Comments

Click to learn more about author Kevin W. McCarthy.

In 2017, we saw many changes across the United States and the world. And it’s no different when it comes to the Data Management space. Last year was filled with some good and some not-so-good changes in our industry. More important than what happened last year is what this year will bring. Three of the biggest shifts I recognized last year were the placement of vendors on the Gartner Magic Quadrant for Data Quality Tools report; the shift toward self-service models for Data Management; and the buzz around Big Data, Machine Learning and Artificial Intelligence (AI). Each of these changes will continue to influence the direction of Data Management and Data Quality in 2018, and they will influence how the top Data Management providers work to innovate and respond to shifts in the market.

The Gartner Magic Quadrant for Data Quality Tools report is regarded as the trusted source for agnostic vendor information in the Data Quality industry. Each year, Gartner releases the report to inform the market how the various vendors stack up. While it’s common for vendors to move around the quadrant from year to year, what happened last year was different. In 2017, almost every vendor dropped or moved backward in their quadrant placement. I wasn’t terribly surprised, because many of the top vendors in the Data Quality space are primarily players in the enterprise space.

The thing about being an enterprise-level vendor is that Data Quality tends to be just one piece of a much larger puzzle. With Data Quality being another addition to a larger suite of software tools, there isn’t necessarily a focus on it. For the largest players in the space, solutions often come as part of an acquisition of smaller, pure-play Data Quality companies, allowing them to check the Data Quality box and move on to other things. This means that innovation frequently lags on the Data Quality piece as it is not a top priority.

So, what should we expect in the year to come? I think last year’s Magic Quadrant was a wake-up call for vendors still prioritizing Data Quality, and we’ll see more providers focusing on innovation and improving their Data Quality offerings. As mentioned in the market definition for the Magic Quadrant in 2017:

“As a result of a significant shift in the market over the past few years from traditional IT-driven Data Quality tools to modern business-driven ones, Gartner has redesigned this Magic Quadrant to reflect changing market dynamics and the necessity for innovation.”[i]

Given the new criteria, we are likely to see solutions providers respond by emphasizing their business-centric approaches to Data Quality. This means greater self-service capabilities and user-friendly, easy-to-use functionality. I believe this is good news because when products are designed to require less technical knowledge, data is more easily accessible to users across the business.

In addition to the innovation around business-friendly Data Quality tools, I expect there will be a shift toward designing and tailoring products for the midmarket. As mentioned above, many top vendors for Data Quality design Data Management tools for enterprise users. The issue here, of course, is that while the midmarket demands powerful Data Management solutions, the tools are simply not designed to scale down. Enterprise-level tools tend to be far more complex, take longer to install, require a great deal of technical expertise to operate and maintain, and have add-on capabilities that are unnecessary for most companies at the midmarket but drive up the price point of the product.

Therefore, in addition to greater self-service and user-friendly design, I expect to see increased scalability for products, improved infrastructure for Software-as-a-Service (SaaS) solutions, and mix-and-match options to tailor Data Management products to the organization’s need and size.

Providers that design scalable products can easily serve organizations of any size. Because midmarket and small businesses lack the IT resources that enterprise companies have, products that don’t require a whole IT staff are far better suited to their needs. With the shift to SaaS, self-service and a business-centric approach to Data Management, solutions must provide a simpler experience while retaining the functionality of more technical offerings. In 2018, I anticipate that we’ll see more vendors offering customizable suites of products. This way, organizations can select the solutions they need to drive business results, without having to pay for enterprise-level Data Management capabilities they may not necessarily need. Through these customizable packages, organizations at the midmarket will gain the opportunity to take greater control of their Data Management.

Throughout 2017, the buzz around Big Data, Machine Learning and AI continued (granted, most solutions providers are still trying to figure out how to incorporate these capabilities into their offerings effectively). Most prospects are also still figuring out how to best monetize this kind of technology. Some providers have made greater headway in some of these areas, but it’s still far from the norm for most Data Management vendors. Years ago, we saw the emphasis shift from the data warehouse to so-called “data marts,” based on the vision of the central data repository moving to the reality of actionable data sets.

Now, we are seeing the emphasis turn from Big Data to Data Lakes. Since actual value can only be attained through productive application, I believe in 2018 it won’t be enough to simply include these capabilities. Value will be determined by how useful these capabilities are in practice. As the year plays out, I anticipate that more vendors will start taking this more pragmatic approach to Big Data, Machine Learning and AI by incorporating these capabilities in a practical way that will ultimately help make data clean, accurate and fit for purpose.

I expect many top vendors for Data Quality to improve and evolve throughout 2018. I look forward to the positive advancements that will be made in the space this year by providers that continue to prioritize Data Quality as a crucial piece of any overall Data Management program. Keep an eye on vendors that innovate and develop their products to meet the new market demands identified by Gartner, which will be particularly interesting to watch. Providers that demonstrate this commitment will adapt to market shifts and be the most prepared to fulfill the needs of most organizations. By applying the lessons we’ve learned from 2017 and looking ahead to see how the market will continue to evolve throughout 2018, Data Quality vendors can develop solutions that deliver the value and results customers expect. The new year is always a time of change, and I’m excited to see what changes this year has in store for the Data Quality industry.

[i]Selvage, Mei Yang, Saul Judah and Ankush Jain, Magic Quadrant for Data Quality Tools, Gartner, October 24, 2017.

About the author

Kevin W. McCarthy is a Director of Product Marketing at Experian Data Quality. He leverages his vast experience with data quality technology and customer implementations to help clients achieve their data management objectives. Kevin has spent more than 20 years in the data management space, first as a professional services consultant and later developing product management and product development functions, responsible for development, documentation, and customer support. As part of Experian’s global product management team, Kevin works to bring practical data quality solutions to market by understanding customer expectations and changing market dynamics. He is a frequent blogger and expert panelist. Kevin graduated from Merrimack College with degrees in Computer Science and Psychology. He resides on the North Shore with his wife and two boys, and spends his spare time as a recreational runner. Keep up with Kevin through his LinkedIn profile and read his Experian blogs. About Experian Data Quality Experian Data Quality enables organizations to unlock the power of data. We focus on the quality of our clients’ information so they can explore the meaningful ways they can use it. Whether optimizing data for better customer experiences or preparing data for improved business intelligence, we empower our clients to manage their data with confidence. We have the data, expertise, and proven technology to help our customers quickly turn information into insight. We’re investing in new, innovative solutions to power opportunities for our people, clients, and communities. To learn more, visit www.edq.com. Follow Kevin and Experian Data Quality at: Twitter, LinkedIn, Facebook

You might also like...

Thinking Inside the Box: How to Audit an AI

Read More →