Advertisement

Six Trends in the Current State of Data Quality

By on

Click to learn more about author Farnaz Erfan.

Businesses today have more data available at their fingertips than at any time in history. But an abundance of data is not necessarily a free pass to leveraging multiple layers of knowledge to achieve successful business strategies. In fact, according to The State of Data Quality in the Enterprise, 2018 study, many companies struggle with a variety of Data Quality and Data Preparation challenges in their quest to turn data into valuable business insights.

Surveying 290 executives and IT professionals at enterprises with $100M or more in annual revenue, the report spotlights the need for organizations to reach more comprehensive levels of Data Quality. It also provides benchmarks for companies to gauge how their Data Quality compares.

The survey unveiled some interesting insights:

  • Data Quality Maturity: Companies who have invested in the maturity of their Data Quality report higher levels of satisfaction. However, only 15 percent of businesses have actually deployed a mature Data Quality model, and just 40 percent have developed one.
  • Data Preparation Activities: Mature Data Quality segments perform more high-value Data Preparation activities such as monitoring the quality of data, data profiling and investigation, and predicting future quality issues so they can apply corrective actions. They also spend less time on low level activities such as simply ingesting or collecting data from various sources.
  • Overall Data Quality Responsibility: According to the survey, IT professionals are most likely to take primary responsibility for Data Quality. However, because Data Quality is an issue touching all parts of an organization, many survey respondents felt that the responsibility should be shared more equally with line of business managers, business domain experts, and even CIOs and CDOs.
  • Excel and SQL Usage: Across the board, even in organizations with mature Data Quality, Excel, custom coding, and SQL are still utilized extensively, with 68 percent of all organizations using Excel. This may be an indication of inadequate functionality and user-friendliness of deployed Data Quality solutions; adding self-service data prep and interactive data profiling to these solutions would likely remedy the situation.
  • Data Lake/Cloud Storage: While the usage of Data Lakes and public Cloud for storage are increasing, overall Data Quality satisfaction levels are relatively modest (3.4 out of a 1-5 scale). This indicates that while data access and storage issues have been solved, the same is not yet the case with Data Quality.
  • Data Quality Tool Purchase Drivers: According to the study, the top purchase drivers, criteria, and functionality that organizations look for when investing in more sophisticated Data Quality tools are performance and scalability, visualizations, de-duplication, usability, and live interaction with data.

Clearly, there are numerous challenges confronting organizations, as they strive to turn data into valuable business insights. Data complexity and variety is growing as companies continue to ingest data from first, second, and third-party sources which creates a complex mix of data environment. While Data Lake and public Cloud usage is growing and helping organizations to meet their data storage challenges, companies continue to wrestle with data preparation issues. Today business leaders, office of the CDO, and technical teams all share the responsibility to improve their Data Quality and are looking for comprehensive capabilities that have a high usability and visualization treats to solve these challenges.

Leave a Reply