Advertisement

Advances in Data Quality Management

By on

Data Quality Management (DQM) is always advancing, not necessarily in terms of technological leaps, but in terms of being used more as a result of shifting business patterns. DQM is being used more and more, as organizations shift to a digital format.

Other reasons for the increased use of DQM range from the lowering of prices for the sensors on manufacturing equipment to members of the supply chain being pressured into becoming more efficient.   

Data Quality Management has the goal of providing accurate information. The process starts with the acquisition of data and ends with the effective distribution of “accurate” data.

DQM also includes oversight of an organization’s data. Accurate, consistent information is a necessity for any reliable data analysis. The quality of data is essential for good communication, and for developing accurate insights from a data analysis.

Digital transformation will increasingly change how organizations operate and behave.

To maximize the effects of the transformation, it’s necessary to embrace automation, artificial intelligence, and the cloud. The efficient use of these technologies will mean a dramatic shift in business models, mindsets, and work cultures.

Lower Priced Sensors Promote Quality 4.0

Manufacturers are just starting to embrace the philosophy of Quality 4.0 (also known as Industry 4.0). Those that have are seeing significant improvements in their Data Quality Management.

With the prices of sensors and actuators lowering, mid-sized manufacturers are realizing that the visibility offered by these devices can be enhanced by embracing the Quality 4.0 philosophy. They have discovered that monitoring the equipment and products with technology and promoting a philosophy of quality throughout the workplace, promotes efficiency, productivity, and excellence.

Quality 4.0 merges digital technologies with traditional methods to improve quality in manufacturing. It includes adapting the work culture (particularly management) to maximize the impact of the technological changes.

Quality 4.0 doesn’t replace traditional methods, but builds on and improves them. The benefits of using 4.0 Quality include:

  • Productivity
  • Increased profits
  • Improving operations with real-time monitoring, data gathering, and predictive analytics
  • Preemptively identifying and addressing quality issues
  • Meeting regulatory compliance
  • Improving the brand’s reputation

Becoming a “factory of the future” is no longer limited to extremely large manufacturers, but can now be achieved by mid-sized manufacturers. Quality 4.0 becomes the icing on the cake, technologically speaking.

After installing sensors, an Industrial Internet of Things (IIoT) system, connections to the cloud, and the use of AI, developing a Quality 4.0 philosophy is the logical next step.

The Cloud-First Strategy 

Moving to the cloud provides easy access to cutting edge DQM tools, in turn improving Data Quality. The “cloud-first” strategy promotes use of the cloud, and as a consequence, superior Data Quality Management.

The cloud-first strategy is based on a conscious decision to immediately look at the cloud as the first choice for information technology solutions.

As more and more software shifts from a downloadable format to a software-as-a-service format, relying on the internet for access to this software has become the norm. Examples include Microsoft 365, Dropbox, WordPress, Salesforce, and Zoom. Overall, using cloud services is more efficient, more productive, and less expensive.

With comfort levels for the cloud increasing, and cloud services becoming obviously more cost effective, making the decision to embrace a cloud-first strategy becomes a simple one.

Data Quality Management in Banking: Artificial Intelligence and Machine Learning

Banks are adopting the philosophy that automated services make far fewer mistakes than humans. That means more accurate data.

When the data being delivered to artificial intelligence algorithms is unclean, their output invariably leads to bad decision-making. The banking industry has noticed this, and has taken steps to use AI (artificial intelligence) and ML (machine learning) to collect their data.

By eliminating the human factor, they are improving the quality of their data and their Data Quality Management.

The McKinsey’s Global AI Survey stated nearly 60% of the financial-services sector uses at least one AI service. The most common AI technologies being used were chatbots and virtual assistants, with RPA (robotic process automation) for predictable and structured tasks.

Technology designed to ensure good Data Quality has become an important and significant investment in the finance and banking sectors.

Augmented Data Quality

Machine learning technologies are used to create augmented Data Quality. The process offers organizations the ability to enhance automation and shorten completion times.

Augmented Data Quality reduces the amount of manual work needed to process data, and supports enhanced Data Quality. This machine learning strategy uses natural language processing, and delivers high levels of Metadata Management, Data Governance, and Data Quality.

Augmented Data Quality allows many routine Data Quality tasks to be automated. Tasks such as:

  • Merging
  • Cleansing
  • Monitoring
  • Profiling
  • Troubleshooting poor quality warnings or anomalies
  • Data matching
  • Automatic linking between entities
  • Automatic alignment between IT control rules and business

Data Fabric and Data Quality Management

Use of data fabric has the potential to significantly increase the efficient use of data. A data fabric is a form of Data Management architecture that helps businesses manage their data.

It provides a consistent user experience, in real-time, and supports access to data by any appropriate members of the business.

Gartner predicts that, “by 2024, data fabric deployments will quadruple efficiency in data utilization while cutting human-driven data management tasks in half.” Data fabric performs or enhances many of the tasks normally accomplished by DQMs.

Data fabric automates the process of data discovery and Data Governance, delivering clean, high-quality data for purposes of analytics and artificial intelligence. The ultimate goal of data fabric is to simplify the Data Management process and the process of gaining insights from them. A data fabric supports the following features:

  • Can store both unstructured and structured data
  • Can connect with multiple data sources
  • Supports Data Governance
  • Data ingestion capabilities including data transformation
  • Stores metadata and data catalog source information
  • Can integrate data across clouds
  • Graph engine links data to show complex relationships

While many of the features listed above are part of modern Data Management platforms, one important feature that is unique to data fabric platforms is data virtualization.

Data virtualization creates a data abstraction layer by connecting, gathering, and transforming data silos to support real-time and near real-time insights. It gives direct access to transactional and operational systems in real-time whether on-premise or in the cloud.

The Cargo Container Shipping Industry and Data Quality (or Supply Chain Chaos)

Recent delays in the delivery of cargo containers have resulted in an examination of the primitive ways the global supply chain is run.

Expect to see a dramatic increase in the use of Data Quality Management in the cargo container shipping industry. This is primarily because customers (and various governments around the world, including the U.S. government) are demanding greater speed and efficiency in the delivery of their products.

According to Gartner’s Supply Chain Digital Transformation, cargo ships and shipping ports do not communicate very well. Many overseas shipping companies are using primitive systems on their ships that do not include the internet, particularly after the ship has left dock. As a result, the world’s supply chains have been run chaotically and inefficiently. But that is changing.

The simplest way to improve the global supply chain’s efficiency requires access to the internet for communications, real-time tracking of the ships and their containers becoming part of the IoT and the coordination of quality data for deliveries. (This is an area that is ripe for innovation, including the use of AI and ML.)

Historically, for many ships, scheduled arrival times were generally a little vague. (The ships got there when they got there.)

Increasing efficiency also involves communications between the cargo ships and the shipping ports, to confirm the ports are not overloaded, and can accept the delivery with a minimum of wait time.

While the industrial internet of things is still in its infancy, we can expect dramatic improvements in the supply chain as cargo ships and shipping ports modernize their technology, and use Data Quality Management to streamline deliveries and keep track of the cargo.

Image used under license from Shutterstock.com

Leave a Reply