Advertisement

Justify MDM Investments with an Alternative Approach

By on

How far can Data Quality fall before real pain sets in? Not far.

Addressing that issue requires validating, matching, and merging existing data so that you can drive out duplicates and enrich the data with new data as it becomes available. In other words, it requires Master Data Management (MDM), which helps companies share data definitions and standards across the enterprise and avoid confusion, errors, overlaps, and redundancies across multiple business units.

In an operational environment it’s around 1.5 percent, according to research conducted by Data Quality and Master Data Management vendor Innovative Systems. Below that threshold, pain surfaces in the shape of employees not being able to trust the data they use as well as in the client base becoming frustrated by the presentation of incorrect information. Analytics can be a basket case.

And yet…there may still be some hesitancy in either beginning an initiative or re-invigorating MDM in organizations that have invested millions of dollars in it for years but still haven’t realized the level of Data Quality they envisioned. It’s no small task to cost-effectively integrate multiple systems hosting their own master data with perhaps a few million customer records across them.

Stakeholders need to buy into the vision if the business is to avoid losing a single view of customer, vendor, product, or other entities. That’s tricky, since MDM technology typically isn’t widely used among business users, remaining rather in the domain of data stewards or governance executives. Business leaders don’t always see what’s in it for them. They’re thinking more about how time-consuming (potentially years) it can be to cleanse and integrate their full data sets before they can become available. And how personnel-intensive the effort will be. And how risky, too, in terms of time and cost overruns while failing to meet quality.

“Not being able to justify MDM is the leading reason that organizations have difficulty in starting one or re-starting one or merging them together,” Michael Ott, senior VP at Innovative, told an audience at the DATAVERSITY® Enterprise Data World Conference.

What is in for them, though, is strengthening business capabilities and gaining the ability to realize strategic initiatives. Consultancy firm First San Francisco Partners has identified a number of concrete benefits that MDM can bring. These include:

• Enabling data integrity for regulatory reporting compliance

• Maximizing product and services revenue by offering integrated solutions across                  business units

• Realizing operational efficiencies and on-time delivery by eliminating manually-intensive activities, and reducing error-prone data integration processes

• Enabling greater responsiveness to new business opportunities

The MDM Alternative

The typical way of approaching MDM may potentially jeopardize benefits, though, Ott argued. Combining components of both Virtual Master Data Management (VMDM) and traditional MDM have their advantages—and their cons too.

Traditional models often just link master data to associated components or products. VMDM describes managing master data in a distributed fashion on source systems and remains fragmented across those systems with a central indexing service. It’s faster, less disruptive, and less costly than traditional MDM for eliminating data inconsistency and addressing the challenges of continuously changing data in a dynamic database environment. But it can provide inconsistent results and lower Data Quality for some business use cases.

“There is no reconciliation. There is no ability to review data and capture the results of that review of exceptions. And that’s a key reason why it makes it difficult to apply that to achieve operational results that you typically need in an MDM,” Ott said.

A different methodology championed by Innovative is a cognitive or knowledge-based approach. In Innovative’s case, its large knowledge base was crowd-sourced from thousands of different clients over decades to come up with proper and improper words and phrases, Ott noted. When focusing on the customer domain, for instance, this knowledge base has both the proper spellings of name, information, account, and address terminology and misspellings “to the tune of about two-and-a-half million entries,” he said. And it’s all in context of differences, not just the number of differences in a pair of records.

“An alternative approach for managing a single view of data allows the creation of 360-degree views having 99.5 percent quality,” he says, crushing the pain point, either in the cloud or on-premise:

“You bypass the traditional methods and allow a single view of customer or client, patient, provider—whatever your domain or domains are—to meet the specific business needs that you’re interested in—customer experience, digital transformation, GDPR, etc.”

And then organizations have the ability to use their data in a very short period of time, which is what it’s all about going back to the needs of the business user.

Exception review is done to reach 98 percent or higher quality, he said, noting that this is all but ignored in typical implementations:

“Once you get even to 98 percent accuracy, for example, a two percent level of duplication, it starts to be noticed,” Ott said. “And once it starts to be noticed, the data starts to be less than trusted, less than used, more abandoned, etc.”

This should make it clearer to executives that making better choices in investing in MDM is worth it. And maybe they’re starting to realize that. A new study says that over the next five years the MDM market will reach $6.88 billion by 2024, from $2.89 billion this year.  

Check out Enterprise Data World at www.enterprisedataworld.com

Here is the video of the Enterprise Data World Presentation:

Image used under license from Shutterstock.com

Leave a Reply