MDM 2.0: Comprehensive MDM – Part One of Five

by Mehmet Orun

My first Master Data Management (MDM) project, according to some old project notes I dug up recently was in 1999.  Except, we did not call it MDM then, it was simply how we were going to handle Customer data across multiple Operational Support Systems.  We used a custom development approach through code, where data cleansing tools were limited.  This I would call was the MDM v0 days.

Then, five or six years ago, strong Customer Data Integration (CDI) engine vendors emerged.  As an organization, this was a much more attractive approach for us, since we could focus on match rules instead of writing and maintaining code to do the matches, and ID management.  We started with a robust Customer data model for our Customer centric solution architecture, focused on automation rules, and at the end of the day, we realized this architecture alone did not meet our needs.  While there is definite value to these solutions, which are an essential part of organizational master data management, it is but a part of the big picture.  This solution I refer to as MDM 1.0.

Over the years, we learned many lessons on what works, what doesn’t seem to work, and what we seemed lacking, and defined a broader solution architecture to manage our master data.  While this means a broader, more comprehensive program, it should also yield greater benefits managed master data can provide.  I refer to this more comprehensive solution as MDM 2.0, managed master data beyond technology architecture.

MDM 2.0 is a comprehensive framework that includes:

  • Data Governance and Stewardship processes
  • Guidelines for MDM in Enterprise Applications and Technology Context
  • Master and Reference Data Management Services

while building upon the tools, techniques, and data architecture patterns of the early implementations

This article seeks to introduce context, processes, and components of MDM 2.0, which will be further detailed in future articles.

MDM 1.0 vs. 2.0

Before detailing MDM 2.0 framework’s components, lets look at a high level comparison of the two implementation approaches:

Traditional MDM
(or 1.0)
MDM 2.0
Typical driver Data warehousing/360 reporting, as well as CRM or M&A efforts True 360 view, across applications and reports, SOA
Typical architecture Single system of record or hub-and-spoke CDI type database, often in front of a DW Hybrid. Single or multiple systems of record, data lookup, quality, and publication services, MDM hub integrated with reporting and integration, not just DW.
Typical processes Emphasis on Data Cleansing and Match-Merge/Unmerge Data stewardship, match-link, match-merge, address lookup, address standardization, address verification, data change management, …
Data security Focused on who can read what information Focuses on data creation, update, and read rights
Business involvement Executive sponsorship for the effort, involvement in match rule definition Executive sponsorship as well as organizational change management, long term data stewardship
Change Management need Minimal, typically geared towards reporting users It can be significant, based on the breadth of business groups and systems included in the MDM effort

MDM 2.0 in Enterprise Architecture

If your organization is interested in implementing Services Oriented Architecture, you cannot succeed without effective management of master data.

In an SOA environment, in order for:

  • Portlets to provide data source agnostic views that can operate on each other,
  • Rapid data integration technologies such as EII to deliver its time to value promise,
  • Hosted (Software as a Service) as well as on site applications to have a consistent view of the information,

data source master data discrepancies must be resolved and relationships maintained.

MDM 2.0 in Solution Processes

In order to be effective in the enterprise context, MDM 2.0 means master data needs to be managed across layers:

  • New Business Processes are needed to manage overall and individual record data quality .  Also, existing business processes that defined previous data capture or sharing rules must be examined for change management.
  • Application Processes that incorporate search before create, data quality verification, merge/unmerge handling type processes enable applications to have higher quality information, especially when there isn’t a single system of record available.
  • BI processes track the business user specific views of the master data, while basing it on the same, consistent granular “real world” views
  • Integration processes differentiate Updated data from New, ensure data quality verifications are part of Pub-Sub (Publish and Subscribe) processing, and provides support for enterprise as well as B2B (Business to Business) information integration

Understanding User Context

Another important aspect of managing master data is based on the source of the information, different data services can be applied.  If the information provider is an:

  • Original source of the information, such as the customer themselves for Customer data, then this should be treated as the reliable source (in most cases) with only address standardization, edit check type rules applied
  • Internal user, then data quality expectation of different user groups need to be understood
  • External source, limits on enforcing data quality must be understood, and feasible data quality checks should be incorporated into the B2B architecture as well as quality clauses in data acquisition contracts.
  • External data service provider (e.g. D&B, IMS, …), understanding of data management processes must be reconciled with the organization’s own, and especially the above mentioned sources of information.

In addition to differentiating sources of information, how information is consumed also needs to be understood.  As a result, you will find three key types of information:

  • Information that is broadly of interest to many different groups, which is also captured by many different groups
  • Information that is predominantly captured/managed by one group, but of interest to many groups
  • Information that is of interest to an individual group or user and no-one else

Once user context is defined, you can classify your master data using a series of stewardship types, which will enable you to handle your information, conflict resolution, and change management processes consistently across your organization:

  • Type 0: Information provided by the source itself.  Consider authoritative and do not restrict creation or updates.  (Information sharing may have gates to ensure effective versioning or updates)
  • Type 1: Commonly captured or used information.  As such, it is managed through a central process and changes are restricted/handled through a change management process.
  • Type 1A: No changes are allows in the Applications without external verification or updates from the data steward.  E.g. State Codes
  • Type 1B: Changes are allowed in the Application to allow transactions to take place but are not made available to any other group without external data steward verification and approval.  E.g. creating a new account, changing an account’s address only for the purpose of a transaction, not allowing unverified changes by just one user group.
  • Type 2: Information captured and stewarded by multiple groups.  This data is readable by many but only updatable by the group(s) that is responsible for the data.  Clear update rules are required for Type 2 data, otherwise, consider Type 1.  This means most recent information may often win.  E.g. Internal credit rating of a customer.
  • Type 2A: Internal function’s updates always wins.  If Type 2A, customer’s internal credit rating will be unchanged regardless of the external rating (which would be available as a reference)
  • Type 2B: External data source updates are allowed through a managed process.  If Type 2B, if a customer’s external credit rating changes, internal rating will be adjusted.
  • Type 3: Information captured and used by a specific user or group.  Due to limited sharing, this type requires the least master data stewardship and change management.  E.g. person’s nickname.  Warning: Any data that is proposed as Type 3 should still be periodically reviewed by the Domain Data Steward to ensure it is not duplicated or shared by other groups.

Once you have defined your data stewardship type for your data elements, then you can associate this with edit rules across your applications and as part of your data security framework.

Part 1 Conclusion:

Management of Master Data requires much more than a technology implementation.  It requires strong business support, a broader solution plan to meet the intended business need, and focus on how the information is created, used, and managed.  Future articles will detail these in greater detail.

What’s Next:

Part 2.

Lessons of MDM 1.0 and Elevator speeches for your next initiative

Part 3.

MDM 2.0 Architecture in Enterprise Context

Part 4.

Data Management Data Services in SOA

Part 5

Information Providers, Consumers, and Data Stewardship Types

Acknowledgments:

This work would not have been possible without months of discussion and brainstorming activities with my many colleagues at Genentech as well as Christine Carpentier, who originally suggested the idea of typing data elements to support Data Stewardship.

ABOUT THE AUTHOR

Mehmet Orun

Mehmet Orun is responsible for data architecture, data quality, data management, integration services, and associated technology roadmaps and methodologies.  Mehmet’s interest and expertise includes data and meta data driven solutions to improve data quality, maintainability, and utility while bridging the gap across applications, processes, and structured and unstructured data sources.

Related Posts Plugin for WordPress, Blogger...

  2 comments for “MDM 2.0: Comprehensive MDM – Part One of Five

  1. Afshin Shobair
    July 10, 2013 at 1:25 pm

    Excellent paper well written and it matches the evolution I have been observing in the field. More execs are beginning to grasp the impact of Big Data on their organization top line and this paper helps their top IT talent grasp the concepts and provides a blue print for adoption of technology.
    Looking forward to publication of the next chapter/section

Leave a Reply

Your email address will not be published. Required fields are marked *