The Evolution of Master Data Management at Intel: A case study of finance master data

by Vanitha Srinivasan

Master Data and its Management

The common master data elements that people think of in the context of master data are customer, supplier or product. Financial master data is equally important, is highly shared across the entire supply chain and is foundational for the operation of any enterprise.   The figure 1 below describes the scope and relationship of master data in any organization.

Figure 1 – Scope of Master Data Management

Master data management (MDM) comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization which may include reference data.

(Source: http://en.wikipedia.org/wiki/Master_data_management).

The foundation of MDM is a governance mechanism (workflows) by which data policies and definitions can be enforced on an enterprise scale. Tools and processes exist to facilitate the management of master data, and setting up an MDM system entails more than installing software and running a two year transition project. Successful master data management requires governance. It requires collaboration between IT and business to monitor, adjust, and improve data management.

Intel’s evolution with MDM

Over the past seven years Intel has been building out its master data management systems.  The finance subject area has had a central master data solution since 1990’s and the other core master data areas were engaged in the mid-2000’s.

The roadmap

It wasn’t until the company initiated an organizational realignment in 2003 that it came around to the idea that the integration and management of master data would be core to the success of the entire organization. The master data areas core to supporting Intel’s business were prioritized first for delivery.  The subject areas were customer, supplier, item, worker, location, and finance.

In 2004, the MDM program was created that included a program team, product owners and data architects. The mission of the central master data management team was to align and consolidate people, processes, and technologies. This was followed by the establishment of governance boards for each of the core master data areas in 2006. With one central team managing master data delivery across IT projects, there were efficiency improvements, better decision making and elimination of rework.

By 2007, the need to extend MDM to off the shelf products was realized as the master data management applications then in place did not meet the needs. The MDM teams started engaging with the members of vendor influence councils to drive the delivery of solutions that meet manufacturing business needs.

Implementations of mature MDM solutions were started in 2009 and today these applications are maturing at different rates. Some master data areas are more mature than others- and Intel is continuing to make refinements as the MDM solutions integrate with the ERP platform.

General Master Data Management Best practices

A shift in Thinking: From a reactive to a proactive approach

In the early stages of the process (pre 2004) the master data management teams were reactive to problems came up with the MDM applications. Through the years there has been a gradual shift in the approach – IT product owners have become proactive. Besides engaging with users to identify emerging requirements, they collaborate with the MDM applications product support groups to drive issue resolution with before the issue is brought to attention through IT service desk tickets by the business user. Today each business area has funded its MDM project separately, which means we there are multiple MDM capabilities. The goal state is to drive consolidation into a few key products.

Usage of Enterprise Architecture

An aspect of the Master data management architecture that has had industry focus is the usage of enterprise architecture practices to coordinate business, data, application, and technology domains and formalize the approach to solution design and delivery. Usage of enterprise architecture practices ensures agility in IT responding to business needs and a decrease the total cost of ownership of the MDM product.

Today Intel IT utilizes an enterprise architecture framework to implement solutions. Figure 2 shows a sample set of artifacts and roadmaps that combine the processes, data models, applications and technology domains to form the “Enterprise Model.” These artifacts drive solution architecture that addresses the complete needs of an MDM solution.

Figure 2- The Enterprise Model

The enterprise architecture artifact that has driven success in MDM has been the creation of data blueprints. Data blueprints address data design and business process design. Their creation involves subject matter experts and business experts from across IT and business units. The blueprints go through constant revision and improvement under the supervision of a change control board.

The governance board ensures that the structure, quality, and workflow of data meet the business needs. The board also drives the compliance of various downstream applications   and programs that are not in compliance, and guides them through the change management process.

Financial Master Data Management

Managing finance master data poses distinct challenges compared with other types of master data because accounting controls, regulations, and reporting standards change over time and organizations have to accommodate it in their processes and applications. Finance was one of the initial master data subject area where Intel implemented MDM. Finance master data management (FMDM) has a two fold scope: organizational master data and finance master data. Organization master data broadly refers to profit centers, or business units that generate revenue, cost centers, which are synonymous with departments, and other conceptual entities. Finance master data refers to different types of currencies, general ledger accounts, and the fiscal calendar. FMDM has come a long way at Intel the past 20 years.

  • In 1990 there were no global standards and lack of a common fiscal calendar.
  • In 1994, a single fiscal calendar for all of Intel’s business units and subsidiaries around the world though each subsidiary had a separate chart of account.
  • During the years 1995 to 1998 Intel implemented ERP in seven areas of the supply chain and the business was introduced to concepts like profit centers, cost centers, and a centralized account maintenance process.
  • Most of the supply chain processing was integrated into the ERP platform in 2003, which drove the consolidation of records of origin for key finance master data. In tandem with that, finance began using a home-grown application for workflow and life cycle management. For example, creating a general ledger account was managed as a workflow. The workflow process began with the request for a new account. The account councils would make sure it mapped properly to the global accounting standards, followed by an approval by the relevant business or operational unit. Attributes would be added on once an account was approved for creation.

IT Best Practices for Financial Master Data Management at Intel

  • Governance is a major factor in maintaining data quality. Today cleansing is achieved through a central finance data maintenance group in lieu of an automated system. The central finance data maintenance group is responsible for adds changes, and in-activations to company code configuration, synchronization of the operations and business hierarchies, configuration of bank master data and electronic bank statements, currency code configuration and rate validation and security profile maintenance. They also conduct internal audits on data owners to determine who uses finance master data and who approved access to it. While a centralized maintenance group gives better control of data, the business process is not ideal because multiple business teams remain involved with approving maintenance decisions. This fragmentation is an issue if a data quality problem occurs that affects multiple teams. Governance in FMDM is still maturing.
  • Life cycle management is another key concept. In addition to creation, maintenance, and updates, the FMDM team proactively manages deletions or in-activations. For example, if a cost center or department hasn’t been active for 12 months, and finance operations agrees to discontinue the usage, the cost center is inactivated instead of being deleted from the system. This enables reuse of the cost center in the future if required and need to keep it for auditing reasons. The need for an alerts capability to assess the downstream impact of a data management decision, such as inactivating a cost center is a need that is not met by the current application. .
  • An intuitive and friendly user interface is important for business operations to use the MDM application
  • Changes to accounting standards and regulations implies impacts to the business rules in an FMDM application.  Currently in the finance business unit is in an evaluation mode for moving from US GAAP (US General ly Accepted Accounting Principles) to IFRS (International financial reporting standards). In order to avoid re-work, the architecture for finance master data applications needs to be flexible to changing standards. Today the home-grown FMDM application can handle only a single chart of accounts. One of the challenges ahead will be to integrate flexibility such as adding multiple chart of accounts to the application or evaluate products in the market to meet such needs.

Conclusion

Successful master data management requires extensive collaboration between IT and the business units. It takes years for organizations to implement an enterprise wide MDM strategy so planning for it is key to success.

Several products exist in the market and due diligence of requirements is essential in selecting the correct MDM product and a building a thorough understanding of the product so that it can be integrated into the existing environment.

Tools and processes exist to facilitate the management of master data; however governance is the key to continuing success.

To learn more IT @ Intel, visit us at www.intel.com/IT.

ABOUT THE AUTHOR

Vanitha Srinivasan, Enterprise Architect, Intel

Vanitha Srinivasan is an enterprise architect in the IT group at Intel, the world’s largest semiconductor manufacturer and leading manufacturer of computer, networking & communications products. She specializes in data and applications. This article explores the ongoing evolution and implementation of master data management (MDM) procedures at Intel.

Related Posts Plugin for WordPress, Blogger...

  3 comments for “The Evolution of Master Data Management at Intel: A case study of finance master data

  1. September 25, 2011 at 3:24 am

    I told my garndmoethr how you helped. She said, “bake them a cake!”

  2. Russel Revord
    May 21, 2013 at 4:23 am

    IT service management or IT service support management (ITSM or ITSSM) refers to the implementation and management of quality IT services that meet the needs of the business. IT service management is performed by IT service providers through an appropriate mix of people, process and information technology.*`,”

    http://caramoanpackage.com

    Newest short article from our own webpage

Leave a Reply

Your email address will not be published. Required fields are marked *