Advertisement

A Brief History of Master Data

By on

Master data is generally described as essential business data about people, places, and things. Master Data Management or MDM describes a system for managing the data, which is different than the actual data. Master data is not typically transactional data, but in some situations transactional data can be treated as master data. For instance, if information about products, suppliers, and vendors is contained only within the transactional data (orders and receipts), then this must be used as master data.

But, before master data and transactional data, there was the datum. And to understand that history, we must go backwards in time.

In the 1640s, a datum was a fact that was assumed to be true, whether it was from the Bible or a scientific study. A datum, the singular version of data, was often a fact used as a foundation for calculating mathematical problems. The word data gained significant popularity in 1939, after Willard Cope Brinton published a book titled, Graphic Presentation. Throughout his book, the term “data” was used to describe structured measurements taken from scientific observations or statistical results.

In 1946, the word data had its definition expanded when it was used to describe “information that can be transmitted and stored using a computer.” In 1954, the phrase “data processing” came into being, followed by “database” (the structured storage of data in a computer) in 1962. This expanded definition and new uses for the word “data” would not have been possible without the punch card.

In 1890, Herman Hollerith designed a punch card system with the goal of calculating the 1880 census. He accomplished this goal in three years and saved the U.S. government $5 million. (Hollerith later founded the company that became IBM.) All data at this time, including what would come to be called master data, was stored on punch cards. The situation became more challenging as the U.S. population grew and changes to the questions needed to be made. To deal with this, Hollerith chose to classify data as static or changing. This established the basic master data versus transaction data classification system. As computers progressed, they moved from punch cards onto magnetic tape, and then to disk-based storage.

Master Files

In 1898, Edwin G. Seibels invented the “lateral file” – currently called a filing cabinet. Prior to the lateral file, businesses had traditionally folded papers, contracts, and documents, placing them in envelopes and then storing them in pigeon-hole drawers or flat drawer filing systems. (Before that, they would bundle important documents and store them in a wooden box.) Seibels’ invention eliminated the wasted time of finding and opening envelopes and made folding the papers unnecessary.

Filing cabinets led to filing systems, which in turn led to a master file. Information in the master file contained descriptive information, such as customer names, addresses, habitual preferences, and billing information. In many cases, the master file became a large accordion folder with smaller folders inside – lots of information which could be removed from the filing cabinet easily and taken to a desk to be researched. In 1936, the Social Security Administration created a “Death” master file with the appropriate names and addresses. This file was eventually transferred to a computer system and can be accessed today (though identity theft issues have resulted in limited access).

In the 1950s and 60s, very few businesses had computers, and master data contained what was – and still is – called “contact information,” which was typically copied by hand to the Master File from an address book, or vice versa. Early nonhuman computers (prior to electronic computers, human mathematicians were called computers) were used primarily for mathematics. Contact information was typically stored in rolodexes and address books – the now seemingly ancient predecessors of the smart phone.

The concept of master files was transferred to some corporate computers as early as the 1960s and gained popularity during the 1970s and 80s when computers became more abundant. Master files contain essentially the same descriptive data – such as names, addresses, and basic summary information – as master data. Master files continue to be used as a part of some system databases and were a precursor to master data.

Master Data

Data Management started as a concept in the 1960s with the Association of Data Processing Service Organizations, or ADAPSO, forwarding advice on Data Management with a focus on quality assurance and professional training. Master data developed as a result of Data Management programs which arose in the 1980s and quickly came to include Master Data Management (MDM).

Master data can be described as an organization’s core data, containing the basic information needed to conduct business. It is fairly stable information, changing only when something dramatic happens, such as a client moving to a new location. Although master data may describe transactions, it is not transactional in design. Master data generally covers four domains, and subdivisions within those domains are known as subdomains, entity types, or subject areas. The four general domains of master data are:

  • Customers: Subdomains include customers, employees, and salespeople.
  • Products: Subdomains include parts, stores, and assets.
  • Locations: Subdomains include office location and geographic divisions.
  • Other: Subdomains include things like warranties, licenses, and contracts.

Some subdomains may be further divided. For instance, customers can be further divided based on classifications such as “normal” customers and “executive customers,” or by way of their history. Products can be divided up by sector and industry. Geographic areas can be further broken down into sales territories or by the concentration of customers.

Useful analytics and research depend on the accuracy of the master data. Master data can be stored using a central repository, sourced from single or multiple systems, or referenced centrally through an index. However, when it is being used by several groups, master data can be distributed and stored redundantly in a variety of applications across an organization. This copied data may be inconsistent (or inaccurate). To remedy this, master data should use an agreed upon view which is shared across an organization. Care should be used to assure an accurate version of master data is filed appropriately. Curating and managing accurate master data is necessary for minimizing chaos and maximizing efficiency.

In a transaction system, master data is commonly used with transactional data. When a customer purchases a product or a crate of materials is delivered, master data is used. A product with a specific location would use master data to describe its placement and location within the store. The relationship between transactional data and master data can be seen as a noun/verb relationship. Master data captures nouns, such as “1103 Harbor Street” or “Phil’s Auto Shop,” while transactional data captures words, such as “deliver,” “purchase,” or “sell.” Data warehouses often separate data using this tactic.

It is common for Data Management programs to utilize a master data file, which has come to be called Master Data Management (MDM). Use of an MDM offers a common definition for the entire organization with the goal of eliminating competing or ambiguous data policies and providing an organization with uniform, accurate data.

Master Data Management

Master Data Management came into use in the 1990s, in part, as a solution for an overwhelming amount of disjointed data coming in. The increasing use of data across different business lines also coincided with the enforcement of new regulatory measures, such as the Sarbanes-Oxley Act of 2002 (a federal law establishing financial and auditing regulations for public companies) and the Solvency II Directive (a directive from the EU that integrates and harmonizes EU insurance regulations). Prompted by the need for new organizational programs and the new laws, organizations quickly adopted MDM technologies.

MDM is a way of providing an organization with a link to all of its essential data in one file, referred to as a master file or master data file, which provides a common reference platform. When done properly, Master Data Management can streamline data sharing between personnel and departments.

A common example of a poorly organized Master Data Management system is a website which sells a customer a product and then harasses the customer with focused advertisements of the same product which they no longer need, having just purchased it. This takes place because the customer’s information, which was used by the sales department of the website, lacks integration with the customer information being used by the advertising department. The advertising department (or subcontractor) is completely unaware the sale has already been made, and as a consequence, is wasting everyone’s time. Record linkage, a process used to associate records from different sources and refer to the same person, or entity, would be useful in this situation.

Image used under license from Shutterstock.com

Leave a Reply