Data Quality: It’s A Supply Chain Issue

By   /  July 23, 2015  /  No Comments

Data-Quality-Supply-Chainby Jennifer Zaino

Ask Mary Wilson, Vice President of Advisory Services at GS1 US about what Data Quality is, and here’s what she’ll tell you:

“Data Quality is the outcome of having sound Data Governance processes, and institutional education/ knowledge and training to support good Master Data Management (MDM),” so that there is a single point of reference when it comes to the critical data used within organizations and in conjunction with their partners.

GS1 US is a member of GS1, a standards organization that brings industry communities together to solve supply chain problems through the adoption and implementation of GS1 Standards. It is dedicated to driving the global language of business by creating a common foundation for uniquely identifying, accurately capturing, and automatically sharing vital information about products, so that anyone in the supply chain who wants that information can understand it, no matter who or where they are.

Too often, that isn’t happening: According to recent Data Quality pilots with a couple of dozen leading companies conducted by GS1 US, data accuracy was a growing problem – about 50% of the data surveyed was inaccurate. That hurts visibility across the supply chain, and efficiency in seamless information sharing and collaboration for enhancing business processes among companies. Ultimately, that affects consumer convenience, value, and satisfaction as well.

Work is underway to get a better handle on the situation. GS1 US’s mission now includes a National Data Quality program that has brought together industry stakeholders in order to respond to increasing consumer and trading partner demands for complete, accurate, and timely product information, with the development of the GS1 US National Data Quality Framework. In late spring, it published two guidance documents to support organizations evaluating their Data Quality programs, the Data Governance Assessment Guide and the GS1 US Data Governance Best Practice Guidance, which discusses building a good Data Governance program.

The Framework initiative itself began about two years ago and it’s had input from companies in the general merchandise, hardlines, healthcare, consumer packaged goods, grocery, fresh foods, retail, and foodservice sectors. Some 300 people and almost 200 unique companies participate. While the goal was to make sure the program could operate universally across all sectors, retailers and suppliers from the grocery industry played a major role, Wilson says, in large part because they have been very active in measuring the accuracy of data along their supply chains over the last decade and a half. “They’ve seen some improvement over the last 12 to 15 years,” she says, “but there’s still opportunity for improvement.”

A Look at the Framework

The framework offers detailed information about the assessment criteria and scoring for what GS1 US calls the three pillars of Data Quality: Data Governance process, education and training protocol, and attribute audit. Organizations that want to implement the framework can start with any pillar, with the ultimate – but voluntary – goal being to attain GS1 US certification that they have the proper processes and procedures in place to sustain quality data over time, Wilson says.

The GS1 US Data Governance Quality Framework document describes the overall program and its requirements. It notes, for example, that the Data Governance pillar includes an assessment designed to determine the degree to which people, processes, and procedures are in place within an organization to validate that quality data is maintained and shared across all necessary business entities. The Data Governance Assessment Guide can serve as a self-assessment tool to help a company determine whether they are ready to begin the certification process. “When you utilize it as a self-assessment tool you can identify areas of strength as well as areas of opportunity for improvement,” she says.

The Education and Training Protocol is comprised of a series of assessments designed to verify the comprehension and proper application of the GS1 System of Standards. That includes, Wilson explains, demonstrating an understanding of the GS1 Package Measurement Rules around accurate dimensional data; GDSN (Global Data Synchronization Network) rules around globally sharing trusted product data; and GTIN (Global Trade Item Number) Allocation Rules for providing guidance of when a new GTIN needs to be assigned. “When a change occurs to a product, for example, you have to assign a new GTIN to distinguish it from previous versions,” she says.

The Attribute Audit encompasses an assessment of select key product attributes to validate that the attribute information being shared with trading partners matches the physical product. Regarding the pilot of 29 companies that GS1 US conducted last year – which encompassed 24 suppliers and five demand-side partners and leveraged key performance indicators that were universal across all different sectors – data accuracy continued to be an issue with foundational attributes such as weights and dimensions.

The most basic problem that data inaccuracy within and across supply chain partners creates comes to the forefront in supply chain logistics, she says. For example, when weight and dimensional attributes are incorrectly communicated down the supply chain line, it becomes a challenge to maximize truck usage to get products to their destinations, so that they don’t register as underweight (indicating they are not full to capacity and therefore not operating at highest efficiency) or overweight (which makes them subject to violations of load limits), Wilson says. Other challenges include the fact that with so much automation in today’s warehouses, entire loading operations can be shut down if actual physical item weight or size aren’t a match to the data attributes ascribed to them. Also, supermarkets can take a hit when inaccurate dimensional data is supplied, leaving them with either too much room on the shelf or the inability to fit all products in the allocated space.

“Quality data was once considered the cost of doing business, but with the advent of e-commerce, quality data has become a strategic point of differentiation among trading partners,” she adds. Consider online grocery buying, where a consumer places an order for what is described on the supermarket’s web site as a box of one brand’s pasta of a certain size and weight. When it arrives, though, the measurements don’t match the description – a real problem for the recipe he planned to make that night.

“If consumers don’t get what they expect they will likely look elsewhere next time. Consumers want the right thing at the right time, and if that expectation isn’t met they are going to blame the supermarket for loading the information wrong, not the supplier,” Wilson says.

The Framework in Action

The GS1 US National Data Quality Framework launched back in January, and many suppliers have varying levels of engagement with the three pillars.

A review of current participants’ Data Quality scorecards drawn from the framework in aggregate showed room for improvement, even among those who thought their Data Quality was really high. “What they are finding is that in order to achieve sustainable, accurate data, the issue is you need to change business processes,” Wilson notes. That takes time and commitment. For instance, one root cause of Data Quality issues between suppliers and retailers in the grocery sector is that the former are often asked to share information very far in advance of when a new product goes to market – information that truly isn’t yet available in any sort of accurate form, but rather comprised of pieces put together from concepts and spec data and the like.

That’s fine for starters, but as a product evolves business processes often don’t take into account updating the information that was originally shared. For example, a store-brand’s soda may be reformulated to reduce sugar – a modification that may change the weight of the product – but no one ever goes back to update the data being shared. With the framework, guidelines exist to help address this: That includes instituting a process that calls for recapturing data from sustainable production environments.

Or think of a bottle of water where the cap is made smaller, which will affect the overall weight of a case of bottled water. “It’s a great sustainability improvement, but in this example the manufacturer probably considers the difference that made to case weights,” she notes. The Data Governance process part of the framework requires an audit process to ensure the product attributes remain accurate throughout the lifecycle of a product.

Another supplier has discovered that while its data is largely correct, it continues to populate database information in the wrong cells because logically it seems to make better sense that way. It’s currently working with GS1 US to address the issue, including rethinking business processes that were developed to support that way of working.

Wilson expects to start to see the early adopters of the program at achieve certification by the end of September. That’s some good news for the demand-side partners that no longer want to be in the business of having to manage their own supplier scorecards. “They want the data to be accurate upon receipt,” she says. “And they want to be confident in the quality of the data throughout the lifecycle of the product.”

You might also like...

Managing Data as a Corporate Asset: Data Virtualization

Read More →