Click to learn more about author Bill O’Kane.
Organizations across industries continue to deal with a broad-based inability to commoditize siloed business processes in favor of a digital business transformation — one that requires a common understanding of the organization’s master data. Studies have found that many businesses have recently become much more cognizant of this, or at least are admitting it for the first time, in no small part due to the current economic restrictions and their long-term effects. Analytics capabilities aren’t the only thing affected. In the age of digital transformation, trusted data is the foundation of both automating current operational processes and is critical to creating new and innovative ways of doing business.
WANT TO STAY IN THE KNOW?
Get our weekly newsletter in your inbox with the latest Data Management articles, webinars, events, online courses, and more.
After covering the Data Governance (DG) and Master Data Management (MDM) markets for a number of years, as part of helping clients build business cases for Data Management, I took a step back to reflect on why the state of data in most organizations is as dismal as it is, and why there is such a challenge involved in demonstrating the value of trusted data available across mission-critical operations and analytics in an enterprise.
The succinct version of this story starts with the fact that most of us in any business environment have historically been compensated to optimize business processes in silos, whether we realized it at the time or not. For those of us fortunate enough to become involved with information technology during the last half of the 20th century, we were allowed and actually encouraged to automate these silos. This resulted in tremendous productivity gains when all of those processes were aggregated, and very little attention was paid to the technical debt that was created when each of those new business application systems took its own set of data with it.
ERP and other application suites made significant strides towards at least co-locating logically similar data within their databases, but little if any capabilities were built in to enforce broad-based Data Quality and semantics across the supported business processes. This resulted in a different set of “logical” silos within a single physical data store. Concurrently, specialized applications such as CRM arose, again increasing productivity in isolation but again complicating the issue of trusted and, therefore, reusable data.
The technical debt of poor Data Quality was first widely exposed by the advent of data warehouses and data marts in the 1990s and the first attempts to consolidate and reconcile source data from these various data silos for use in even basic reporting and analytics. Nascent “data analysts” discovered that the data in these systems did not conform to the ostensible rules within each system. Worse, the meanings of seemingly similar data attributes across these systems bore little resemblance to each other. As the value of these analytics efforts was not yet realized by mainstream IT, aka those responsible for production operational systems and the like, the complaints of these analysts went largely ignored for the next decade or so. Some gave up, but others simply created new silos in the form of analytics platforms that at least offered data availability, if not quality, for analysis.
As enterprises now pursue digital transformation, they have increasingly discovered that the status quo of largely untrusted data can no longer be tolerated if they are to implement futuristic capabilities and automations of their business processes and analytics. The technical debt of poor-quality, mission-critical data must now be paid. The most egregious scenarios are often found in those companies that were the earliest and most fervent adopters of the siloed automation described above. Many of these firms created a sufficient number of these siloes that the increased fixed costs in creating more of them in support of new products and/or services absorbed any additional revenue that might be realized.
As often happens with macro business trends and the broad availability of technology, it’s quite likely that some version of this effect will eventually trickle down to organizations of a more modest size as well. The good news is that technology has been available and improving for several years now that empowers all companies with the organizational discipline to resolve these issues, in the form of Master Data Management (MDM) and Data Governance solutions.
Multidomain MDM platforms provide the data model flexibility to allow the business to develop a common data model that accurately reflects both the current and desired future states of the business. These models also provide a map to those siloes that are most critical to new and improved business outcomes along the digital transformation journey. With the formation of a virtual Data Governance organization, the business assumes responsibility for the state and use of the organization’s data while IT enables the managed deployment of the newly trusted data provided by the MDM platform. These are the first steps that any organization must take in order to begin to resolve its technical debt and fully enable any digital business transformation — not to mention, once and for all, kick their dirty data to the curb.