Operationalizing Data Governance

by Paul Williams

Data Quality and Data Governance continue to grow in importance for data professionals, especially with the emphasis on organizational compliance in the wake of the recent financial crisis as well as the nearly decade-old Sarbanes-Oxley Act. Remember, officers of any enterprise can be held legally accountable for non-compliance.

Anyone establishing a Data Governance initiative at an organization needs to look at the entire task as program optimized for permanence and growing maturity, not simply a one-off project. The UK’s Financial Service Authority clearly states the importance of creating this kind of practice, and the same principles apply no matter the country:

“A firm should have clear and documented standards and policies about the use of data in practice (including information technology standards) which should in particular cover the firm’s approach to the following:

(a) data access and security;

(b) data integrity, including the accuracy, completeness, appropriateness and testing of data; and

(c) data availability.”

Any approach to operationalizing an organization’s overall Data Management role must include a look at the integration of business and IT Metadata, Data Quality methodologies, and finally implementing Data Governance along with stewardship.

Pete Rivett, CTO for Adaptive, maker of systems to manage enterprise information, sums up the need for an approach towards organizational Data Quality:

“Governance of enterprise data is no longer optional or a ‘nice to have’ – it’s being demanded by regulators and it needs to be part of everyday business operations. Without a complete understanding of how enterprise information is created, defined, managed and consumed, an organization’s ability to meet external compliance reporting demands and optimize overall business performance is severely undermined.”

Making Data Quality a Core Component of Business

It is important to realize that any enterprise-wide attempt at Data Governance needs to be treated like the practice will become a core of component of that organization’s business. Doing so allows a firm to methodically discover and handle data quality issues, as well as ultimately providing a common view of company data from the business, technical, and management perspectives.

This kind of collaboration is essential when developing a corporate culture where Data Quality is crucial at every step of the information management lifecycle. It allows management to set the overall policies, while their business and technical teams view integrated data models described by metadata optimized to support their own processes and systems.

As stated in Adaptive’s Operationalizing Data Governance presentation at the Enterprise Data World 2012 conference that inspired this article: “without a complete understanding of how enterprise information is created, defined, managed and consumed, an organization’s ability to meet external compliance reporting demands and optimize overall business performance is severely undermined.”

Metadata is Key to the Operationalizing Process

The lack of a common business view of the information contained in their systems remains a problem with many enterprises that, for whatever reason, have never implemented a Data Governance practice. Robust Metadata serves as the glue between the business and technical views of organizational data, allowing both sides of the shop to better share a common understanding of their separate “world” views.

Developing a business repository to map IT Metadata to business Metadata is a key part of this process. The Metadata can include business rules as well as Data Quality metrics. This repository also serves to properly manage both the business and technical Metadata moving forward.

Pete Rivett of Adaptive stresses the importance of a Metadata driven repository:

“Data definitions and rules for data quality and profiling need to be business driven and IT-enabled. A business glossary is the unifying hub for business models (processes, rules, reporting) and technology models (data stores and warehouses, application models, data movement) – so that IT people can understand the meaning, and business people can understand the processing. A role-based, workflow-enabled, enterprise repository is needed to pull it all together.”

Leveraging Industry Standards in Data Governance

Given the potential for aTowerofBabelinspired collection of definitions around similar business terms, industry standard ontologies have been gradually developed, allowing organizations to speak the same language when describing data. These standards support business glossaries, relationships, as well as rules for both processes and data.

Some of these industry standards include:

  • FIBO – the Financial Industry Business Ontology
  • The XML-based XBRL ontology
  • ISO 20022 — also known as UNIFI — for financial service messaging
  • The Financial Products Markup Language (FPML)
  • ACORD – a set of business models used in the insurance industry
  • MISMO – the Mortgage Industry Standards Maintenance Organization

The fact that so many of these standard ontologies and models serve the financial and insurance industries again speaks to the regulatory importance of Data Quality and Data Governance. Companies new to this process are able to leverage standards honed by experience to help quickly implement a governance program.

Rules Drive Data Quality Operationalization

Success when implementing Data Quality depends on clearly stated rules, whether they are the types defined in higher-end processes, or the more detailed advanced rules contained within the programming of a system or in the constraints of a database. The rules of the data quality process get implemented in various places throughout the information lifecycle.

Scorecards and other reporting techniques are used throughout the lifecycle to help manage overall Data Quality. Profiling scorecards help to describe the consistency and timeliness of incoming data, while other scorecards track rules tied to specific business domains. Adjustment scorecards track both manual and automated adjustments made to the data after the fact.

Data Stewardship and Data Governance

As a Data Governance practice is fully implemented, stewardship becomes important in helping it to achieve maturity and continuous improvement. Echoing a theme repeated throughout the article, active and operationally-focused engagement from both business and IT remains vital.

A clearly defined workflow for the stewardship process needs to be in place, along with the rules and definitions stating what is to be governed. The previously mentioned enterprise Metadata repository helps greatly in linking technical and business definitions, ensuring everyone speaks a similar language.

Change management and the deployment lifecycle are also part of what gets stewarded. Robust communication and collaboration allow this to occur smoothly in addition to having these procedures sharply defined and documented in the repository.

The Enterprise Repository Ties it All Together

The enterprise repository is the piece of the operational Data Governance puzzle that ties everything together. The repository includes an ontology containing a glossary of business terms. Additionally, it provides information on each term from both the business and technical perspectives.

A quality repository also defines the ownership of the term, its versioning, and any Data Quality metrics that describe what the term defines. Security information, as in who can see or modify the data, also gets defined as well as a set of valid domain values for the term.

Ultimately, the enterprise repository is a vital tool for an organization’s risk officer or any corporate officer with an exposure to regulatory compliance. The repository provides insight to the information received on a report. Any officer is able to verify what the report data means, where it came from, and its underlying Data Quality metrics, all by perusing the enterprise repository.

Business Focused, IT Enabled

To summarize, Data Governance is no longer optional. It needs to be a core component of any enterprise’s data management tool shed — no matter the industry. The technical and business sides of the shop need to share an integrated involvement with the overall governance process.

Robust business semantics and thoroughly defined Data Quality rules remain vital for Data Governance. An enterprise repository serves as the glue for the business triumvirate — executive, business, and technical — providing enhanced reporting along with insight. Focused stewardship helps the practice of data governance to continually improve towards maturity.

Finally, Adaptive stresses that organizations implementing data governance live and learn the following mantra: “Business Focused, IT Enabled.”

Related Posts Plugin for WordPress, Blogger...

  2 comments for “Operationalizing Data Governance

Leave a Reply

Your email address will not be published. Required fields are marked *

Add video comment