Assessing Data Management Maturity Using the DAMA Data Management Book of Knowledge (DMBOK) Framework – Part 2

By   /  December 27, 2011  /  2 Comments

by Charles Roe

Part 1 of “Assessing Data Management Maturity”discussed the primary elements of the Data Management Maturity Assessment (DMMA) conducted by April Reeve, an advisory consultant at EMC Consulting, for a mortgage bank. The primary elements of this maturity assessment were:

  • The Capability Maturity Model (CMM) developed by Carnegie Mellon University as the inspiration
  • The DAMA Data Management Book of Knowledge (DMBOK) as the guide used for the creation of the Key Practice Areas (KPA) that would be assessed, especially the best practice activities and technologies
  • The incorporation of ISACA best practice activities for the Data Security Management KPA and TOGAF Version 9.0 best practice activities for the Data Architecture Management KPA.
  • The addition of a Data Integration Management (DIM) as a tenth KPA since the DMBOK has only nine and Data Integration is not included in it as a separate area but incorporated in the other nine.

The beginning of the DMMA process includes determining which business lines to assess. The mortgage bank decided to include their primary and reverse mortgage businesses, retail banking business and cross-business functions as the business lines o assess. A key part of the maturity assessment process is determining the people to be interviewed.  This assessment was done with a very large number, but usually a much smaller number of people are included.  It is important that those interviewed include people from both the business and technology areas involved with each business lines to be assesses as well as from both senior level and operational level people involved with the day to day processing of the organization data.  Once these decisions were completed then a Current State Analysis was conducted, which included the completion of a series of interviews with around 100 employees in various departments of the bank, about 15 interviews with people who were judged to have the best perspective on current data management maturity in the organization, and then interviews of the interviewers by Ms. Reeve. The results of the interviews were collated and an assessment was created based on the 5 levels of the CMM framework:

  • Immature (Initial)
  • Repeatable (Repeatable)
  • Managed (Defined)
  • Monitored (Managed)
  • Continuous Improvement (Optimizing)

Part 2 of this article will discuss the results of the maturity assessment, how the bank interpreted it, the bank’s primary decisions for the Future State Analysis roadmap and how they planned to move forward with the given improvements.

The Assessment Results – Calculating and Presenting

Once the CSA was completed all the results had to be collated, with comments included to add more meaning to the numbers generated from within the CMM framework. In the case of the mortgage bank, everyone interviewed were weighted equally – but this may not be the case for other organizations. It is possible to add different weighting calculations to certain individuals or departments depending on which KPAs are focused on. All the activities, tools and lines of business were also weighted equally, thus no mathematical skewing of the results was employed. The calculated values, with rating from 1 to 5 as per the CMM levels, are important, but the comments are equally important as they point out specific gaps and opportunities with each of the KPAs. Graphic Three shows the final results of the Current State Analysis for the mortgage bank: the left column lists each area( KPA) and the top row shows the four main business lines assessed.

Graphic Three

The results are color coded (with the order of success from least to most being red, orange, yellow and green) and given a numerical value (with a higher number being a better result). The areas that received the highest scores for the mortgage bank were Data Security Management (DSM) and Document and Content Management (DCM).  However, the fact that the Data Security Management values are primarily green is misleading since data security is critical to a bank, thus an average of 2.58 was deemed not acceptable; they wanted a minimum score in Data Security of 4 or a Monitored Level. Their DCM score was averaged at 2.82 though they also wanted to improve their Document Management practices to at least a 3 in that category, with a 4 being their goal. Their overall goal was to have all 3’s, but with no single average at 3 for the entire maturity assessment, the mortgage bank had to focus on their most pressing needs.

The question then remains: “How does an organization improve their scores to acceptable or goal levels?” The third stage of the DMMA process is the Future State Analysis (FSA) which includes an assessment of the CSA, where the organization wants to go concerning the various KPAs and priority action steps to reach their goals.

Understanding the Assessment Results

The numerical values of Graphic Three don’t provide much actionable information. The mortgage bank had a CSA score of 2.58 in DSM, but wanted a 4, so there is a gap of 1.42. But what did that mean? The mortgage bank had a number of areas in data security that needed improvement including, but not limited to:

  • Improved user access management and tracking, and auditing of user activities were required.  This included private client information.
  • Additional data security controls were required for the use of production data within application development environments.
  • Data security management activities needed to be enhanced to CMM level 4 to meet the requirements of the bank’s executives.
  • MS Access environment (primarily supporting Data Consumers) needed to be upgraded to a technology which would adequately support Data Security Management controls or managed with compensating controls that would meet the bank’s requirements:
  • The bank was not adequately tracking who had access to what data, who extracted the data from the MS access databases and what they did with that data.


The FSA included a list of benefits for each of the KPAs so that the stakeholders involved in the DMMA had a better understanding of “why” they should make such changes. The possible benefits of the bank following the recommendations in the Data Security Management KPA included:

Ÿ  Reduced risk of reputational, customer relationship and financial damage due to security breaches.

Ÿ  Enhancement in the organization’s ability to comply with regulatory requirements for security and privacy.

Ÿ  Enhancement of the organization’s ability to control track and audit Data access.


The completed FSA for the mortgage bank included a comprehensive assessment for all 10 of the KPAs and detailed the findings, opportunities/recommendations, risks and benefits for each of them. This information provided them with a roadmap they could follow into the future and had prioritized items for each KPA, though some large questions remained – notably concerning costs.

Using the Assessment Results to plan improvement

Any organization that undertakes a DMMA needs to be aware that once the results are finalized, a specific roadmap must be developed and followed or the entire process will likely fail. The mortgage bank could not implement every change identified in the maturity assessment: The improvements in data security management alone would take considerable time and an investment in training, upgraded software, better hardware and a range of new processes throughout the organization. The goal of having every KPA at level 3 was unrealistic in the immediate future, especially when none of their current numerical values even reached 3. Thus, they had to weigh the relative importance of each KPA and move forward on those most important.

A Data Integration Management score of 1.7 where they wanted a 3, meant having to develop an enterprise Data Integration Strategy and Architecture, select a standard tool set, then install, train and integrate that tool set throughout the organization, while establishing a plan to streamline the integration environment over time through phased project execution. They decided that improvement in that particular KPA needed to wait – the roadmap centered primarily on DSM, DCM and a few other higher priority levels. All organizations who undertake such an assessment must come to an agreement on their priorities.

Most organizations complete a three year roadmap, but even with such a map stabilized and under implementation, it is also necessary for a yearly reassessment to make sure the roadmap is still being followed, how well it is working and what changes need to occur. Creating a roadmap from the entire DMMA process and then not following it is not only a waste of time and resources, but also costly. Graphic Four shows the basic roadmap that was created after the maturity assessment (a fully functional one is much more detailed) for the mortgage bank in 2011. It illustrates the enabling activities the bank needed to employ to move the roadmap forward, the immediate priorities and gave a colored graphical illustration of what areas were being worked on with a timeline for the completion of certain projects.

Graphic Four

The DAMA DMBOK was not written for the purpose of implementing it into a capability maturity framework for the assessment of an organization’s data management abilities, but since it lists best practices for the entire field of Data Management, it lends itself well as a framework for such an assessment. Any organization that desires an assessment is advised to work with a consulting firm with expertise in performing data management maturity assessments, since the experience and expertise of the consulting firm will probably make the efficiency of performing the assessment process and documenting the results and next steps more cost effective to the organization than trying to perform it internally. The creation, implementation, assessment and collation of the results are something any organization – no matter its size – could use, but how to complete a successful DMMA requires an intimate understanding of all the steps throughout the process.


  • Douglas Laney

    Looks conspicuously like the Information Maturity Model I published at Gartner (then Meta Group) in 2004…tho purely technical and lacking key dimensions such as Roles and Culture. –Doug Laney, VP Research, Gartner

  • Rpd MacPherson

    From my perspective, it looks exactly like the very document on which it is based – DAMA Data Management Body of Knowledge (TM). (notwithstanding the switch out between Data Operations and Data Integration) something which I support.

    Good Job Charles – nice to see it being applied in a practical way.

You might also like...

A Brief History of Non-Relational Databases

Read More →