Like many of you reading this post, I frequently find myself explaining to those around me what is implied by “data management” and why it is so important. While it’s easy for the layperson to understand data in the context of how it is entered into applications and how it is reported through various mechanisms, it is much more difficult to capture their imagination on the importance of data quality practices and things like a robust meta-data framework built around a soundly architected data model.
I became acutely aware of this when my wife pointed out the me that the attention span of those who asked me what I did was typically “Gone in 60 Seconds”. That’s when I realized that I needed to find different ways of explaining things to people. I have been fortunate, on many occasions to hear Dr. Peter Aiken, President of DAMA – International speak, and it was in one of his sessions that I first started seeing the value of comparative analogy to help people what is implied by data management and why it is important.
Being a Discovery Channel buff, it didn’t take me long to find one of my own. I was watching a program about the re-construction of the BC Place Dome- with particular emphasis on the various disciplines and trades that had to be employed and coordinated to replace the 91,000 sq. ft. roof.
What struck me as most amazing was that they were able to design the world’s largest cable-supported retractable roof (capable of supporting an astonishing 7 million pounds of snow), on top of a 30 year-old stadium that when it was built, was designed to support the world’s largest air supported roof. The $500+ million dollar project was a spectacular success and helped BC re-establish itself as having one of the greatest sports facilities in the world. This is by no means the only example of successful but spectacularly complicated projects that we are all aware of: fans the MARS rover “Curiosity” are all gideously aware of the incredible achievement of having it touch down safely and within 1.5 miles of its targeted landing site after a 350,000,000 mile journey (that's 99.9999996% accuracy)
So why, I ask, are we still faced with IT project failure rates in excess of 50% poor data quality affecting anywhere from 10 to 50 % of enterprise corporate data? The answer lies in the role that the various standards and disciplines that pervade these civil and mechanical fields.
From the most basic construction project, to the most complex feat of engineering marvel ever concluded – there is a universal set of disciplines and an ordered development of engineering complexity that characterizes these projects. In all cases, it starts with a concept- usually a vision in one person’s mind that is quickly translated to a rough sketch or draft concept. From that visioning emerges ever more complex architectural renderings, along with specific ideas and features emerging as the concept fleshed out. Only after a reasonable degree of consensus been reached out key concepts and design features does it move to the engineering phase (but always, with the architect ever-present. Watching the BC Place project – you would see the architects working with the engineers to refine the concepts and design; while outside the surveyors were starting to establish their baseline assessments, working with the geo-technical engineers to ensure they were getting it right. As things progress they evolve from architecture, to engineering and onto the actual design itself. This is where the real picture starts to emerge of what the actual finished product is going to look like from the users perspective. Often it will include a means of understanding how they interact with the concept through 3D design, modeling, cardboard prototypes, models, wind tunnel tests and laboratory experimentation before a single sod has been turned. It becomes clear that these projects are successful because they illustrate our capacity to do anything conceivable – with the right amount of planning, and a standards-based approach to design.
When the actual build starts – to the untrained eye – it’s absolute chaos with hundreds, and sometimes thousands of workers working on hundreds of things at a time; like ants or bees in a colony. Rest assured however, it is an organized chaos as these marvels of architecture start to rise from the ground. The next time you are around a large construction site however, take a moment to observe the ratio of hands-off to hands-on workers. The numbers might surprise you. The other day I was watching the foundation work being laid for a new twin-tower, 25 story condo. 60 feet below grade of explosives, blasted Canadian shield (and less than one foot from the sidewalks edge, I peered deep into base of this 21st century project and noticed that there were as many white hats as there were yellow hats. In addition there was nothing less than 30 people doing survey – on a site with just over a hundred people. The yellow hats were the only ones actually lifting a hammer or moving equipment and supplies. The white hats? The surveyors? Quality control and coordination. Hundreds of times a day, thousands of different facets are evaluated to ensure that the results of construction are being achieved within the highly engineered parameters of the relevant design and domain. Concrete breaking points, steel quality and thickness, electrical relay capacities, structural fit parameters that demand sub-millimeter precision on 40 ton, 100 foot sections of beam in all ranges of environmental conditions; and it all fits together seamlessly having been relentlessly architected, engineered, designed modeled and tested before the first bolt was turned. If only we could this bring this same degree of rigor and discipline to our IT environments.
How many of you don't recognize the analogy to data architecture, data modelling, data quality, meta-data management, master-data management, data governance, and so on.
This analogy is nothing new – you may recognize the parallels to John Zachman’s seminal article on Enterprise Architecture in 1987. John understood these things long before some of us were even born – and most of us were still in school.
For many of us in data management, we can only dream of being able to architect, engineer and design systems with such a high degree of precision, quality and reliability. Not because we are not capable – but rather because we are but one small part in a very large professional and corporate eco-system where software and hardware engineers have long-reigned supreme. I have hope and faith though that things are evolving. DAMA International, the worlds leading data management association continues to grow and expand throughout the world. Job titles such as Chief Data Officer, Chief Data Scientist and Chief Data Architect offer hope that C-levels are starting to get how important this is. Finally, “Data” as a buzz-word is hitting the mainstream media through trends such as Big Data and government driven initiatives like Open Data. Data Management is becoming a curriculum level program (versus a simple by-line on data modeling in a software 101 course). As DM professionals we need to stay the course, promote the dialog and mentor the young professionals entering the profession. Most of all – we need to engage senior leadership in this dialog. Watch my blog for more parables, metaphors and analogies to help you move this conversation forward.
 Discovery Channels Megaroof: Rebuilding BC Place
 "A framework for information systems architecture". IBM SYSTEMS JOURNAL, VOL 26. NO 3,. 1987