LIVE ONLINE TRAINING: DATA MANAGEMENT LITERACY
Get up to speed on today’s most important data management practices during this two-day workshop – Feb. 7-8, 2023.
by Ian Rowlands
From time to time, I have to face an ugly truth. I have been working with data and metadata for a long time. It is nearly two hundred and eighty dog years! That longevity is not all bad. It does allow me to take a long view of trends and cycles. Sometimes I see things coming round again that really bother me. Today, with apologies to those who have ridden this merry-go-round with me before, I want to call out one of those things that I know will cause you pain if you let it bite you.
I have seen the development of monolithic mainframe systems, distributed database management systems and sophisticated decision support systems. Now I’m watching another “reinvention” of data processing with the emergence of Big Data technologies in cloud environments.
There’s been a standard lifecycle for each completed spin of the data processing carousel. Creative solution providers identified a new paradigm. Early adopters were excited and invested; some saw benefits; some saw none and became disenchanted. Mainstream organizations adopted the model and realized revenue gains and cost savings. Over time, costs increased and revenue gains stalled. The search for the next paradigm began.
What causes the decaying value of data processing approaches? One way of saying it is that it is the impact of time. To get into fancy academics for a moment, it could be an instance of Social entropy. Social entropy is a systems theory that suggests societies and social networks break down over time, moving from collaborative progress to chaotic conflict.
I like the idea of a fancy theoretical basis. People who know me will recognize the predisposition. I think the real issues are practical. In each cycle, I have seen a tendency to focus on function and ignore the challenge of management. In each cycle, the businesses gaining sustainable benefits have made an investment in long-term manageability.
Long-term manageability of data implies a commitment to knowing your data inventory, understanding the life-cycle of assets in the inventory, and understanding the detailed characteristics of those assets. It means knowing how assets relate to each other, and the roles they play in supporting business operation, management and governance. You can’t have that understanding and knowledge without a disciplined approach to collecting, managing and sharing metadata.
If you have stayed with me this far, and you just groaned and said “I can’t believe Ian is writing about metadata management again”, wonderful! My point is just that. We need to keep writing about metadata management – and much more than that, we need to be doing it! It doesn’t matter what the data processing paradigm is. Data management and metadata management are the basic requirements for substantial sustainable value.
Back to Basics. No apologies.