You are here:  Home  >  Data Education  >  Big Data News, Articles, & Education  >  Big Data Blogs  >  Current Article

Database Reflections: Ten Things to Consider

By   /  January 9, 2017  /  2 Comments

Click here to learn more about author Michael Blaha.

I’m going to do something different this month and reflect on some observations of the IT industry. My comments will focus on database-related topics. This is a smattering of ideas that is not intended to be comprehensive. I’m hoping that this article will stimulate dialogue. I welcome comments on my opinions as well as your own insights.

  • IT vs. engineering. I started my career in engineering, working as a chemical engineer for seven years. Since then I’ve worked in the IT industry for thirty years. So I’ve seen both worlds. The engineering world does not tolerate the level of failed projects, cost overruns, and quality shortfalls that happen with IT. The situation is about the same now as it was thirty years ago.
  • Data Model as a specification. A Data Model and database design should be part of the specification for building an information system. Vendors should be required to conform to the model and the schema unless a deviation is explicitly approved. The current practice of letting the vendor prepare the model and schema is flawed. The customer should provide these as specs for the vendor.
  • Advanced database skills. Business has much need for advanced database skills. I see it over and over again. But there is a lack of available talent. One reason for the shortfall in talent is that management often does not reward technical excellence. I think that’s because management has trouble assessing and measuring technical excellence.
  • Deep thought. IT places too much emphasis on doing something and not enough on thinking. Brainstorming on different architectures and attention to Data Modeling can improve project outcome and save a lot of work. Too many projects stumble into the first Data Architecture they come upon and fail to consider alternatives.
  • Analytics. Data analysis is really taking off. I believe this is real and not a fad. Computer systems are acquiring vast amounts of data. More and more organizations are trying to mine this data for deep insights.
  • UML Data Modeling. The UML seems to have reached maturity in the data world. The UML is no longer seen as a new technology and in vogue. Some data projects use the UML but most use the notations in conventional Data Modeling tools. The UML seems to be more successful with programmers.
  • Database vs. programming. The IT community is still obsessed with programming. Universities emphasize programming. Publications emphasize programming. But the big business payoff lies with data. Most organizations buy software but their data is specific to their business. Many programmers are still “afraid” of databases.
  • Two Data Architects. For complex projects it’s a good idea to have two architects in the lead. Two minds are better than one. I’ve encountered a few projects like this and two architects really increase the odds of success.
  • The consulting body game. Many organizations pay vendors on the basis of time and material rather than for results. This gives consulting companies an incentive to sell bodies and hours. Consulting companies are disincentivized to find quick solutions to problems and to cut costs.
  • Consulting credibility. I often see situations where I could help a customer. Their cost to pay me would be greatly exceeded by their business benefit. However, often nothing happens. I know I can help them, but they don’t know that and it’s hard to bridge the chasm. Many organizations pay too much attention to cost per hour and not enough attention to cost per output.

In Conclusion

I have presented a number of opinions. Some of these items are difficult to solve and act on. Nevertheless, savvy managers should be aware of these issues and consider them when making decisions.

About the author

Michael Blaha is a consultant and trainer who specializes in conceiving, architecting, modeling, designing and tuning databases. He has worked with dozens of organizations around the world. Blaha has authored seven U.S. patents, seven books many articles, and two video courses. His most recent publication is the Agile Data Warehouse Design video course from Infinite Skills. He received his doctorate from Washington University in St. Louis, and is an alumnus of GE Global Research in Schenectady, New York. You can find more information with his LinkedIn profile or at superdataguy.com.

  • Beatriz Stratton

    Thank you for this blog post, Michael. I’d like to share my thoughts on some of the points you’ve raised:

    Re: IT vs. engineering

    I feel that part of the reason why IT project failure rate is high is
    the constant change that affects every level of systems development.
    Architectures change, technologies change, programming languages change. We as IT professionals are most times developing systems on “new ground”. It is harder to define success when we have no previous history to draw from. Is it “failure” if we have never done the system this way before, or is it “success” (we implemented the system, we learned new technologies, we learnt new architecture, we now know better how to work within a new technology stack and so on). It depends on what we define failure to be: if it is cost and time overruns only then yes, most projects fail. But if we take a holistic view perhaps we can see some success in the midst of it as well.

    I feel that IT’s tolerance for project failures may be related to this
    constant change. If we were in an environment where we worked with largely the same variables in every project then we would be able to more easily identify which of the variables can be tweaked to achieve better results. We would have a baseline to compare with and perhaps it would be not as tolerable to repeatedly make the same errors in judgement.

    Re: databases X programming

    I believe this is an area where we may see some change in the not-so-distant future. IT started by creating systems so that people could store data in disks rather than on paper. Those applications were program-heavy while the data was in electronic files. Systems now can store data in much more complex and efficient ways. Relationships between data entities can exist, we can access data that is not contained within our systems and businesses are demanding more intelligence out of their data. Also, for some industries, current data entry methods (which may currently include a person entering data) will be replaced by machines
    (robots, drones, sensors, etc.). This may diminish the need for programming, and the sheer volume and variety of data being collected may stimulate some IT professionals to pursue a career in data management.
    Beatriz Stratton

    • Michael Blaha

      Thanks Beatriz. Sometimes my head spins from the rapid pace of changes in IT. From my engineering experience, life is more staid and changes happen more slowly. So this could certainly be one of the reasons for the high IT failure rate.

You might also like...

Artificial Neural Networks: An Overview

Read More →