1. Higher quality. Just as architects consider blueprints before constructing a building, you should consider data before building an app. On average, about 70 percent of software development efforts fail, and a major source of failure is premature coding. A data model helps define the problem, enabling you to consider different approaches and choose the best one.
When I worked at GE Global Research, my colleague, Bill Premerlani wrote a compiler that generated SQL (standardized query language) code from a picture of a model. The software was complex in that it had to recognize the grammar of the input file, determine graphical connectivity, translate graphic figures to model constructs and emit application code. Each phase centered on its own data model. Our application was unusual, and there was no commercial alternative available at that time.
Premerlani wrote the software, from start to finish, in six weeks flat. The software had few bugs, was extensible and performed well. Admittedly, Premerlani is a super programmer, but modeling facilitated his excellence.
2. Reduced cost. You can build applications at lower cost via data models. Data modeling typically consumes less than 10 percent of a project budget, and can reduce the 70 percent of budget that is typically devoted to programming. Data modeling catches errors and oversights early, when they are easy to fix. This is better than fixing errors once the software has been written or – worse yet – is in customer hands.
Avelo (now part of Iress) is a leading financial software vendor in the United Kingdom. Avelo routinely uses data models as the nucleus for building applications. The company does this because it can build its applications faster and with fewer errors. The models promote clarity of thought and provide the basis for generating much of the needed database and programming code.
3. Quicker time to market. You can also build software faster by catching errors early. In addition, a data model can automate some tasks – design tools can take a model as an input and generate the initial database structure, as well as some data access code.
The academic medical center and health system for the University of Wisconsin, UW Health has been developing a comprehensive medical data warehouse. A recent focus has been calculating readmission statistics to gauge compliance with the Affordable Care Act. We have been accelerating development by using data models as a guide for writing SQL code. With this method we have been able to develop SQL-based logic 10 times faster than by preparing conventional ETL (extract, transform, load) programming code.
4. Clearer scope. A data model provides a focus for determining scope. It provides something tangible to help business sponsors and developers agree over precisely what is included with the software and what is omitted. Business staff can see what the developers are building and compare it with their understanding. Models promote consensus among developers, customers and other stakeholders.
A data model also promotes agreement on vocabulary and jargon. The model highlights the chosen terms so that they can be driven forward into software artifacts. The resulting software becomes easier to maintain and extend.
As part of my consulting practice, I routinely conduct live data-modeling sessions in front of audiences comprising technologists and businesspeople. Often, there are different schools of thought among departments, and the model must triangulate their respective understandings. I project the model on a screen, so audience members can see it as it evolves. The model provides a nucleus for reaching agreement.
5. Faster performance. A sound model simplifies database tuning. A well-constructed database typically runs fast, often quicker than expected. To achieve optimal performance, the concepts in a data model must be crisp and coherent (see first bullet). Then the proper rules must be used for translating the model into a database design.
As a consultant, I’m often asked to assist projects where “the database runs too slowly.” In reality, it is seldom a problem of the database software (Oracle, SQL Server, MySQL, etc.) – but, rather, that the database is being used improperly. Once that problem is fixed, the performance is just fine. Modeling provides a means to understand a database so that you are able to tune it for fast performance.
6. Better documentation. Models document important concepts and jargon, proving a basis for long-term maintenance. The documentation will serve you well through staff turnover.
Today, most application vendors can provide a data model of their application upon request. That is because the IT industry recognizes that models are effective at conveying important abstractions and ideas in a concise and understandable manner.
7. Fewer application errors. A data model causes participants to crisply define concepts and resolve confusion. As a result, application development starts with a clear vision. Developers can still make detailed errors as they write application code, but they are less likely to make deep errors that are difficult to resolve.
In 2010, Thailand’s FXA Group partnered with IBM to create a system to trace food and food processes from origin to final destination. The core of the application architecture was a data model that was highly scalable; it worked for all processes – from Costco and ASDA, to small shrimp and chicken producers in remote Thai villages.
8. Fewer data errors. Data errors are worse than application errors. It is one thing to have an application crash, necessitating a restart. It is another thing to corrupt data in a large database.
A data model not only improves the conceptual quality of an application, it also lets you leverage database features that improve data quality. Developers can weave constraints into the fabric of a model and the resulting database. For example, every table should normally have a primary key. The database can enforce other unique combinations of fields. Referential integrity can ensure that foreign keys are bona fide and not dangling.
Consider the recent troubled rollout of Web software for the Affordable Care Act. Insurers are having difficulty providing coverage because the data they receive is too often corrupted by application errors. Data errors can have severe consequences that are often difficult to understand and correct.
9. Managed risk. You can use a data model to estimate the complexity of software, and gain insight into the level of development effort and project risk. You should consider the size of a model, as well as the intensity of inter-table connections.
Robert Hillard wrote an excellent book, “Information-Driven Business” (John Wiley, 2010), in which he equates a data model to a mathematical graph. He uses the graph as a basis for assessing software complexity. An application database with heavily interconnected tables is more complex and therefore prone to more risk of development failure.
10. A good start for data mining. The documentation inherent in a model serves as a starting point for analytical data mining. You can take day-to-day business data and load it into a dedicated database, known as a “data warehouse.” Data warehouses are constructed specifically for the purpose of data analysis, leveraging that data from routine operations.
Ten years ago, I worked on an application for the Road Home Program. This program was mandated by Congress in the wake of Hurricane Katrina to help Louisiana homeowners rebuild. The application supported the mandated process for processing claims and approving repairs. In addition, it provided data so that sponsors could see the progress of disbursement and repairs, as well as bottlenecks.
These 10 benefits of using data models to build business applications underscore the bottom line: Data models provide the means for building quality software in a predictable manner.