Data Modeling Trends in 2018

By on

data modeling trendsThe Database Management and Data Modeling landscapes have evolved significantly in the past few years, from the traditional relational model to now include non-relational models as well. The growth of Big Data and its unstructured and semi-structured data formats, along with trends in Cloud Computing, Artificial Intelligence, Data Lakes, Machine Learning, Blockchain and others pushing the need for more advanced concepts and practices.

Such a monumental shift has caused Data Modeling to also advance. The fundamental changes in data infrastructure and newer technology evolutions have together contributed to this Data Management development.

According to Paxata co-founder and Chief Product Officer Nenshad Bardoliwalla, the combined goal of newer database technologies is to “make data available to a broader community of individuals.” Unstructured data brings a new set of challenges, which can be best tackled via Artificial Intelligence (AI), self-describing data formats, and ontological models. Are advanced database technologies like other technologies, headed towards a mainstreaming of AI, Big Data, and the Cloud? What are the most prominent Data Modeling trends to be seen in 2018?

Why Data Modeling?

The DATAVERSITY® article titled State of the Art of Data Modeling? discusses the primary purpose of Data Modeling, which is basically an attempt to provide a “context” or “coherent structure” to data and information flows within an organization. Context being the core of modern business decisions, modern Data Models have the potential to reshape business decision making through context-sensitive insights.

The Webinar Series Lessons in Data Modeling presents a number of advantages of Data Modeling, which ought to be presented before the new Data Modeling trends are discussed further.

Data Modeling has a number of advantages for organizations and with the growth of new technologies like non-relational/NoSQL data stores, Machine Learning , Artificial Intelligence, Data Lakes, Internet of Things, Cloud Computing, and so many more. The modeling of data structures is no less important in the new Big Data world than it was in the older world of Data Management and Data Governance, and arguably it’s even more important.

Some of those advantages for the average business will gain through enhanced Data Modeling are:

  1. The Data Model provides a clear framework for development projects through the blueprint, thus business clients and developers get more clarity on agreed terms.
  2. When the Data Model contains crisp guidelines, then it enables high performance levels. The guidelines help in resolving confusion during the development phase.
  3. Corrupt datasets get detected and cleansed during the Data Modeling stage, which is another big advantage for developers.
  4. Data Models offer a tested “blueprint” for building software, which great reduces the possibility of later development errors, and greatly reduces development cost and time resulting in shorter time to market (almost 70 percent of initial coding budget and allocated time are reduced).
  5. The Data Model clearly outlines both the “scope” and the associated “risks” of a managed development effort. Thus, the risks can be mitigated early on by appropriately scaling the model up or down.
  6. As Data Models include detailed documentation, long-range maintenance becomes easy and transparent even when staff changes. The documentation also serves as the starting point of advanced data analysis.

A Move from the Relational World to the Unstructured World

Data Modeling is suddenly facing new challenges as database design not only includes traditional relational databases, but newer NoSQL databases handling large amounts of unstructured data as well. Moreover, now advanced database analysts demand the presence of Predictive Models, which are incorporated only via Artificial Intelligence or Machine Learning technologies.

Now, it is possible for DBAs to access large libraries of Machine Learning or Deep Learning Data Models offered by third-party vendors. This was unheard of in the database world five or six years ago. Modern databases are equipped to handle cognitive technologies and live data sources provisioned through the Cloud.

The article You Still Need a Model! Data Modeling for Big Data and NoSQL to understand this shift in Database Management in recent years. Also read 10 Transformational Database Technologies, where NoSQL databases have been described as those “used by web giants to manage information for billions of users.”

New Data Modeling Trends for SQL/NoSQL Databases

The Analytics Week column 2018 Trends in Data Modeling states that the biggest challenge for modern databases is handling machine data. In the modern database world, the source data types, data structures, and data channels are all varied and complex, so the following technology trends may be most profoundly visible in Data Modeling this year:

  • The wide variety of machine data, including everything from log files to live, IoT data issuing messages about customer actions, will pose a new set of analytics challenges.
  • The scale and speed of data, especially in predictive Data Models will be provisioned by Machine Learning or Deep Learning.
  • The demand for aggregating data from widely disparate sources will increase the complexity of data management in Data Modeling. Data Modeling has to address the co-existence of on-premise, private, and public Cloud data repositories while building useful models.
  • Data Modeling consistency will be required for Data Lakes, so that data storage in native formats without extensions is possible. Also, self-describing data formats will be necessary for building flexible Data Models to accommodate Big Data.

And here are some other significant Data Modeling trends that will become apparent in 2018:

  • Automated Data Modeling (Algorithms)

The practice of using algorithms to build Data Models has been in use for some time now, but according to industry thought leaders, the use of algorithms results in automation of Data Modeling to some extent. The algorithms could be any combination of AI, ML, Natural Language Processing (NLP), and statistical algorithms, but when used with other intelligent capabilities in Self-Service Data Preparation platforms, ordinary business users can develop reasonably good Data Models for data analysis.

  • Predictive Modeling

In advanced Machine Learning, the underlying data itself helps shape the Data Models. This type of Data Modeling approach is highly useful in Predictive Modeling, when dealing with huge datasets and repetitive Deep Learning tasks through multiple layers of data. Predictive Modeling is best used to determine root causes of fraud, churn, or upsell. In Predictive Modeling, the combined capabilities of ML and DL are put to use to build learning Data Models.

  • Semantic Data Models

When data is stored in silos, more time and effort are spent on Data Modeling while less time is reserved for enjoying the fruits of that labor. Semantic Models are high in demand as they are believed to work across disparate data types and data structures, and database architectures. Semantic Models also include vivid descriptions, which makes them user-friendly to a broad business user base.  The flexibility of Semantic Models makes them easily usable in varying use cases. When data types or structures change, the models may be recalibrated with minimal programming effort.

Five SQL Server Database Trends for 2018

Here are some new trends that you will notice in SQL Server databases this year:

  • SQL Trend No. 1: Possible Adoption Rates Due to Security Concerns

Both SQL Server 2016 and SQL Server 2017 passed the security tests with enhanced security features and a security-hyped database engine. In the new version this year, adoption rates may differ as users remain concerned about security breaches through new threats.

  • SQL Trend No. 2: Pointing Toward the Cloud

With Azure SQL Database, the database vendors are now providing the same programming capabilities as those of on-premise SQL Server. As Cloud adoption among the DBA community continues to rise, following the Azure SQL Database and associated Cloud services may become a hot trend in 2018.

  • SQL Trend No. 3: AI in Everything

As one of the solemn technology trends of 2017 was to incorporate AI in everything in a few years, 2018 database designers must do their part to uphold this solemn promise.  SQL Server is now leading the database industry with built-in AI capabilities. In the RDBMS world, AI enablement for SQL databases means the use of functionality provisioned by ML Services component of the SQL Server 2017.

  • SQL Trend No. 4: SQL Server for Linux

Beginning with SQL Server 2017, the support for Linux has made a huge impact on the user community. The high performance benchmarks on Linux have already set new database records. As SQL Server design teams prepare for open-source Linux implementations, the future breed of SQL Server DBAs must also train themselves up to deal with Linux and its derivatives.

  • SQL Trend No. 5: New Software Update Cycles

As Microsoft has discontinued the practice of distributing cumulative updates (CUs) every two months and service packs (SPs), the biggest challenge for SQL Server vendors will be managing ongoing software patches and updates in 2018. Thus, DBAs must gear up for the infrequent CU update cycles (quarterly?) this year.

Emerging Trends in ETL: The Data Vault

A big challenge for database technology developers now is handling the complexity of Big Data in terms of volume, variety, and velocity. Review the article titled Emerging Trends of ETL: Big Data and Beyond to understand how ETL poses a roadblock for Big Data. If ETL can be successfully utilized by Data Vaults of the future, then Big Data adoption in databases will catapult in a flash. Data Vaults will remove data cleansing requirements of large datasets and enable incorporating Hadoop, MongoDB, or NoSQL easier. Data Vaults will not have any impact on the underlying Data Model.

Database Technologies of the Future

DBTA rationalizes the market demand for future database technologies in the article titled The Database Technologies of the Future.  This author of this article envisions a future with multi-model database architecture strong enough to support many storage formats, many processing types, and different languages. Massive changes have already taken place in the database design world, but more disruption is yet to come. One sweeping disruption will happen when the limitation of disk access is completely resolved by future in-memory, database architectures.


Photo Credit: Khakimullin Aleksandr/

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept