Advertisement

How Data Gravity Is Forcing a Shift to a Data-Centric Enterprise Architecture

By on

Click to learn more about author Tony Bishop.

Data is the output of society and everything we do — and the enterprise is fast becoming the world’s data steward. Digital-enabled interactions are becoming the norm, increasing enterprise data exchange volumes. In fact, it’s estimated that by 2024, Global 2000 Enterprises will create data at a rate of 1.1 million gigabytes per second and will require 15,635 exabytes of additional data storage annually. While applications like artificial intelligence (AI) and machine learning (ML) are fast becoming the center of today’s digital enterprise, helping to create efficiencies and improve customer experience, they also add to the accumulation of data that must be processed, analyzed, and applied to keep businesses running smoothly and spur innovation.

The accumulation of this data describes an effect similar to what occurs with the gravity between objects like the earth and the moon — data gravity. The data becomes harder to move, which can cause complexity and prevent digital transformation from occurring. For instance, if enterprises aren’t monitoring their data gravity challenges, it can cause slow response times, create information silos, and ultimately stall profitability and growth.

Data gravity is fast emerging as an invisible megatrend, one that has the potential to inhibit enterprise workflow performance, raise security concerns, and increase costs. If you don’t design for it, data gravity can be a company’s biggest challenge. Additionally, AI applications can exacerbate the negative impact that data gravity has on a business. AI workloads have moved to colocation facilities, yet data gravity continues to inhibit AI innovation and creates complexities that prevent its implementation.

So how can enterprises overcome data gravity challenges? First, start with understanding what data gravity is and how it’s forcing a shift toward data-centric architectures that invert traffic flow and brings users, networks, and clouds to privately-hosted enterprise data.

What Is Data Gravity?

First coined in 2010 by then GE Engineer Dave McCrory, data gravity describes the effect that as data accumulates, there is a higher likelihood that additional services and applications will be attracted to the data, essentially having the same effect gravity has on objects around a planet. As the mass or density increases, so does the strength of the gravitational pull. The data then becomes difficult to process and, if it becomes large enough, can be virtually impossible to move.

Today’s continuous data creation lifecycle underpins data gravity. For example, enterprises are now serving an increasing number of users and endpoints that are creating and exchanging data. The growing volume of interactions and transactions between users and machines is creating a need for increased data processing and storage, both structured and unstructured. While analytics, ML, and AI enable enterprises to embed workflow intelligence, they also fuel additional data enrichment, aggregation, and need for exchange. This compounding effect creates complexity that inhibits digital transformation and creates challenges that current backhaul architecture can’t fully address. Data gravity requires a connected community approach between enterprises, connectivity, cloud, and content providers at centers of data exchange to remove barriers and unlock new capabilities.

Data Gravity and AI

Take, for instance, AI models. Training AI models can be an extremely slow endeavor, not to mention cost-prohibitive, if the main data set resides solely in the cloud, not close to the model.

This failure to factor in data gravity is often the culprit of a common pitfall we see — “model debt.” Model debt is the case when data scientists quickly develop AI models, but they remain undeployed to production for months or longer, causing these models to lose value. 

A key way for enterprises to break down the barriers of data gravity is to conduct model training and inferencing in close proximity to a neutral meeting place where data sets lie, such as multi-tenant data centers. By bringing powerful computing resources closer to the data, enterprises can shrink the time and distance between the data sets that need to be crunched to power advanced AI models and get them out into the wild faster.

The Need for a Data-Centric Architecture

The new demands brought on by AI and ML create new opportunities for data-centric architecture that supports businesses and their need to operate ubiquitously so they can meet customer expectations and make business decisions on-demand. It’s informed by real-time intelligence to power innovation and scale digital business. They must also be able to support the data exchange that fuels this application development.

Enterprise architecture needs to be inverted to a data-centric architecture deployed at points of presence. With a modernized infrastructure strategy, enterprises can support the influx of data from several users, locations, clouds, and networks and create centers of data exchange. Traffic can then be aggregated and maintained via public or private clouds, at the core or the edge, and from every point of business presence, helping lessen data gravity barriers and its effects. By implementing a secure, hybrid IT and data-centric architecture globally at key points of business presence, businesses can harness data to create centers of data exchange for better digital decision-making.

Overcoming Data Gravity Together

Data gravity impacts businesses of all sizes, and every industry has unique requirements around addressing data gravity. In order for the industry to tackle the next era of compute, companies including data center, cloud, and HPC solution providers, are coming together to help mitigate the challenges associated with data gravity by creating an ecosystem of partners so that enterprises can solve their global coverage, capacity, and ecosystem connectivity. That’s why we’ve partnered with companies such as NVIDIA, Core Scientific, and leading public cloud providers so that enterprises can continue to overcome data gravity challenges and solve all complex challenges facing today’s enterprises.

Leave a Reply