Why Your Enterprise Needs Data-Centric Cybersecurity (and How to Achieve It)

By on
Read more about author Dana Morris.

Hardly a week goes by without news of another cybercrime. A recent example is the MOVEit breach, in which hackers stole data from customers of the MOVEit file transfer service. Between May and August, at least 600 organizations and 40 million users fell victim – with no apparent end in sight.

One reason for the parade of cyberattacks is the growing sophistication of attackers. Verizon’s latest Data Breach Investigations Report reveals that 83% of breaches involve external actors, about 70% of whom are organized crime.

But the bigger issue is the way enterprises handle data today. Not long ago, organizations generated, stored, and accessed data in a central location. To protect that information, they put in place network-centric, perimeter-based defenses, layering safeguards like firewalls and endpoint security to keep attackers at bay.

Today, the network perimeter has atomized. Employees work from hundreds or thousands of remote locations. Data is generated and consumed at the network edge. Most significantly, sensitive information is routinely stored and shared by users and systems across an ecosystem of enterprises, partners, and customers.

This new way of handling information calls for a new approach to protecting it: data-centric security. Data-centric security applies policy, access controls, and encryption directly to information flowing in and out of the organization through email, files, and software-as-as-service (SaaS) applications.

But data-centric security doesn’t require discarding existing cyber protections or re-architecting existing IT infrastructure. Instead, an open standard called Trusted Data Format (TDF) can enable organizations to simply and affordably achieve the persistent access controls and strong encryption of data-centric security.

The Call for Data-Centricity

To operate efficiently and compete effectively, your enterprise must be able to share information among teams, partners, and customers. This data exchange takes place whenever your organization sends and receives emails and instant messages, shares Microsoft 365 files, uses collaboration software, connects to SaaS solutions, transfers data among systems, or transmits internet of things (IoT) data.

Throughout this activity, you need to keep sensitive data secure – to protect privacy, safeguard intellectual property (IP), and avoid regulatory violations. Crucially, you need to protect that information as it traverses internal and external networks and as it’s shared and re-shared by other organizations.

Central to that endeavor is the ability to classify which data is sensitive, tag it accordingly, and permanently bind that tagging to the data. In the past, data classification was associated with the federal government and its Secret and Top Secret designations. However, a growing number of regulations require enterprises to classify information based on its value, sensitivity, and criticality. Mandates like the California Consumer Privacy Act (CCPA), General Data Protection Regulation (GDPR), Health Information Portability and Accountability Act (HIPAA), and Payment Card Industry Data Security Standard (PCI DSS) all require certain data to be protected in certain ways.

Data classification and tagging involves several challenges, however:

  • Sensitive enterprise data shows up everywhere: in data centers and cloud environments, in on-prem applications and SaaS systems, in cloud storage services, and on laptops, mobile devices, and removable media.
  • Few organizations have a standardized, commonly understood mechanism for classifying and tagging data. While many organizations are improving in the way they handle structured data (such as data within databases and warehouses), there is a lag in addressing unstructured data – emails, productivity files, images, and videos – and much of this data can be sensitive in nature. 
  • Data-handling requirements can change over the information’s lifetime. For instance, data might be kept confidential initially but later made broadly available. Or, certain users might be allowed access today but restricted in the future.

Fine-Grained, Attributed-Based Access Control

TDF addresses these challenges head-on. The open standard wraps data objects in a layer of attribute-based access control (ABAC) and encryption. This approach enables fine-grained, persistent control of data – wherever that data is shared and for as long as it exists. The original owner of the information retains control over it, even after it leaves the organization and is re-shared by others.

Using TDF, data owners can entitle users to access data tagged with particular attributes, or classifications. When defining access policies, they can factor in additional system and environmental attributes including things such as location or time of day. By leveraging data attributes and environmental attributes, they can also enforce value-based access decisions across data objects at scale.

TDF binds these policies to the data using public-based signatures. This approach ensures that only the data owner can change the policies, and the policies can’t be tampered with. What’s more, the technology logs every access request. That way, the data owner can easily track shared data and maintain end-to-end auditability.

An effective solution based on TDF addresses the file-sharing challenges enterprises face every day. For instance, users of popular email systems like Microsoft Outlook and Gmail can simply click a button to protect data they share. Rules can be set to automatically encrypt sensitive data before it leaves the organization. Functionality for collaborative environments like Google Workspace put protections in place for data shared across teams or outside the organizations.

There are even options for securing data that flows through popular SaaS systems like Salesforce and Zendesk. Data tagging can be automated on IoT sensors so that data generated at the edge is tagged and encrypted before it leaves the IoT device – with validation that the data that came from that specific sensor wasn’t tampered with along the way.

As an open standard, TDF is in use in private-sector enterprises large and small. It’s also deployed throughout the Department of Defense (DoD) and dozens of intelligence and defense communities in the U.S. and nations around the globe. 

In fact, the ABAC-enabled classification and tagging made possible by TDF is central to the zero trust approach to cybersecurity advocated by the National Institute of Standards and Technology (NIST), the Cybersecurity and Infrastructure Security Agency (CISA), and other industry-leading organizations. It addresses two of the core pillars of zero trust – data and identity – and is essential to the identity, credential, and access management (ICAM) that NIST advocates.

As data exchange becomes fundamental to the way enterprises operate, network-centric security will no longer be sufficient. The TDF standard addresses today’s data-sharing demands by providing a simple and cost-efficient approach to protecting data wherever it travels, for as long as it exists. Deployed effectively, TDF can enable organizations to achieve true data-centric cybersecurity – and protect their most sensitive and valuable information, even as they share it.