The world has changed dramatically over the past couple of years – especially in the areas of business and technology. The COVID pandemic accelerated digital transformation and forced a shift to a remote or hybrid business model, leading to a significant spike in the adoption of public cloud services. Gartner estimates that public cloud services spending will increase by 47% from $270 billion in 2020 to a projected $397 billion in 2022.
Cloud services and data have been essential for enabling remote workers to maintain productivity, but relying on cloud is also a double-edged sword that could lead to complexity, excessive costs and unnecessary security risks. With increasingly more data getting stored in the cloud, it is also becoming increasingly challenging for IT teams to effectively manage and protect it all. The result is a breach culture that is only going to get worse. The cost of maintaining the status quo or doing nothing increases daily.
Cloud Data Security Challenges
Cloud platforms and services have been a lifesaver for businesses during the pandemic. Companies have embraced cloud services to provide accessibility, streamline productivity, and increase operational resilience for employees working remotely.
However, for most organizations, the rapid adoption of cloud services came with consequences as well. Visibility was sacrificed and security was compromised in the name of expedience. The percentage of corporate data stored in the cloud has doubled from 30% in 2015 to 60% in 2022 and continues to grow.
The data sprawl results in unknown and unmonitored data stores. Cloud services and DevOps practices enable end users to self-provision applications and services and allow developers to spin up new databases at the push of a button. Our State of Public Cloud Data Security Report 2022 found that less than half (49%) of the survey respondents claimed to have full visibility when developers spin up a new data repository. More than a third (35%) reported partial visibility, while 12% indicated they have no visibility at all.
Not-So-Hidden Cost of Shadow Data
The complexity and lack of visibility results in “shadow data.” Test environments, cloud data store backups, remnants of cloud data migration, data logs, and other artifacts consume resources.
Unfortunately, cloud data storage is not free. There is a real cost of unknown and unnecessary cloud storage. In one example, a customer will save $100,000 per year by eliminating shadow data and consolidating data stores.
Shadow data also introduces an increasing cost of additional risk. These unknown data stores often contain sensitive information like customer or employee data, financial information, intellectual property, or other classified or confidential information.
Unfortunately, the mantra that “you can’t protect what you can’t see” is very true. IT and data security teams can’t possibly enforce policies, monitor access, or protect data of which they are unaware. That is why four out of five senior data security professionals we surveyed revealed they are concerned or very concerned about shadow data.
The pace and scale of data breaches continue to grow – along with the cost of being breached. According to the annual Cost of a Data Breach Report from the Ponemon Institute, the average cost to remediate a breach was $4.24 million in 2021. In addition, the 2021 Data Breach Investigations Report from Verizon found that 90% of data breaches target the public cloud.
There is also a less tangible cost in terms of efficiency without clear visibility of cloud data. With a clear understanding of where data is stored, IT and data security teams can focus on speed of access – which improves productivity and streamlines operations.
Doing Nothing Is Costly
There are significant and essential benefits from cloud services and applications. Even if everything could somehow go back to the way it was before the pandemic, few businesses would choose to do so. While the circumstances that drove much of the digital transformation and cloud adoption over the past couple of years have been tragic, they forced organizations to make changes that have resulted in improved productivity and efficiency.
However, shadow data and cloud data security are very real problems. In a best-case scenario, shadow data results in unnecessary expenses for cloud storage resources, and in a worst-case scenario, the unprotected data could lead to millions of dollars in costs to remediate and recover from a data breach.
With cloud services here to stay, what’s important is that data security teams have the visibility and tools to effectively mitigate and manage cloud data risks. Organizations need cloud-native security to automatically and autonomously discover all data across the cloud ecosystem and recognize and classify PII (personally identifiable information) and other sensitive data. Autonomy is important as these assets are unknown to data security. Once they have full visibility, they can prioritize data stores based on sensitivity and exposure to risk, enforce data security policies, and monitor data access and egress to detect and alert on suspicious activity.
Maintaining the status quo is a bad strategy. The cost of doing nothing is an expense few organizations can afford.