The Move to Public Cloud and an Intelligent Data Strategy

By on

Click to learn more about author Joe Gaska.

It has taken a global pandemic for organizations to finally realize that the old way of doing business – and the legacy technologies and processes that came with it – are no longer going to cut it. This is especially true when it comes to applications. As organizations have transitioned to doing business remotely, there’s been an ever-more palpable need for greater agility and speed, enabling employees, wherever they may be, to have access to the same solutions and data, whenever they need it.


If you find this article of interest, you might enjoy our online courses on Data Architecture fundamentals.

Enter the public cloud. According to a 2020 IDG report, over 92% of organizations say their IT environment is at least partially in the cloud, with that number likely to grow in coming years. Accelerated by the pandemic, enterprises have been eschewing on-premises systems in favor of public infrastructures, like AWS, GCP, and Azure, and cloud-based applications. Yet, despite the plethora of these modern solutions and technologies and the flexibilities they offer, many IT organizations are still operating the old way – and suffering from disjointed systems, extra costs, and frustrations. 

However, this doesn’t have to be the case. By creating an intelligent cloud infrastructure that makes the most of an organization’s data, enterprises can make sure they have the access they need to derive tactical and strategic value from going to the cloud.

The Big Conundrum

Cloud-based applications, such as Salesforce, contain a wealth of data about critical business actions and decisions. This data can be incredibly valuable to people throughout the company, but many enterprises can’t easily access or pull that data into their own business intelligence applications or systems. They devote dozens of IT staff to maintaining APIs for accessing or ingesting data that resides in third-party cloud applications. 

The many users who want that data often end up making copies of it, which they store in their own systems. In fact, one TB of production data can easily turn into eight to nine TB of data replicas, according to ESG. This becomes a management and compliance nightmare. Not to mention an application performance issue – particularly when API limits are reached. 

This brings several questions to light: 

1. Does it have to be so hard to get at your own data if it’s already in the cloud? 

2. Why do application vendors charge so much to store your data within their app, even though it’s being stored in AWS on the backend anyway?

3. Why is it so complicated to move that data to your own cloud or data lake

The answers? 1. No. 2. Because they can. 3. It doesn’t have to be.  

Bring Your Own Storage (BYOS)

The point of going cloud-first is to have an agile, cost-effective, and connected infrastructure that enterprises can easily use for business advantage. But if you don’t own and have complete control over your data that’s in those third-party cloud applications, it’s hard to reap those benefits. 

How can you own it? By bringing your own cloud storage to the party. In other words, backing up and archiving data out of the vendor’s cloud-based app directly into your own AWS, GCP, or Azure infrastructure – and into your own data ecosystem.

First, there’s a big tactical benefit: cost. Companies that bring their own cloud often reduce storage costs by up to 50%. That’s nothing to sneeze at, especially when you consider that data volume is growing by 40% every year, according to ESG. Best to keep it in a place that minimizes the expense.

Then there’s the huge strategic benefits. When your data is in your own environment, it’s much more readily accessible. You can use cloud-native tools that plug into where the data already resides, easily incorporating the cloud application data into your organization’s DataOps ecosystem. This can set your organization up for better insights, higher customer retention, and greater revenue growth. 

You also help reduce risk. Having too many copies of data in different places – especially if it’s sensitive consumer or patient data – can result in violation of data privacy regulations, such as the right to be forgotten. 

As you continue your journey of establishing intelligent, cost-effective infrastructure and building out a cloud-first strategy, strive for solutions that increase your flexibility, ownership, and optionality. When your business-critical SaaS data can be easily moved and accessed, your company can learn more, adapt faster, and open up a whole new world of possibilities.

Leave a Reply

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept