Loading...
You are here:  Home  >  Data Education  >  Data Architecture News, Articles, & Education  >  Data Architecture Blogs  >  Current Article

Moving to the Cloud – One Valuable Step at a Time

By   /  February 2, 2018  /  No Comments

Click to learn more about author Neil Barton.

If you’ve ever caught a little league game filled with eight year olds playing baseball, chances are you’ve seen a certain type of parent on the sideline.  Not content with simple encouragement, they bellow specific instructions to their child to do things that, frankly, the world’s greatest players might even struggle with.  Their understanding of the gap between where their child’s skills currently lie and where they want them to be is wildly misaligned.

To continue with this analogy, it’s a phenomenon that organizations experience when they begin the move to the Cloud. They’re so enamored with the potential the Cloud offers that they are blinded to the reality of where their current infrastructure really is.  Their desire to ‘get to the Cloud’ immediately is understandable. However, in most cases- – like the parent who wants their child to play like Jeter – wholly unrealistic.

Quite simply, a combination of available skills, investment and ingrained processes mean that moving to the Cloud overnight may not be realistic, practical or even desired.  While a complete migration to the Cloud remains the end goal, an ability to start smaller, pick a first project to migrate, and successfully work in a hybrid environment for the time being is crucial to continuing to meet the needs of the business. One key metric that determines IT success in doing so is Time to Value (TTV).

So, as you start your transition to the Cloud, how do you preserve and improve the all-critical Time to Value at every step along the way?  You revamp your processes to take advantage of new technologies while also leveraging Agile Data Warehousing best practices that your existing data infrastructure environment and historical approach may have held you back from implementing.  Okay, I hear you say, I get the theory but how do I turn this into practice?

Around 20 years ago, when Data Warehousing first became a ‘thing,’ it was heralded as the means to transform how we do business.  Data would be managed efficiently, insights would flow and businesses would benefit exponentially.  Except that the logistics of building and managing a workable data warehouse were way more complex than we first hoped.  The result?  The power of data resided with a few specialists, its extraction in any meaningful form was slow and the resultant political struggles actually slowed Time to Value rather than accelerated it.

Even more frustratingly, on-premises Data Warehouses were hugely expensive and inhibited agility. Building a Data Warehouse took years and cost millions, meaning only the wealthiest companies could afford one. Companies had to estimate how much storage and compute power was needed three years in advance and purchase capacity for peak workloads (eg: end of month processing, nightly ELT processing), which would then sit underutilized or idle for the bulk of the time-buy too little and you ran out of space and lost the ability to do the job, buy too much and you wasted huge amounts of money on unused processing capacity.

The Cloud has changed this dynamic by allowing organizations to only pay for what they need. Cloud-based infrastructure enables you to do a special project for a few months or maybe a proof-of-concept trial, and simply fire up the necessary capacity for the duration of the project.  Once it is complete, you can instantly scale back down.  And when you don’t have to include new hardware costs for a particular project pitch, it is far easier to make a compelling business case.

In terms of an on-going operational basis, elastic computing, the ability to scale up and down as workload demands, can offer your company valuable advantages. Cloud Data Platforms such as Snowflake provide this level of elastic compute flexibility to keep your costs in line with your actual need.

Scalability and elasticity are only a couple of the enticing advantages that Cloud Data Platforms offer companies. But from a holistic standpoint, and despite these advantages, IT still has a lot of work to do to build and manage a data warehouse that will reside on the platform. Without automating the design, development, deployment and operational aspects of your organization’s Data Warehouse, you aren’t positioned to leverage the benefits of Cloud Data Platforms quickly or show business leaders any significant improvement in Time to Value.

Automation software supplements your IT development resources –  from initial design through development and operation – to speed up Data Warehouse delivery by 80 percent, while reducing cost and project risk.  Companies using automation are seeing developer productivity increase by five-fold as developers spend more time working with business users to deliver new business value much quicker and let the automation software do the routine, tedious and time-consuming work of generating code and documentation. Not only are IT teams delivering faster Time to Value, they are seeing stronger code reliability and consistency that better positions IT for increased responsiveness to future business needs. This is largely due to the automation software’s use of built-in Agile Data Warehousing best practices and industry standards, and its high optimization for the data platform in use.

As you move to the Cloud, Data Warehouse automation software, that is built for a Cloud native platform like Snowflake, Amazon Redshift or Azure SQL Data Warehouse is a must. But just as important, given you may find yourself with one foot in the world of on-premises and one foot in the world of Cloud for quite a while, is the software’s ability to help you manage hybrid environments seamlessly and treat them as a single logical Data Warehouse. This will allow you to manage your current data infrastructure more effectively, manage the transition over time, consume as many of the Cloud Data Warehousing advantages as early as possible for your organization and provide a consistent overview of your infrastructure regardless of data location.

Automation software fitting this criteria will ensure your IT organization’s ability not only to preserve, but greatly accelerate, Time to Value to the maximum extent possible as your company makes the great Cloud migration. And unlike the parent who wants their child to be the next Jeter, that’s got to be a realistic goal worth aiming for.

About the author

Neil Barton is the Chief Technology Officer for WhereScape, the leading provider of data infrastructure automation software, where he leads the long-term architecture and technology vision for the company's software products. Barton has held a variety of roles over the past 20 years, including positions at Oracle Australia and Sequent Computer Systems, focused on Software Architecture, Data Warehousing and Business Intelligence. Barton is a co-inventor of three US patents related to Business Intelligence software solutions.

You might also like...

AI Infuses the Next Generation of Web Application Development

Read More →