Click here to learn more about Heine Krog Iversen.
We recently read reports about plans for Talend to be acquired by Thoma Bravo, a private equity investment firm. This announcement is interesting and causes some of us in the tech industry to step back and consider many of the factors involved in providing data technology solutions for organizations and where we see the industry headed.
To start, I believe this announcement can be viewed from one perspective as an indicator of the strength of the market for Data Management, cloud, and data infrastructure, and just how crucial data is to running a business.
Having said that, I do wonder if corporate consolidation or independent companies will be the wave of the future to support the data ecosystem. Another important question to contemplate is how organizations should manage their data as they look to migrate to the cloud: a platform approach or a makeshift cloud offering? And what type of provider is best equipped to deliver these solutions?
When I see acquisitions occur, my first question is always how does this impact the customer? As a CEO, I make my biggest and most urgent priority of every workday the customer. Yes, we care about profits just like any company, but I believe it all starts with being customer-centric … servicing the customer, developing new innovation for the customer, and providing leadership in the channel for the customer. When we make an important decision, such as a new technology development, we start off by asking ourselves, “How will this initiative provide greater value/efficiency/return to the customer?”
For some time, businesses were being told that data warehouses, data lakes, or data marts were not needed. Industry luminaries argued that building a data warehouse took too long and cost too much. Many insisted that businesses didn’t need a middle layer and would be fine with self-service and supportive front-ends. Everyone should know by now that this belief was a mistake.
I regress just a bit as I do agree that classical data warehousing did cost too much, take too long, and require too many resources. However, the answer to this challenge should have never been to simply not build a data warehouse. Thinking about architects from another field, designing and building a bridge is complicated as well, and can take a lot of time and substantial resources, but people need efficient transportation modes, so architects found inventive ways to build bridges faster and cheaper. The same is true in this case, as the solution for having a place to store, manage, and access data has always been to get smarter and more innovative. For this reason, I am a proponent of an automated, data-estate-platform approach.
Data is a precious commodity – the market involving all things data is scorching hot and should remain so for years to come. However, software providers need to continue with rapid speed to consistently provide technology to keep up with ever-evolving needs. Scratch that – providers shouldn’t merely follow needs expressed by customers but rather should be ahead of the curve so that technology is ready and available as customers reach out for help, or even more, to lead by developing pioneering breakthroughs and bringing them to market.
Recently, we’ve heard a lot about “cloud data integration” as organizations look to unify their cloud and on-premise applications and systems that they use to run their business as one, cloud-based environment. Ultimately, cloud data integration tasks get data from a cloud SaaS application into analytics databases in the cloud. To support this, various tools, connectors, and adapters are used. My concern when it comes to analytics data is that this can be complicated, as getting data out from the SaaS needs APIs, which means coding will be required. Using an API entity, we don’t have control if upgrades take place on the back end with our database, ERP, or CRM. We change our apps and systems, but do we change the API? Complicating matters even more, these connectors don’t naturally work with each other. Take, for instance, wanting to connect Power BI and discovering that you need to obtain integration software to get to your data. All this translates into greater complexities, costs, time, and security concerns. This leads to another very important point.
There are some impressive brands in our market space, with a host of software solutions and extensive portfolios related to data. So let’s ask another question: Is this really where the industry should be headed to serve customers? In some cases, in order for the customer to be successful in implementing some of these comprehensive software solutions, it can require numerous tools to build and deploy platforms. These tools can be siloed and you might need a tool for data cataloging and a tool for Data Quality and a tool for data cleansing and on and on. This is all quite problematic.
Take, for instance, when the data professionals and IT department are done using these tools to build the data warehouse/lake/mart, and then they look back some time later and discover that some of the data sources are not linked. They then have to determine if they can continue using the same tools, or if they have to learn a new set of tools. Are the tools updated with the latest revision? Do they have the right staff in place to use the tools? This can be overwhelming, as these tools don’t naturally work with each other.
You’re left with software that now looks very complex with an enterprise that runs inefficiently. If a customer has all this great software but can’t utilize it, they’re then looking at expenditures that aren’t put to good use, while at the same time stressed to meet compliance, privacy, and security mandates.
At this point, you realize that you have a lack of resources and skilled people, and your infrastructure is lagging behind. One of the biggest challenges in the technology industry is finding skilled people and, in this instance, skilled professionals who know how to use all the various tools. If they need training, how long will that take? If a customer is relying on consultants, when will they train to learn these new tools, as their profession is predicated on billable hours. All this means that your salary costs to recruit skilled staff can go through the roof. Bottom line: You have a big challenge on your hands. The reality is a bigger problem than most understand upfront.
With that, the question should be, how do we simplify? Can we automate? Can we use one, cohesive tool set to build and manage the data system? The answer is yes on all accounts. In fact, the answer is so obvious, yet we need to break through the collective mindset of the old way being the right way. We need to change the mindset of building the “perfect data warehouse” (which often leads to the excessive labor and time costs as previously noted) even though it might not meet the needs and speed of your business.
What’s more logical to overcome all of these obstacles is a cohesive Data Management platform like the automated data estate mentioned earlier. This type of platform meets modern and future needs, utilizes a single, unified tool set rather than multiple tools, and relies on Data Modeling instead of time-consuming, manual coding. With a single tool set, you can build all data components and define and fully document from the first entry throughout all data layers.
With the power of automation and no coding, this method enables the platform to mostly run on its own. This is the present and the future. History has shown us time and again that automation can modernize processes. This approach is smooth, easy, and just plain smart. And more importantly, it will make that all-important customer very happy.