Why Data Analysts End Up Playing Data Detective

By on
Read more about author Martin Brunthaler.

Many data analysts are getting a raw deal. For all the optimism around cloud-based systems promising to make Data Management easier, analysts often wind up playing detective ­– battling through huge information stores on the hunt for useful data, instead of running analysis.

For companies, the disillusionment driven by lumbering valuable (and rare) specialists with manual processing is obviously concerning. But it’s also crucial to recognize the risk it creates of misdirecting business decisions and strategies. Our research has shown four in 10 analysts (41%) have little trust in the data they handle, meaning actions based on their reports and recommendations are often ineffective at best, or at worst, harm organizational success.

This blend of dissatisfaction and inefficiency only stands to cause more damage for both analysts and businesses in the long run. As addressing these issues becomes increasingly urgent, it’s crucial to start looking closer at key barriers stopping analysts from doing their jobs, and how they can be removed.

Heads Stuck in the Clouds

Few data analysts need reminding about the cloud’s soaring popularity. According to BARC’s 2023 business intelligence trends report, cloud for data analytics has quickly risen through the rankings chart: moving from a 4.3 rating in 2019 to 5.4 today, now sitting inside the top ten. And to a certain point, this eagerness isn’t undue.

Developments in off-premises cloud technology have presented new possibilities for large-scale data capture, assessment, and activation that equates to less strain on resources. It’s the fabled dream of getting more for less. Lured by the prospect of gaining greater capabilities while simultaneously dialing back internal setups – including data centers, ETL (extract, transform, and load) systems, and databases – many analysts have put their faith in cloud-based solutions as a passport to faster processing, slicker data analysis, and higher maturity.

But enthusiasm seems to have driven a dangerous assumption that outsourced data hosting and management will intrinsically be more efficient. In truth, central orchestration elements are the same, wherever information is housed, including collection, cleansing, interrogation, and making data usable. As a result, analysts mistakenly believing cloud tools can shoulder all the heavy lifting of upscaled data efforts will likely find their hopes dashed ­– facing a heavier workload that leaves them even less able to focus on deeper, valuable analysis.

Underestimating Foundational Essentials  

Overestimation of the benefits new tools bring isn’t helped by reliance on outdated processes and gaps in critical infrastructure. Only last year, our research revealed just 41% of analysts have access to a unified data store, while 58% routinely build reports in spreadsheets.

Lingering legacy practices and problems aren’t ensuring the best use of data analysts’ skills, or time. Already struggling to manage fast-swelling data volumes, their days are frequently filled by labor-intensive organization and administration, which is not only wasteful, but also increases the probability of human error creeping in, even for data specialists. Moreover, this risk isn’t unknown to analysts; 63% of those who name manual wrangling as their biggest challenge admitting to low trust in data, compared to 15% for analysts who don’t.

The main reason why these issues persist is a well-worn classic: focusing on building out and upgrading data stacks, instead of getting core data operations in working order, as illustrated by this year’s leading BARC trends. While modernizing data warehouses falls within the top five trends, practical measures required to support such goals, and generally smooth data flow, rank much further down. This includes embracing integrated analytics platforms – at number 12  – despite the fact that most analysts see data integration as business critical.

Clearly, data analysts typically understand the basics necessary for their roles, particularly data unification and streamlining intensive processes. Amid the drive to continually move forward, however, it looks like a high proportion are skipping over core essentials. It’s therefore no surprise that as vital steps keep being missed, improving data quality and master data management remains difficult.

Enabling Hassle-Free Analysis

The ability to easily tap relevant data is arguably integral to every role in a modern business. But for analysts who are meant to be premium data wielders, it’s imperative. Experts hired specifically for the purpose of analyzing data and providing actionable insights shouldn’t have to repeatedly find new ways to collect data from emerging sources, pull together siloed data sets before they can get to work, or spend hours generating manual reports in Excel.

What’s needed to help analysts fulfill their potential, and capacity, is a minimum threshold for Data Management, specifically: systematic collation of ever-refreshing data that’s effectively visualized to offer the best foundation for further evaluation and smarter decision-making.

Although setups for individual companies will vary, the two most critical components are strong data culture and automated tools that enable mature analysis. More often than not, one begets the other. Automated integration creates a cohesive view of data that allows analysts to focus on generating precise, granular insights for multiple teams at speed, thereby boosting company-wide confidence in the value of data, and eagerness to accelerate usage.

As business intelligence solutions evolve, there is also greater scope for analysts to harness a sophisticated range of tools that not only cultivate robust data workflow, but also offer additional time-saving capabilities. For example, growing availability of features such as data schema mapping are making it simpler to create comprehensive and analysis-ready data structures by instantly matching metrics and dimensions across different tools.

Similarly, other forms of automation are supporting better, machine-assisted efficiency. As well as applying scheduling to control how data is fetched – including setting their desired frequency/time range – analysts can leverage continuous monitoring to keep careful watch over how long it takes to retrieve data and determine where processes should be optimized.

As our own research illustrates, solid Data Management groundwork yields many rewards beyond enhancing data useability, paving the way for successful implementation of advanced technologies and practices. Of the analysts who have a centralized data lake, for example, 69% are currently using predictive analysts, as opposed to 27% adoption for those who don’t.

The source of current data woes isn’t localized. In part, challenges are driven by companies hiring data analysts and asking them to follow clunky, error-prone procedures that force them to play detective and solve problems outside their role. Many analysts have also exacerbated their own challenges by chasing cloud dreams and putting too little emphasis on baseline integration. The best route to tackling this mix of hurdles is defining how structures can be reformatted to bolster usability and enable analysts to finally do what they signed up for.