Advertisement

From Minutes to Microseconds: How Can Businesses Speed up Real-Time Analytics?

By on

Click to learn more about author Eric Raab.

For many enterprises, conversations regarding data strategies have shifted from discussing big data to fast data. The ability to understand and act on real-time data insights – often adding the context from historical data – has become a critical differentiator for firms in almost every industry sector globally.

From predictive maintenance within the manufacturing sector to enhancing the performance of racing cars on the track, and a myriad of use cases in between, the business case for real-time data and analytics has been made. But what does “real-time” actually mean? How fast is fast enough when it comes to extracting insights from data at speed?

It may depend on who you ask. Recent research has revealed that while most businesses agree that real-time data and analytics should continue to be an important area of focus, there is a large variance around what “real-time” means. Only one-third of organizations define “real-time” as a second or faster, and nearly half believe “real-time” to mean anything up to an hour or longer – even up to a few days!

What these findings tell us is that many of today’s businesses are simply thinking too slowly. That they could be missing out on extracting the full value of their data by not executing real-time analytics at speed. A mere shift from minutes to microseconds could be a game changer, but this depends on having the right culture and capabilities to do so.

Let’s take a closer look at what an operating model for microsecond-level analytics could look like.

1. Determine if continuous intelligence is right for you

Who wouldn’t want the ability to make faster, smarter decisions? While this may feel like the obvious path forward for any business, the truth is that not all businesses need to operate at the sub-second level. Determining whether or not your business is a good candidate for accelerated decision-making is a good place to start. Businesses should ask themselves the following questions:

  • How valuable is the data that flows within our business, and is it mission-critical?
  • What is the rate at which that value diminishes, and how does this affect us?
  • Do we have a strong data-led culture, with the right tools and processes in place to ensure our people and applications can extract full value from data?
  • Do we have a clear idea of what success looks like? For instance, will faster real-time analytics give us a measurable competitive advantage, such as having richer and more contextual data, or the ability to create faster or more innovative product/service delivery?

Having a clear understanding upfront about the data landscape, how well the business promotes the right technologies and people with the right skills, and what the impacts of faster data could create, are critical first steps.

2. Understand your data

Businesses must have a clear picture of the current data environment before they can attempt to shorten decision-making windows and significantly improve operations. But when you’re working with petabytes or even zettabytes of data, getting a handle on where that data resides, what format it’s in, what applications are using it, etc. are key challenges.

Conducting an audit of the data landscape will reveal the various types of datasets that exist within an organization, so that they can be brought together. With so many different types of datasets – from data generated internally and externally, to streaming data and data at rest, and structured and unstructured data – having a capability to deal with multiple formats is important. This should include the ability to capture and manage historical data, as real-time analytics is most powerful when data created “in the moment” is immediately placed within the context of what a business knows already.

3. Think faster and smarter

When processes and systems are continuously fed by real-time data (enriched by the context of historic data), teams can focus their time on incidents that need investigation – even predicting them – rather than looking at every alert that appears.

Take, for example, temperature data from a sensor embedded in a machine. Understanding that data in real time is useful for checking that the machine is operating efficiently or that a temperature threshold hasn’t been reached, but when you add historic data, mapped over many days and months, you not only get a richer understanding of how a machine is performing, but you can also build machine performance profiles to understand when problems are likely to occur and take action in advance.

Businesses also should challenge teams to apply continuous testing and learning. For example, by pairing real-time analytics with machine learning technologies, these systems can grow in sophistication, getting smarter and faster with each iteration. Analysts and scientists can then build new models and innovate quickly, working within a framework that empowers them to use data in evolved, smarter ways. It’s the bedrock for iteratively building capabilities, adding new data pipelines, and using the technology to add value over time. Sandbox environments allow data teams to build these models without affecting critical systems.

4. Anticipate challenges

Adopting a new real-time analytics methodology is not always straightforward. Challenges and pushback could come from anywhere, with likely issues around extreme data complexity, introducing a new system to already full data software stacks, and creating a new “skills need” within the business – when new IT skills are not always easy to come by.

Consider how real-time analytics solutions might slot into the existing Data Management environment. Ideally, it should be compatible with the major cloud platforms and computing architectures, interoperable with popular programming languages, and flexible in terms of deployment method – depending on preferences for running on-premises, in the cloud, or in a hybrid model.

Businesses should also consider TCO and the impact that a new deployment could have on costs. Having a low memory footprint and ability to run on commodity hardware are important considerations, especially for the Internet of Things (IoT) and other scenarios where analytics at or near the edge requires software to run on devices that are unlikely to have significant compute power.

Ongoing maintenance and operational costs are other factors to account for, along with the level of professional services that are available to support the analysis, remediation, and migration of data. Businesses may also want to look at the experience that exists within the organization, to see if the appropriate skill sets exist or whether training and hiring policies need to be updated.

Achieving a Microsecond Mindset

The potential to bring real-time analytics even closer to true “real time” can be game-changing, empowering businesses across industries to make faster, smarter, and more valuable decisions. Getting started relies on realizing what value and success looks like for your business; setting your data up to succeed with real-time analytics, including putting capabilities in place that effectively bring different types of datasets together; and figuring out how you’ll deal with inevitable hurdles and setbacks. Each of these steps should put you on the path to a microsecond mindset, and all the commercial and operational benefits that come with it. 

Leave a Reply