Advertisement

Data and Analytics: The Pandemic Lessons

By on
Read more about author Gary Bhattacharjee.

Since the onset of the pandemic, when the world plunged into a state of suspended animation, “touchless business” has become more than a boardroom buzzword, a survival mechanism. Customer interactions have been digitized at a pace accelerated by three years, while enterprises’ speed of response to changes, such as increasing the use of technology in operations, multiplied by a factor of 25. 

The need to decide and act quickly in response to immediate and constant change – from lockdowns to disruption of supply chains – spiked the demand for data and analytics (DNA) solutions. In a recent survey of data and analytics leaders in North America, three-fourths of respondents said demand for those services had increased within their organizations, and roughly half said the increase was substantial. Fortunately, the widespread adoption of data and analytics in the last few quarters provided significant learning along the way. 

Here are a few lessons to consider.

Data-on-Cloud Is Table Stakes

The need to work from home and interact remotely with customers pushed most enterprises to migrate workloads to the cloud, which is where they are likely to stay. Governments, which faced the additional challenge of gathering, analyzing, and disseminating massive amounts of data concerning the virus, public health, and subsequently, vaccination, also moved to the cloud. Their use of the cloud to drive innovation – for example, building a cloud-based health management system that connects and monitors patients through sensors and wearable devices – serves as an important example for the industry. Cloud migration has also proven cost-effective for enterprises. 

That the cloud is integral to the enterprise data journey through this decade is no longer debatable – it’s one of the enduring lessons of 2021. 

Leverage the “World-Wide-Data” Ecosystem

Despite having access to rich external sources of information, businesses have always relied more on internal data for analysis. But once the pandemic started, external events reigned supreme, causing uncertainty and disruption on an unprecedented scale. With past data no longer useful for predicting the future, enterprises tuned in to the signals on the outside, from health information to regulatory restrictions to supply chain realignments. The crisis also highlighted how critical it is to verify and validate data, while corroborating with multiple sources, before making decisions with big human impact. 

The Data Narrative Is Compelling When It Is Visual

The power of data visualization and storytelling using data was on full display during the pandemic, as flattening curves and social distancing simulations depicting the spread of infection went viral and encouraged COVID-appropriate behavior. It was a reminder of the “show and tell” lesson we learned as kids. 

Since the ultimate goal of all data is to convey a message, data is also best presented in a way that is simple, unambiguous, and audience-centric. When information is intended to trigger a certain behavior, it also needs to be supported by decision-making tools – to compare data sets across geographic or demographic segments, simulate the consequences of alternative courses of action, predict how an event may progress over time, and so on. 

As the perfect test bed for driving population-scale narratives using data and analytics, the pandemic re-enforced these lessons in 2021.  

Trust, but Verify

Mathematical models that rely on historical market transactions were used for creating a groupthink on derivatives trading like credit-default-swap, which played its part in causing the Great Financial Crisis of 2007-08. 

The pandemic once again highlighted the over-reliance on algorithms powered by historical data and biased knowledge. In the early days of the pandemic, COVID-19 models used in the United Kingdom said cases would double every six days while the actual testing data indicated it was twice as fast. Relying on the model – which was later found to be inaccurate, outdated, not validated, and unsuited to the COVID context – rather than real-world indicators, delayed the imposition of the lockdown. 

As the data economy gets commoditized, and enterprises become more adept at engineering the volume, variety, and velocity of data, checks and balances need to be designed to avoid groupthink and historical bias. It is crucial to provide complete traceability for decision-making through models. Techniques like “Explainable AI” will become important to avoid “data and analytics tunnel vision.” 

In Conclusion

Necessity truly is the mother of invention, as the travails of the pandemic drove the world to conduct smarter business. While we celebrate the extraordinary focus on data and analytics, driven by the need for survival of business models, let us also be cautious and develop a practice to use data purposefully and drive insights responsibly. 

Leave a Reply