The global economic outlook for 2023 is tepid at best; inflation is waning but threats of recession still loom heavy in many countries. IT leaders will place their bets on strategic cloud and analytics investments – two areas where there is ample proof of cutting costs and delivering insights that can create new value from unstructured data, leveraging the affordable power of cloud compute. In fact, 65% of organizations plan to or are already delivering unstructured data to their big data analytics platforms, according to a survey about unstructured data management conducted earlier this year. The survey also predicts a sharper focus on intelligent automation, metrics gathering, FinOps, and AI to manage data more precisely to save money and meet a myriad of business objectives.
Storage IT and data management spending will prioritize cloud and edge analytics, data lifecycle management, and data governance.
A well-cited statistic from IDC predicts that data will grow to 175 zettabytes (ZB) by 2025, up from an estimated 97 ZB by the end of 2022 – numbers so large they are hard to visualize. And, with the lion share of this data being unstructured, meaning not living as rows and columns in databases but instead dispersed with multiple copies backed up multiple times across many different applications, storage technologies and departmental silos, the risks go beyond sky-high data storage costs. Organizations need to understand their data better to protect it and to leverage it so they can improve business strategies and develop new products and services faster. Automating workflows to curate and deliver the right data to cloud-native AI and ML tools will be a top tactic in 2023. Edge processing and analytics are also taking hold to deliver just the information needed to other applications rather than dumping large data sets into a cloud data lake and other cloud services. As well, we predict greater adoption of adaptive, more intelligent automation which learns from customer environments and experiences to improve results. This could be particularly useful in areas such as cybersecurity, data mobility and lifecycle management (moving the right data to the right place at the right time) and digital customer experience. All these initiatives will require more proactive unstructured data management, data governance, and monitoring programs to avoid data swamps and compliance breaches and reduce manual effort in data curation and preparation by data analysts and scientists.
Successful cloud migration projects will require mature FinOps practices.
Overspending in the cloud is rampant. One-third (32%) of cloud spending is wasted, up from 30% last year, according to Flexera. Cloud projects are also on average 13% over budget. As a result, FinOps, a cloud financial management discipline that strives to maximize business value by helping engineering, finance, technology, and business teams collaborate on data-driven spending decisions, will become a mainstream practice. FinOps analysis can lower the risks of a cloud data migration by showing predicted savings and ROI of different plans. This will entail gathering metrics on data usage and age to ascertain data value, per-TB costs for on-premises storage and target storage tiers, management costs in on-prem versus cloud, performance, and availability metrics of target storage, and more.
IT managers will be measured on new, business-oriented data management metrics.
New analytics practices will also emerge across IT infrastructure. Storage teams, for example, have traditionally gathered metrics for capacity and performance such as latency, I/O operations per second (IOPS), and throughput. It is now necessary to expand those traditional metrics to ones that focus squarely on the data, as enterprises move away from storage management to unstructured data management in hybrid cloud and edge infrastructure. New metrics cover usage indicators such as top data owners, percentage of “cold” files which haven’t been accessed in over a year, most common file size and type, storage costs per department, storage costs per vendor per TB, percentage of backups reduced, rate of data growth, chargeback metrics, and more. By incorporating these new business-oriented data metrics, IT organizations can do a better job of managing strategies to actual departmental needs. For instance, a department with a large percentage of data that is only active for 30 days will benefit from a different storage strategy than one that needs to keep data in the highest cost “active” tier for a year or longer.
Unstructured data management capabilities will extend to data owners.
IT departments are drowning in data requests along with their daily IT management responsibilities; it’s time for end users and departments to play a greater role in managing their own files and data. With the appropriate security guard rails in place, storage professionals will look for easy ways to share analytics with departments, such as the amount of data in storage, top data owners, usage trends, and costs. Through delivering secure self-service access to data management tools, IT can work more closely with departments to meet cost savings and governance goals. Meanwhile, department heads can ensure that their data is managed appropriately according to business needs. For example, users can identify data sets with certain characteristics (such as project or age) to move to cloud storage for cost-cutting or research initiatives.
AI-based automation will meet unstructured data management.
When data is the topic, AI (and ML) is increasingly the answer. Look for unstructured data management software to get more sophisticated by incorporating adaptive machine learning and automation to intelligently guide data placement, lifecycle management, search, and movement. Solutions can adapt based on the customer’s cost profile, data profile, and target provisioning, and learn over time to refine recommendations. This will not only save time but ideally, help IT managers make better decisions when it comes to protecting and leveraging data assets for the enterprise and ensuring that data always lives in the right place at the right time.