You are here:  Home  >  Data Education  >  Big Data News, Articles, & Education  >  Big Data News  >  Current Article

Could Your Big Data Use a Little Cold Storage?

By   /  September 1, 2014  /  No Comments

Blue iceby Angela Guess

Mary Shacklett of TechRepublic recently wrote, “There has been significant enterprise focus on tiered storage algorithms that route data to the most appropriate storage media based upon how frequently the data needs to be accessed, with particular attention paid to tier one, the most rapid access storage for data that requires quick and frequent retrieval. But back in the ‘data dungeon’ where up to 85% of all enterprise data resides in storage that is seldom accessed, there is an equally looming crisis of how this data can be optimally managed and maintained at the lowest cost, with appropriate data storage, retrieval, security, and access policies in place.”

Shacklett continues, “The name for this infrequently accessed but nonetheless necessary data is ‘cold storage.’ Determining whether data is ‘hot’ (frequently accessed), ‘warm’ (moderately accessed), or ‘cold’ (infrequently accessed) is often the job of a storage administrator who assesses how long it has been since various categories of data have been accessed. In some cases, data centers are even beginning to use automated storage tiering software to make these data storage decisions. Big data factors into the discussion because there is so much of it. For purposes of governance (where data is required to be retained even though it isn’t regularly being used), business continuation (where big data as well as ‘regular’ data needs multiple data repositories for disaster recovery failovers), and just general sanity reasons of needing to know where everything is, sites have to look for low-cost, slower cold storage so they can affordably keep this seldom accessed data under management.”

Read more here.

You might also like...

The AI Advantage in ITSM: Features and Use Cases

Read More →