by David Merrill on Oct 30, 2013
The past 3 blogs (part 1, part 2 and part 3) presented the basic framework to identify and measure long-term data retention options. In this final entry, I will summarize other cost areas that need to be considered when one compares and contrasts options for 100-year (or more) data retention plans. In my first blog, I stated that I would ignore the media cost, and service delivery option. I will break that rule now, as these last 3 cost areas are highly dependent on the type of delivery service used.
by David Merrill on Oct 22, 2013
In my third installment on the economics behind long-term data storage (read the first two here and here), I will discuss the single largest cost component for keeping data for very long period of time, and these are the costs of migration and/or re-mastering. In a 100-year perspective, the cost of migration and re-mastering represent 60-90% of the total PV cost, depending on the method you choose for long-term retention.
by David Merrill on Oct 9, 2013
In my part one of my blog on this topic, I constructed a scenario to understand and determine operational cost factors of preserving and retaining data for very long periods of time (100 years or more). Response is that this time horizon is still too short, and that perhaps hundreds of years should be considered. I hope that my presentation of some ideas could work for the 50,100 and millennial periods of time.
The past couple of weeks I have met with clients in western Canada, New York, New Zealand and Australia. There has been a long-standing comparison of disk and tape over the years. Some of the basic arguments have not changed, but now we are seeing customer face very long-term retention requirements. Some customers have been in the oil/gas market, media, government and some in university research. I noticed that the perspective of long-term differs by customer and by vertical, but each of these discussions has been around a serious and deliberate plan to archive and access data for a minimum of 100 years.
I am supporting a large customer transformation to virtual, thin and tiered storage. The projections that we made months ago about improving utilization has come true; we are forecasting a net reclamation of about 1.5PB of storage through these transformation investments. The older/existing arrays were simply virtualized, and then afterward the volumes were re-presented as thinned volumes. The good news is they have 1.5 PB of reclaimed space. The bad news… they have 1.5 PB of capacity that is still too new to de-commission.
by David Merrill on Aug 30, 2013
In my previous blog I discussed some of the investments and steps people can take to be cloud-ready, or cloud-enabled, without necessarily moving everything to an off-site or consumption based delivery model. There are key ingredients that can help to get cloud-ready. And by cloud-ready I mean the same technology and processes that cloud providers use to deliver superior price and cost models for their customers. Some of these key ingredients include: Read More »
by David Merrill on Jul 26, 2013
Moore’s law has been a stable predictor of density and price in the IT world for many decades. Initially used to describe transistor density as a function of time, it has been loosely applied to the price of IT, and for our purposes today the price of storage. Except for 2012 (with Tsunamis and flooding) we have enjoyed storage price erosion in the range of 20-25% per year for many, many years. Storage price erosion is a function of areal density and technology improvements, not necessary transistor density. The chart below (from IIEEE Transactions on Magnetics Vol. 48 May 2012) can give you a rough idea of the future for areal density of NAND, HDD and tape.