by David Merrill on Jan 21, 2014
by David Merrill on Jan 21, 2014
Hu Yoshida has recently completed a blog series on 2014 trends and predictions, and I’d like to add a few of my own from an economic or IT finance perspective. As I look back on the 15 years of developing and observing economic trends in IT, I believe there is a new shift ahead of us on how we justify, grow, finance and pay for IT services. I don’t think that the traditions and financial practices of the past will be as relevant in the future.
by David Merrill on Oct 30, 2013
The past 3 blogs (part 1, part 2 and part 3) presented the basic framework to identify and measure long-term data retention options. In this final entry, I will summarize other cost areas that need to be considered when one compares and contrasts options for 100-year (or more) data retention plans. In my first blog, I stated that I would ignore the media cost, and service delivery option. I will break that rule now, as these last 3 cost areas are highly dependent on the type of delivery service used.
by David Merrill on Oct 22, 2013
In my third installment on the economics behind long-term data storage (read the first two here and here), I will discuss the single largest cost component for keeping data for very long period of time, and these are the costs of migration and/or re-mastering. In a 100-year perspective, the cost of migration and re-mastering represent 60-90% of the total PV cost, depending on the method you choose for long-term retention.
by David Merrill on Oct 9, 2013
In my part one of my blog on this topic, I constructed a scenario to understand and determine operational cost factors of preserving and retaining data for very long periods of time (100 years or more). Response is that this time horizon is still too short, and that perhaps hundreds of years should be considered. I hope that my presentation of some ideas could work for the 50,100 and millennial periods of time.
The past couple of weeks I have met with clients in western Canada, New York, New Zealand and Australia. There has been a long-standing comparison of disk and tape over the years. Some of the basic arguments have not changed, but now we are seeing customer face very long-term retention requirements. Some customers have been in the oil/gas market, media, government and some in university research. I noticed that the perspective of long-term differs by customer and by vertical, but each of these discussions has been around a serious and deliberate plan to archive and access data for a minimum of 100 years.
I am supporting a large customer transformation to virtual, thin and tiered storage. The projections that we made months ago about improving utilization has come true; we are forecasting a net reclamation of about 1.5PB of storage through these transformation investments. The older/existing arrays were simply virtualized, and then afterward the volumes were re-presented as thinned volumes. The good news is they have 1.5 PB of reclaimed space. The bad news… they have 1.5 PB of capacity that is still too new to de-commission.
by David Merrill on Aug 30, 2013
In my previous blog I discussed some of the investments and steps people can take to be cloud-ready, or cloud-enabled, without necessarily moving everything to an off-site or consumption based delivery model. There are key ingredients that can help to get cloud-ready. And by cloud-ready I mean the same technology and processes that cloud providers use to deliver superior price and cost models for their customers. Some of these key ingredients include: Read More »