Noted data storage expert Henry Newman points out — in contradiction to other tech analysts — that budgeting for storage requires more than simply extrapolating from today’s costs.
Those of you that read my column regularly know I look at things a little differently than some of the big name industry analysts, such as Gartner or IDC, and try to look at the reasons behind the predictions.Most of the articles I read from these two groups and others talk about budgets in terms of how much total hardware will be sold, and much of that is based on the economy. I see the problem for budgeting much differently. In my view the problem is you need to buy terabytes or petabytes of storage capacity – if you are Google maybe tens of petabytes of – and I believe there are going to be some significant technology changes for disk storage that will, in turn, impact your budgets.
The way I view the problem, is that you will need to buy a defined amount of storage and that has a cost, and too many bean counters, accountants, university researchers and big industry analysts, just draw straight lines based recent history on density and cost. They make the assumption that they have solved the problem and know what the cost will be. The last time I checked, technology growth, for the most part, does not happen on a straight line. The cost per GB for disk technology flattens compared to the old technology cost per GB when it is introduced. The cost quickly gets more dense and lower and then lines flatten again at the end of the technology lifecycle.
Here is a history of Seagate Enterprise disk drives:
Read the rest at Enterprise Storage Forum.