Tuesday 14 December 2010

Gartner Data Center Summit comes to a close

Well, the Gartner Data Center Summit is over and we have all headed back home, typically to parts of the US or elsewhere that are considerably colder than Vegas has been this week.  Everyone seems overwhelmed by the volume of new information that has been available to us – it’s been a good and very large agenda running over several very long days.
One of the things that seems to have been at the forefront of discussion all week is that we all seem to be much further into the Cloud than a year ago.  Last year 2% of delegates professed to have any actual Cloud implementation, this year the figure is 15%.  For virtualization, the key enabling technology, organizations seems to be putting 50% of all new workloads onto virtual platforms and on average about 25% of older systems have been virtualized.  The top of the virtual hill seems to be in sight as there was talk of a pool of about 25% of applications that look as though they will be too large or too difficult to virtualize.
Without being particularly at the forefront of the agenda, the ‘Green IT’ issue came to the fore.  Key Performance Indicators (KPIs) for capacity, previously expressed purely in utilization terms, now seem to be broader, for example utilization per core per kilowatt, not just utilization per core.  eBay spoke of tracking power per eBay listing, using this as the driver for their capacity planning.  Their servers are now leased, not bought, and refreshed every eighteen months.  The power savings outweigh any cost issues associated with lease, the new kit coming in each time with increased efficiency in power usage thanks to technological advance – a ‘Moore’s Law’ for power consumption perhaps?  Reaching the peak of virtualization could become a green issue as well.  If there are few applications left that can be virtualized, further cost savings will only come from driving up VM densities.
What does this mean from a capacity perspective?  Well, certainly having a capacity solution that can expand from traditional areas such as hardware utilization to other factors such as power consumption looks increasingly important.  Everyone was talking of the broader range of environments under their (or should I say out of their) control with the Cloud.  Breadth of measurement tools and data integration will thus be more and more vital.  Likewise, the more complex the environment, the more benefits that will come from automation. 
The Gartner event also confirmed for me that capacity management is a discipline whose time has come.  Performance and capacity are cited as the #2 concern about moving to the Cloud by Gartner, #1 with Forrester.  People are worried about capacity as the Cloud means aspects of the environment go beyond their control.  Whilst the Cloud offers elasticity of resources on demand, it will not be infinitely elastic, and those resources will take some time to configure in and cost more if brought on line at short notice.  The sooner organizations start aligning their capacity needs with the business, with capacity tools that reach broader across IT than before, the greater the savings and the less chance of performance crises.

Andrew Smith
Chief Sales and Marketing Officer

No comments:

Post a Comment