Friday, 22 June 2012

Virtualization changes the Storage Capacity Planning game


An excellent article on storage capacity planning from Manek Dubash via TechTarget (www.techtarget.com) crossed my desk recently. It triggered a few thoughts about Metron and our approach to capacity planning, whether that be for servers, storage, networks or whatever.

The gist of the message was that storage demands are increasing rapidly and that virtualization complicates the picture.  Where a physical server has physical limits to things such as storage, in theory a virtual server has none of these limitations.  What’s more, the ability to scale up so quickly with virtual technology makes planning harder.

The need for speed in planning tends towards using as simple a route as possible.  Often this is just not appropriate though.  It will suit a virtualization Sys Admin to take a view of the average requirements (e.g. memory, storage, CPU) per additional VM.  This can cover a multitude of very different workloads in those VMs however.  This average approach could lead to the same over-provisioning as we saw so many organizations achieve when they used ‘standard’ server builds in the early days of distributed systems.  Ah, the joy of virtualization: this time we will be able to over-provision so much more quickly.

To counter this, the storage admin or capacity planner needs to put a greater degree of definition into future storage needs.  This means profiling different workloads and using planning techniques to ‘mix and match’ what might happen with the business; 1,000 more VMs might actually be 200 email, 100 database, 100 application, and 600 web server, each type having very different storage requirements.  Being able to do this profiling means capturing and analyzing relevant data over longer periods of time – there is simply no way around this.  Without it, you run the risk of under- or over-provisioning and the attendant performance crises or out of control hardware spend.

There are lots of steps you can take to help implement efficient storage systems, for example tiered storage, de-duping data or thin provisioning.  The same dangers will still exist and those technologies only mitigate or potentially delay the day of reckoning.  For example, thin provisioning might avoid over-provisioning of storage upfront, but it also increases the need to manage real storage more effectively. Thin provisioning makes capacity planning of real storage more important than ever by moving the responsibility elsewhere.

All of this supports Metron’s 360 view of Capacity Management, supported by good practice guidelines such as those provided by ITIL®. 
You need to split the ‘capacity planning’ tasks of sys admin and capacity management.  Each can feed and support the other, but they have different perspectives and address different questions.  Longer term capacity management of storage, or CPU, or memory or network, needs a good database of quality data and a level of analysis that short term day to day capacity decisions cannot make use of. 
Just doing capacity planning within one group, whichever you choose, will cost you money sooner or later.

Andrew Smith
Chief Sales & Marketing Officer

Manek’s full article is available at http://searchstorage.techtarget.co.uk/feature/Storage-capacity-planning-tools-Why-virtual-machines-change-everything.

No comments:

Post a Comment