Friday 24 December 2010

Holiday Greetings from our CEO

Well, what a great year!
We were 25 years old this year and if you’ve been reading our press releases you’ll know that we followed up last year’s record revenues with another ‘best ever’ first 6 months to 30th September 2010 and are really looking forward to a good year end in 2011.
Capacity Management has certainly come to the fore this year and with both Forrester and Gartner predicting performance and capacity as the top number 1 and 2 concerns(respectively)of companies migrating to the Cloud, the need for our solutions is going to become even greater.
Increasing business in a recession is never an easy achievement and our success has been hard won.
The winning combination of our World class Capacity Management solution, Athene with our knowledgeable, helpful, and professional people is making us the number one choice of the world’s leading companies....and long may it last.
Can I therefore take this opportunity to wish all our staff, customers and acquaintances a very happy holiday and we look forward to speaking to you again next year.

Paul Malton
Chief Executive Officer

Monday 20 December 2010

Proven Practice: Implementing ITIL v3 Capacity Management in a VMware environment

Implementing an effective ITIL Capacity Management process could be considered challenging in all but the most static of environments.  When a dynamic and feature rich product such as VMware vSphere is brought into scope, it can make a challenging exercise seem almost impossible. 
When incorporating VMware into an existing Capacity Management process or even starting from scratch, we tend to hear the same questions regardless of the business type or size, namely:
·         Where should we monitor VMware i.e. Cluster, host, guest etc and what should we monitor?
·         How do we integrate these new capacity and performance aspects into our Change process?
·         How do these new metrics relate to the services or do we need to redefine the relationship between the services and supporting components?
·         My “Key Performance Indicators (KPIs)” don’t appear to be valid, what should I use instead?
·         I’m not sure what all this new terminology means and how I can manage the performance and capacity of it.
The answers to these questions (and many others) will be addressed in a series of whitepapers discussing how VMware and virtualization will integrate with ITIL Capacity Management; the first of which can be downloaded from
In addition to the whitepaper, Metron Professional Services also have a wide range of consultancy and training offerings that can provide anything from a gap analysis, to a workshop in VMware performance management or a full process implementation.  The full extent of our service catalogue can be viewed on our website at
Follow our blog to keep posted on the next instalment of our “Proven Practice” series.

Rob Ford
Consultant

Friday 17 December 2010

Resource Consumption within VMware

I've just hosted an Athene Training and User Forum for our clients, which looks in detail at resource consumption within VMware and thought I would share some of my thoughts with a wider audience.
With its rapid deployment in most large organizations, the tracking of resource consumption within a VMware environment is one of the major activities performed for Capacity Management within an Enterprise.
One of the strengths of Athene, our Capacity Management software, is its ability to report on the various resource metrics that will be critical to your VMware environment.
The key to making virtualization a success is to ensure you have sufficient resources at any time to satisfy demand. In order to ensure you have sufficient resources, you need to have a consistent reporting process.  This process can consist of a Web Portal / Dashboard for daily reports and ad-hoc reports for troubleshooting.
Trend reports are a critical item in your reporting process, as these will allow you to forecast the future and keep you abreast of potential areas for review based upon both increased demand and current consumption rates. 
The traditional areas Performance Analysts focus on are CPU, memory and IO. However, with virtualization you need to make sure that these components are looked at from the top down and then in reverse. This means starting by reviewing your resource pool allocations, moving on to the clusters, hosts and finally the virtual machines, and then reversing the process.
One item you have to make note of is the headroom for the various performance metrics, as this is going to affect the high availability and dynamic resource procedures that have been put into place within an organization. 
In future blogs, we‘ll discuss how to track those resource wasters along with how modeling factors into your traction of resources. 

http://www.metron-athene.com/athene/environments/virtualized/virtualized_platforms.html

Charles Johnson
Consultant

Wednesday 15 December 2010

Farewell to CMG2010 Orlando

With both the temperatures and the Holiday season starting to warm up in Orlando, the curtain finally came down on a well attended and successful CMG 2010.  As expected the ‘Hot topic’ papers were popular with standing room only in some of the sessions, specifically Dr Tim Norton’s “Cloud Computing:  The New Reality or just vapour?” which pointed out key areas of using Public Clouds, such as performance issues, software licensing and security. 

From Application Tuning to z/OS Optimizing and from Architectural Efficiency to Modeling Oracle RAC, vast amounts of information was presented and digested by a grateful audience.

Our presentations were very well received and thanks to all the people who paid us a visit at the booth and attended my paper on Thursday –if you didn’t get the chance to be with us in Orlando but would like to catch my presentation then please join me at my free webinar :
‘VMware Capacity Management - Now the Dust is Settling...’ on January 13th 2011.

Warm congratulations to Erik Ostermueller of FIS Global, who won an iPOD Touch 32GB in our prize draw.

We enjoyed a wonderful evening with our customers at the Old Hickory Steakhouse, great food (alligator?) and fun enjoyed by all! 
Our sincere thanks to all our customers, it was great to finally meet you in person and I look forward to working with you in the future.

See you in Washington D.C at CMG 2011!

Jamie Baker
Consultant

Tuesday 14 December 2010

Gartner Data Center Summit comes to a close

Well, the Gartner Data Center Summit is over and we have all headed back home, typically to parts of the US or elsewhere that are considerably colder than Vegas has been this week.  Everyone seems overwhelmed by the volume of new information that has been available to us – it’s been a good and very large agenda running over several very long days.
One of the things that seems to have been at the forefront of discussion all week is that we all seem to be much further into the Cloud than a year ago.  Last year 2% of delegates professed to have any actual Cloud implementation, this year the figure is 15%.  For virtualization, the key enabling technology, organizations seems to be putting 50% of all new workloads onto virtual platforms and on average about 25% of older systems have been virtualized.  The top of the virtual hill seems to be in sight as there was talk of a pool of about 25% of applications that look as though they will be too large or too difficult to virtualize.
Without being particularly at the forefront of the agenda, the ‘Green IT’ issue came to the fore.  Key Performance Indicators (KPIs) for capacity, previously expressed purely in utilization terms, now seem to be broader, for example utilization per core per kilowatt, not just utilization per core.  eBay spoke of tracking power per eBay listing, using this as the driver for their capacity planning.  Their servers are now leased, not bought, and refreshed every eighteen months.  The power savings outweigh any cost issues associated with lease, the new kit coming in each time with increased efficiency in power usage thanks to technological advance – a ‘Moore’s Law’ for power consumption perhaps?  Reaching the peak of virtualization could become a green issue as well.  If there are few applications left that can be virtualized, further cost savings will only come from driving up VM densities.
What does this mean from a capacity perspective?  Well, certainly having a capacity solution that can expand from traditional areas such as hardware utilization to other factors such as power consumption looks increasingly important.  Everyone was talking of the broader range of environments under their (or should I say out of their) control with the Cloud.  Breadth of measurement tools and data integration will thus be more and more vital.  Likewise, the more complex the environment, the more benefits that will come from automation. 
The Gartner event also confirmed for me that capacity management is a discipline whose time has come.  Performance and capacity are cited as the #2 concern about moving to the Cloud by Gartner, #1 with Forrester.  People are worried about capacity as the Cloud means aspects of the environment go beyond their control.  Whilst the Cloud offers elasticity of resources on demand, it will not be infinitely elastic, and those resources will take some time to configure in and cost more if brought on line at short notice.  The sooner organizations start aligning their capacity needs with the business, with capacity tools that reach broader across IT than before, the greater the savings and the less chance of performance crises.

Andrew Smith
Chief Sales and Marketing Officer

Friday 10 December 2010

Don’t be one of those suffering from capacity products that have performance problems getting data from VMware......


There is a large number of third party performance and capacity products out there for VMware now.  At VMWorld in San Francisco in September 2010 I heard tell of 40+ products claiming to be capacity management software.  Perhaps some of the vThis and vThat product names should be changed to vTroubled or vChallenged.
Being associated with a product that has a 25 year heritage, I of course have a healthy cynicism about many of these offerings.  Gartner recently stated that for virtualization, 2 out of every three new companies disappeared within three years.  Not necessarily for bad reasons.  Some were bought out by other organizations and continue to provide solutions. Many fell by the way side as they failed to achieve market penetration though.   Beware which horse you back!
I have further cynicism over what constitutes capacity management.  With capacity management now getting viewed as a more strategic initiative due to virtualization and Cloud computing initiatives, many new offerings are just not ‘industrial strength’ capacity management.  Whilst they might be good and pretty for day to day performance reporting, they lack facilities for the more sophisticated aspects of capacity planning such as modeling and predicting future performance and integrating business data into capacity plans.  Many are also purely x86 solutions as well, meaning that many UNIX and mainframe virtual platforms are not covered.  This means that large organizations have to implement various point solutions to meet all their needs and lose the broader picture provided by cross-platform capacity products.
Specifically for VMware, there is a further danger.  Many of these products have been developed from a small scale environment perspective.  The same can be true of VMware itself, and just as VMware has had to address scalability issues, word has been spreading around recent events such as Computer Measurement Group CMG and the Gartner Data Center Summit that some of these VMware point solutions for capacity management have scalability and capacity issues of their own.  For larger environments such as 100+ VMware hosts, many products seem to be struggling to bring back data quickly enough, leading to lost data and gaps in performance and capacity reports.  The issue is not totally their own: VMware admits to inefficiencies in the timeliness of data retrieval when such products use the vCenter web service API to retrieve such data.
Metron saw the writing on the wall for the web service API as a means of gathering large volumes of VMware data some time ago.  As VMware installations have scaled up, Metron has migrated to an alternative approach, accessing the vCenter database directly.  This means that Athene captures data in a few seconds that takes other products several minutes.  If you have a small installation with few hosts this is of no interest, but if you have an enterprise environment with thousands of hosts like many of our clients, you need to make sure your capacity solution has enough capacity of its own. 
http://www.metron-athene.com/athene/environments/virtualized/virtualized_platforms.html

Andrew Smith
Chief Sales and Marketing Officer

Tuesday 7 December 2010

Metron - live from CMG2010


As the holiday season gets into full swing, in a sunny but slightly chilly Orlando, Florida I'm here at CMG2010.

Today kicked off with a very hot topic - Dr Annie Shum’s The “Measured” way to Cloud Computing, pulling in over 100 delegates from the estimated 300 + attending, providing the start of many interesting and varied papers and sessions throughout the coming week.

Cloud Computing is an area that is attracting a lot of attention in the IT industry and it looks as though all organizations are going to have some form of Cloud implementation over the next few years.

We recognise that Capacity Management in the Cloud is going to be a challenge and we'd love to chat with you about Integrator, part of our own Cloud strategy.

We're one of the many vendors attending, Metron-Athene on booth 305, come along and see us to discuss a range of topics on Capacity and Performance Management, Cloud, Virtualization, Green IT and many more.

Presentations to look forward to this week include Adam Grummitt 'Why do we Capacity Model in the UK, Performance Test/Report in the US and Resource Monitor in Japan' and  my own vSphere Performance and Capacity Overview. If you’re attending CMG I hope to see you there.

I'll look forward to giving you a report on how the week went and updating you on more hot topics on Friday.

Jamie Baker
Consultant

Monday 6 December 2010

Gartner Data Center Summit 2010

British weather permitting, I should be at the Gartner Data Center Summit in Las Vegas today.  Lots of things to look forward to, not least getting myself up to speed with what Gartner and the user community see as their key issues for 2011. 
I notice that one of the keynotes features a presenter from eBay.  It will be good to hear how their IT is progressing as it is a few years since I had any direct contact with it.  In those days their business growth was so phenomenal that they spoke of how many servers were being installed per day, not per month or quarter.  It was the epitome of server sprawl.  How do you ‘plan’ capacity in an environment like that?  It showed me that new business challenges need new approaches, and both Metron and the industry at large have taken a broader view of capacity planning and the need to tie it closely to business strategy since then.  I’ll be keen to see if virtualization has helped them cope with that sort of growth thanks to the flexibility of provisioning it offers.  I also wonder if they have had to respond to the pressure to be ‘green’ with IT.  Without good planning tied to business metrics, I don't see how you can adopt a green strategy in an environment as dynamic as theirs.  I’d be interested to hear anyone’s thoughts on this: how do you optimize your server usage to minimize environmental impact when your business is growing at that kind of rate?
Many of the hot topics at the summit are around virtualization and Cloud, as you would expect from a data center event this year.  Last year the figures from the show for take up of Cloud services amongst delegates was very low.  How much that has changed this year might be a good indicator of just how pervasive Cloud architectures will become.  A major change I have felt through this last year is that when you mentioned Cloud last December, people seemed to talk first of public services.  Now, private Cloud seems the paradigm most likely to be of concern to major IT users.  Public, private or whatever hybrid model evolves, at Metron we see this as meaning the environment that our clients manage is becoming ever more complex.  To handle this we are concentrating our product enhancements on making the software capable of handling an even wider range of inputs.  In the future you might not be able to have influence over sources of performance and capacity data in the way you could when everything was in-house, so being able to take what is available and use that now has more significance.  Someone far wiser than me once told me that one way to handle stress is to not worry about things you can’t change.  This could be appropriate for Cloud.  Capacity management of Cloud could cause stress for any of us, and there could be many parts of our environment that we can’t change or affect.  Measure what you can, use that data well and hopefully you won’t get too stressed. 
I’ll be giving you an update on events at the Gartner Data Center Summit later in the week.

Andrew Smith
Chief Sales and Marketing Officer

Wednesday 1 December 2010

How can you realistically capacity manage the Cloud?

Cloud Computing is an area that is garnering much attention in the IT industry and it looks as though all organizations will have some form of Cloud implementation over the next few years.
What Cloud Computing is allowing organizations to do is manage their resources in a way in which the infrastructure does not continually spiral in growth. This could be a mix of private and public Cloud, and for public cloud it could involve a variety of external providers.
The main reason Cloud Computing works is because of virtualization both on the x86 and Unix/Linux platforms.  The key to making Cloud Computing work within an enterprise is an effective Capacity Management process.  An enterprise should have a Dashboard / Portal along with regularly scheduled reports that not only the IT staff, but also the Business Owners can look at to assess their environments.  It is critical to keep the Business Owners apprised of their resource usage as it could change dynamically as other factors in the environment change.   The types of reports you would produce for a Private, Community or Hybrid Cloud architecture may have similar qualities but need to provide a focus to the particular resource used. 
 
What is without doubt, is that the range of environments you will have to manage will become ever more complex. What you can and should do in terms of capacity management will vary with the nature of your own implementation. Join me on 9 December at a webinar and find out what it will be realistic for the capacity manager to provide to the business in this complex world of interacting services, and how we can help you achieve it.


Charles Johnson, Consultant