Carrying on from my previous blog
it is evident that benchmarking the cloud is not an easy task.
After all Clouds evolve.They don't tell you when changes go in, so historical data is not reliable. “commercial clouds such as Amazon EC2 add frequently new
functionality to their systems, thus, the benchmarking results obtained at any
given time may be unrepresentative for the future behaviour of the system.” Alexandru
Iosup, Radu Prodan, and Dick Epema
So why don’t we continually benchmark the cloud? Because it’s complex and expensive (Challenge 1 = how to do it cheaply)
“A straightforward approach to
benchmark both short-term dynamics and long-term evolution is to measure the
system under test periodically, with judiciously chosen frequencies [26].
However, this approach increases the pressure of the so-far unresolved
Challenge 1.” Alexandru Iosup, Radu Prodan, and
Dick Epema
Even with lots of data you’ll have a
hard time making it fit reality because you cannot replicate all the software
involved.
“We have surveyed in our previous
work [26], [27] over ten performance studies that use common benchmarks to
assess the virtualization overhead on computation (5–15%), I/O (10–30%), and
HPC kernels (results vary). We have shown in a recent study of four commercial
IaaS clouds [27] that virtualized resources obtained from public clouds can
have a much lower performance than the theoretical peak, possibly because of
the performance of the middleware layer.” Alexandru Iosup, Radu Prodan, and Dick Epema
Over long term observation the trend was clear(ok the dates
are old but this still stacks up) things
were slowing up, the cloud experiences seasonality of some description.
“We have observed the long-term
evolution in performance of clouds since 2007. Then, the acquisition of one EC2
cloud resource took an average time of 50 seconds, and constantly increased to
64 seconds in 2008 and 78 seconds in 2009. The EU S3 service shows pronounced
daily patterns with lower transfer rates during night hours (7PM to 2AM), while
the US S3 service exhibits a yearly pattern with lowest mean performance during
the months January, September, and October. Other services have occasional
decreases in performance, such as SDB in March 2009, which later steadily
recovered until December [26].” Alexandru
Iosup, Radu Prodan, and Dick Epema
The final nail in the coffin when trying to benchmark the Cloud is the
flexibility and shifting nature of the hardware, workloads and software
involved.
“Depending on the provider and
its middleware abstraction, several cloud overheads and performance metrics can
have different interpretation and meaning.” Alexandru Iosup, Radu Prodan, and Dick Epema
So you can’t trust the data from
clouds to be what you expect and you can’t trust your existing benchmarks to
represent the future.
I'll answer this question on Monday.In the meantime why not sign up to our Community and listed to our on-demand webinar 'Cloud Computing and Capacity Management'
http://www.metron-athene.com/_downloads/on-demand-webinars/index_2.asp
Phil Bell
Consultant
No comments:
Post a Comment