The more I investigate Cloud infrastructure the more I'm convinced that anyone with annual revenues of less than $1 billion should be using it. Usage rates will vary but one thing is certain even with VM technology you will never utlilize 100% of your capacity. You will have RAM bottlenecks or CPU bottlenecks or Disk Bottlenecks.
Most Cloud infrastructure vendors including Amazon's
Elastic Compute Cloud (EC2) offering are charging based on actual usage. Need an application for one week out of the month? Turn it off until you need it and then turn it on. Only pay for what you need.
But aren't virtual servers a good thing you ask? Yes they are the middle man for savings based on shared environments. In fact Virtual Machines (VMs) are what Microsoft, Amazon and a host of other companies use to make their cloud offerings. VMs only get you so far though.
I put together a graph of the cost to a business running a physical server, a VM and the cloud. Basic Assumptions:
1) An organization replaces its physical servers every 4 years (this may be too aggressive for your organization) due to end of life/warranty, etc.
2) Average CPU utilization is 20% (I find on average that avg. CPU usage is usally a lot less than this on a dedicated box (its not uncommon to see avg CPU at 1% with spikes at certain times).
3) The VM is 1/5th capacity and cost of the physical. You can load up more VMs on a physical server but you start to have RAM or CPU bottlenecks if you carve it up into too many chunks.