I remember wondering about this when I was in college -- this is before I really knew anything about computers.
I noticed that our computer center charged by the minute of processor time and it was a lot! I remember people freaking out because their random prime number generator ran overnight and the department got billed an unexpected $1,000.00. And they were in trouble!
So it must cost a lot to get these computers to do all these calculations, I thought. So, I asked the local expert, how much does it cost, then, when the computer isn't doing any calculations? He laughed. He said, it costs the same, whether it's calculating or not. How weird.
Here's an interesting article in the New York Times about the data canters of companies like Google, Microsoft, Facebook and many many others, use and waste a ton of electricity. Here's the hook:
"Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt. The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components." (fromNew York Times)
But beware, here's something from an immediate rebuttal from Forbes, "Why The New York Times Story "Power, Pollution and the Internet" is a Sloppy Failure"
"So here’s the first problem that requires a clarification if not a correction. The utilization rates of servers in data centers is cited as between 7 and 12 percent. Nowhere is it pointed out that this statistic is derived from IT data centers, not from the state of the art data centers run by the Internet companies. Huan Liu based on an external model, estimates Amazon’s EC2 utilization at 7 to 25 percent. But Amazon, Facebook, and Google, don’t report their utilization rates. It is not accurate to make this implied association." (from Forbes)
You may read both articles and draw your own conclusion.