So you’ve heard the hype on going green, wouldn’t you like to know how the internet fits into the big picture of saving energy? Every time you search Google you could power an 11-watt light bulb for an hour… Think that’s bad? Wait until you put that into perspective including sites like YouTube and every other bandwidth hog on the web…
So you now know one single Google search query consumes 2 to 8 watt-hours of energy. To put this on a scale, Google processes petabytes of information on a daily basis while indexing the web and doing other various things. If we average this out to 4.5 watt hours per query, and consider Google is easily handling 400 million queries a day based on comScore metrics, then we can see 1,800,000,000 (1.8 billion) watt-hours of energy being used daily just for basic search queries. The Google Complex itself uses the amount of power as 3,333 California homes.
If you’d dare to look at the bandwidth from a site like YouTube (making up over 10% of total internet bandwidth usage) and put into perspective how much power is required to simply run a search query versus serving up video content, the issue should become a bit more clear. Myspace, Facebook and the other networks sites of the like are also bandwidth hogs sharing robust amounts of digital media at high speeds.
When a big company like Intel or Google refers to “greening” or “going green” with their server technologies the first step is finding a renewable or more efficient power source. Generally wind-power and biomass come to mind, but we’re going to need more than that to power the web as you can already see.
Right now there’s strong efforts by Intel and Google to start going green with their data center technology. Is this enough? I’m convinced that if the overall public keeps the pressure on going green and saving our environment that our technology will match pace and hopefully save the day… but in a less ideal world we’ve got bigger problems.
The amount of heat produced by these data centers is another intimidating issue we have to face… With the currently technology available we’re forced to use AC units and massive cooling centers to keep all of the servers at around 70 degrees or less. The hot air gets pumped out of the centers and usually outside, although some companies are finally using this excess heat to power office buildings and heat up pools in surrounding areas. Again, this is a good step and recycling is always ideal… but what about all this heat?
Shouldn’t the focus be on creating less heat rather than reusing the heat more effectively? A great deal of power used at these data centers is used in the cooling process alone, sometimes even more in cooling than actually running the servers. There’s a debate as to whether to use forced air cooling or liquid cooling, but either way you’re using a ton of power to cool off your servers. While it seems the big chip manufacturers are focusing on this, it becomes less pressing of an issue when you just find better ways to use the heat… but this is also a problem.