Equipment powering the internet accounts annually for 9.4% (or 350 billion kWh) of the total electricity consumption in the US, and 5.3% (or 868 billion kWh) of the global usage.
That's from a research conducted by David Sarokin at Uclue, an online pay-for-answers service. The figures cover computers and monitors (roughly two-thirds of the total), data centers (one-eight) as well as networking and transmission equipment. They do not cover the energy that goes into producing and distributing computers and equipment, nor that powering printers and other non-communicating devices. Also, left out is the fast-growing non-computer set of Internet-enabled devices, such as PDAs, smartphones, etc. Sarokin has published the details his calculations, and for what I can judge they look generally accurate -- although of course a generous margin of error should be considered, given the difficulties inherent with such a calculation but also specific elements such as, for example, the relative growth of laptops (they consume less than desktops) etc..
Even allowing for such a margin, these figures are huge, and they underscore an untold reality: the supposedly immaterial information economy runs on significant infrastructure and requires significant energy. The idea that an average Second Life avatar consumes about as much electricity as an average Brazilian is not just the stuff of urban legends. And energy consumption is spiking upwards. According to a Sept 2006 report by IDC, today every dollar invested in computer hardware in data centers entails 54 cents in energy costs; IDC projects this figure to grow to 71 cents by 2010, and 1 euro by 2012.
(Cross-posted on WattWatt)