THE MAGAZINE FOR THE FUTURE BY TÜV SÜD

HOW MUCH DOES IT COST TO RUN THE INTERNET?

TEXT THOMAS SCHMELZER

—— An insight into global data centre energy consumption and its costs.

Calculating all the costs of the global internet’s infrastructure would be a Herculean task. However, it is possible to make a rough estimate for some areas, such as the volume and expense of global energy consumption for data centers. This amounts to between 200 and 500 billion kilowatt-hours of electricity annually. By 2030, the demand could even rise to 2 trillion kilowatt-hours—equivalent to around half of the annual energy use in the United States of America. Since the data centers also use some electricity from renewable resources or are integrated into larger industrial facilities, and energy prices vary from country to country, it isn’t possible to put a reliable price on power consumption.

A study by the German digital association Bitkom shows how large the differences are in Europe alone. For example, data center operators in Germany had to pay 113.11 euros per megawatt-hour in taxes, levies and network fees. The biggest price driver here is the EEG levy, the German Renewable Energy Sources Act. To compare, in the Netherlands the price is just 17.08 euros per megawatt-hour—a mere 15 percent of the cost in Germany. Hidden costs in the form of carbon dioxide emissions haven’t yet been factored in. According to the Borderstep Institute, an hour of video streaming in 4K-resolution quality on a 65-inch television requires almost 1,300 watt-hours of energy, which corresponds to around 610 grams of carbon dioxide.

MORE ARTICLES