Computers have grown into energy gluttons, and it can’t go on like this

News & events

Computers have grown into energy gluttons, and it can’t go on like this

22 October, 2018

It’s natural to assume that the IT revolution will continue forward at a cracking pace, but what if there are limits on how much energy humanity can actually put towards it? That’s the focus of Professor Michael Fuhrer’s research, who is speaking at the Materialise conference this week in Wellington.

The theory is called Moore’s Law. At a simple level, it says the number of transistors in a dense integrated circuit should double every two years. What that means in practice is that computing and information technology can get smaller, faster, crunch more numbers. The speed of technological development in this sense has had dramatic consequences for how technologically advanced societies have developed in recent years, and enabled the widespread digitisation of infrastructure, commerce, government, our social lives and entertainment. It is no exaggeration now to say that we live in a digital world, and that it has been a revolution that has taken place in the space of just a few decades.

But is continuing to march further into the digital future sustainable? A desktop computer running at peak capacity might draw around 175 watts of energy, compared to, for example, a compact fluorescent lightbulb, which uses about 15. Now get an entire office building full of those desktop computers. Now multiply that by 100 for every building in the city. Now add the energy use to charge up the smartphones sitting in every one of the office workers’ pockets. Now add in the energy use of the servers that store all the data sitting on both sets of devices. Now add in all the energy used to build the devices in the first place. Now add in the electricity needed to power the various wifi networks. All of a sudden, and using only the most basic examples, the amount of energy use that has powered the digital revolution starts to add up dramatically.

Or consider the more hot-button example of Bitcoin, which is considered by some to be the future of currency. But the energy use that powers the generation of more Bitcoins is astonishing – by one measurement, the entire network uses about as much energy as the entire country of Ireland. The process known as mining bitcoin heavily incentivises users to upgrade their computing power, to accelerate how quickly they can build up their stock of Bitcoins. As of 2017, the Bitcoin network emitted the equivalent of 17.7 million tonnes of carbon dioxide every year, and that was predicted to keep rising quickly. For a world grappling with the need to take action on climate change, it is a microcosm of how the increasing digitisation of all aspects of life could work against that goal. Around the world, about 8% of all energy use is being put into computing. That means it could become a finite resource.

It’s these sorts of questions that Monash University Professor Michael Fuhrer spends his time working on, as the director of an Australian initiative called FLEET – or to give it the full title, the Australian Research Council Centre of Excellence for Future Low-Energy Electronics Technologies (you can see why they call it FLEET). He’s an American physicist who has conducted research into ways different materials could be used to continue increasing computing capacity, and will be speaking in Wellington at the Materialise: A Sustainable Future conference this week. (There’s also a group of New Zealand scientists have found a way to take the heat off data centres). Professor Fuhrer spoke to Alex Braae about the challenges the world faces in this area, and what solutions might look like.

Read the full article on The Spinoff.

Tags: the spinoff