Artificial intelligence – the energy challenge for data centres

Artificial intelligence is driving both data centre growth and energy efficiency.
Published: Wed 26 Sep 2018

The energy impacts of blockchain and bitcoin mining in particular are by now well known. But what about another new technology that is fast growing in widespread applications – artificial intelligence (AI)?

With its power to provide deep insights based on significant computing capability, the technology is turning into something of a double edged sword as it impacts the energy system. On the one hand AI is driving growth in the number of data centres needed to process all the associated data. On the other it is being harnessed to make those energy guzzling centres more efficient.

Ultimately however, the upshot is the presence of another energy consuming resource in an increasingly electrified world, and bringing with it both challenges and opportunities for utilities in areas such as supply, energy efficiency and flexibility.

AI and data centres

With the move to digitalisation and the shifting of data and analytics to the cloud, the use of data centres has mushroomed with the numbers now running into the millions worldwide - and growing. For example, in Sweden according to Christoffer Svanberg, Chief Marketing Officer of the investment and development hub Node Pole, the country’s data centre industry is currently growing at around 14% year on year and there are no signs of a slowing down.

Indeed, although not putting a number to it, Svanberg anticipates that AI will push the data centre industry into a much more rapid expansion in the future.

While statistics on data centres and their growth and any correlation with AI growth appear to be limited, this is clearly an issue that countries and companies need to consider as AI advances. In Svanberg’s case, he would like to see companies investing into the use of Sweden’s clean energy resources, with hydropower abundant to ensure future sustainability.

As an example of a company that has taken the leap, Facebook, which also makes extensive use of AI, has located the first of its two data centre campuses outside the US in Luleå in northern Sweden, where Node Pole is located. Due to the reliability of the regional grid serving the town, Facebook was able to reduce the number of backup generators by 70% compared with its US facilities and the cold outside air - Luleå is a mere 100km from the Arctic Circle – is tapped for cooling.

The data centre energy challenge

The key challenge with data centres is optimising the energy usage with the banks of servers and other devices running constantly and the cooling needs for all this equipment.

Currently data centres are consuming approximately 200TWh of energy per year, corresponding to about 1% of global energy demand. However, this is expected to rise considerably. A 2017 analysis by Hauwei researcher Anders Andrae projects data centre traffic to increase from around 10ZB/year currently to almost 180ZB/year by 2025, with data centre energy usage increasing to between a best case 1,200TWh to as much as 3,400TWh/year or approaching 8% of global demand at that time.

One novel approach to the cooling challenge is being investigated by Microsoft in its Project Natick, which was launched in 2015 to test the feasibility of under-sea data centres. The cooling requirements are eliminated due to the low water temperatures, the seasonal variations are minimal and moreover, the locations can be in close proximity to large numbers of users.

In phase 2, which is currently underway, a full scale module comprising 864 standard data centre servers and 27.6PB of disk is being tested off the coast of Scotland, powered by renewables from the Orkney grid. However, in the future the intention is to trial powering from offshore renewables.

While the long-term feasibility of such an approach still needs to be proven, with Microsoft aiming for a 5-year subsea lifetime, others are being implemented and notably AI.

Microsoft is one of the companies also pursuing this approach and claims its Cloud is between 22% and 93% more energy efficient than traditional enterprise data centres, due to savings in IT operational efficiency, IT equipment efficiency and data centre infrastructure efficiency. For example, AI is employed for dynamic provisioning of server resources to improve the matching of the server capacity with the demand and to maximise the server utilisation rates.

Another is Google and its company DeepMind, which has been working on introducing AI into data centre operations over several years. In 2016, the company developed an AI-based framework for data centre energy efficiency, which is now implemented in multiple centres providing savings averaging around 30% currently and expected to increase as more and more data is gathered.

DeepMind’s cloud-based AI pulls a snapshot of the data centre cooling system every five minutes from thousands of sensors, which is then analysed to predict how different combinations of potential actions will affect future energy consumption. The AI system then identifies which actions will minimise the energy consumption while satisfying a robust set of safety constraints. Those actions are sent back to the data centre, where they are verified by the local control system and then implemented.

Data centres are of course just one area where AI is being applied in the energy sector and clearly the potential is significant. Moreover, these solutions also have wider applicability to other industrial settings, paving the way for AI to become a key driver for more widespread energy efficiency.