The growing influence of artificial intelligence (AI) not only raises ethical and technological questions, but also presents different challenges in terms of energy consumption.
Sergey Edunov, chief engineer of generative artificial intelligence at Goalcompany behind Facebook, leads open source language model training Call 2.
And it was during a round table in Silicon Valley, which Edunov shared his perspective on the energy consumption of generative AI.
Nuclear energy and Meta’s AI
According to the engineer and as stated VentureBeat: ““Nvidia will launch between one million and two million H100 GPUs next year”.
“If all those GPUs were used to generate ‘tokens’ for reasonably sized language models, they would add up to around 100,000 tokens per person per day across the planet,” he added.
Explaining that “each H100 consumes about 700 watts; adding cooling of data centers, around 1 kW. It’s not so much on a human scale, It would only take two nuclear reactors to power all those H100s.”.
AI and sustainability
At a global level, concern about environmental impact of AI is increasing.
He intensive use of graphics cards and data centers directly contributes to greenhouse gas emissions.
Therefore, Meta is not the only company exploring the connection between nuclear energy and AI; Microsoft has also announced similar plans to develop nuclear fission reactors. that power your data centers and artificial intelligence systems.
Likewise, new regulations, such as government initiatives in California and the European Commission proposal to make data centers carbon neutral by 2030seek to make large technology companies reveal and mitigate their carbon footprint.
These measures could accelerate the transition towards the use of zero-emission energy in the AI sector, although there is still a long way to go.