It’s difficult to predict exactly how artificial intelligence will change our lives in upcoming years, but one thing seems clear: It will place big demands on energy supplies. AI systems require large amounts of computing power, relying on servers in vast, purpose-built data centers.
Remarkably, this situation is now brightening the prospects for nuclear power. Amazon, Microsoft and Google recently signed deals to purchase nuclear energy, including from small modular nuclear reactors. All these deals, which would take years before power actually starts flowing, are based around nuclear fission, the process that’s been used for decades.
However, some in the tech industry think that a technology breakthrough in nuclear fusion could address the spiraling power demand and climate impact from AI. While fission involves splitting a heavy chemical element to form lighter ones, fusion releases energy by combining two light elements into a heavier one. The same basic process powers the Sun.
In theory, fusion can produce large amounts of power from small amounts of fuel, does not produce greenhouse emissions during operation and yields little long-lived radioactive waste. But dig below the surface and the reality isn’t so rosy, writes Sophie Cogan, who studies the ethical dimensions of fusion energy. Fusion has still not been shown to be economically viable, and scientists have yet to produce more energy out of the process than is put in to run the reactor.
AI could actually help tackle climate change, improving our ability to forecast extreme weather or closely track greenhouse gas emissions. To some, this might make AI’s big energy footprint justifiable. But Cogan says that technology giants need to consider approaches to fusion that meet the general population’s energy needs, rather than just those of data centers. It’s also important to note that fusion is not a technology we can deploy now – it will need more development before it can meet either goal.
|