Proposed solutions to high energy consumption of Generative AI LLMs: optimized hardware, new algorithms, green data centers

Introduction:

Many generative AI tools rely on a type of natural-language processing called large language models (LLMs) to first learn and then make inferences about languages and linguistic structures (like code or legal-case prediction) used throughout the world.  Some companies that use LLMs include: Anthropic (now collaborating with Amazon), Microsoft, OpenAI, Google, Amazon/AWS, Meta (FB), SAP, IQVIA. Here are some examples of LLMs: Google’s BERT, Amazon’s Bedrock, Falcon 40B, Meta’s Galactica, Open AI’s GPT-3 and GPT-4, Google’s LaMDA Hugging Face’s BLOOM Nvidia’s NeMO LLM.

The training process of the Large Language Models (LLMs) used in generative artificial intelligence (AI) is a cause for concern. LLMs can consume many terabytes of data and use over 1,000 megawatt-hours of electricity.

Alex de Vries is a Ph.D. candidate at VU Amsterdam and founder of the digital-sustainability blog Digiconomist  published a report in Joule which predicts that current AI technology could be on track to annually consume as much electricity as the entire country of Ireland (29.3 terawatt-hours per year).

“As an already massive cloud market keeps on growing, the year-on-year growth rate almost inevitably declines,” John Dinsdale, chief analyst and managing director at Synergy, told CRN via email. “But we are now starting to see a stabilization of growth rates, as cloud provider investments in generative AI technology help to further boost enterprise spending on cloud services.”

Hardware vs Algorithmic Solutions to Reduce Energy Consumption:

Roberto Verdecchia is an assistant professor at the University of Florence and the first author of a paper published on developing green AI solutions. He says that de Vries’s predictions may even be conservative when it comes to the true cost of AI, especially when considering the non-standardized regulation surrounding this technology.  AI’s energy problem has historically been approached through optimizing hardware, says Verdecchia. However, continuing to make microelectronics smaller and more efficient is becoming “physically impossible,” he added.

In his paper, published in the journal WIREs Data Mining and Knowledge Discovery, Verdecchia and colleagues highlight several algorithmic approaches that experts are taking instead. These include improving data-collection and processing techniques, choosing more-efficient libraries, and improving the efficiency of training algorithms.  “The solutions report impressive energy savings, often at a negligible or even null deterioration of the AI algorithms’ precision,” Verdecchia says.

……………………………………………………………………………………………………………………………………………………………………………………………………………………

Another Solution – Data Centers Powered by Alternative Energy Sources:

The immense amount of energy needed to power these LLMs, like the one behind ChatGPT, is creating a new market for data centers that run on alternative energy sources like geothermal, nuclear and flared gas, a byproduct of oil production.  Supply of electricity, which currently powers the vast majority of data centers, is already strained from existing demands on the country’s electric grids. AI could consume up to 3.5% of the world’s electricity by 2030, according to an estimate from IT research and consulting firm Gartner.

Amazon, Microsoft, and Google were among the first to explore wind and solar-powered data centers for their cloud businesses, and are now among the companies exploring new ways to power the next wave of AI-related computing. But experts warn that given their high risk, cost, and difficulty scaling, many nontraditional sources aren’t capable of solving near-term power shortages.

Exafunction, maker of the Codeium generative AI-based coding assistant, sought out energy startup Crusoe Energy Systems for training its large-language models because it offered better prices and availability of graphics processing units, the advanced AI chips primarily produced by Nvidia, said the startup’s chief executive, Varun Mohan.

AI startups are typically looking for five to 25 megawatts of data center power, or as much as they can get in the near term, according to Pat Lynch, executive managing director for commercial real-estate services firm CBRE’s data center business. Crusoe will have about 200 megawatts by year’s end, Lochmiller said. Training one AI model like OpenAI’s GPT-3 can use up to 10 gigawatt-hours, roughly equivalent to the amount of electricity 1,000 U.S. homes use in a year, University of Washington research estimates.

Major cloud providers capable of providing multiple gigawatts of power are also continuing to invest in renewable and alternative energy sources to power their data centers, and use less water to cool them down. By some estimates, data centers account for 1% to 3% of global electricity use.

An Amazon Web Services spokesperson said the scale of its massive data centers means it can make better use of resources and be more efficient than smaller, privately operated data centers. Amazon says it has been the world’s largest corporate buyer of renewable energy for the past three years.

Jen Bennett, a Google Cloud leader in technology strategy for sustainability, said the cloud giant is exploring “advanced nuclear” energy and has partnered with Fervo Energy, a startup beginning to offer geothermal power for Google’s Nevada data center. Geothermal, which taps heat under the earth’s surface, is available around the clock and not dependent on weather, but comes with high risk and cost.

“Similar to what we did in the early days of wind and solar, where we did these large power purchase agreements to guarantee the tenure and to drive costs down, we think we can do the same with some of the newer energy sources,” Bennett said.

References:

https://aws.amazon.com/what-is/large-language-model/

https://spectrum.ieee.org/ai-energy-consumption

https://www.wsj.com/articles/ais-power-guzzling-habits-drive-search-for-alternative-energy-sources-5987a33a

https://www.crn.com/news/cloud/microsoft-aws-google-cloud-market-share-q3-2023-results/6

Amdocs and NVIDIA to Accelerate Adoption of Generative AI for $1.7 Trillion Telecom Industry

SK Telecom and Deutsche Telekom to Jointly Develop Telco-specific Large Language Models (LLMs)

AI Frenzy Backgrounder; Review of AI Products and Services from Nvidia, Microsoft, Amazon, Google and Meta; Conclusions

 

2 thoughts on “Proposed solutions to high energy consumption of Generative AI LLMs: optimized hardware, new algorithms, green data centers

  1. I think AI assisted design and decisions will expedite nuclear energy production, solve global warming, reduce energy consumption and even quicken chip design. Nvidia develops its SoCs every 6 months while other semiconductor companies take about 2 years. The potential of Ai is to assist everything in the future.

    DeepMind’s AlphaFoldan has solved the protein-folding problem. The ability to predict the shape of proteins is essential for addressing numerous scientific challenges, from vaccine and drug development to curing genetic diseases. But in the 50-plus years since the protein-folding problem had been discovered, scientists had made frustratingly little progress.

    I am more concerned that how humans and AI finally meet at a singularity and what humanity will be defined by then. Most predictions have already been shortened from 2060 to 203x. In the near future, quantum computing with an AI assist could change the whole world beyond our imagination.

    Governments must ensure that AI systems do not harm. There is always someone smart enough without ethics to create an AI something that destroys humanity. It could be worse than an atomic bomb!

  2. Great article, Alan.

    I agree with Bob’s comment. In addition to power consumption, the benefits of AI must be looked at to realize the entire picture. There are productivity gains, growing the pie (improved quality of life), improved efficiencies (e.g., detecting patterns of grid energy consumption and shifting loads), and the use of fuel/energy that would be otherwise wasted (e.g. instead of flaring methane, powering distributed data centers in the hinterlands).

    Buttressing the central point of this article, the Open Source community is concerned about the recent White House Executive Order on AI and the advantage it potentially provides to the incumbents and large entities.

    https://twitter.com/martin_casado/status/1720517026538778657

    It seems like the real advantage is owning large datasets to train the algorithms. This would seem to provide an advantage to larger entities; whether in technology fields or outside, like insurance.

    Tesla/X.ai with its Grok should be added to the list of formidable competitors. Tesla has 150M miles of self-driving (plus countless hours of human driving it has recorded).

    https://www.teslarati.com/tesla-fsd-beta-150-million-miles/

    It could probably recreate every city in America from its training data and make a business of selling special effect footage to Hollywood generated from that data.

    The X.ai Grok will supposedly operate on Teslas as a personal assistant. Perhaps the Tesla vehicles will even operate as distributed data centers.

    https://www.teslarati.com/tesla-native-xai-grok/

    Grok will also be trained on X/Twitter’s live data, so it could effectively become an AI-media company.

    I share Bob’s concerns about the potential implications of generative AI. At the same time, perhaps it is too late as the genie may be out of the bottle.

    From a personal standpoint, the use of generative AI, whether to assist with writing, video editing, or image generation has been a great help. My fear is that it becomes a crutch that weakens my ability to think and communicate.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*