Large Language Models
Proposed solutions to high energy consumption of Generative AI LLMs: optimized hardware, new algorithms, green data centers
Introduction:
Many generative AI tools rely on a type of natural-language processing called large language models (LLMs) to first learn and then make inferences about languages and linguistic structures (like code or legal-case prediction) used throughout the world. Some companies that use LLMs include: Anthropic (now collaborating with Amazon), Microsoft, OpenAI, Google, Amazon/AWS, Meta (FB), SAP, IQVIA. Here are some examples of LLMs: Google’s BERT, Amazon’s Bedrock, Falcon 40B, Meta’s Galactica, Open AI’s GPT-3 and GPT-4, Google’s LaMDA Hugging Face’s BLOOM Nvidia’s NeMO LLM.
The training process of the Large Language Models (LLMs) used in generative artificial intelligence (AI) is a cause for concern. LLMs can consume many terabytes of data and use over 1,000 megawatt-hours of electricity.
Alex de Vries is a Ph.D. candidate at VU Amsterdam and founder of the digital-sustainability blog Digiconomist published a report in Joule which predicts that current AI technology could be on track to annually consume as much electricity as the entire country of Ireland (29.3 terawatt-hours per year).
“As an already massive cloud market keeps on growing, the year-on-year growth rate almost inevitably declines,” John Dinsdale, chief analyst and managing director at Synergy, told CRN via email. “But we are now starting to see a stabilization of growth rates, as cloud provider investments in generative AI technology help to further boost enterprise spending on cloud services.”
Hardware vs Algorithmic Solutions to Reduce Energy Consumption:
Roberto Verdecchia is an assistant professor at the University of Florence and the first author of a paper published on developing green AI solutions. He says that de Vries’s predictions may even be conservative when it comes to the true cost of AI, especially when considering the non-standardized regulation surrounding this technology. AI’s energy problem has historically been approached through optimizing hardware, says Verdecchia. However, continuing to make microelectronics smaller and more efficient is becoming “physically impossible,” he added.
In his paper, published in the journal WIREs Data Mining and Knowledge Discovery, Verdecchia and colleagues highlight several algorithmic approaches that experts are taking instead. These include improving data-collection and processing techniques, choosing more-efficient libraries, and improving the efficiency of training algorithms. “The solutions report impressive energy savings, often at a negligible or even null deterioration of the AI algorithms’ precision,” Verdecchia says.
……………………………………………………………………………………………………………………………………………………………………………………………………………………
Another Solution – Data Centers Powered by Alternative Energy Sources:
The immense amount of energy needed to power these LLMs, like the one behind ChatGPT, is creating a new market for data centers that run on alternative energy sources like geothermal, nuclear and flared gas, a byproduct of oil production. Supply of electricity, which currently powers the vast majority of data centers, is already strained from existing demands on the country’s electric grids. AI could consume up to 3.5% of the world’s electricity by 2030, according to an estimate from IT research and consulting firm Gartner.
Amazon, Microsoft, and Google were among the first to explore wind and solar-powered data centers for their cloud businesses, and are now among the companies exploring new ways to power the next wave of AI-related computing. But experts warn that given their high risk, cost, and difficulty scaling, many nontraditional sources aren’t capable of solving near-term power shortages.
Exafunction, maker of the Codeium generative AI-based coding assistant, sought out energy startup Crusoe Energy Systems for training its large-language models because it offered better prices and availability of graphics processing units, the advanced AI chips primarily produced by Nvidia, said the startup’s chief executive, Varun Mohan.
AI startups are typically looking for five to 25 megawatts of data center power, or as much as they can get in the near term, according to Pat Lynch, executive managing director for commercial real-estate services firm CBRE’s data center business. Crusoe will have about 200 megawatts by year’s end, Lochmiller said. Training one AI model like OpenAI’s GPT-3 can use up to 10 gigawatt-hours, roughly equivalent to the amount of electricity 1,000 U.S. homes use in a year, University of Washington research estimates.
Major cloud providers capable of providing multiple gigawatts of power are also continuing to invest in renewable and alternative energy sources to power their data centers, and use less water to cool them down. By some estimates, data centers account for 1% to 3% of global electricity use.
An Amazon Web Services spokesperson said the scale of its massive data centers means it can make better use of resources and be more efficient than smaller, privately operated data centers. Amazon says it has been the world’s largest corporate buyer of renewable energy for the past three years.
Jen Bennett, a Google Cloud leader in technology strategy for sustainability, said the cloud giant is exploring “advanced nuclear” energy and has partnered with Fervo Energy, a startup beginning to offer geothermal power for Google’s Nevada data center. Geothermal, which taps heat under the earth’s surface, is available around the clock and not dependent on weather, but comes with high risk and cost.
“Similar to what we did in the early days of wind and solar, where we did these large power purchase agreements to guarantee the tenure and to drive costs down, we think we can do the same with some of the newer energy sources,” Bennett said.
References:
https://aws.amazon.com/what-is/large-language-model/
https://spectrum.ieee.org/ai-energy-consumption
https://www.crn.com/news/cloud/microsoft-aws-google-cloud-market-share-q3-2023-results/6
Amdocs and NVIDIA to Accelerate Adoption of Generative AI for $1.7 Trillion Telecom Industry
SK Telecom and Deutsche Telekom to Jointly Develop Telco-specific Large Language Models (LLMs)
AI Frenzy Backgrounder; Review of AI Products and Services from Nvidia, Microsoft, Amazon, Google and Meta; Conclusions
SK Telecom and Deutsche Telekom to Jointly Develop Telco-specific Large Language Models (LLMs)
SK Telecom and Deutsche Telekom announced that they signed a Letter of Intent (LOI) to jointly develop a telco-specific Large Language Models (LLMs) that will enable global telecommunication companies (telcos) to develop generative AI models easily and quickly. The LOI signing ceremony took place at SK Seorin Building located in Seoul with the attendance of key executives from both companies including Ryu Young-sang, CEO of SKT, Chung Suk-geun, Chief AI Global Officer of SKT, Tim Höttges, CEO of Deutsche Telekom, Claudia Nemat, Board Member Technology and Innovation of Deutsche Telekome, and Jonathan Abrahamson, Chief Product and Digital Officer of Deutsche Telekom.
SK Telecom and Deutsche Telekom to Jointly Develop Telco-specific LLM
This marks the first fruition of discussions held by the Global Telco AI Alliance, which was launched by SKT, Deutsche Telekom, E&, and Singtel, in July 2023, and lays the foundation to enter the global market. SKT and Deutsche Telekom plan to collaborate with AI companies such as Anthropic (Claude 2) and Meta (Llama2) to co-develop a multilingual – i.e, German, English, Korean, etc. – large language model (LLM) tailored to the needs of telcos. They plan to unveil the first version of the telco-specific LLM in the first quarter of 2024.
The telco-specific LLM will have a higher understanding of telecommunication service-related areas and customer’s intentions than general LLMs, making it suitable for customer services like AI contact center. The goal is to support telcos across the world, including Europe, Asia, and the Middle East, to develop generative AI services such as AI agents flexibly according to their respective environment. That will enable telcos to save both time and cost for developing large platforms, and secure new business opportunities and growth engines through AI innovation that shifts the paradigm in the traditional telecommunications industry. To this end, SKT and Deutsche Telekom plan to jointly develop AI platform technologies that telcos can use to create generative AI services to reduce both development time and cost.
For instance, when a telco tries to build an AI contact center based on generative AI, it itself will be able to build one that suits their environment more quickly and flexibly. In addition, AI can be applied to other areas such as network monitoring and on-site operations to increase efficiency, resulting in cost savings in the mid- to long-term.
Through this collaboration, the two companies will proactively respond to the recent surge in AI demand from telcos, while also promoting the expansion of the global AI ecosystem through the successful introduction of generative AI optimized for specific industries or domains.
“AI shows impressive potential to significantly enhance human problem-solving capabilities. To maximize its use especially in customer service, we need to adapt existing large language models and train them with our unique data. This will elevate our generative AI tools,” says Claudia Nemat, Member of the Board of Management for Technology and Innovation at Deutsche Telekom.
“Through our partnership with Deutsche Telekom, we have secured a strong opportunity and momentum to gain global AI leadership and drive new growth,” said Ryu Young-sang, CEO of SKT. “By combining the strengths and capabilities of the two companies in AI technology, platform and infrastructure, we expect to empower enterprises in many different industries to deliver new and higher value to their customers.”
References: