Tech Giants Look to Curb AI's Energy Demands: The Kiplinger Letter

The expansion in AI is pushing tech giants to explore new ways to reduce energy use, while also providing energy transparency.

To help you understand how AI and other new technology are affecting energy consumption, trends in this space and what we expect to happen in the future, our highly experienced Kiplinger Letter team will keep you abreast of the latest developments and forecasts. (Get a free issue of The Kiplinger Letter or subscribe). You'll get all the latest news first by subscribing, but we will publish many (but not all) of the forecasts a few days afterward online. Here’s the latest…

The rise of AI is pushing tech giants to find new ways to curb energy use. Firms like Alphabet and Microsoft have always strived for energy efficiency, but AI chips are extra power-hungry and will be unsustainable without big changes.

Cue new tools that help users reduce energy usage. Tools being developed by researchers at the Massachusetts Institute of Technology (MIT) lower power needs with simple techniques, such as capping the amount of energy used by hardware.

Subscribe to Kiplinger’s Personal Finance

Be a smarter, better informed investor.

Save up to 74%
https://cdn.mos.cms.futurecdn.net/hwgJ7osrMtUWhk5koeVme7-200-80.png

Sign up for Kiplinger’s Free E-Newsletters

Profit and prosper with the best of expert advice on investing, taxes, retirement, personal finance and more - straight to your e-mail.

Profit and prosper with the best of expert advice - straight to your e-mail.

Sign up

The researchers have found that such tweaks don’t hinder the AI’s performance. Another idea is optimizing the mix of AI chips with traditional ones for efficiency. Though it could take a while, expect more energy transparency around AI. Users will eventually get an energy report along with their answers from ChatGPT, which signals a trend of large-scale AI. 

According to an article by Scientific American, a continuation of the current trends in AI capacity and adoption is set to lead to NVIDIA, a leader in AI computing, shipping 1.5 million AI server units per year by 2027. These 1.5 million servers, running at full capacity, would consume at least 85.4 terawatt-hours of electricity annually — more than what many small countries use in a year.

This forecast first appeared in The Kiplinger Letter, which has been running since 1923 and is a collection of concise weekly forecasts on business and economic trends, as well as what to expect from Washington, to help you understand what’s coming up to make the most of your investments and your money. Subscribe to The Kiplinger Letter.

Related Content

John Miley
Senior Associate Editor, The Kiplinger Letter

John Miley is a Senior Associate Editor at The Kiplinger Letter. He mainly covers technology, telecom and education, but will jump on other important business topics as needed. In his role, he provides timely forecasts about emerging technologies, business trends and government regulations. He also edits stories for the weekly publication and has written and edited e-mail newsletters.

He joined Kiplinger in August 2010 as a reporter for Kiplinger's Personal Finance magazine, where he wrote stories, fact-checked articles and researched investing data. After two years at the magazine, he moved to the Letter, where he has been for the last decade. He holds a BA from Bates College and a master’s degree in magazine journalism from Northwestern University, where he specialized in business reporting. An avid runner and a former decathlete, he has written about fitness and competed in triathlons.