What Do Google’s AI Answers Cost the Environment?

Google is bringing AI answers to a billion people this year, but generative AI requires much more energy than traditional keyword searches

Photo illustration of an aisle in a server room with a door open on one of the equipment cases

Photo illustration of a server room.

Cosmin4000/Getty Images

What medications cure headaches? Should you kill cicadas? If you Google these questions in the U.S., answers may pop up directly at top of the results page—products of “AI Overviews,” the latest artificial intelligence offerings from the search engine giant. Google, which handles more than 90 percent of Internet searches worldwide, has promised to bring this feature to one billion people by the end of 2024. Other tech companies may follow suit; Microsoft’s search engine Bing has also introduced a similar AI option.

Gemini, Google’s family of large language models, generates the text found in AI Overviews. These systems are powerful but not flawless. In the weeks since the feature’s debut, it has recommended gluing cheese to pizza, for instance, and changing a car’s blinker fluid.* But these early flubs, which Google chalked up to misinterpretations or a lack of data, have so far overshadowed another possible drawback: the financial and environmental cost of AI search. Embedding generative AI in such a widely used application is likely to deepen the tech sector’s hunger for fossil fuels and water.

When compared to traditional search engines, AI uses “orders of magnitude more energy,” says Sasha Luccioni of the AI research company Hugging Face, who studies how these technologies impact the environment. “It just makes sense, right?” While a mundane search query finds existing data from the Internet, she says, applications like AI Overviews must create entirely new information; Luccioni’s team has estimated it costs about 30 times as much energy to generate text versus simply extracting it from a source.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Big tech companies typically do not disclose the resources required to run their models, but outside researchers such as Luccioni have come up with estimates (though these numbers are highly variable and depend on an AI’s size and its task). She and her colleagues calculated that the large language model BLOOM emitted greenhouse gases equivalent to 19 kilograms of CO2 per day of use, or the amount generated by driving 49 miles in an average gas-powered car. They also found that generating two images with AI could use as much energy as the average smartphone charge. Others have estimated in research posted on the preprint server arXiv.org that every 10 to 50 responses from ChatGPT running GPT-3 evaporate the equivalent of a bottle of water to cool the AI’s servers.

Such demands translate to financial costs. John Hennessy, chair of Google’s parent company Alphabet, told Reuters last year that an exchange with a large language model could cost 10 times more than a traditional search, though he predicted that those costs would decrease as the models are fine-tuned. Analysts at Morgan Stanley estimated that if AI generated 50-word answers in response to 50 percent of queries, it could cost Google $6 billion dollars per year. (Alphabet reported $300 billion in revenue in 2023.)

When asked about the costs of generative AI compared to traditional search, a Google spokesperson told Scientific American that “machine costs associated with [generative] AI responses have decreased 80 percent from when [the feature was] first introduced in [Google’s feature incubator] Labs, driven by hardware, engineering and technical breakthroughs.”

Data centers, including those that house AI servers, currently represent about 1.5 percent of global energy usage but are projected to double by 2026, at which point they may collectively use as much power as the country of Japan does today. Generative AI itself is expected to consume 10 times more energy in 2026 than it did in 2023. To accommodate these growing power needs, Google and other big tech companies are building out their computing infrastructure. Google is spending billions on new data centers in Missouri, Indiana and Iowa. Microsoft and OpenAI, the company behind ChatGPT, are meanwhile working on a $100 billion plan to increase their power capacity.

Many tech companies say they plan to shift away from using fossil fuels for their growing computing needs. IBM, for example, has said that renewable energy accounted for 74 percent of its data center energy usage in 2023. Google has set a goal to match 100 percent of its energy usage each hour with purchases of carbon-free energy by 2030, and it achieved an average of 64 percent in 2022, the most recent year for which data are available. This number was relatively flat compared with that of 2021 “despite [a] significant expansion of carbon-free energy supply ... because of the significant growth in our total electricity load,” according to the company’s 2023 Environmental Report.

A major challenge in switching to renewable energy is that data centers’ need for constant, stable power often does not match the rhythm of renewable energy production, which fluctuates—sometimes unpredictably—throughout the day, explains Shaolei Ren, a computer engineer at the University of California, Riverside, who studies sustainable AI. To support the demand for stable power, fossil-fuel plants are staying online longer than planned, especially in Virginia’s “Data Center Alley” and other tech-dense areas of the country.

“There's a supply and demand mismatch for this renewable energy,” says Benjamin Lee, a computer scientist at the University of Pennsylvania, who studies AI’s environmental impact. He adds that data centers may solve this problem in the future by scheduling their computing needs to match the availability of renewable energy or by investing in batteries to store it.

While Google’s AI answers may initially cause a big jump in the search engine’s energy costs, the costs may begin to decrease again as engineers figure out how to make the system more efficient, says Rama Ramakrishnan, a professor at the MIT Sloan School of Management with expertise in the tech industry. “The number of searches going through a [large language model] is going to go up..., and therefore, probably, the energy costs are going to go up. But I think the cost per query seems to be going down” as the company works to make both the hardware and software more efficient, he says.

“It’s difficult to predict” what this means for the cost of generative AI compared to traditional search, Ramakrishnan says. “My guess is that it’s probably going to go up, but it’s probably not going to go up dramatically.”

Whether this increased cost is worth it depends on the value users find in the AI answers. Links provided by AI Overviews get more traffic than those in traditional search listings, and people wind up spending more time on those websites, an indication that users are satisfied with the search result, the Google spokesperson says.

“Some people will be like, ‘Okay, that’s worth it. I'm ready to emit that much more carbon in order to have that kind of result,’” Luccioni says, “whereas other people [would] rather use the legacy Google search.” Users currently do not have a simple option to turn the AI Overviews feature off, however, meaning that the search engine is computing answers that many people may not want or need.

Still, even when given a choice, the public may have a hard time connecting digital actions to invisible real-world costs. “Cloud [computing] has made things incredibly convenient, and as a result, many people use it as if it were free because they don’t incur the cost of it,” Lee says.

Luccioni is currently working on a project to give Energy Star ratings to different AI models and types of tasks. She says she hopes this would help AI users make more informed choices. “We know that there’s a cost to this,” she says. “We should know what that cost is.”

*Editor’s Note (6/12/24): This sentence was edited after posting to remove a reference to an AI Overviews recommendation for eating undercooked chicken. According to a Google spokesperson, this was a faked result.

Allison Parshall is an associate news editor at Scientific American who often covers biology, health, technology and physics. She edits the magazine's Contributors column and has previously edited the Advances section. As a multimedia journalist, Parshall contributes to Scientific American's podcast Science Quickly. Her work includes a three-part miniseries on music-making artificial intelligence. Her work has also appeared in Quanta Magazine and Inverse. Parshall graduated from New York University's Arthur L. Carter Journalism Institute with a master's degree in science, health and environmental reporting. She has a bachelor's degree in psychology from Georgetown University. Follow Parshall on X (formerly Twitter) @parshallison

More by Allison Parshall