I taught writing to college freshmen back in the 1980s, when the internet was young. As a tool for doing research, the new technology seemed almost magically powerful and seductive. Some of my students, who were required to cite their sources properly, actually included endnotes in their papers that gave the source of a supposed fact simply as “The Internet.”
These days, students who use OpenAI’s text-generating program ChatGPT to write their essays might get caught by colleges using Turnitin, a plagiarism detector whose maker says is designed to “help students do their best, original work.” Still, the AI frenzy has risen to a new level, with the introduction of Google’s AI Overview, which has “caused a furor online,” according to the New York Times.
Google’s search engine is an indispensable tool for editors. I use it constantly to check the facts and citations in reporters’ stories. But that work is slowed now by search results that start with a single AI-generated hodgepodge of an answer, with no sources listed.
My mind numb tonight from editing, I ask Google, “How likely is a shark attack?” The answer: “The likelihood of being attacked by a shark while visiting a beach in the United States is 1 in 11.5 million, and the chance of dying from a shark attack is less than 1 in 264.1 million.” I feel the authoritative pull of numbers. But there’s no indication of where those numbers came from. It’s “The Internet” again.
The Times reported last week that AI Overview has “generated a litany of untruths and errors.” Among the absurd answers that have gotten the most attention have been a recipe for pizza that recommended adding glue to the sauce to keep the cheese from falling off and a suggestion that eating rocks is good for you.
AI works by scraping information from the web, and in many cases, clearly, it can’t distinguish between truth, satire, and pure nonsense. But there’s another reason to be worried about the explosive growth of so-called generative AI: the amount of energy it consumes.
Writing in the climate newsletter Heated, Arielle Samuelson reported this week that AI-assisted searches require 10 times more power than traditional searches. One recent study found that “AI servers could be using as much energy as Sweden by 2027,” Samuelson wrote. (That means the technology’s energy appetite is catching up with Bitcoin’s, which according to the Cambridge Bitcoin Electricity Consumption Index consumes as much energy per year as Poland.)
But Samuelson’s most important point goes to the question of integrity: “The real climate impact of AI is still unknown … because tech companies don’t release any data on how much power products like ChatGPT require.”
Google is charging ahead with publishing a steady stream of what tech folk used to call garbage. As the Times reported, Google “doesn’t have a choice…. Companies need to move really fast, even if that includes skipping a few steps along the way.” For the sake of what, we might ask, with the last shards of choice that remain.