SAN FRANCISCO — Google said it was scaling down the use of AI-generated answers in some search results, after the tech made high-profile errors including telling users to put glue on their pizza and saying Barack Obama was Muslim.
Tech
Google scales back AI search answers after it told users to eat glue
Google’s head of search, Liz Reid, confirmed in a blog post Thursday afternoon that the company was scaling back some of the AI answers, which it calls “AI Overviews.” The company cut down on using social media posts as source material for the AI answers, is pausing some answers on health-related topics and “added triggering restrictions for queries where AI Overviews were not proving to be as helpful,” Reid said in the post.
The change is the latest example of Google launching an AI product with fanfare and then rolling it back after it goes awry. In February, the company blocked users from making images of people with its AI image tool after conservative commentators accused it of anti-White bias.
The tech industry is in the throes of an AI revolution, with start-ups and Big Tech giants alike trying to find new ways to put the tech into their products and make money from it. Many of the tools have been launched before they’re ready for prime time, as companies jostle to be the first to market their products and cast themselves as cutting-edge.
GET CAUGHT UP
Summarized stories to quickly stay informed
Google, whose employees invented much of the tech underlying breakthrough AI tools like ChatGPT, has been trying to prove to investors, consumers and its own employees that it is still the most important player in the industry. At its I/O conference this month, the company made more than 100 different AI-related announcements.
The biggest one was a confirmation that it would begin rolling out AI-generated answers in search results to most of its users. Google has been testing the AI answers for a year with a select group of users, but adding them to more search results meant most people would now begin to interact directly with generative AI on a tool they use every day.
The tech works by reading websites that would otherwise show up in Google search results and then summarizing them into multi-paragraph answers. Publishers have cried foul, accusing the company of hurting their businesses by taking their content and regurgitating it for users directly in search results, depriving them of important web traffic.
But journalists, search engine experts and social media users quickly began spotting problems with the answers. Some of the responses were funny while others were concerning. They showed up on sensitive queries as well, including health-related ones.
One answer, which Google has since fixed, told people to drink plenty of urine to help pass a kidney stone. Another said John F. Kennedy graduated from the University of Wisconsin at Madison in six different years, three of which were after his death.
Google tried to test the tool as much as it could before the broader rollout, but Reid said the full-scale launch revealed many situations the company hadn’t prepared for.
“There’s nothing quite like having millions of people using the feature,” Reid said.