Advertisement

Google and OpenAI argue that AI fabrications are an ‘inherent feature’

Google's AI search function has been dishing out wildly incorrect information. 

Google's AI search function has been dishing out wildly incorrect information.  Photo: Getty

When Google launched its AI-driven search function on May 14, it promised users the research, planning and brainstorming tool of the future.

“We’ve meticulously honed our core information quality systems to help you find the best of what’s on the web,” Liz Reid, head of Google search, said.

“We’ve built a knowledge base of billions of facts about people, places and things, all so you can get information you can trust in the blink of an eye.”

Instead, Google’s AI search function has been dishing out wildly incorrect information.

Users of AI Overview have received advice ranging from a depression-treating jump off the Golden Gate Bridge to eating rocks for their nutritional value.

Google’s new AI Overview has had a rough launch. Photo: Google

Artificial intelligence ‘hallucinations’ aren’t a new phenomenon: Since ChatGPT first popularised generative AI and large language models (LLMs), ‘facts’ have been spawned out of thin air.

Google CEO Sundar Pichai said that “there is a lot of nuance” to generative AI giving clearly incorrect answers.

“You’re getting at a deeper point where hallucination is still an unsolved problem. In some ways, it’s an inherent feature,” Pichai said in an interview with The Verge.

“LLMs aren’t necessarily the best approach to always get at factuality.”

Fixing the problem?

Toby Walsh, a professor of AI at UNSW Sydney, explained that these false answers occur because generative AI doesn’t know what is true, just what is popular.

“For example, there aren’t a lot of articles on the web about eating rocks as it is so self-evidently a bad idea,” he said in The Conversation.

“There is, however, a well-read satirical article from The Onion about eating rocks and so Google’s AI based its summary on what was popular, not what was true.”

Pizza enthusiasts, or anyone who likes food, were shocked by this suggestion. Photo: Google

Google’s own promotional material about its Bard chatbot (now Gemini) made false claims about the James Webb Space Telescope, while a study found that when ChatGPT generated 178 scientific references for a research article, 69 could not be substantiated.

Sam Altman, CEO of OpenAI, made a similar argument to Pichai during an interview in September, describing AI hallucinations as just as much of a feature as a bug.

“One of the sort of non-obvious things is that a lot of value from these systems is heavily related to the fact that they do hallucinate,” he said.

“If you want to look something up in a database, we already have good stuff for that.”

Personal impact

Although Google and OpenAI argue that the creativity of their AI model’s answers is a feature, real people are being affected by the results.

Brian Hood, the Mayor of Hepburn Shire outside Melbourne, threatened to sue ChatGPT’s creator, OpenAI, after he was falsely named as taking part in a bribery scandal involving the Reserve Bank of Australia, but abandoned the lawsuit.

This pregnancy advice was not well received. Photo: Google

Other cases have landed people hot water, including lawyers using ChatGPT to cite case law and US-based professor Jonathan Turley being falsely accused of sexually harassing one of his students.

“The program promptly reported that I had been accused of sexual harassment in a 2018 Washington Post article after groping law students on a trip to Alaska,” Turley said in USA Today.

“It was a surprise to me since I have never gone to Alaska with students, The Post never published such an article, and I have never been accused of sexual harassment or assault by anyone.”

Google’s promise of trustworthy information is starting to look as accurate as the AI Overview search results.

Advertisement
Stay informed, daily
A FREE subscription to The New Daily arrives every morning and evening.
The New Daily is a trusted source of national news and information and is provided free for all Australians. Read our editorial charter.
Copyright © 2025 The New Daily.
All rights reserved.