www.upworthy.com
I Googled to see if Maria Von Trapp remarried after Georg died. The result was horrifying.
With AI being implemented seemingly everywhere for seemingly everything these days, it wasn't surprising when Google launched its "AI Overview" in the spring of 2024. With messaging like "Generative AI in Search: Let Google do the searching for you" and "Find what you're looking for faster and easier with AI overviews in search results," the expectation is that AI will parse through the search results for you and synopsize the answer.That sounds great. The problem is, its synopsis is too often entirely wrong. We're not talking just a little misleading or incomplete, but blatantly, factually false. Let me show you an example. I recently wrote an article about the real-life love story between Maria and Georg Von Trapp, and as part of my research, I found out Georg died 20 years after they married. I hadn't seen anything about Maria remarrying, so I Googled whether she had. Here's what the AI Overview said when I searched last week: This is what Google AI Overview said when I asked how many times Maria Von Trapp had been married. It's wrong.Screenshot via Google "Maria Von Trapp married twice. First, she married Georg Von Trapp in 1927 and they had 10 children together. After Georg's death, she married Hugh David Campbell in 1954 and had 7 daughters with him. Later, she also married Lynne Peterson in 1969 and had one son and daughter with him." Something about that didn't add up—and it wasn't just how it said she married twice but then listed three spouses. Maria Von Trapp was born in 1905, so according to the AI Overview, she remarried at 49 years old and had seven more children, and then married again at 64 years old and had another two children. That seems…unlikely. Did Maria Von Trapp have two children in her mid-60s? No. Giphy So I clicked the link icon on the AI Overview, which took me to the Maria Von Trapp Wikipedia page. On that page, I found a chart where the extra two spouses were listed—but they very clearly weren't hers. Hugh David Campbell was the husband of one of her daughters. Lynn Peterson was the wife of one of her sons. The fact is that Maria never remarried after Georg died. If I had just run with the AI Overview, I would have gotten it this very basic fact about her life completely wrong. And it's not like it pulled that information from a source that got it wrong. Wikipedia had it right. The AI Overview extrapolated the real information incorrectly. Ironically, when I Googled "Did Maria Von Trapp remarry after Georg died?" in the middle of writing this article to see if the same result came back, the AI Overview got it right, citing the Upworthy article I wrote. (Yes, I laughed out loud.) After my article was published, the AI Overview cited it while giving the correct answer.Screenshot via Google This may seem like a lot of fuss over something inconsequential in the big picture, but Maria Von Trapp's marital status is not the only wrong result I've seen in Google's AI Overview. I once searched for the cast of a specific movie and the AI Overview included a famous actor's name that I knew for 100% certain was not in the film. I've asked it for quotes about certain subjects and found quotes that were completely made up.Are these world-changing questions? No. Does that matter? No. Facts should matter no matter what they are. Giphy GIF by Angie Tribeca Objective facts are objective facts. If the AI Overview so egregiously messes up the facts about something that's easily verifiable, how can it be relied on for anything else? Since its launch, Google has had to fix major errors, like when it responded to the query "How many Muslim presidents has the U.S. had?" with the very wrong answer that Barack Obama had been our first Muslim president. Some people have "tricked" Google's AI into giving ridiculous answers by simply asking it ridiculous questions, like "How many rocks should I eat?" but that's a much smaller part of the problem. Most of us have come to rely on basic, normal, run-of-the-mill searches on Google for all kinds of information. Google is, by far, the most used search engine, with 79% of the search engine market share worldwide as of March 2025. The most relied upon search tool should have reliable search results, don't you think?Even the Google AI Overview itself says it's not reliable: Google's AI Overview doesn't even trust itself to be accurate.Screenshot via GoogleAs much as I appreciate how useful Google's search engine has been over the years, launching an AI feature that might just make things up and put them them at the top of the search results feels incredibly irresponsible. And the fact that it still spits out completely (yet unpredictably) false results about objectively factual information over a year later is unforgivable, in my opinion.We're living in an era where people are divided not only by political ideologies but by our very perceptions of reality. Misinformation has been weaponized more and more over the past decade, and as a result, we often can't even agree on the basic facts much less complex ideas. As the public's trust in expertise, institutions, legacy media, and fact-checking has dwindled, people have turned to alternative sources to get information. Unfortunately, those sources come with varying levels of bias and reliability, and our society and democracy are suffering because of it. Having Google spitting out false search results at random is not helpful on that front.
— (@)
AI has its place, but this isn't it. My fear is that far too many people assume the AI Overview is correct without double-checking its sources. And if people have to double-check it anyway, the thing is of no real use—just have Google give links to the sources like they used to and end this bizarre experiment with technology that simply isn't ready for its intended use.