We gave Google’s AI Overviews the benefit of the doubt. Here’s how they did.
Credit: NurPhoto via Getty Images
By now I don’t need to tell you Google has rolled out a feature called “AI Overviews” in Search and made it so that millions of people accustomed to receiving a page of links when they query the world’s most popular website now get an AI chatbot’s answer at the top of the results page. Â
I probably also don’t need to tell you the AI answers have been shown to occasionally be bizarre or even seemingly dangerous. Google Search has been caught on social media allegedly saying people should put glue on their pizzas, and claiming that Wario is canonically gay. The CEO of The Onion, Ben Collins, even pointed out that it was apparently confusing “facts” from Onion articles for, y’know, facts.
None of this hilarity can possibly be good for Google’s bottom line, but it doesn’t necessarily say all that much about the average Google user’s experience in aggregate. In recent years, regular Google Search has come under fire for serving deceptive or untrustworthy answers, or just plain old spam. So is using the new Google, warts and all, actually a downgraded experience?Â
You’ll answer that question for yourself over the coming months and years because you, like everyone, are going to have to use the new Google. In the meantime, though, here are some side-by-side comparisons that shed a little bit of light on exactly what we’re all getting here. I gathered a few dozen searches that I suspect might be found in the wild, used them to trigger AI responses, and compared those results to the same exact queries with AI Overviews disabled. These are the four results pages that I found the most illustrative of the contrast.
In my (admittedly unscientific) tests, I tried not to judge too harshly if the answers from human beings or the AI weren’t technically accurate — I’m mostly not qualified to evaluate them on that basis anyway. Instead, I focused on what Google probably focuses on: speed and user satisfaction. Even with the bar lowered in this way, the weaknesses of the new Google Search regime are glaring. But the AI scored some wins, and there are even some glimmers of hope and possibility.Â
Searches attempting to confirm what I already believeTest search: “proof that standing desks are bad”Â
Credit: Mashable screenshot from Google
This search is based on a hunch that Google users like to query the search engine in ways intended to simply confirm their biases.
For the most part, the AI Overview reads like a pretty uncontroversial list of possible problems that could arise while using a standing desk. It says “Standing for long periods can cause pain in your knees, hips, and feet,” and points to the possibilities of “Musculoskeletal disorders,” and “Varicose veins.”Â
It also says, “Standing desks aren’t designed to support your weight all day,” which — wait, what? Are people standing on their standing desks? Silly stuff.Â
Credit: Mashable screenshot from Google
The first result for the non-AI version is an article on the Harvard Health Blog called “The truth behind standing desks.” The article — which supposedly informs the AI answer — reads like a pretty fair evaluation of the relevant studies as of 2016, when it was written. And while it doesn’t make the case that standing desks are harmful, it makes them seem pretty worthless. It notes that subjects in one study burned 80 calories per hour sitting, compared to about 88 calories per hour standing.
That’s such a negligible difference, and given that sitting is a wonderful thing to do, I’m ready to call this a total vindication of my extremely unfair Google query. I feel very satisfied with this answer.Â
It’s safe to say the AI Overview bombed this one.Â
Science topics way too complicated to glean from a Google searchTest search: “do parrots understand what they’re saying”