The Times Real Estate


.

  • Written by Chirag Shah, Professor of Information Science, University of Washington
AI search answers are the fast food of your information diet – convenient and tasty, but no substitute for good nutrition

If you have used Google lately and been lucky – or unlucky – enough to encounter an answer to your query rather than a bunch of links, you have been subjected to something called AI Overviews[1]. This is a new core feature that Google has been rolling out, a move widely anticipated since the company’s experiments with its LaMDA[2] large language model in 2021, and since OpenAI’s ChatGPT artificial intelligence chatbot rocketed to prominence in 2023[3].

a text box
Google’s knowledge panels summarize information from search results, in contrast to AI Overview, which generates answers based on its training data. Screen capture by The Conversation, CC BY-ND[4][5]

This feature is yet another addition to the increasing number of add-ons and tools being integrated into search engines like Google. Some of the notable examples include knowledge graph[6]-driven knowledge panels[7], which are used to populate relevant factual information in an infobox next to search results, and featured snippets[8], which are blurbs excerpted from a search result and provided before the link to that page.

But what’s different about AI Overviews is that they are not simply extracted from relevant sources but generated behind the scenes by Google’s generative AI technology. The company’s goal is to give you a personalized, on-demand answer instead of a standard set of documents or even an answer box[9] matching your query.

This seems almost magical and potentially useful in many situations. After all, people use search engines primarily to find answers and not lists of documents. But there’s more to the picture.

My colleague Emily Bender[10] and I have written about what search engine users need, want and have. We have shown that they want not only information but also the ability to discover, learn and question[11] what they find. In other words, users have a wide range of situations and objectives, and compressing them down to a set of links or, worse, a single answer is problematic[12].

Bad advice

These AI features vacuum up information from the internet and other available sources and spit out an answer based on how they are trained to associate words. A core argument against them is that they mostly remove from the equation the user’s judgment, agency and opportunity to learn[13].

This may be OK for many searches. Want a description of how inflation has affected grocery prices in the past five years, or a summary of what the European Union AI Act includes? AI Overviews can be a good way to cut through a lot of documents and extract those specific answers.

But people’s searching needs don’t end with factual information. They look for ideas, opinions and advice. Looking for suggestions about how to keep the cheese from sliding off your pizza? Google will tell you that you should add some glue to the sauce[14]. Or wondering if running with scissors has any health benefits? Sure, Google will say, “it can also improve your pores and give you strength[15]”.

Computer scientist Paulo Shakarian explains why ‘hallucinations’ – incorrect and often weird answers – are likely to continue to plague large language models and therefore tools like Google’s AI Overviews.

While a reasonable user can understand that such outrageous answers are likely to be wrong, it’s hard to detect that for factual questions.

For example, while researching the faith of U.S. presidents, Google’s AI Overviews gave the incorrect answer[16] that Barack Obama is a Muslim. This misinformation was widely circulated and debunked years ago, but Google regurgitated it with no good way for users to learn that it is misinformation.

What about a student using Google for homework and asking which countries in Africa start with the letter K? While Kenya does meet this criteria, Google’s AI Overviews incorrectly answered[17] that there are no such countries.

Google has acknowledged issues with AI Overviews and said it has addressed them[18]. But the concern remains: Can you really trust any answers you receive through this service?

How to avoid AI answers

There are alternatives. You can always go back to the traditional Google search with its 10 blue links. Click on “More” in the menu – All, News, Images, Maps, Videos and More – directly below the search field at the top of the Google search page and select “Web.”

You can then do what you have likely done for decades now[19] – sift through some of the top results, visit a few of those sites and decide for yourself. It does take a little work, but it gives you back the ability to examine multiple sites and evidence to support or refute something. More importantly, you leave open the possibilities for learning, discovery and serendipity[20].

AI Overviews is like fast food that gets delivered through a drive-through window – it’s quick, hot and convenient, but not the healthiest choice. Going through Google’s traditional search results is like examining a menu in a sit-down restaurant and placing an order that will take awhile to make it to your table. You can ask your server questions about those items and even request some changes to the restaurant’s offerings. It’s prepared with more care, customization and control, but also takes longer and may cost more.

These aren’t the only methods of finding information, however. There are alternatives to Google’s search engine, including specialty search tools.

For scholarly needs, Google Scholar[21], Semantic Scholar[22] and CORE[23] are helpful places to look for research papers and citations. Looking for medical information? Try PubMed[24], ScienceDirect[25] and OpenMD[26]. For legal needs, some services include Fastcase[27], Caselaw Access Project[28] and CourtListener[29].

Concerned about privacy? Check out DuckDuckGo[30], Startpage[31] and Swisscows[32]. If you still want AI-generated answers, some of the alternatives to Google’s AI Overviews and rival Bing’s Copilot are You.com[33] and Komo[34], which provide more transparency about the data they collect about you, provide greater privacy and also offer ways to opt out of having your data collected for training their AI models.

A balanced information diet

Perhaps you can’t afford to eat out at a nice restaurant or prepare every meal from scratch every time, but it’s important to avoid ending up going through a drive-through for all your nourishment. After all, you are what you eat[35], and in a similar vein, you are how you search.

It’s easy to fall for sensational headlines[36] and bite-size news that lack context. But you don’t have to let that define you. You can expand the scope of how you search. It’s OK to hit the drive-through every now and then and go for AI Overviews, but it’s important to also find other more wholesome ways to fulfill your needs – for food and for information.

References

  1. ^ AI Overviews (blog.google)
  2. ^ LaMDA (blog.google)
  3. ^ rocketed to prominence in 2023 (theconversation.com)
  4. ^ Screen capture by The Conversation (www.google.com)
  5. ^ CC BY-ND (creativecommons.org)
  6. ^ knowledge graph (blog.google)
  7. ^ knowledge panels (support.google.com)
  8. ^ featured snippets (support.google.com)
  9. ^ answer box (www.seoclarity.net)
  10. ^ Emily Bender (scholar.google.com)
  11. ^ ability to discover, learn and question (dl.acm.org)
  12. ^ is problematic (doi.org)
  13. ^ opportunity to learn (doi.org)
  14. ^ add some glue to the sauce (x.com)
  15. ^ it can also improve your pores and give you strength (x.com)
  16. ^ incorrect answer (www.nbcconnecticut.com)
  17. ^ incorrectly answered (www.theatlantic.com)
  18. ^ it has addressed them (blog.google)
  19. ^ do what you have likely done for decades now (www.pcmag.com)
  20. ^ possibilities for learning, discovery and serendipity (searchresearch1.blogspot.com)
  21. ^ Google Scholar (scholar.google.com)
  22. ^ Semantic Scholar (www.semanticscholar.org)
  23. ^ CORE (core.ac.uk)
  24. ^ PubMed (pubmed.ncbi.nlm.nih.gov)
  25. ^ ScienceDirect (www.sciencedirect.com)
  26. ^ OpenMD (openmd.com)
  27. ^ Fastcase (www.fastcase.com)
  28. ^ Caselaw Access Project (case.law)
  29. ^ CourtListener (www.courtlistener.com)
  30. ^ DuckDuckGo (duckduckgo.com)
  31. ^ Startpage (www.startpage.com)
  32. ^ Swisscows (swisscows.com)
  33. ^ You.com (you.com)
  34. ^ Komo (komo.ai)
  35. ^ you are what you eat (healthnews.com)
  36. ^ fall for sensational headlines (theconversation.com)

Authors: Chirag Shah, Professor of Information Science, University of Washington

Read more https://theconversation.com/ai-search-answers-are-the-fast-food-of-your-information-diet-convenient-and-tasty-but-no-substitute-for-good-nutrition-230759

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more