1034
Mayonnaise Rule (files.catbox.moe)
submitted 9 months ago by Gork@lemm.ee to c/196@lemmy.blahaj.zone
you are viewing a single comment's thread
view the rest of the comments
[-] Umbrias@beehaw.org 10 points 9 months ago

This is genuinely great content for demonstrating that ai search engines and chat bots are not in a place where you can trust them implicitly, though many do

[-] Daxtron2@startrek.website 1 points 9 months ago

Which is exactly why every LLM explicitly says this before you start.

[-] Umbrias@beehaw.org 12 points 9 months ago

"Why, we aren't at fault people are using the tool we are selling for the thing we marketed it for, we put a disclaimer!"

[-] Daxtron2@startrek.website 1 points 9 months ago

You've seen marketing for the big LLMs that's marketing them as search engines?

[-] Umbrias@beehaw.org 3 points 9 months ago
[-] Daxtron2@startrek.website 1 points 9 months ago

Bing w/ LLM summarization of results is not an LLM being used as a search engine

[-] Umbrias@beehaw.org 3 points 9 months ago
[-] ProgrammingSocks@pawb.social 1 points 9 months ago

Bing literally has a copilot frame pop up every time you search with it that tries to answer your question

[-] Daxtron2@startrek.website 1 points 9 months ago

Again, LLM summarization of search results is not using an LLM as a search engine

this post was submitted on 10 Feb 2024
1034 points (100.0% liked)

196

16487 readers
1648 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS