this post was submitted on 17 Mar 2025
491 points (96.6% liked)

Technology

66783 readers
5445 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Half of LLM users (49%) think the models they use are smarter than they are, including 26% who think their LLMs are “a lot smarter.” Another 18% think LLMs are as smart as they are. Here are some of the other attributes they see:

  • Confident: 57% say the main LLM they use seems to act in a confident way.
  • Reasoning: 39% say the main LLM they use shows the capacity to think and reason at least some of the time.
  • Sense of humor: 32% say their main LLM seems to have a sense of humor.
  • Morals: 25% say their main model acts like it makes moral judgments about right and wrong at least sometimes. Sarcasm: 17% say their prime LLM seems to respond sarcastically.
  • Sad: 11% say the main model they use seems to express sadness, while 24% say that model also expresses hope.
you are viewing a single comment's thread
view the rest of the comments
[–] Waraugh@lemmy.dbzer0.com 3 points 5 hours ago (3 children)

Is stringing words together really considered knowledge?

[–] CalipherJones@lemmy.world 2 points 5 hours ago

If they're strung together correctly then yeah.

[–] ILikeBoobies@lemmy.ca 1 points 5 hours ago

As much as a search engine is

[–] Donkter@lemmy.world 0 points 5 hours ago (2 children)

It's semantics. The difference between an llm and "asking" wikipedia a knowledge question is that the llm will "answer" you with predictive text. Both things contain more knowledge than you do, as in they have answers to more trivia and test questions than you ever will.

[–] Shanmugha@lemmy.world 2 points 2 hours ago

I have a new word for you: information

[–] Waraugh@lemmy.dbzer0.com 1 points 4 hours ago

I guess I can see that, maybe my understanding of words or their implication is incorrect. While I would agree they contain more knowledge I guess that reads different to me than being more knowledgeable. I think that maybe it comes across as anthropomorphizing a dataset of information to me. I could easily be wrong.