138
you are viewing a single comment's thread
view the rest of the comments
[-] Soyweiser@awful.systems 13 points 5 months ago

I think solving the AI hallucination problem — I think that’ll be fixed.

Wasn't this an unsolvable problem?

[-] Amoeba_Girl@awful.systems 20 points 5 months ago* (last edited 5 months ago)

it's unsolvable because it's literally how LLMs work lol.

though to be fair i would indeed love for them to solve the LLMs-outputting-text problem.

[-] aStonedSanta@lemm.ee 2 points 5 months ago

Yeah. We need another program to control the LLM tbh.

[-] zogwarg@awful.systems 5 points 5 months ago

Sed Quis custodiet ipsos custodes = But who will control the controllers?

Which in a beautiful twist of irony is thought to be an interpolation in the texts of Juvenal (in manuscript speak, an insert added by later scribes)

this post was submitted on 03 Jun 2024
138 points (100.0% liked)

TechTakes

1401 readers
205 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS