196
you are viewing a single comment's thread
view the rest of the comments
[-] JoeyJoeJoeJr@lemmy.ml 10 points 9 months ago

You are falling into a common trap. LLMs do not have understanding - asking it to do things like convert dates and put them on a number line may yield correct results sometimes, but since the LLM does not understand what it's doing, it may "hallucinate" dates that look correct, but don't actually align with the source.

[-] Byter@lemmy.one 1 points 9 months ago

Thank you for calling that out. I'm well aware, but appreciate your cautioning.

I've seen hallucinations from LLMs at home and at work (where I've literally had them transcribe dates like this). They're still absolutely worth it for their ability to handle unstructured data and the speed of iteration you get -- whether they "understand" the task or not.

I know to check my (its) work when it matters, and I can add guard rails and selectively make parts of the process more robust later if need be.

this post was submitted on 05 Feb 2024
196 points (97.1% liked)

Technology

59148 readers
2261 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS