1034
Mayonnaise Rule (files.catbox.moe)
submitted 9 months ago by Gork@lemm.ee to c/196@lemmy.blahaj.zone
you are viewing a single comment's thread
view the rest of the comments
[-] fidodo@lemmy.world 2 points 9 months ago

There are techniques to make these kinds of errors less common already today. For example, you can ask it to think through its answers step by step using first principals. If you and an LLM to do that it will write out the letters line by line which gives it enough context to correctly answer using the improved probability the context window gives it. You can even ask it to write programs to answer questions so it could write a quick script to do it programmatically.

The main reason you don't see AIs doing this today is that producing all that extra context is slow and expensive and it's unnecessary a lot of the time for most prompts. As the technology gets faster and cheaper and the use cases get more complex these techniques will be used more and more often.

While the technology does have fundamental flaws, that doesn't mean there aren't ways to work with those flaws to avoid the problems they have when using the raw output.

this post was submitted on 10 Feb 2024
1034 points (100.0% liked)

196

16452 readers
1732 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS