93
submitted 10 months ago by alessandro@lemmy.ca to c/pcgaming@lemmy.ca
you are viewing a single comment's thread
view the rest of the comments
[-] bionicjoey@lemmy.ca 3 points 10 months ago

That assumes the model is trained on a large training set of the worldstate encoding and understands what that worldstate means in the context of its actions and responses. That's basically impossible with the state of language models we have now.

[-] kakes@sh.itjust.works 1 points 10 months ago

I disagree. Take this paper for example - keeping in mind it's a year old already (using ChatGPT 3.5-turbo).

The basic idea is pretty solid, honestly. Representing worldstate for an LLM is essentially the same as how you would represent it for something like a GOAP system anyway, so it's not a new idea by any stretch.

this post was submitted on 19 Jan 2024
93 points (92.7% liked)

PC Gaming

8573 readers
539 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS