SwampYankee

joined 2 years ago
[–] SwampYankee@mander.xyz 0 points 1 year ago

Your mistake was trying to do literally anything yourself. Just sit down and let daddy Microsoft take care of everything for you.

[–] SwampYankee@mander.xyz 2 points 1 year ago

Don't forget Meatball Ron while we're at it.

[–] SwampYankee@mander.xyz 3 points 1 year ago

Has no one here ever used a debit card?

[–] SwampYankee@mander.xyz 2 points 1 year ago

Or a debit card...

[–] SwampYankee@mander.xyz 1 points 1 year ago

Well, if each universe itself is infinite, then there are infinite planets, meaning the number of planets that are not earth is infinite. So, if there are infinite universes that are, themselves, infinite, then the infinity of universes that do not contain earth is infinitely larger than the infinity of universes that do.

[–] SwampYankee@mander.xyz 4 points 1 year ago (2 children)

With infinite universes, every possible eventuality is realized an infinite number of times. There are infinite universes without earth and infinite universes with it.

[–] SwampYankee@mander.xyz 1 points 1 year ago (1 children)

VTOL VR is awesome too. The problem with a lot of games that support VR is they don't support the controllers to the same extent. Playing VR with an Xbox controller instead of the motion tracking Index controllers just ain't the same.

[–] SwampYankee@mander.xyz 1 points 1 year ago

I guess I'm wondering if there's some way to bake the contextual understanding into the model instead of keeping it all in vram. Like if you're talking to a person and you refer to something that happened a year ago, you might have to provide a little context and it might take them a minute, but eventually, they'll usually remember. Same with AI, you could say, "hey remember when we talked about [x]?" and then it would recontextualize by bringing that conversation back into vram.

Seems like more or less what people do with Stable Diffusion by training custom models, or LORAs, or embeddings. It would just be interesting if it was a more automatic process as part of interacting with the AI - the model is always being updated with information about your preferences instead of having to be told explicitly.

But mostly it was just a joke.

[–] SwampYankee@mander.xyz 4 points 1 year ago

It's amazing the way you NOTICE TWO THINGS.

[–] SwampYankee@mander.xyz -1 points 1 year ago (3 children)

Basically, the more vram you have, the better the contextual understanding, their memory is. Otherwise you’d have a bot that maybe knows to only contextualize the last couple messages.

Hmm, if only there was some hardware analogue for long-term memory.

[–] SwampYankee@mander.xyz 1 points 1 year ago (1 children)

Yes, who knows? If only someone had transcribed the article into the lemmy post and mentioned armed security in the third sentence. But I guess since that didn't happen, we'll never know if the Governor's home is protected at all times by the Massachusetts State Police.

[–] SwampYankee@mander.xyz 16 points 1 year ago

Top Dakota and Bottom Dakota

view more: ‹ prev next ›