this post was submitted on 15 Oct 2025
286 points (93.1% liked)
TechTakes
2253 readers
127 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I imagine there are a few reasons. An LLM is a narcissist's dream--it will remain focused on you and tell you what you want to hear (and is always willing to be corrected).
In addition, LLMs are easy to manipulate, and sort of mimic a person enough to give you a sense of power or authority. So if you're the type of person who gets something from that, there's likely a draw to that kind of person.
Those are just guesses, though. I don't use LLMs myself, so I don't really know.
Thanks, that sounds reasonable. Especially the focus/attention.
Maybe it's the same as with other games or computer games... Some people also really get something out of fantasy achievements and when they win and feel like the main character... in a weird way...