The thing is, it's not about it being convincing or not, it's about reinforcing problematic behaviors. LLMs are, at their core, agreement machines that work to fulfill whatever goal becomes apparent from the user (it's why they fabricate answers instead of responding in the negative if a request is beyond their scope). And when it comes to the mentally fragile, it doesn't even need to be particularly complex to "yes, and..." them swiftly into full on psychosis. Their brains only need the littlest bit of unfettered reinforcement to fall into the hole.
A properly responsible company would see this and take measures to limit or eliminate the problem, but these companies see the users becoming obsessed with their product as easy money. It's sickening.
I didn't get a light until the gb color. It held into port well enough, but yeah, that thing chewed through batteries as well. I was so happy when I got an sp.