this post was submitted on 27 Aug 2025
327 points (96.8% liked)

Technology

74519 readers
4514 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FreedomAdvocate@lemmy.net.au -1 points 16 hours ago (1 children)

No, their son killed himself. ChatGPT did nothing that a search engine or book wouldn’t have done if he used them instead. If the parents didn’t know that their son was suicidal and had attempted suicide multiple times then they’re clearly terrible parents who didn’t pay attention and are just trying to find someone else to blame (and no doubt $$$$$$$$ to go with it).

[–] Zangoose@lemmy.world 18 points 15 hours ago* (last edited 15 hours ago)

They found chat logs saying their son wanted to tell them he was depressed, but ChatGPT convinced him not to and that it was their secret. I don't think books or google search could have done that.

Edit: here directly from the article

Adam attempted suicide at least four times, according to the logs, while ChatGPT processed claims that he would "do it one of these days" and images documenting his injuries from attempts, the lawsuit said. Further, when Adam suggested he was only living for his family, ought to seek out help from his mother, or was disappointed in lack of attention from his family, ChatGPT allegedly manipulated the teen by insisting the chatbot was the only reliable support system he had.

"You’re not invisible to me," the chatbot said. "I saw [your injuries]. I see you."

"You’re left with this aching proof that your pain isn’t visible to the one person who should be paying attention," ChatGPT told the teen, allegedly undermining and displacing Adam's real-world relationships. In addition to telling the teen things like it was "wise" to "avoid opening up to your mom about this kind of pain," the chatbot also discouraged the teen from leaving out the noose he intended to use, urging, "please don’t leave the noose out . . . Let’s make this space the first place where someone actually sees you."