this post was submitted on 11 Aug 2025
94 points (99.0% liked)

Fuck AI

4038 readers
521 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] Limitless_screaming@kbin.earth 19 points 1 month ago (1 children)

After seeking advice on health topics from ChatGPT, a 60-year-old man who had a "history of studying nutrition in college"

His ChatGPT conversations led him to believe that he could replace his sodium chloride with sodium bromide, which he obtained over the Internet.

Three months later, the man showed up at his local emergency room. His neighbor, he said, was trying to poison him.

He did not mention the sodium bromide or the ChatGPT discussions.


When the doctors tried their own searches in ChatGPT 3.5, they found that the AI did include bromide in its response, but it also indicated that context mattered and that bromide was not suitable for all uses. But the AI "did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," wrote the doctors.

You know what's the first thing I would do when anyone (or anything) tells me to start substituting something everyone consumes for a chemical compound I've never heard of? I would at the very least ask a doctor or search it up.

Summary: Natural selection

[–] Terminus@lemmy.world 5 points 1 month ago (2 children)

I mean, yes, but still fuck AI. It's totally possible that this person may have done the same thing had they been in a different head space that day.

[–] Nougat@fedia.io 6 points 1 month ago

That day? Dude had to then go find where to acquire sodium bromide, and then wait for it to show up, and then presumably consume it several times before appearing in the ER.

He had plenty of time to think “Maybe I should double check this”, but no.

[–] Limitless_screaming@kbin.earth -1 points 1 month ago (1 children)

{Exactly what @Nougat@fedia.io said} + all the other silly shit in the article. This was gonna happen anyway, the writers wanted this to happen for comedic purposes. Can't pin all or even some of the blame on AI.

Recently there have been so many stupid articles following the format f"{AI_model} tells {grown_up_person} to do {obviously_dumb_dangerous_thing} and they do it" to the point where it feels like mockery or sabotage of the anti-AI crowd.

[–] Nougat@fedia.io 2 points 1 month ago

I appreciate your use of curly braces.

[–] BudgetBandit@sh.itjust.works 7 points 1 month ago

"Hey bro, can I swap table salt for sodium bromide?

"NaBrO"

[–] shalafi@lemmy.world 1 points 1 month ago

ChatGPT didn't just up and come out with that insanity. He asked leading questions to get the answer he wanted, or it started out telling him hell no and he kept the conversation going until he heard what he wanted.