342
this post was submitted on 27 Aug 2025
342 points (97.0% liked)
Technology
74519 readers
4820 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If the driver wants to kill himself and drives into a tree at 200kph, the manufacturer is not responsible
If the cars response to the driver announcing their plan to run into a tree at maximum velocity was "sounds like a grand plan", i feel like this would be different
Because if he didn't use the jailbreak it would give him crisis resources
but even OpenAI admitted that they're not perfect:
That said chatgpt or not I suspect he wasn't on the path to a long life or at least not a happy one:
I think OpenAI could do better in this case, the safeguards have to be increased but the teen clearly had intent and overrode the basic safety guards that were in place, so when they quote things chatgpt said I try to keep in mind his prompts included that they were for "writing or world-building."
Tragic all around :(
I do wonder how this scenario would play out with any other LLM provider as well