this post was submitted on 17 Mar 2025
415 points (96.8% liked)
Videos
15379 readers
395 users here now
For sharing interesting videos from around the Web!
Rules
- Videos only
- Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
- Don't be a jerk
- No advertising
- No political videos, post those to !politicalvideos@lemmy.world instead.
- Avoid clickbait titles. (Tip: Use dearrow)
- Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
- Duplicate posts may be removed
Note: bans may apply to both !videos@lemmy.world and !politicalvideos@lemmy.world
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As soon as we have hard data from real world use and FSD is safer than the average human, it would be unethical to not solve the regulatory and legal issues and apply it on a larger scale to save human lives.
If a human driver causes a crash, the insurance pays. Why shouldn't they if a computer caused the crash, which drives safer overall, if only by let's say 10%.
I agree that it would be unethical to ignore self driving since it has the potential to be far safer than a human driver. I just have problems with companies over promising what their software can do.
As for the insurance part, why should my insurance premium increase for a software defect? If a manufacturer defect causes me to crash my car, the manufacturer is at fault, not me. You wouldn't be liable if the brakes gave out in a new car.
Also keep in mind that the hard data from the real world means putting these vehicles on the road with other drivers. Deficiencies in the software mean potential crashes and deaths. It will be valuable data but we can't forget that there are people behind it. Self driving is going to shake things up and will probably be a net positive overall. I just think we should be mindful as we begin to embrace it.