885
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 27 Apr 2024
885 points (95.8% liked)
Technology
59570 readers
3212 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
This is the best summary I could come up with:
In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.
The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.
NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.
Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.
Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.
The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.
The original article contains 788 words, the summary contains 158 words. Saved 80%. I'm a bot and I'm open source!
Cameras and AI aren't a match for radar/lidar. This is the big issue with the approach to autonomy Tesla's take. You've only a guess if there are hazards in the way.
Most algorithms are designed to work and then be statistically tested. To validate that they work. When you develop an algorithm with AI/machine learning, there is only the statistical step. You have to infer whole systems performance purely from that. There isn't a separate process for verification and validation. It just validation alone.
When something is developed with only statistical evidence of it working you can't be reliably sure it works in most scenarios. Except the exact ones you tested for. When you design an algorithm to work you can assume it works in most scenarios if the result are as expected when you validate it. With machine learning, the algorithm is obscured and uncertain (unless it's only used for parameter optimisation).
Machine learning is never used because it's a better approach. It's only used when the engineers don't know how to develop the algorithm. Once you understand this, you understand the hazard it presents. If you don't understand or refuse to understand this. You build machines that drive into children, deliberately. Through ignorance, greed and arrogance Tesla built a machine that deliberately runs over children.