this post was submitted on 17 Mar 2025
461 points (97.3% liked)
Technology
67535 readers
5168 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As much as I like Mark, He's got some explaining to do.
At 15:42 the center console is shown, and autopilot is disengaged before impact. It was also engaged at 39mph during the youtube cut, and he struck the wall at 42mph. (ie the car accelerated into the wall)
Mark then posted the 'raw footage' on twitter. This also shows autopilot disengage before impact, but shows it was engaged at 42mph. This was a seprate take.
/edit;
Youtube, the first frames showing Autopilot being enabled: 39mph
Twitter, the first frames showing autopilot being enabled: 42mph
https://www.motortrend.com/news/nhtsa-tesla-autopilot-investigation-shutoff-crash/
No. That's by design. The "autopilot" is made to disengage when any likely collision is about to occur to try to reduce the likelihood of someone finding them liable for their system being unsafe.
Not saying you're wrong (because I've always found it suspicious how Tesla always seems to report that autopilot is disengaged for fatal accidents) but there's probably some people asking themselves "how could it detect the wall to disengage itself?".
The image on the wall has a perspective baked into it so it will look right from a certain position. A distance from which the lines of the real road match perfectly with the lines of the road on the wall. As you get closer than this distance the illusion will start to break down. The object tracking software will say "There are things moving in ways I can't predict. Something is wrong here. I give up. Hand control to driver".
Autopilot disengaged.
(And it only noticed a fraction of a second before hitting it, yet Mark is very conscious of it. He's screaming. )
Sidenote: the same is true as you move further from the wall than the ideal distance. The illusion will break down in that way too. However, the effect is far more subtle when you're too far away. After all, the wall is just a tiny bit of your view when you're a long way away, but it's your whole view when you're just about to hit it.
If you watch the footage it's clearly the same take.
From the twitter footage:
This is from the first couple frames showing that Autopilot is enabled, just as the blue lines appeared on screen: 42mph displayed on the center console.
And from the youtube footage:
Again, from the first couple frames as Autopilot is enabled, just as the blue lines appear: 39mph displayed on the center console.
Here's more from youtube, taken several seconds apart:
They are very very similar, but they do appear to be two different takes.
He stated in the video he tapped the brakes. Doesn’t that disengage auto pilot?
It would, but he explicitly says 'without even a slight tap on the breaks' in the youtube video.
Then:
- Mark Rober
Twitter.
He did state that he hit the brakes. Just on the fog one, not the wall. 🤷
But the fact that FSD disengages 17 frames before the crash also implies the software, along with the car, crashed 😂 I’d love to see the logs. I imagine the software got real confused real quick.
This is likely what happened. The software hit an invalid state that defaulted to disengaging the autopilot feature. Likely hit an impossible state as the camera could no longer piece the "wall" parts of the road with the actual road as it got too close.
Likely an error condition occured that would happen in the same way if an object suddenly covered the camera while driving. It would be even more concerning if the autopilot DIDN'T disengage at that point.
The software likely disengaged because the wall was so close it could not function. It would be the same as if an object suddenly covered up the camera while you were driving. It would turn off the autopilot/FSD or whatever they call it. (At least I hope it would)
He explicitly states that he didn't tap the brakes.
He said he tapped the brakes on one of the earlier non-wall runs.