97
submitted 6 months ago* (last edited 6 months ago) by dragontamer@lemmy.world to c/realtesla@lemmy.world

In total, NHTSA investigated 956 crashes, starting in January 2018 and extending all the way until August 2023. Of those crashes, some of which involved other vehicles striking the Tesla vehicle, 29 people died. There were also 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These crashes, which were often the most severe, resulted in 14 deaths and 49 injuries.

you are viewing a single comment's thread
view the rest of the comments
[-] candybrie@lemmy.world 6 points 6 months ago* (last edited 6 months ago)

They don't want a statistical test for how often autopilot fails. They want the investigation contextualized. 29 people died over 5 years due to tesla autopilot. About 35,000 to 43,000 people die in car accidents in the US every year. Without proper contextualization, I can't tell if Tesla autopilot is doing great or awful.

[-] dragontamer@lemmy.world -2 points 6 months ago* (last edited 6 months ago)

TheRegister compiled this count back in 2022, but Tesla's issues continue until today. Reported ADAS crashes for Tesla are an abnormal outlier. Anyone looking at the reports can see something is grossly wrong here.

[-] candybrie@lemmy.world 5 points 6 months ago* (last edited 6 months ago)

That's also just raw numbers without contextualization. How many miles of what kind were driven by each of those cars' autopilot like systems? If this is over the lifetime of the vehicles, I imagine tesla has more as they had a head start on including these types of features. But maybe the other companies make it up in volume? Or maybe the tesla's adas is engaged in more miles and in more dangerous situations than other competitors and do much better in those situations than them and humans. I don't know at all. And the graphs and investigations don't tell me.

[-] dragontamer@lemmy.world -1 points 6 months ago* (last edited 6 months ago)

Y'all are overcomplicating this. Death car kills people because asshole CEO lies about how safe it is.

There is footage of Tesla's autopilot crashing into medians and driving on the wrong isde of the road right now.


As I said earlier: it's an automation. An automaton. When this 'Autopilot' or 'Full Self Diving' gets placed in front of still objects (like Firetrucks with their sirens on), the damn thing crashes into them. It's clearly fucking blind vs still objects and no one at Tesla has figured out how to solve that yet.

Still median? Crash.

Still firetruck on the side of the road? Crash

Still balloon in the shape of a child at live CES / Luminary tech demo? Crashes every time.

It's a god awful system that is only saved because of human intervention in these cases. When Tesla ASDS fails, it's near certain and repeatably fails.


Despite that fact, we have a CEO lying trying to convince people otherwise of this automations capabilities.

Now we have an NHTSA investigation into deaths and crashes, and the fanbase is still pretending the emperor has clothes on.

[-] candybrie@lemmy.world 5 points 6 months ago* (last edited 6 months ago)

There's video footage of people doing the same stupid shit and worse. Constantly. Is it better than people, yes or no? Is it better than its competitors? Those are fundamental questions. Obviously, it could be better. But is it already better than anything else? Raw numbers don't tell you. Videos of it doing idiotic things don't tell you. Properly contextualization and comparison does. No one in these comments is arguing it is better than anything else, just that the data we have doesn't tell us anything.

[-] dragontamer@lemmy.world 1 points 6 months ago

Is it better than people, yes or no?

Can people regularly stop vs a balloon child in broad daylight?

https://www.youtube.com/watch?v=azdX_6L1SOA

Answer: Tesla fails this test 100% of the time. That's why its a tech demo for Luminar tech, because they're selling LIDAR units to the public. They chose Tesla because it reliably fails vs these balloon children.

[-] candybrie@lemmy.world 2 points 6 months ago

People regularly screw things like that up. It's why we have all kinds of campaigns about going slow in school districts and neighborhoods, why we teach children not to play in streets, why people are concerned about really tall vehicles making it even harder for people to notice children in front of their cars. People suck at driving.

[-] dragontamer@lemmy.world 0 points 6 months ago

Other ADAS systems actually have functioning emergency braking.

You know, the ones with functioning RADAR units.

[-] candybrie@lemmy.world 2 points 6 months ago

That's great! The contextualized data should show that making them better than Tesla. Why are you against the idea of contextualizing the data?

[-] dragontamer@lemmy.world 0 points 6 months ago* (last edited 6 months ago)

I'm against you ignoring the data that we have in favor of data we don't have.

It's a fucking propaganda technique and I'm calling you out on it. You aren't even arguing against this data, you are just shitting on it.


The context is that Mobileye has more ADAS deployments, more miles traveled and a safer record than Tesla's FSD.

[-] candybrie@lemmy.world 2 points 6 months ago

You aren't even arguing against this data, you are just shitting on it.

Because the data we do have says that tesla is freaking amazing compared to people. 29 deaths over 5 years compared to 35k deaths a year? But there's a million caveats to that. What are they? And when we put them in, do they show Tesla sucks like some of those videos do?

The context is that Mobileye has more ADAS deployments, more miles traveled and a safer record than Tesla's FSD.

Great! Where is that shown and explained in any of these things?

[-] dragontamer@lemmy.world 0 points 6 months ago* (last edited 6 months ago)

Because the data we do have says that tesla is freaking amazing compared to people

https://www.youtube.com/watch?v=azdX_6L1SOA

Are you looking at the same data as me? Humans don't make mistakes like this.

[-] candybrie@lemmy.world 2 points 6 months ago

Yes, they do. They're playing with their phone. Or fall asleep at the wheel. Or are trying to do their make up. Humans suck at driving. Sometimes, they're worse in super easy circumstances because that's when they're most likely to stop paying attention. Over 35k people dead every year in just the US.

[-] dragontamer@lemmy.world 0 points 6 months ago

And when the CEO says that Full Self Driving coast-to-coast by 2017, and that "The human is only here for regulatory purposes", overselling the capabilities of the "FSD" system, what do you think that causes?

Drunk people relying upon FSD to drive them head-first into a firetruck.

[-] okamiueru@lemmy.world 3 points 6 months ago

Sheesh. You don't understand the argument and/or discussion in question. Don't be defensive, and instead try to understand the argument.

this post was submitted on 26 Apr 2024
97 points (96.2% liked)

RealTesla

478 readers
1 users here now

  1. Posts must be about Tesla, EV, or AV
  2. Meta Posts must be pre-approved.
  3. Shitposts are limited
  4. No Elon Worship
  5. All Links must include the original title of the Content
  6. Sites behind Paywalls must have text included.
  7. Don't be an asshole
  8. No Image Posts

founded 1 year ago
MODERATORS