559
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

Safe Streets Rebel's protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said...

you are viewing a single comment's thread
view the rest of the comments
[-] catsarebadpeople@sh.itjust.works 123 points 1 year ago

Thousands of accidents a year from human drivers. I sleep

90 accidents a year from autonomous vehicles. Lazer eyes

[-] over_clox@lemmy.world 89 points 1 year ago

You make it sound like it's a 50/50 split between human drivers and autonomous vehicles, which is definitely not the case.

There are way more human drivers than autonomous vehicles. So, when an autonomous vehicle runs your child or pet over or whatever, who do you blame? The company? The programmers? The DMV for even allowing them on the road in the first place?

What's an autonomous vehicle do if it gets a flat? Park in the middle of the interstate like an idiot instead of pulling over and phone home for a mechanic?

[-] donalonzo@lemmy.world 16 points 1 year ago* (last edited 1 year ago)

You need to first ask yourself if it more important to put blame than to minimize risk.

"Autonomous vehicles could potentially reduce traffic fatalities by up to 90%."

"Autonomous vehicle accidents have been recorded at a slightly lower rate compared with conventional cars, at 4.7 accidents per million miles driven."

https://blog.gitnux.com/driverless-car-accident-statistics/

[-] HedonismB0t@lemmy.ml 38 points 1 year ago

That opinion puts a lot of blind faith in the companies developing self driving and their infinitely altruistic motives.

[-] donalonzo@lemmy.world 13 points 1 year ago

That's one way of strawmanning your way out of a discussion.

[-] rtxn@lemmy.world 34 points 1 year ago* (last edited 1 year ago)

It's not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there's no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.

I was actually trying to find an article I'd read about Tesla's self-driving software reverting to manual control moments before impact, but I was literally flooded by fatality reports.

[-] HobbitFoot@thelemmy.club 11 points 1 year ago

We can't audit the code for humans, but we still let them drive.

If the output for computers driving is less than for humans and the computer designers are forced to be as financially liable for car crashes as humans, why shouldn't we let computers drive?

[-] Shayreelz@sh.itjust.works 18 points 1 year ago

I'm not fully in either camp in this debate, but fwiw, the humans we let drive generally suffer consequences if there is an accident due to their own negligence

[-] Obi@sopuli.xyz 12 points 1 year ago

Also we do audit them, it's called a license. I know it's super easy to get one in the US but in other countries they can be quite stringent.

load more comments (1 replies)
[-] rambaroo@lemmy.world 12 points 1 year ago* (last edited 1 year ago)

Because there's no valid excuse to prevent us from auditing their software and it could save lives. Why the hell should we allow then to use the road if they won't even let us inspect the engine?

A car isn't a human. It's a machine, and it can and should be inspected. Anything less than that is pure recklessness.

load more comments (1 replies)
[-] kep@lemmy.world 11 points 1 year ago

Strawman arguments can be factual. The entire point is that you're responding to something that wasn't the argument. You're putting words in their mouth to defeat them instead of addressing their words at face value. It is the definition of a strawman argument.

[-] donalonzo@lemmy.world 4 points 1 year ago* (last edited 1 year ago)

It is most definitely a strawman to frame my comment as considering the companies "infinitely altruistic", no matter what lies behind the strawman. It doesn't refute my statistics but rather tries to make me look like I make an extremely silly argument I'm not making, which is the defintion of a strawman argument.

[-] rambaroo@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

The data you cited comes straight from manufacturers, who've repeatedly been shown to lie and cherry-pick their data to intentionally mislead people about driverless car safety.

So no it's not a straw man argument at all to claim that you're putting inordinate faith in manufacturers, because that's exactly what you did. It's actually incredible to me how many of you are so irresponsible that you're not even willing to do basic cross-checking against an industry that is known for blatantly lying about safety issues.

load more comments (1 replies)
[-] IntoDaLagoon@lemmygrad.ml 9 points 1 year ago

What do you mean, I'm sure the industry whose standard practices include having the self-driving function turn itself off nanoseconds before a crash to avoid liability is totally motivated to spend the time and money it would take to fix the problem. After all, we live in a time of such advanced AI that all the news sites and magazines tell me we're on the verge of the Singularity, and they've never misled me before.

[-] RedWizard@lemmygrad.ml 5 points 1 year ago

I feel like I'm taking crazy pills because no on seems to know or give a shit that Tesla was caught red handed doing this. They effectively murdered those drivers.

[-] biddy@feddit.nl 7 points 1 year ago

That wasn't an opinion, it's a statistic.

No (large public) company ever has altruistic motives. They aren't inherently good or bad, just machines driven by profit.

[-] HobbitFoot@thelemmy.club 4 points 1 year ago

You don't need to put faith into companies beyond the faith that is put into humans. Make companies just as financially liable as humans are, and you'll still see a decrease in accidents.

[-] xavier666@lemm.ee 7 points 1 year ago

You mean those companies who will lobby and spend a fraction of their wealth to make those lawsuits disappear?

[-] HobbitFoot@thelemmy.club 3 points 1 year ago

How is that different from the current system of large vehicular insurance companies spending a fraction of their wealth to make their lawsuits disappear?

[-] xavier666@lemm.ee 6 points 1 year ago

It's no different at all. We should have stronger laws for such scenarios.

load more comments (5 replies)
[-] over_clox@lemmy.world 2 points 1 year ago

So...

Your car is at fault. Their kid is dead.

Who pays for the funeral?

Does your insurance cover programming glitches?

[-] HumbertTetere@feddit.de 11 points 1 year ago

If your insurance determined that an autonomous vehicle will cause less damage over time than a human driver, they will do that, yes.

load more comments (12 replies)
[-] CoderKat@lemm.ee 6 points 1 year ago

I mean, why shouldn't it? Is a programming glitch in a self driving all that different from a mechanical issue in a manually driven car?

[-] over_clox@lemmy.world 3 points 1 year ago

AI driven cars are just as prone to mechanical issues as well. Is AI smart enough to deal with a flat tire? Will it pull over to the side of the road before phoning in for a mechanic, or will it just ignorantly hard stop right in the middle of the interstate?

What's AI do when there's a police officer directing traffic around an accident or through a faulty red light intersection? I've literally seen videos on that before, AI couldn't give two shits about a cop's orders as to which way to drive the vehicle.

load more comments (6 replies)
load more comments (1 replies)
[-] jtmetcalfe@lemmy.fmhy.ml 41 points 1 year ago

Using the public as Guinea pigs for corporate profits: priceless

load more comments (1 replies)
[-] randon31415@lemmy.world 41 points 1 year ago

DARPA figures out how to safely drive cars using LIDAR. Musk asked for a self driving car. Engineers come back the LIDAR solution. Musk fires them, says if humans can drive with two eyes, then so can computers. Cameras are cheaper than LIDAR. Second group tries it with cameras, can't get it to work, asked why they can't use LIDAR. Second group of engineers is fired. Third group comes up with something that 'kind of works'. People die. Big companies avoid self driving altogether, even though we have a perfect solution with LIDAR, all because Musk wanted to save a buck and can't get out of the way of his engineers.

[-] Yendor@reddthat.com 13 points 1 year ago

I’ve worked on serious projects involving LiDAR. The LiDAR you need at these speeds and with this resolution cost almost as much as an Electric Car - it’s too expensive to reach wide adoption. But video processing with CNNs/RNNs has proven you can build the same level of data with cameras. You don’t even need binocular cameras now - if objects are moving you can generate binocular data by combining IMU data with time-series imagery.

As I understand it, Tesla’s delays aren’t related to image capture (which is where LiDAR could help). They’re related to trying to find universal actions to take against an almost infinite number of possible scenarios (mostly actions by human drivers).

[-] pickle_party247@lemmy.world 28 points 1 year ago

the real funny here is how the USA has the most lax driving test standards in the developed world resulting in crazy amounts of road traffic accidents and really high mortality rates, but instead of dealing with shitty driving at the source there's a billion dollar industry in autonomous driving.

[-] Clown_Tempura@lemmy.world 3 points 1 year ago

Exploitation is the American way, bro. Create problems where there are none, offer a solution, profit.

load more comments (1 replies)
[-] Snapz@lemmy.world 24 points 1 year ago

When a for profit company is deciding how much time/energy/funds they want to invest in pedestrian safety, you get LOUD and you stay that way forever.

Your comment is blind to the reality we live in and the broken, out of touch people deciding if human lives are a businesses priority, and at what percentages, as these types of vehicles scale.

When humans get in an accident, there were choices/mistakes made, but there are things we can understand in certain situations and find closure often. When elon's failed experiment decapitates your grandmother by driving her under a semi and sheering off the top off the car, you'll probably never settle with that image as long as you live - and you'll see elon in the news each day being a tool and never seeing justice for that moment.

There's a difference with distinction in this conversation.

[-] Event_Horizon@lemmy.ml 6 points 1 year ago

That's a really good point.

Imagine your dog gets run over, you rush them to the vet but ultimately they die and your thousands out of pocket. You call the corporate helpdesk to log a claim because there isn't anyone else to contact, they offer you $300 in credit for immediate resolution or you can dispute. You become upset because your dog was more than a credit refund, the call centre drone says that you've become aggressive, that you can call back during business hours and hangs up.

What a hell scape.

load more comments (1 replies)
load more comments (1 replies)
[-] Sethayy@sh.itjust.works 19 points 1 year ago

Did you read the article? The protests are in favour of affordable public transit, instead of using 'surveillance pods' as a way to build even MORE roads. The accidents are probably the least of their concerns, although still on the list

[-] lobut@lemmy.ca 14 points 1 year ago

I mean, there's probably millions of drivers performing more driving and less than that of autonomous vehicles.

I personally can't wait for autonomous vehicles to take over but the argument would be clearer with percentages and stuff.

[-] bighi@lemmy.world 8 points 1 year ago

90 accidents a year is a LOT, if you stop to think that there are like only a few dozens of them out there, versus more than a hundred million human drivers.

[-] agitatedpotato@lemmy.world 6 points 1 year ago* (last edited 1 year ago)

They stop for no reason, cause gridlocks that require a human to comd out to it and pilot it, they've run over fire hoses being used and don't always get out of the way for emergency service vehicles. Nice statistic though.

[-] Mudflap@lemm.ee 24 points 1 year ago

So are you talking about autonomous cars or...

load more comments (9 replies)
[-] Kausta@lemmy.world 5 points 1 year ago

Comparing these two requires the number of cars with human drivers and the amount of time humans spend driving per year versus the number of autonomous vehicles and the amount of time they spend driving per year. I am not saying that you are wrong, I am just saying that comparing these numbers directly is like comparing apples with oranges.

I agree completely. My original post was just a stupid meme. I don't really think putting cones on the hoods of the cars is helping and that it's kind of dumb to do that and act smug about it. I'd rather people were sueing or something. I'm sure there is precedent for stopping manufacturers from making their vehicles more dangerous just to save a small percentage of money. I guess we do live in a capitalist utopia though so maybe I'm wrong but it seems like court might be more effective than trying to make these cars even more dangerous by adding a cone to the hood.

this post was submitted on 11 Jul 2023
559 points (96.5% liked)

Technology

59205 readers
2893 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS