207
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 03 Sep 2023
207 points (100.0% liked)
Technology
37702 readers
339 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
Those damn things are not ready to be used on public roads. Allowing them is one of the more prominent examples of corruption that we’ve seen recently.
Statistically they're still less prone to accidents than human drivers.
I never quite undestood why so many people seem to be against autonomous vehicles. Especially on Lemmy. It's unreasonable to demand perfection before any of these is used on the public roads. In my view the bar to reach is human level driving and after that it seems quite obvious that from safety's point of view it's the better choice.
This is just such a bad take, and it's so disappointing to see it parroted all over the web. So many things are just completely inaccurate about these "statistics", and it's probably why it "seems" so many are against autonomous vehicles.
So no, I would absolutely not say they are "less prone to accidents than human drivers". And that's just the statistics, to say nothing about the legality that will come up. Especially given just how adverse companies seem to be to admit fault for anything.
Accidents are less likely on highways. Most accidents occur in urban settings. Most deadly accidents occur outside of cities, off-highway.
Sure mile for mile they are less likely. But when they happen they are generally more serious as higher speeds are involved, and if Tesla has shown anything it's a much more complicated process for autonomous vehicles to navigate and deal with edge cases (like vehicles on the side of the road, emergency or otherwise). Much harder (and dangerous) to just slam on the brakes and put on your hazards on a highway than a side street if the car gets confused.
I could see accidents being more likely for autonomous cars on highways though
Why? Driving on highways is the easiest kind of driving?
For humans, but not necessarily for camera-based autonomous cars? They also can't just stop on a highway to prevent accidents.
Avoiding dangerous scenarios is the definition of driving safely.
This technology is still an area under active development and nobody (not even Elon!) is claiming this stuff is ready to replace a human in every possible scenario. Are you actually suggesting they should be testing the cars in scenarios that they know wouldn't be safe with the current technology? Why the fuck would they do that?
OK... if you won't accept the company's reported data - who's data will you accept? Do you have a more reliable source that contradicts what the companies themselves have published?
No that's a non issue. When a human driver runs over a pedestrian/etc and causes a serious injury, if it's a civilised country and a sensible driver, then an insurance company will pay the bill. This happens about a million times a week worldwide and insurance is a well established system that people are, for the most part, happy with.
Autonomous vehicles are also covered by insurance. In fact it's another area where they're better than humans - because humans frequently fail to pay their insurance bill or even deliberately drive after they have been ordered by a judge not to drive (which obviously voids their insurance policy).
There have been debates over who will pay the insurance premium, but that seems pretty silly to me. Obviously the human who ordered the car to drive them somewhere will have to pay for all costs involved in the drive. And part of that will be insurance.
Well hey - atleast I provided some statistics to back me up. That's not the case with the people refuting those stats.
I honestly can't tell if that's a passive-aggressive swipe at me or not; but just in case it was: stats mean very little w/o context. I believe the quote was "Lies, damned lies, and statistics". I simply pointed out a few errors with the foundation of these "statistics". I didn't need to quote my own statistics because, as I was pointing out, this is a completely apples to oranges comparison. The AV companies want at the same time to preach about how many miles they go w/o accident while comparing themselves to an average they know doesn't match their own circumstances. Basically they are taking their best case scenario and comparing it against average/worst case scenario stats.
I'd give more weight to the stats if they where completely transparent, worked with a neutral 3rd party, and gave them access to all their video/data/etc to generate (at the very least) proper stats relative to their environment. Sure, I'll way easier believe waymo/cruises' numbers over those by tesla; but I still take it with a grain of salt. Because again, they have a HUGE incentive to tweak their numbers to put themselves in the very best light.
No, I see your point, and I agree. These companies are almost guaranteed to cherry-pick those stats, so only a fool would take that as hard evidence. However, I don't think these stats flat-out lie either. If they show a self-driving car is three times less prone to accidents, I doubt the truth is that humans, in fact, are twice as good. I believe it's safe to assume that these stats at least point us in the right direction, and that seems to correlate with the little personal experience I have as well. If these systems really sucked as much as the most hardcore AV-skeptics make it seem, I doubt we'd be seeing any of these in use on public roads because the issues would be apparent.
However, the point I'm trying to highlight here is that I make a claim about AV-safety, and I then provide some stats to back me up. People then come telling me that's nonsense and list a bunch of personal reasons why they feel so but provide nothing concrete evidence except maybe links to articles about individual accidents. That's just not the kind of data that's going to change my mind.
With Tesla the complaint is that the statistics are almost all highway miles so it doesn't represent the most challenging conditions which is driving in the city. Cruise then exclusively drives in a city and yet this isn't good enough either. The AV-sceptics are really hard to please..
You'll always be able to find individual incidents where these systems fail. They're never going to be foolproof and the more of them that are out there the more news like this you're going to see. If we reported about human-caused crashes with the same enthusiasm that would be all the news you're hearing from then on and letting humans drive would seem like the most scandalous thing imaginable.
That article you linked isn't about self driving car. It's about Tesla "autopilot" which constantly checks if a human is actively holding onto the steering wheel and depends on the human checking the road ahead for hazards so they can take over instantly. If the human sees flashing lights they are supposed to do so.
The fully autonomous cars that don't need a human behind the wheel have much better sensors which can see through fog.
The fact that Tesla requires a human driver to take over constantly makes it not self-driving.
The Human isn't supposed to be "doing nothing". The human is supposed to be driving the car. Autopilot is simply keeping the car in the correct lane for you, and also adjusting the speed to match the car ahead.
Tesla's system won't even stop at an intersection if you need to give way (for example, a stop sign. Or a red traffic light). There's plenty of stuff the human needs to be doing other than turning the steering wheel. If there is a vehicle stopped in the middle of the road Tesla's system will drive straight into it at full speed without even touching the brakes. That's not something that "might happen" it's something that will happen, and has happened, any time a stationary vehicle is parked on the road. It can detect the car ahead of you slowing down. It cannot detect a stopped vehicle.
They've promised to ship a more capable system "soon" for over a decade. I don't see any evidence that it's actually close to shipping though. The autonomous systems by other manufacturers are significantly more advanced. They shouldn't be compared to Tesla at all.
Yes. Tens of millions of testing and they pay especially close attention to any situations where the sensors could potentially fail. Waymo says their biggest challenge is mud (splashed up from other cars) covering the sensors. But the cars are able to detect this, and the mud can be wiped off. it's a solvable problem.
Unlike Tesla, most of the other manufacturers consider this a research project and are focusing all of their efforts on making the technology better/safer/etc. They're not making empty promises and they're being cautious.
On top of the millions of miles of actual testing, they also record all the sensor data for those miles and use it to run updated versions of the algorithm in exactly the same scenario. So the millions of miles have, in fact, been driven thousands and thousands of times over for each iteration of their software.
You don’t understand why people on Lemmy, an alternative platform not controlled by corporations, might not want to get in a car literally controlled by a corporation?
I can easily see a future where your car locks you in and drives you to a police station if you do something “bad”.
As to their safety, I don’t think there are enough AVs to really judge this yet; of course Cruise’s website will claim Cruise AVs cause less accidents.
I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.
We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.
Any dataset sourced from human activity (eg internet text as in Chat GPT) will always contain the current societal bias.
You're putting words to my mouth. I wasn't talking about people on Lemmy not wanting to get into one of these vehicles.
The people here don't seem to want anyone getting into these vehicles. Many here are advocating for all-out ban on self-driving cars and demand that they're polished to near perfection on closed roads before being allowed for public use even when the little statistics we already have mostly seem to indicate these are at worst as good as human drivers.
If it's about Teslas the complain often is the lack of LiDAR and radars and when it's about Cruise which has both it's then apparently about corruption. In both cases the reaction tends to be mostly emotional and that's why every time one provides statistics to back up the claims about safety it just gets called marketing bullshit.
Honestly? I don’t want anyone to use AVs because I fear they will become popular enough that eventually I’ll be required to use one.
I honestly haven’t done enough research on AV safety to feel comfortable claiming anything concrete about it. I personally don’t feel comfortable with it yet since the technology is very new and I essentially need to trust it with my life. Maybe in a few years I’ll be more convinced.
I hear you. I love driving and I have zero interest in buying a self-driving vehicle. However I can still stand outside my own preferences and look at it objectively enough to see that it's just a matter of time untill AI gets so good at it that it could be considered irresponsible to let a human drive. I don't like it but that's progress.
Fine by me, as long as the companies making the cars take all responsibility for accidents. Which, you know, the human drivers do.
But the car companies want to sell you their shitty autonomous driving software and make you be responsible.
If they don't trust it enough, why should I?
I saw a video years ago discussing this topic.
How good is “good enough” for self-driving cars?
The bar is much higher than it is for human drivers because we downplay our own shortcomings and think that we have less risk than the average driver.
Humans can be good drivers, sure. But we have serious attention deficits. This means it doesn’t take a big distraction before we blow a red light or fail to observe a pedestrian.
Hell, lot of humans fail to observe and yield to emergency vehicles as well.
But none of that is newsworthy, but an autonomous vehicle failing to yield is.
My personal opinion is that the Cruise vehicles are as ready for operational use as Teslas FSD, ie. should not be allowed.
Obviously corporations will push to be allowed so they can start making money, but this is probably also the biggest threat to a self-driving future.
Regulated so strongly that humans end up being the ones in the driver seat for another few decades - with the cost in human lives which that involves.
I'm not gonna join in the discussion, but if you cite numbers, please don't link to the advertising website of the company itself. They have a strong interest in cherry picking the data to make positive claims.
They can't come quick enough for me. I can go to work after a night out without fear I might still be over the limit. I won't have to drive my wife everywhere. Old people will not be prisoners in their own homes. No more nobheads driving about with exhausts that sound like a shoot out with the cops. No more aresholes speeding about and cutting you up. No more hit and runs. Traffic accident numbers falling through the floor. In fact it could even get to a point where the only accidents are the fault of pedestrians/cyclists not looking where they are going.
All of these are solved by better public transport/safe bike routes and more walkable city designs. All of which is we can do now, not rely on some new shiny tech so that we can keep car companies profits up.
The day I can get in a car and not be simultaneously afraid of my own shortcomings and the fact that there are strangers driving massive projectiles around me is a day I will truly celebrate. The fact is that automobiles are weapons, and I don't want to be the one wielding it when a single mistake can cost an entire family their lives, although I would like to be there to slam on the brakes and prevent it if needed.
The possibilities really are endless.
When the light turns green the entire row of cars can start moving at the same time like on motor sports. Perhaps you don't even need traffic lights because they can all just drive to the intersection at the same time and just keep barely missing eachother but never crash due to the superior reaction times and processing speeds of computer. You could also let your car go taxi other people around when you don't need it.
What if we tied that entire row of cars together as one unit so we could save cost on putting high end computers in each car? Give them their own dedicated lane because we will never have 100% fully autonomous cars on the road unless we make human drivers illegal.
I'll call my invention a train.
I think you might need lights for pedestrians at crossings.
I did wonder if ambulances would need sirens but again, pedestrians!
Just ban pedestrians. Problem solved,
For me it's because they're controlled by a few evil companies. I'm not against them in concept. Human drivers are the fucking worst.