Here we go. Autonomous vehicles are now mowing down cyclists.
I hope that Waymo’s insurance is good!
The car’s driver AI is becoming more human-like in running down cyclists, that is just concerning.
The choice between a range of humans drivers on the roads or only good self driving cars, give me the self driving cars.
But they aren’t there yet they are still being developed.
Sure you want to get rid of cars on the roads, I get that. But frankly it won’t happen, ever. Trying to stop self driving cars is trying to stop a future where you don’t have to worry about some driver being distracted and killing you.
Self driving cars are going to be the only thing that gets me on most of the roads that have no designated cycle lane.
I’ll take a good human driver over a good self-driving car. Humans can anticipate with foresight in a way that autonomous vehicles can’t.
For example, there are areas or times of the day where it might be common for pedestrians to walk out from behind a parked vehicle. A good human driver would know this, and drive defensively.
A self driving car only knows how to react to what it sees. And it can often wrong in certain situations. There are quite a few videos online of Teslas wanting to steer into danger or ignoring traffic stops.
I think the only way that self driving cars can work is if they are on designated roads (I.e. highway) with no random events like human drivers, cyclists, or kids near by.
But I really think it’s irresponsible for our governments to allow beta technology on public roads. There is no real accountability for when they fail. Maybe a small fine or settlement in court, but that’s about it.
How do you ensure all human drivers and good ones and not distracted? That’s roads I want to be on. If you know how to do that countries around the world want to hear from you.
Waymo actually seems very cautious. It’s was actually a known issue especially at the start. You can also programme it to be cautious at certain points.
We are talking self driving cars, think waymo. We are not talking about lane assist, what Tesla does is irrelevant. I was also on about thr tech when it’s more developed, but right now at this moment in time it is already safer than drivers in the US.
According to Waymo, the company’s vehicle fully stopped at a four-way intersection before proceeding into the intersection as a large truck was driving through in the opposite direction. “The cyclist was occluded by the truck and quickly followed behind it, crossing into the Waymo vehicle’s path,” the company said in a statement. “When they became fully visible, our vehicle applied heavy braking but was not able to avoid the collision.”
so what I’m hearing is that the cyclist was hidden behind a truck until last second, would a standard driver been able to see the cyclist? It initiated the brakes as soon as it saw the cyclist, not sure what else they expected it to be able to do.
So basically the car gunned it trying to shave .02 seconds off the drive? I mean, how fast of an acceleration did you need to hit someone not “fully visible” behind a truck?
I was actually curious about this so I started looking into it, this article doesn’t do it justice. Most articles on it give a better clarification of how the intersection was laid out.
The vehicle definitely didn’t gun it to race through the intersection it started moving as soon as it was clear that the truck entering the intersection was going straight and not turning, however the cyclist who was behind it didn’t stop at the intersection like the truck did and continued following behind the vehicle until deciding to blindly turn left.
I really don’t think that was the fault of the machine and I think a human driver would have done the same thing and are really might not have stopped in time. I think this is a clear no-fault or cyclist fault because the machine followed road laws, I’m not sure why these cyclist would decide to blindly turn left in a four-way intersection knowing that in a four-way intersection the opposite side can go at the same time
A non-negligent driver would practice defensive driving where you have to check that no vehicle is behind the truck and then start applying the accelerator.
This is just a lame excuse to avoid responsibility.
When handling a > 2 ton machine capable of speeds > 30 Kmph you have to be that careful.
I recommend stripping negligent drivers of their driving license and forcing them to relearn and apply again.
The bots can’t let humans have all the fun.
I hate the lane assist in our car, I had to shut it off. It kept steering the car back toward obstacles I was trying to avoid, like trucks with wide loads, potholes, and cyclists. I don’t know why anyone thinks that shit is a good idea. I’m sure driverless cars will make the same stupid mistakes.
I’ve never had issues with it because it shuts off when indicating in the direction that you are turning.
Mine disconnects when I steer. Both my cars will just beep at you for actively steering across a line, though you do need to overpower the machine’s steering
Yeah that “machine’s steering” is what pisses me off (and got shut off).
I’ve been driving for decades. Hundreds of thousands of miles. Everything from small cars to pickups towing 30’ trailers, and motorhomes towing cars. I have ONE at-fault accident in my record during all that time (which likely wouldn’t have been helped by the nanny features). I do not need the help.
I’m not saying I’m perfect; I make mistakes, too. But the fucking false alarms and shit from the car is distracting. I almost hit the brakes the first time I got a false alarm, because I thought there was something seriously wrong with our new car. It’s training me to ignore the alarm, so when I actually do make a mistake it catches, I’ll probably just ignore it - making them useless.