Wapo journalist verifies that robotaxis fail to stop for pedestrians in marked crosswalk 7 out of 10 times. Waymo admitted that it follows “social norms” rather than laws.

The reason is likely to compete with Uber, 🤦

Wapo article: https://www.washingtonpost.com/technology/2024/12/30/waymo-pedestrians-robotaxi-crosswalks/

Cross-posted from: https://mastodon.uno/users/rivoluzioneurbanamobilita/statuses/113746178244368036

86 points

People, and especially journalists, need to get this idea of robots as perfectly logical computer code out of their heads. These aren’t Asimov’s robots we’re dealing with. Journalists still cling to the idea that all computers are hard-coded. You still sometimes see people navel-gazing on self-driving cars, working the trolley problem. “Should a car veer into oncoming traffic to avoid hitting a child crossing the road?” The authors imagine that the creators of these machines hand-code every scenario, like a long series of if statements.

But that’s just not how these things are made. They are not programmed; they are trained. In the case of self-driving cars, they are simply given a bunch of video footage and radar records, and the accompanying driver inputs in response to those conditions. Then they try to map the radar and camera inputs to whatever the human drivers did. And they train the AI to do that.

This behavior isn’t at all surprising. Self-driving cars, like any similar AI system, are not hard coded, coldly logical machines. They are trained off us, off our responses, and they exhibit all of the mistakes and errors we make. The reason waymo cars don’t stop at crosswalks is because human drivers don’t stop at crosswalks. The machine is simply copying us.

permalink
report
reply
75 points

All of which takes you back to the headline, “Waymo trains its cars to not stop at crosswalks”. The company controls the input, it needs to be responsible for the results.

permalink
report
parent
reply
33 points

Some of these self driving car companies have successfully lobbied to stop citys from ticketing their vehicles for traffic infractions. Here they are stating these cars are so much better than human drivers, yet they won’t stand behind that statement instead they are demanding special rules for themselves and no consequences.

permalink
report
parent
reply
35 points

The machine can still be trained to actually stop at crosswalks the same way it is trained to not collide with other cars even though people do that.

permalink
report
parent
reply
18 points
*

I think the reason non-tech people find this so difficult to comprehend is the poor understanding of what problems are easy for (classically programmed) computers to solve versus ones that are hard.

if ( person_at_crossing ) then { stop }

To the layperson it makes sense that self-driving cars should be programmed this way. Aftter all, this is a trivial problem for a human to solve. Just look, and if there is a person you stop. Easy peasy.

But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?

To me it’s this disconnect between the common understanding of computer capability and the reality that causes the misconception.

permalink
report
parent
reply
9 points

I think you could liken it to training a young driver who doesn’t share a language with you. You can demonstrate the behavior you want once or twice, but unless all of the observations demonstrate the behavior you want, you can’t say “yes, we specifically told it to do that”

permalink
report
parent
reply
5 points

But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?

Most walkways are marked. The vehicle is able to identify obstructions in the road and things on the side of the road that are moving towards the road just like cross street traffic.

If (thing) is crossing the street then stop. If (thing) is stationary near a marked crosswalk, stop and go if they don’t move in (x) seconds. If they don’t move in a reasonable amount of time, then go.

You know, the same way people are supposed to handle the same situation.

permalink
report
parent
reply
8 points

Most crosswalks in the US are not marked, and in all places I’m familiar with vehicles must stop or yield to pedestrians at unmarked crosswalks.

At unmarked crosswalks and marked but uncontrolled crosswalks we have to handle the situation with social cues about which direction the pedestrian wants to cross the street/road/highway and if they will feel safer crossing the road after a vehicle has passed than before (almost always for homeless pedestrians and frequently for pedestrians in moderate traffic).

If waymo can’t figure out if something intends or is likely to enter the highway they can’t drive a car. Those can be people at crosswalks, people crossing at places other than crosswalks, blind pedestrians crossing anywhere, deaf and blind pedestrians crossing even at controlled intersections, kids or wildlife or livestock running toward the road, etc.

permalink
report
parent
reply
3 points

Thing? Like a garbage bin? Or a sign?

permalink
report
parent
reply
3 points

You can use that logic to say it would be difficult to do the right thing for all cases, but we can start with the ideal case.

  • For a clearly marked crosswalk with a pedestrian in the street, stop
  • For a pedestrian in the street, stop.
permalink
report
parent
reply
3 points

Difference is that humans (usually) come with empathy (or at least self-preservation) built in. With self-driving cars we aren’t building in empathy and self (or at least passenger) preservation, we’re hard-coding in scenarios where the law says they have to do X or Y.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
17 points

Whether you call in it programming or training, the designers still designed a car that doesn’t obey traffic laws.

People need to get it out of their heads that AI is some kind of magical monkey-see-monkey-do. AI isn’t magic, it’s just a statistical model. Garbage in = Garbage out. If the machine fails because it’s only copying us, that’s not the machine’s fault, not AI’s fault, not our fault, it’s the programmer’s fault. It’s fundamentally no different, had they designed a complicated set of logical rules to follow. Training a statistical model is programming.

You’re whole “explanation” sounds like a tech-bro capitalist news conference sound bite released by a corporation to avoid guilt for running down a child in a crosswalk.

permalink
report
parent
reply
8 points

It’s not apologeia. It’s illustrating the foundational limits of the technology. And it’s why I’m skeptical of most machine learning systems. You’re right that it’s a statistical model. But what people miss is that these models are black boxes. That is the crucial distinction between programming and training that I’m trying to get at. Imagine being handed a 10 million x 10 million matrix of real numbers and being told, “here change this so it always stops at crosswalks.” It isn’t just some line of code that can be edited.

The distinction between training and programming is absolutely critical here. You cannot hand waive away that distinction. These models are trained like we train animals. They aren’t taught through hard coded rules.

And that is a fundamental limit of the technology. We don’t know how to program a computer how to drive a car. Instead we only know how to make a computer mimic human driving behavior. And that means the computer can ultimately never peform better than an attentive sober human with some increases reaction time and visibility. But if there is any common errors that humans frequently make, then it will be duplicated in the machine.

permalink
report
parent
reply
-4 points

It’s obvious now that you literally don’t have any idea how programming or machine learning works, thus you think no one else does either. It is absolutely not some “black box” where the magic happens. That attitude (combined with your oddly misplaced condescension) is toxic and honestly kind of offensive. You can’t hand waive away responsibility like this when doing any kind of engineering. That’s like first day ethics-101 shit.

permalink
report
parent
reply
7 points

That all sounds accurate, but what difference does it make how the shit works if the real world results are poor?

permalink
report
parent
reply
5 points

It’s telling that Tesla and Google, worth over 3 trillion dollars, haven’t been able to solve these issues.

permalink
report
parent
reply
3 points

Training self driving cars that way would be irresponsible, because it would behave unpredictably and could be really dangerous. In reality, self driving cars use AI for only some tasks for which it is really good at like object recognition (e.g. recognizing traffic signs, pedestrians and other vehicles). The car uses all this data to build a map of its surroundings and tries to predict what the other participants are going to do. Then, it decides whether it’s safe to move the vehicle, and the path it should take. All these things can be done algorithmically, AI is only necessary for object recognition.

In cases such as this, just follow the money to find the incentives. Waymo wants to maximize their profits. This means maximizing how many customers they can serve as well as minimizing driving time to save on gas. How do you do that? Program their cars to be a bit more aggressive: don’t stop on yellow, don’t stop at crosswalks except to avoid a collision, drive slightly over the speed limit. And of course, lobby the shit out of every politician to pass laws allowing them to get away with breaking these rules.

permalink
report
parent
reply
2 points

According to some cursory research (read: Google), obstacle avoidance uses ML to identify objects, and uses those identities to predict their behavior. That stage leaves room for the same unpredictability, doesn’t it? Say you only have 51% confidence that a “thing” is a pedestrian walking a bike, 49% that it’s a bike on the move. The former has right of way and the latter doesn’t. Or even 70/30. 90/10.

There’s some level where you have to set the confidence threshold to choose a course of action and you’ll be subject to some ML-derived unpredictability as confidence fluctuates around it… right?

permalink
report
parent
reply
1 point

In such situations, the car should take the safest action and assume it’s a pedestrian.

permalink
report
parent
reply
66 points
*

I’m sure a strong legal case can be made here.

An individual driver breaking the law is bad enough but the legal system can be “flexible” because it’s hard to enforce the law against a generalized (bad) social norm and then each individual law breaker can argue an individual case etc.

But a company systematically breaking the law on purpose is different. Scale here matters. There are no individualized circumstances and no crying at a judge that the fine will put this single mother in a position to not pay rent this month. This is systematic and premeditated. Inexcusable in every way.

Like, a single cook forgetting to wash hands once after going to the bathroom is gross but a franchise chain building a business model around adding small quantities of poop in food is insupportable.

permalink
report
reply
19 points

I really want to agree, but conservative Florida ruled that people don’t have the right to clean water so I doubt the conservative Supreme Court will think we have the right to safe crosswalks

permalink
report
parent
reply
12 points

I am not intimately familiar with your country’s legal conventions, but there is already a law (pedestrians having priority in crosswalks) that is being broken here, right?

permalink
report
parent
reply
7 points

Driving laws are broken by humam drivers every day. The speed limit is secretly +15, rolling stops on stop signs is standard, many treat a right turn on red as a yield instead. Its so common and normalized that actually enforcing all the driving laws now would take a massive increase in the amount of police doing traffic control on the road assissted with cameras throughout the city to help with speeding and running red lights.

The truth is, North America has no interest in making their roads safer, you can see that in the way they design them. Vehicle speed and throughput above all else. North America has had increasing pedestrian deaths over the last several years, the rest of the developed world has decreasing pedestrian deaths.

permalink
report
parent
reply
1 point

Well put 👏

permalink
report
parent
reply
51 points

I remember seeing a video from inside a waymo waiting to make a left against traffic.

It turned the wheel before moving, in anticipation of the turn. Which is normal for most drivers I see on the road.

It’s also the exact opposite of what you should do for safety and legality.

Keep the wheel straight until you’re ready to move, turning the wheel before the turn means that if someone rear ends you, you get pushed into traffic, not along your current lane.

It’s the social norm, not the proper move.

permalink
report
reply
23 points

I was involved in a crash many years ago where this resulted in the car in front of us getting pushed into an oncoming car. We were stopped behind a car indicating to turn, hit from behind by a bus (going quite fast), pushed hard into the car in front and they ended up getting smashed from behind and in front.

Don’t turn your wheel until you’re ready to move, folks.

permalink
report
parent
reply
5 points

I haven’t driven in a bit, so thank you for that reminder. It’s scary that people instinctively do that.

permalink
report
parent
reply
2 points

Can’t be good for your car either to be turning the wheels while stopped.

permalink
report
parent
reply
3 points

On a similar note I’ve noticed the waymos don’t start there turns when there’s a pedestrian in the crosswalk, whereas I see drivers do that very often.

permalink
report
parent
reply
1 point

A left against traffic? Left turns don’t go against traffic. That’s right turns.

permalink
report
parent
reply
38 points
*

The recent Not Just Bike video about self driving cars is really good about this subject, very dystopic

permalink
report
reply
36 points

And again… If I break the law, I get a large fine or go to jail. If companies break the law, they at worst will get a small fine

Why does this disconnect exist?

Am I so crazy to demand that companies are not only treated the same, but held to a higher standard? I don’t stop ar a zebra, that is me breaking the law once. Waymo programming their cars noy to do that is multiple violations per day, every day. Its a company deciding they’re above the law because they want more money. Its a company deciding to risk the lives of others to earn more money.

For me, all managers and engineers that signed off on this and worked on this should he jailed, the company should be restricted from doing business for a month, and required to immediately ensure all laws are followed or else…

This is the only way we get companies to follow the rules.

Instead though, we just ask compi to treat laws as suggestions, sometimes requiring small payments if they cross the line too far.

permalink
report
reply
13 points

Why does this disconnect exist?

Because the companies pay the people who make the law.

Stating the obvious here but it’s the sad truth

permalink
report
parent
reply
9 points

Funny that you don’t mention company owners or directors who are supposed to oversee what happens, in practice are the people putting pressure to make that happen, and are the ones liable in front of the law.

permalink
report
parent
reply
1 point

I thought that was obviously implied.

If the CEO signed off on whatever is illegal, jail him or her too.

permalink
report
parent
reply
1 point

Do you have an example of a company getting a smaller fine than an individual for the same crime? Generally company fines are much larger.

permalink
report
parent
reply

Fuck Cars

!fuckcars@lemmy.world

Create post

A place to discuss problems of car centric infrastructure or how it hurts us all. Let’s explore the bad world of Cars!

Rules

1. Be Civil

You may not agree on ideas, but please do not be needlessly rude or insulting to other people in this community.

2. No hate speech

Don’t discriminate or disparage people on the basis of sex, gender, race, ethnicity, nationality, religion, or sexuality.

3. Don't harass people

Don’t follow people you disagree with into multiple threads or into PMs to insult, disparage, or otherwise attack them. And certainly don’t doxx any non-public figures.

4. Stay on topic

This community is about cars, their externalities in society, car-dependency, and solutions to these.

5. No reposts

Do not repost content that has already been posted in this community.

Moderator discretion will be used to judge reports with regard to the above rules.

Posting Guidelines

In the absence of a flair system on lemmy yet, let’s try to make it easier to scan through posts by type in here by using tags:

  • [meta] for discussions/suggestions about this community itself
  • [article] for news articles
  • [blog] for any blog-style content
  • [video] for video resources
  • [academic] for academic studies and sources
  • [discussion] for text post questions, rants, and/or discussions
  • [meme] for memes
  • [image] for any non-meme images
  • [misc] for anything that doesn’t fall cleanly into any of the other categories

Recommended communities:

Community stats

  • 4.4K

    Monthly active users

  • 1.1K

    Posts

  • 29K

    Comments