Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

You are viewing a single thread.
View all comments
245 points

I lost all trust in their ‘Autopilot’ the day I read Musk said (Paraphrasing) “All we need are cameras, there’s no need for secondary/tertiary LIDAR or other expensive setups”

Like TFYM? No backups?? Or backups to the backups?? On a life fucking critical system?!

permalink
report
reply
109 points

or other expensive setups

As much as I lost trust in his bullshittery a long time ago, his need to mention the cost of critical safety systems is what stuck out to me the most here. That’s how you know the priorities are backwards.

permalink
report
parent
reply
94 points

Also, my robot vacuum has LiDAR. It’s not expensive relative to a car.

permalink
report
parent
reply
34 points

Hell every iphone has lidar and the pro models have two lidar cameras. The tech is not very expensive, epecially not for a $80,000 car.

My partner’s econobox has lidar for its cruise control, but Tesla can’t seem to figure out how to make it work.

permalink
report
parent
reply
5 points

It’s actually gotten cheaper since they figured out how to make it solid state.

permalink
report
parent
reply
30 points
*

Skimping on cost is how disasters happen. Ask Richard Hammond. “Spared no expense” my ass, hire more than 2 programmers, you cheap fuck.

Edit: This was supposed to be a Jurassic Park reference, but my dumb ass mixed up John Hammond and Richard Hammond. That’s what I get for watching Top Gear and reading at the same time.

permalink
report
parent
reply
5 points

Were Richard Hammond’s many crashes a result of cost skimping? If so, I had no idea. Could you elaborate?

permalink
report
parent
reply
3 points

As someone who hasn’t much watched Top Gear, I was cracking up at your Jurassic Park reference until I saw your edit and was like “Wait a minute.”

Top Gear? Jurassic Park? Either way: Hold on to your butts.

😆

permalink
report
parent
reply
101 points
*

The crazier and stupier shit was that part of his justification was that “people drive and they only have eyes. We should be able to do the same.”

Its a stunningly idiotic justification, and yet here we are with millions of these “eyes only” teslas on the road.

permalink
report
parent
reply
43 points

That’s terrifying for showing how little he understands about the problem he is attempting to solve.

Humans use up to four senses at times to accomplish the task of driving.

@mosiacmango
@cm0002

permalink
report
parent
reply
31 points

I can add more, we don’t only have five senses. Elementary school propoganda that is. Here’s all the ones I can think of while driving.

  1. Vision
  2. Hearing
  3. Tactile feedback from wheel, pedals, you could break this down further into skin tactile pressure receptors, and also receptors of muscle tension, though muscle tension and stretching receptors also involved in number 4
  4. Proprioception, where your limbs and body are in space
  5. Rotational acceleration (semi circular canals)
  6. Linear acceleration (utricle and saccule)
  7. Smell, okay this might be a stretch but, some engine issues can be smelly

And that doesn’t even consider higher order processing and actual integration of all these things which despite all it’s gains with Ai recently can’t match all the capabilities of the brain to integrate all that information or deal with novel stimuli. Point is Elon, add more sensors to your dang cars so they’re less likely to kill people. And people aren’t even perfect at driving, why would we limit it to only our senses anyways? So dumb

permalink
report
parent
reply
20 points

Licking the steering wheel makes it five

permalink
report
parent
reply
23 points

Reminds me of Mao not brushing his teeth, because tigers didn’t brush theirs either.

permalink
report
parent
reply
13 points

Did he also eat his meat raw and sleep in trees?

permalink
report
parent
reply
45 points
*

Ah, but you see, his reasoning is that what if the camera and lidar disagree, then what? With only a camera based system, there is only one truth with no conflicts!

Like when the camera sees the broad side of a white truck as clear skies and slams right at it, there was never any conflict anywhere, everything went just as it was suppo… Wait, shit.

permalink
report
parent
reply
30 points

sees the broad side of a white truck as clear skies and slams right at it

RIP Joshua Brown:

The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.

permalink
report
parent
reply
13 points

he went so fast through my trailer I didn’t see him”.

Lidar would still prevail over stupidity in this situation. It does a better job detecting massive objects cars can’t go through.

permalink
report
parent
reply
5 points

The in-car system shouldn’t allow you to watch a movie wtf

permalink
report
parent
reply
0 points

what if the camera and lidar disagree, then what?

This (sensor fusion) is a valid issue in mobile robotics. Adding more sensors doesn’t necessarily improve stability or reliability.

permalink
report
parent
reply
25 points

After a point, yes. However, that point comes when the sensor you are adding is more than the second type in the system. The correct answer is to work into your algorithm a weighting system so the car can decide which sensor it trusts to not kill the driver, i.e. if the LIDAR sees the broadside of a trailer and the camera doesn’t, the car should believe the LIDAR over the camera, as applying the brakes and speeding into the obstacle at 60mph is likely the safer option.

permalink
report
parent
reply
-5 points

To be fair, humans have proven all you need are visual receptors to navigate properly.

permalink
report
parent
reply
7 points

To be fair, current computers / AI / whatever marketing name you call them aren’t as good as human brains.

permalink
report
parent
reply
-8 points

No, but they can be improved to the point where all that’s necessary are cameras and the means to control the vehicle.

permalink
report
parent
reply
4 points

Visual receptors… And 3-dimensional vision with all the required processing and decision making behind that based on the visual stimuli, lol.

permalink
report
parent
reply
2 points
  1. And how many vehicle accidents and deaths are there today? Proven that humans suck at driving maybe

  2. No we don’t, we use sight, sound and touch/feeling to drive at a minimum

permalink
report
parent
reply
-1 points
*

Touch? Sure, barely. But you can drive without being able to hear.

I’d also wager you can get a license if you have that rare disease that prevents you from feeling. Since, you know, how little we use touch and hearing to drive.

But hey? Maybe I’m wrong. Maybe you can provide a source that says you can’t get licensed if you have that disease or if you’re deaf. That would prove your point. Otherwise, it proves mine.

permalink
report
parent
reply
-13 points
*
Deleted by creator
permalink
report
parent
reply
25 points

Uhhhh…

…any level 4 car actually, according to the federal governments and all the agencies who regulate this stuff.

NAVYA, Volvo/Audi, Mercedes, magna, baidu, Waymo.

Tesla isn’t even trying to go past level 3 at this point.

permalink
report
parent
reply
6 points
*
Deleted by creator
permalink
report
parent
reply
17 points

A 2014 Infiniti can drive itself more safely on the highway than a Tesla. The key here is they didn’t lie about the cars capabilities so they didn’t encourage complacency.

In the city though, yeah you’ll need to look at other level 4 cars.

permalink
report
parent
reply
3 points
*

What brand of car has better autopilot with other sensors?

All of them. The other automakers didn’t fire their engineers during a hissy fit.

permalink
report
parent
reply
-17 points

Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.

I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.

Lidar has severe problems too. I’ve used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.

Cameras will eventually be great! Really they already are, but they’ll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.

That driver really should have been paying attention. Thee car fucking tells you to all the time.

If a camera has a problem the whole system aborts.

In the future this will mean the car will pull over, but it’'s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.

permalink
report
parent
reply
17 points

So I drive a tesla as well. Quite often I get the message that the camera is blocked by something (like sun, fog, heavy rain).

You can’t have a reliable self driving system if that is the case.

Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

permalink
report
parent
reply
3 points
*

Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

Of course it is, functionally both the camera and lidar solutions work in vector-space. The big difference is that a camera feed holds a lot more information beyond simple vector-space to feed the AI straining with than a lidar feed ever will.

permalink
report
parent
reply
12 points

any problem autopilot ejects gracefully and hands it over to the driver.

Gracefully? LMAO

You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it’s okay and it is allowed to disengage now!

permalink
report
parent
reply
0 points

Yes, that’s exactly how autopilots in airplanes work too… 🙄

I think camera FSD will get there, but I also think there are additional sensors needed (perhaps not lidar necessarily) to increase safety and like the point of the article states… a shitload more testing before it’s allowed on public roads. But let’s be reasonable about how the autopilot can disengage.

permalink
report
parent
reply
8 points

Starting off with 3d data will always be better than inferring it. Go fire up Adobe after effects and do a 3d track and see how awful it is, now that same awful process drives your car.

The AI argument falls short too because that same AI will be better if it just starts off with mostly complete 3d data from lidar and sonar.

permalink
report
parent
reply
1 point

Lidar and sonar are way lower resolution.

Sonar has a hard time telling the difference between a manhole cover, large highway sign and a brick wall.

permalink
report
parent
reply
4 points

ejects gracefully and hands it over to the driver

This is exactly the problem. If I’m driving, I need to be alert to the driving tasks and what’s happening on the road.

If I’m not driving because I’m using autopilot, … I still need to be alert to the driving tasks and what’s happening on the road. It’s all of the work with none of the fun of driving.

Fuck that. What I want is a robot chauffer, not a robot version of everyone’s granddad who really shouldn’t be driving anymore.

permalink
report
parent
reply
1 point

After many brilliant people trying for decades, it seems you can’t get the robot chauffeur without several billion miles of actual driving data, sifted and sorted into what is safe, good driving and what is not.

permalink
report
parent
reply
2 points

ejects gracefully and hands it over to the driver.

Just in time to slam you into an emergency vehicle at 80…but hey…autopilot wasn’t on during the impact, not Musk’s fault.

permalink
report
parent
reply
0 points

Nah, with hands on the wheel, looking at the road, the driver, who agrees they will pay attention, will have disengaged the system long before it gets to that point.

The system’s super easy to disengage.

It’s also getting better every year.

5 years ago my car could barely change lanes on the highway. Now it navigates lefts at 5 way lighted intersections in big city traffic with idiots blocking the intersection and suicidal cyclists running red lights as well as it was changing lanes on highway… And highway lane changes are extremely reliable. Cant remember my last lane change disengagement. Same car; just better software.

I bet 5 years from now it’ll be statistically safer than humans… Maybe not same car. Hope it’s my car too, but it’s unclear if that processor is sufficient…

Anyway, it’ll keep improving from there.

permalink
report
parent
reply
1 point

good thing regular cameras aren’t affected by reflective surfaces

oh wait

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 542K

    Comments