10 points

This is the best summary I could come up with:


Back in 2016, Tesla CEO Elon Musk stunned the automotive world by announcing that, henceforth, all of his company’s vehicles would be shipped with the hardware necessary for “full self-driving.” You will be able to nap in your car while it drives you to work, he promised.

But while Musk would eventually ship an advanced driver-assist system that he called Full Self-Driving (FSD) beta, the idea that any Tesla owner could catch some z’s while their car whisks them along is, at best, laughable — and at worst, a profoundly fatal error.

Since that 2016 announcement, hundreds of fully driverless cars have rolled out in multiple US cities, and none of them bear the Tesla logo.

His supporters point to the success of Autopilot, and then FSD, as evidence that while his promises may not exactly line up with reality, he is still at the forefront of a societal shift from human-powered vehicles to ones piloted by AI.

You’ll also hear from a former Tesla employee who was fired after posting videos of FSD errors, experts who compare the company’s self-driving efforts to its competitors, and even from the competitors themselves — like Kyle Vogt, CEO of the General Motors-backed Cruise, who is unconvinced that Musk can fulfill his promises without rethinking his entire hardware strategy.

Listen to the latest episode of Land of the Giants: The Tesla Shock Wave, a co-production between The Verge and the Vox Media Podcast Network.


The original article contains 497 words, the summary contains 236 words. Saved 53%. I’m a bot and I’m open source!

permalink
report
reply
16 points

I’ve been ranting about this since 2016.

Having consumer trust in developing AI vehicles is hard enough without this asshole’s ego and lies muddying the water.

permalink
report
parent
reply
47 points
*
7 points

TBF, we have achieved a FSD that is safer than one human this year. But we took away the driver license of grandma so now we have to find another human that’s worse than FSD.

permalink
report
parent
reply
2 points

I live in a small town with a large college. The students just came back for fall semester. I believe we have quite a few candidates for your list.

permalink
report
parent
reply
15 points

Wow. Impressive collection.

Somehow reminds me of Jehova’s witnesses and the end of the world :-)

permalink
report
parent
reply
2 points

Lmaooo too real

permalink
report
parent
reply
34 points
16 points

I wonder how much impact there might have been on code quality when Elon forced lead devs from their projects at Tesla to work on Twitter. I’ve never seen a situation like that turn out well for either party.

permalink
report
parent
reply
-17 points
*

I wonder how this statistically compares to non-Tesla crashes?

Edit: quick Google/math shows average rate of lethal automobile crashes at 12 per 100,000 drivers. Tesla has supposedly sold 4.5million cars. 4.5million divided by 17 deaths from the article = 1 death per 200,000 Tesla drivers.

This isn’t exactly apples-to-apples and would love for some to “do the math” more accurately, but it seems like Tesla is much safer than a standard driver.

The other confounding factor is we don’t know how many of these drivers were abusing autopilot by cheating the rules (it requires hands on the wheel and full attention on the road)

permalink
report
parent
reply
17 points
*

Your statistical analysis is so bad that it’s not even wrong. It’s just a pile of disparate data strung together with false assumptions.

So all of those Teslas were sold in America? And all 4.5 million of those Teslas have Autopilot? And they’re in Autopilot mode 100% of the time?

permalink
report
parent
reply
-14 points

Fix it for me then daddi

permalink
report
parent
reply
9 points

You forgot the most important issue: Tesla drivers are not representative of the average driver. They have more money and more education. They live in places with nicer weather. These all contribute to lower crash rates without self driving. I bet high end Mercedes have lower crash rates too, because people don’t defer maintenance and then drive them crazily in the snow.

Compare apples to apples and I bet Teslas have average crash rates for luxury cars.

permalink
report
parent
reply
1 point

Whatabout what your mom does, down by the docks at night?

permalink
report
parent
reply
15 points

It is not a valid comparison. Many deaths are in bad weather or in bad roads. Tesla self driving will not even turn on in these conditions. I do not believe apples to apples data exists.

permalink
report
parent
reply
5 points

The true comparison is in miles per accident. Fatal accidents will be higher for older model cars. Not all Tesla cars have FSD. In many situations FSD is not available even on equipped cars. There is nothing to indicate from the current data that Telsa FSD is safer or more dangerous than the median driver.

permalink
report
parent
reply
-4 points

Let’s see it, show me the numbers! Everyone’s critiquing my quick mental math but I don’t see anyone contributing to fix it 🤷‍♀️. Will edit comment once I do!

permalink
report
parent
reply
1 point

This isn’t necessarily true either. The NHTSA Standing General Order data shows that Tesla reports a large number of crashes (which they get to cherry pick in a LOT of cases) under ADAS use compared to other brands. Taking conservative rollout numbers from companies like Honda shows that the crash per ADAS equipped vehicle rate is significantly higher.

The real red flag in all all of this is that Tesla’s own reported marketing numbers for ADAS crashes wasn’t declining with newer releases over time. A rate that doesn’t improve as the CEO claims the software is already performing better than humans should instantly discredit the software, its performance, and any claims about new or improving features.

permalink
report
parent
reply
5 points

this math is too sloppy to draw any conclusions

permalink
report
parent
reply
83 points

Without LIDAR, this is a fool’s endeavor.

permalink
report
reply
66 points

I wish this was talked about every single time the subject came up.

Responsible, technologically progressive companies have been developing excellent, safe, self-driving car technology for decades now.

Elon Musk is eviscerating the reputation of automated vehicles with his idiocy and arrogance. They don’t all suck, but Tesla sure sucks.

permalink
report
parent
reply
15 points

Just like that cheaper non-lidar Roomba with room mapping technology, it will get lost.

permalink
report
parent
reply
23 points
*

Even with LIDAR there are just too many edge cases for me to ever trust a self driving car that uses current-day computing technology. Just a few situations I’ve been in that I think a FSD system would have trouble with:

  • I pulled up at a red light where a construction crew was working on the side of the road. They had a police detail with them. As I was was watching the red light the cop walked up to my passenger side and yelled “Go!” at me. Since I was looking at the light I didn’t see him trying to wave me through the intersection. How would a car know to drive through a red light if a cop was there telling you to?

  • I’ve seen cars drive the wrong way down a one way street because the far end was blocked due to construction and backtracking was the only way out. (Residents were told to drive out the wrong way) Would a self driving car just drive down to the construction site and wait for hours for them to finish?

  • I’ve seen more than one GPS want to route cars improperly. In some cases it thinks a practically impassible dirt track is a paved road. In other cases I’ve seen chains and concrete barriers block intersections that cities/towns have determined traffic shouldn’t be going through.

  • Temporary detour or road closure signs?

  • We are having record amounts of rain where I live and we’ve seen roads covered by significant flooding that makes them unsafe to drive on. Often there aren’t any warning signs or barricades for a day or so after the rain stops. Would an FSD car recognize a flooded out road and turn around, or drive into the water at full speed?

permalink
report
parent
reply
12 points

In my opinion, FSD isn’t attempting to solve any of those problems. Those will require human intervention for the foreseeable future.

permalink
report
parent
reply
1 point

Or there are better other ways to tell a FSD car that the road is closed. We could use QR code or something like that which includes info about blockade, where you can drive around it, and how long it will stay blocked. A FSD should be connected enough to call home and give info to the servers, those then update the other FSD cars, et voila tadaa.

permalink
report
parent
reply
7 points

Musk’s vision is (was?) to eventually turn Tesla’s into driverless robo-taxis. At one point he even said he could see regular Tesla owners letting their cars drive around like automated Ubers, making money for them, instead of sitting idle in garages.

permalink
report
parent
reply
1 point

Well FSD is supposed to be Level 5 according to the marketing and description when it went on sale. Of course, we know Tesla’s lawyers told California that they have nothing more than Level 2, have not timeline to begin building anything beyond Level 2, and that the entire house of cards hinges on courts and regulators continuing to turn a blind eye.

permalink
report
parent
reply
-12 points
*

Do you have lidar on your head? No, yet you’re able to drive with just two cameras on your face. So no lidar isn’t required. Not that driving in a very dynamic world isn’t very difficult for computers to do, it’s not a matter of if, it’s just a matter of time.

Would lidar allow “super human” driving abilities? Like seeing through fog and in every direction in the dark, sure. But it’s not required for the job at hand.

permalink
report
parent
reply
4 points

Humans don’t drive on sight alone.

permalink
report
parent
reply
3 points

Uhhhh… What the fuck else are the rest of you using?!

permalink
report
parent
reply
-1 points

What’s the human equivalent for lidar then?

permalink
report
parent
reply
2 points

I remember watching a video talking about is there a camera that can see as well as a human eye. The resolution was there are cameras that see close but not as well and they are very big and expensive and the human brain filters much of it without you realizing. I think it could be done with a camera or two but I think we are not close to the technology for the near future.

permalink
report
parent
reply
5 points

You have eyes that are way more amazing than any cameras that are used in self driving, with stereoscopic vision, on a movable platform, and most importantly, controlled via a biological brain with millions of years of evolution behind it.

I’m sorry, you can’t attach a couple cameras to a processor, add some neural nets, and think it’s anything close to your brain and eyes.

permalink
report
parent
reply
2 points

And also, cameras don’t work that great at night. Lidar would provide better data.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
4 points

Do you have lidar on your head?

Nope,

And that’s exactly why humans crash. Constantly.

Even when paying attention.

They don’t have resolution in depth perception, nor the FOV.

permalink
report
parent
reply
2 points

And that’s exactly why humans crash. Constantly.

No it isn’t. Anywhere in the world the vast majority of crashes are caused by negligence, speeding, distraction, all factors that can be avoided without increasing our depth perception accuracy.

permalink
report
parent
reply
0 points
*

A lot of LIDAR fans here for some reason, but you’re absolutely right.

There’s just not a good amount of evidence pointing that accurate depth perception only obtained through LIDAR is required for self driving, and it also won’t solve the complex navigation of a real world scenario. A set of visible spectrum cameras over time can reconstruct a 3D environment well enough for navigation and it’s quite literally what Tesla’s FSD does.

I don’t know why someone would still say it’s not possible when we already have an example running in production.

“But Tesla FSD has a high disengagement rate” - for now, yes. But these scenarios are more often possible to be solved by high definition maps than by LIDAR. For anyone that disagrees, go to youtube, choose a recent video of Tesla’s FSD and try to find a scenario where a disengagement would have been avoided by LIDAR only.

There are many parts missing for a complete autonomous driving experience. LIDAR is not one of them.

permalink
report
parent
reply
1 point

Do you have CCDs in your head? No? This argument is always so broken it’s insane to see it still typed out as anything but sarcasm.

permalink
report
parent
reply
-3 points

Dont let them know about that I don’t want my radar detector flipping out for laser lol

permalink
report
parent
reply
2 points

what?

permalink
report
parent
reply
-2 points
*

K and KA band are used for blind spot monitoring and would make radar detectors go nuts until filtering got worked out, cars that use Lidar will set them off as well though they’re more rare still

permalink
report
parent
reply
1 point

I don’t know why people are so quick to defend the need of LIDAR when it’s clear the challenges in self driving are not with data acquisition.

Sure, there are a few corner cases that it would perform better than visual cameras, but a new array of sensors won’t solve self driving. Similarly, the lack of LIDAR does not forbid self driving, otherwise we wouldn’t be able to drive either.

permalink
report
parent
reply
5 points
*

challenges in self driving are not with data acquisition.

What?!?! Of course it is.

We can already run all this shit through a simulator and it works great, but that’s because the computer knows the exact position, orientation, velocity of every object in a scene.

In the real world, the underlying problem is the computer doesn’t know what’s around it, and what those things around doing or going to do.

It’s 100% a data acquisition problem.

Source? I do autonomous vehicle control for a living. In environments much more complicated than a paved road with accepted set rules.

permalink
report
parent
reply
0 points

You’re confusing data acquisition with interpretation. A LIDAR won’t label the data for your AD system and won’t add much to an existing array of visible spectrum cameras.

You say the underlying problem is that the computer doesn’t know what’s around it. But its surroundings are reliably captured by functional sensors. Therefore it’s not a matter of acquisition, but processing of the data.

permalink
report
parent
reply
1 point

Yes, self driving is not computationally solved at all. But the reason people defend LIDAR is that visible light cameras are very bad at depth estimation. Even with paralax, a lot of software has a very hard time accurately calculating distance and motion.

permalink
report
parent
reply
-2 points
Deleted by creator
permalink
report
reply
17 points

Lol, ok. Your anecdotal experience can totally be believed over all the data gathered over years. Great. Thanks.

permalink
report
parent
reply
6 points

Yeah perfect! No more debates this guy settled it

permalink
report
parent
reply
14 points
*

Counter-counterpoint: I’ve been using it since 2019. I think you’re exaggerating.

  • It aggressively tries to center itself, always. If you’re in a lane and it merges with a second lane, the car will swerve sharply to the right as it attempts to go back to the middle of the lane.

  • It doesn’t allow space for cars to merge until the cars are already merging. It doesn’t work with traffic; it does its own thing and is discourteous to other drivers. It doesn’t read turn signals; it only reacts to drivers getting over.

  • If a motorcycle is lane-splitting, it doesn’t move out of the way for the motorcycle. In fact, it assumes anything between lanes isn’t an issue. If something is partially blocking a lane but the system doesn’t recognize it as fully “your lane”, the default is to ignore it. The number of times I’ve had to disengage to dodge a wide load or a camper straddling two lanes is crazy.

  • With the removal of radar, phantom braking has become far, far worse. Any kind of weather condition causes issues. Even if you drive at sunset, the sun can dazzle the cameras and they don’t detect things that they should be able to - or worse, they detect problems which aren’t there.

  • It doesn’t understand road hazards. It will happily hit a pothole at 70 MPH. It will ignore road flares and traffic cones. When the lanes aren’t clearly marked (because the paint has worn away or because of construction), it can have dramatic behavior.

  • It waits so long to brake, and when it brakes it brakes hard. It accelerates just as suddenly, leading to a very jerky ride that makes my passengers carsick.

The only time I trust FSD is when it’s stop-and-go traffic. Beyond that I have to pay so much attention to the thing that I might as well just drive myself. The “worst thing it can do” isn’t just detour; it’s “smash into the thing that it thought wasn’t an issue”.

permalink
report
parent
reply
3 points
*

I have only been driving a Tesla for a few days in 2022, but i fully agree with you, i wanted to specifically test the FSD and i had so many incidents where it tried to get into an appearing turning lane, even tough it should go straight, just straight up slowed to 10kph in a Tunnel where speed limit was 50, and there were blind corners because of “bad vision conditions” even the cruise control was annoying, it felt like my steering input was basically just a “suggestion” that i sometimes really had to force through against the will of the car because otherwise bad shit would’ve happened… Sports mode steering made that only slightly better in the dual motor Model Y

Overall I enjoyed driving the ID3 more actually… at least that had solid and responsive steering that felt - compared to the Tesla - like driving a sports car… and i’ve driven the ID3 directly after the Tesla.

Only good thing about the Tesla was acceleration.

permalink
report
parent
reply
2 points

It doesn’t read turn signals

It does in the FSD beta (somewhat). It even brakes and allows them in if it detects that they have a signal on. It doesn’t understand merges as well, but it’s still better than regular autopilot. All your other points are pretty valid. I am constantly taking it out of AP Ana putting it back in during a city drive, even though I have “FSD”.

permalink
report
parent
reply
2 points

The worst it will do is pick the wrong lane and detour a bit to get back on track.

https://www.cbsnews.com/amp/news/tesla-car-crash-nhtsa-school-bus/

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 506K

    Comments