Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
Did it slow down when it was covered in canvas
I seem to recall that fElon prevented the self driving team from utilizing LIDAR for any part of the system, instead demanding that everything run off of optical input. Does anyone else remember the same?
I’m trying to find an article that covers what I remember but I know for sure that it’s been a good while since I saw the info I recall. Hopefully I can dig something up.
Iirc they were using a combination of lidar and radar, but Elmo wanted to cut costs.
Funny thing is, the price of lidar is dropping like a stone; they are projected to be sub-$200 per unit soon. The technical consensus seems to be settling in on 2 or 3 lidars per car plus optical sensors, and Chinese EV brands are starting to provide self driving in baseline models, with lidars as part of the standard package.
Did he want to cut costs or did he want a network of cameras at his control all over the world?
What’s cool is that Teslas used to have radar sensors, at least, but Elon removed them from production to save money. Even if you have a car from back then, the software no longer uses them and they’ll just physically unplug them the next time you have the car serviced, as it’s just a drain on the battery at this point 🙃
meanwhile our subaru has lidar for adaptive cruise control and emergency braking
they’ll just physically unplug them the next time you have the car serviced
So, (looks at watch), in an hour?
I remember there being claims from him or his team about lidar being a dead end that would not scale as well as computer vision.
I believe he claimed that since humans use their vision to drive that computer vision was more than enough.
I don’t know about you, but I also rely on sounds & feel when I drive. I also know that the human eye has evolved to detect motion, filter out extraneous information, and send just the important bits to the brain so that it doesn’t get overloaded with everything the eye sees. Computer vision is the exact opposite from that, having to process every bit of every image the camera sees.
since humans use their vision to drive that computer vision was more than enough
Surprised he didn’t swap out the wheels with legs while he was at it
I don’t know about you, but I also rely on sounds & feel when I drive.
Of course. When I feel myself driving into a wall, I stop immediately.
Came here to actually write this. Everyone remembers that. He made Tesler the hated shit it is today.
As a space nut I seriously hope that he never gets a chance to do anything similar with SpaceX. Thankfully he’s mostly been kept away from important things thus far.
Don’t get me wrong, I know SpaceX’s closet is overflowing with skeletons. But since Congress has been so kind as to continuously cut NASA’s budget for the last few decades, I have to rely on SpaceX and other private companies to keep our space endeavors going.
I’m (was) huge SpaceX nerd, but last year or so I’m less and less. He always was dumb narcissist asshole, but now I can’t take it anymore. Also the idea that we’ve fucked up this planet and need to move somewhere else, by doing thousands of launches finishing this planet always made me sick. If someone would take him out, I probably would come back to liking the company.
Yes. He took too much inspiration from Stanford University’s “Stanley” winning the DARPA Grand Challenge in 2005. This was an early completion to build viable autonomous vehicles. Most of them looked like tanks covered in radar dishes but Stanford wound up taking home the gold with just an SUV with cameras on it.
It was an impressive achievement in computer vision, and the LiDAR-encrusted vehicles wound up looking like over-complex dinosaurs. There’s a great documentary about it narrated by John Lithgow (who, throughout it, pronounces the word robot as “ro-butt”). Elon watched it, made up his mind, and like a moron, hasn’t changed it in 20 years. I’m almost Musk’s age so I know how the years speed up as we go on. He probably thinks about the Stanford win as something that happened relatively recently. Especially with his mind on - ahem - other things, he’s not keeping up with recent developments out in the real world.
Rober just made Musk look like the absolute tool he is. And I’m a little worried that we may see people out there staging real world versions of this somehow with actual dangerous obstacles, not a cartoonish foam wall.
I did low-key get the squiggles before writing the article. I thought, from an ethical hacking disclosure-type perspective, that this info might cause folks to… well, ya know, paint tunnels on walls.
Then I looked, the cat was already out of the bag, the video had something like 5 million views on it in the 4 hours it took me to draft the article. So I shared it, but I definitely did have that thought cross my mind. I am also a little worried on that score.
Tesla never had LIDAR. That’s the little spinny thing you see on Waymo cars. They had RADAR, and yes it was removed in 2021 due to supply shortages and just…never reinstalled.
still, this should be something the car ought to take into account. What if there’s a glass in the way?
A camera will show it as being more convincing than it is. It would be way more obvious in real life when seen with two eyes. These kinds of murals are only convincing from one specific point.
That’s true, but it’s still way more understandable that a car without lidar would be fooled by it. And there is no way you would ever come into such a situation, whereas the image in the thumbnail, could actually happen. That’s why it’s so misleading, can people not see that?
I absolutely hate Elon Musk and support boycott of Tesla and Starlink, but this is a bit too misleading even with that in mind.
As much as i want to hate on tesla, seeing this, it hardly seems like a fair test.
From the perspective of the car, it’s almost perfectly lined up with the background. it’s a very realistic painting, and any AI that is trained on image data would obviously struggle with this. AI doesn’t have that human component that allows us to infer information based on context. We can see the boarders and know that they dont fit. They shouldn’t be there, so even if the painting is perfectly lines up and looks photo realistic, we can know something is up because its got edges and a frame holding it up.
This test, in the context of the title of this article, relies on a fairly dumb pretense that:
- Computers think like humans
- This is a realistic situation that a human driver would find themselves in (or that realistic paintings of very specific roads exist in nature)
- There is no chance this could be trained out of them. (If it mattered enough to do so)
This doesnt just affect teslas. This affects any car that uses AI assistance for driving.
Having said all that… fuck elon musk and fuck his stupid cars.
This doesnt just affect teslas. This affects any car that uses AI assistance for driving.
Except for, you know… cars that don’t solely rely on optical input and have LiDAR for example
I am fairly dumb. Like, I am both dumb and I am fair-handed.
But, I am not pretentious!
So, let’s talk about your points and the title. You said I had fairly dumb pretenses, let’s talk through those.
- The title of the article… there is no obvious reason to think that I think computers think like humans, certainly not from that headline. Why do you think that?
- There are absolutely realistic situations exactly like this, not a pretense. Don’t think Loony Tunes. Think 18 wheeler with a realistic photo of a highway depicted on the side, or a billboard with the same. The academic article where 3 PhD holding engineering types discuss the issue at length, which is linked in my article. This is accepted by peer-reviewed science and has been for years.
- Yes, I agree. That’s not a pretense, that’s just… a factually correct observation. You can’t train an AI to avoid optical illusions if its only sensor input is optical. That’s why the Tesla choice to skip LiDAR and remove radar is a terminal case of the stupids. They’ve invested in a dead-end sensor suite, as evidenced by their earning the title of Most Lethal Car Brand on the Road.
This does just impact Teslas, because they do not use LiDAR. To my knowledge, they are the only popular ADAS in the American market that would be fooled by a test like this.
Near as I can tell, you’re basically wrong point by point here.
I agree that this just isn’t a realistic problem, and that there are way more problems with Tesla’s that are much more realistic.
I’m so glad I wasn’t the only person who immediately thought “This is some Wile E. Coyote shit.”
I mean, it is also referenced in the article and even in the summary from OP.
I read something a while back from a guy while wearing a T-shirt with a stop sign on it, a couple robotaxies stopped in front of him. It got me thinking you could cause some chaos walking around with a speed limit 65 shirt.
Teslas did this in the past. There was also the issue of thinking that the moon was a red light or something.
Or when a truck is moving traffic lights
So don’t delay, act now, missiles are running out. Allow, if you’re still alive, six to eight years to arrive. And if you follow, there may be a tomorrow, but if the offer’s shunned, you might as well be locking on the sun.
They’re not reading speed limit signs; they’ll follow the speed limit noted on the reference maps, like what you see in the app on your phone.
There’s a lot of cars that check via camera too to double check, for missing/outdated information and for temporary speed limit signs.
Lots of places also have variable limit signs that get updated based on traffic, accidents etc.
Here in NZ those seem to all be marked on the speed limit maps as 100km/h even if in some places the signs never go above 80.
Ngauranga Gorge is one such location and I believe has the country’s highest grossing speed camera.
Yikes, there’s a 25 around here that shows up as a 55 in Google Maps.
Also a 55 that goes down to I think 35 for just a moment when it joins up with a side road. I wonder what a Tesla would do if it was following that data.
Who was the idiot that removed LiDar to cut costs?
/s
He’s said humans don’t use LiDAR so his cars shouldn’t have to. Of course humans have a brain, and he’s cars don’t, but you can’t tell him anything.
Tesla had camera+radar+sonar, and that wasn’t their own tech - they used mobileye EyeQ back then. When they switched to in house tech they gradually ditched the radar and sonar which made no sense to me. But at the time I saw their lead say in an interview that this is superior and I believed. not anymore.
they said doing so cut costs but obviously lidar/radar/sonar only gets cheaper over time, let alone the extra r&d costs because a vision only system is much more difficult to develop.
It was removed because it was giving false positives. They should have upgraded it with lidar but decided to just remove it.
They are so expensive too! /s
Who would have known electronics gets cheaper all the time?? /j