Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        A camera will show it as being more convincing than it is. It would be way more obvious in real life when seen with two eyes. These kinds of murals are only convincing from one specific point.

        • ParadoxSeahorse@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          12 minutes ago

          …and clearly this photo wasn’t the point. In fact, it looks like a straight road from one of the camera angles he chooses later, not afaict from the pov of the car

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          That’s true, but it’s still way more understandable that a car without lidar would be fooled by it. And there is no way you would ever come into such a situation, whereas the image in the thumbnail, could actually happen. That’s why it’s so misleading, can people not see that?
          I absolutely hate Elon Musk and support boycott of Tesla and Starlink, but this is a bit too misleading even with that in mind.

      • Gonzako@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        ·
        10 hours ago

        still, this should be something the car ought to take into account. What if there’s a glass in the way?

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          6 hours ago

          Yes, I think a human driver who isn’t half asleep would notice that something is weird, and would at least slow down.

            • snooggums@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              1
              ·
              5 hours ago

              Glass is far more likely to cause injuries to the driver or the people around the set, just from being heavier material than styrofoam.

      • Mr_Dr_Oink@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        34
        ·
        10 hours ago

        As much as i want to hate on tesla, seeing this, it hardly seems like a fair test.

        From the perspective of the car, it’s almost perfectly lined up with the background. it’s a very realistic painting, and any AI that is trained on image data would obviously struggle with this. AI doesn’t have that human component that allows us to infer information based on context. We can see the boarders and know that they dont fit. They shouldn’t be there, so even if the painting is perfectly lines up and looks photo realistic, we can know something is up because its got edges and a frame holding it up.

        This test, in the context of the title of this article, relies on a fairly dumb pretense that:

        1. Computers think like humans
        2. This is a realistic situation that a human driver would find themselves in (or that realistic paintings of very specific roads exist in nature)
        3. There is no chance this could be trained out of them. (If it mattered enough to do so)

        This doesnt just affect teslas. This affects any car that uses AI assistance for driving.

        Having said all that… fuck elon musk and fuck his stupid cars.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          1
          ·
          21 minutes ago

          I am fairly dumb. Like, I am both dumb and I am fair-handed.

          But, I am not pretentious!

          So, let’s talk about your points and the title. You said I had fairly dumb pretenses, let’s talk through those.

          1. The title of the article… there is no obvious reason to think that I think computers think like humans, certainly not from that headline. Why do you think that?
          2. There are absolutely realistic situations exactly like this, not a pretense. Don’t think Loony Tunes. Think 18 wheeler with a realistic photo of a highway depicted on the side, or a billboard with the same. The academic article where 3 PhD holding engineering types discuss the issue at length, which is linked in my article. This is accepted by peer-reviewed science and has been for years.
          3. Yes, I agree. That’s not a pretense, that’s just… a factually correct observation. You can’t train an AI to avoid optical illusions if its only sensor input is optical. That’s why the Tesla choice to skip LiDAR and remove radar is a terminal case of the stupids. They’ve invested in a dead-end sensor suite, as evidenced by their earning the title of Most Lethal Car Brand on the Road.

          This does just impact Teslas, because they do not use LiDAR. To my knowledge, they are the only popular ADAS in the American market that would be fooled by a test like this.

          Near as I can tell, you’re basically wrong point by point here.

        • teuniac_@lemmy.world
          link
          fedilink
          English
          arrow-up
          67
          ·
          10 hours ago

          This doesnt just affect teslas. This affects any car that uses AI assistance for driving.

          Except for, you know… cars that don’t solely rely on optical input and have LiDAR for example

          • Mr_Dr_Oink@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            15
            ·
            9 hours ago

            Fair point. But it doesn’t address the other things i said, really.

            But i suppose,based on already getting downvoted, that I’ve got a bad take, either that or people who are downvoting me dont understand i can hate tesla and elon, think their cars are shit and still see that tests like this can be nuanced. The attitude that paints with a broad brush is the type of attitude that got trump elected…

            • Reyali@lemm.ee
              link
              fedilink
              English
              arrow-up
              8
              ·
              edit-2
              3 hours ago

              I agree the wall is convincing and that it’s not surprising that the Tesla didn’t detect it, but I think where your comment rubs the wrong way is that you seem to be letting Tesla off the hook for making a choice to use the wrong technology.

              I think you and the article/video agree on the point that any car based only on images will struggle with this but the conclusion you drew is that it’s an unfair test while the conclusion should be that NO car should rely only on images.

              Is this situation likely to happen in the real world? No. But that doesn’t make the test unfair to Tesla. This was an intentional choice they made and it’s absolutely fair to call them on dangers of that choice.

              • Mr_Dr_Oink@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                5 hours ago

                That’s fair.

                I didn’t intend to give tesla a pass. I hoped that qualifying what i said with a “fuck tesla and fuck elon” would show that.

                But i didn’t think about it that way.

                In my defense my point was more about saying “what did you expect” the car to do in a test designed to show how a system that is not designed to perform a specific function cant perform that specific function.

                We know that self driving is bullshit, especially the tesla brand of it. So what is Mark’s test and video really doing?

                But on reflection, i guess there are still a lot of people out there that dont know this stuff, so at the very least, a popular channel like his will go a longway to raising awareness of this sort of flaw.

            • Voroxpete@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              41
              arrow-down
              1
              ·
              9 hours ago

              No, it’s just a bad take. Every other manufacturer of self driving vehicles (even partial self driving, like automatic braking) uses LiDAR because it solves a whole host of problems like this. Only Tesla doesn’t, because Elon thinks he’s a big brain genius. There have been plenty of real world accidents with less cartoonish circumstances involving Teslas that also would have been avoided if they just had LiDAR sensors. Mark just chose an especially flashy way to illustrate the problem. Sometimes flashy is the best way to get a point across.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              7 hours ago

              based on already getting downvoted

              In this case, yes, but in general, downvotes just mean your take is unpopular. The downvotes could be from people who don’t like Tesla and see any defense of Tesla as worthy of downvotes.

              So good on you for making the point that you believe in. It’s good to try to understand why something you wrote was downvoted instead of just knee-jerk assuming that it’s because it’s a “bad take.”

        • Daefsdeda@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          8 hours ago

          I agree that this just isn’t a realistic problem, and that there are way more problems with Tesla’s that are much more realistic.