Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

      • humanspiral@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The biggest problem will always be a backdoor that allows remote control of the car for purposes of killing the driver or other people. The Wile E Coyote attack is much more expensive and puts attacker in jeopardy for the time involved in constructing the “trap”.

        • TXL@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          Most dangerous? Yes. Maybe.

          Biggest? No. It’s in the magnitude of a spy thriller plot for plausibility.

          • humanspiral@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Exploding pagers would seem like something “government should never be so bold as to destroy confidence in industrial economy”. It was praised for being brilliant by our rulers.

            Government demanding backdoors is common even when it results in hackers, and foreign agents, finding and exploiting them. The forward thinking component of our government is not as important as maximum control value. The most dangerous car for this application is one that is pure drive by wire, without completely mechanical brake pedal/steering, that further overrides any signal to “wire control” for steering/power input.

            Over the air updates is a possible vector for backdoor control, but the FSD feature of summoning vehicle from parking spot to front door, is an RC control feature. Just as key fobs get “cloned”, security is not foolproof. The ultra discrete assassination power makes backdoors/hacking features very valuable.

    • nomy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      People will definitely fuck with autonomous cars though so you have to plan for it.

        • shawn1122@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          There’s no way the wall would look real as your perspective shifts while yoi over closer to it. Most humans would react to that by at least slowing down.

        • chaogomu@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I watched the video. The wall would not fool a human with object permanence.

          Anyone who is fooled, is likely impaired enough that they are not legal to drive.

            • Syrc@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              Probably in the sense that as you’re approaching it in the distance you can see the lines around it. When you’re that close you can’t see them anymore, but you should’ve realized that it was a wall way before that point.

      • ChaoticNeutralCzech@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Many people tend to doze off so much they would absolutely get fooled. I admit I might, too, especially if the wall is made of a material that needs no guy wires to prop it up. They either used digital effects or a very good color grading job, it’s uncanny.

        relevant

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      The scientists in Ireland calling their data set to prevent this exact fucking thing “Coyote” sent me over the moon.

  • acockworkorange@mander.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    29 days ago

    Anyone that works in automation will tell you you can’t use software to overcome deficiencies of your sensors. They were too cheap to include a reliable Lidar, and now can’t have a car that knows where obstacles are.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    To be fair, I’d be surprised if half the humans driving didn’t do the same.

    • osugi_sakae@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      this. watching the video, I had some trouble telling the difference. sure, from some angles it is obvious, but from others it is not.

      That said, other cars, with more types of sensors, would probably have “seen” the obstruction on the road.

      • Mic_Check_One_Two@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        That said, other cars, with more types of sensors, would probably have “seen” the obstruction on the road.

        Well yeah, that’s sort of the entire point of the video. He ran the test with a lidar-equipped vehicle, and it saw the wall right away. Hell, a radar-equipped car (like early teslas) probably would have seen the “kid” behind the wall as well. But since Musk has decided that cars should be able to self-drive with only cameras, the newer teslas will just plow straight into the wall without braking.

      • hardcoreufo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I would have liked to see how a more typical car with automatic cruise control /braking functioned. I think they use ultra sonic sensors and would have done better than the Tesla.

    • LeninOnAPrayer@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 months ago

      I mean you’re watching from a recorded video. I really doubt that it wouldn’t be anything but obvious to actual humans eyes. i mean our depth perception alone would tell us something is wrong. You’re just not watching this in 3D.

      Maybe at 55 or 65 mph on a foggy day. But I doubt any person paying attention isn’t seeing the obvious anchors holding the wall up and the incorrect perspective at 40 mph.

        • LeninOnAPrayer@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          You’re right. I miss the days when it was just “these damn kids” on their cellphones. Now, more often than not, its a boomer on their cellphone. At the end of the day fuckcars.

  • fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

    Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.

    This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻‍♂️

    Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      If no one is liable then it’s tempting to deliberately confuse them to crash

    • chilicheeselies@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Not sure how it plays for Tesla, but for Waymo, their accidents per mile driven are WAY below non-automation. Insurance companies would LOVE to charge a surplus for automated driving insurance while paying out less incidents.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        What is far more likely is that policies simply wont cover accidents due to autonomous systems.

        If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.

        This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.

        • ebolapie@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          I don’t know if I believe that people will avoid the risk. Humans are god awful at wrapping their our heads around risk. If the system works well enough that it crashes, let’s say, once in 100,000 miles, many people will probably find the added convenience to be worth the chance that they might be held liable for a collision.

          E, I almost forgot that I am stupid too

      • ThePantser@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        If it’s a feature of a car when you bought it and the insurance company insured the car then anything the car does by design must be covered. The only way an insurance company will get out of this is by making the insured sign a statement that if they use the feature it makes their policy void, the same way they can with rideshare apps if you don’t disclose that you are driving for a rideshare. They also can refuse to insure unless the feature is disabled. I can see in the future insurance companies demanding features be disabled before insuring them. They could say that the giant screens blank or the displayed content be simplified while in motion too.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Once that happens, Level 4 driving will come standard

      Uhhhh absolutely not. They would abandon it first.

    • hedge_lord@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Kids already have experience playing hopscotch, so we can just have them jump between the rooves of moving cars in order to cross the street! It will be so much more efficient, and they can pretend that they are action heroes. The ones who survive will make for great athletes too.

    • lonerangers1@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      In silicon valley there is an episode where a bunch of phones explode because of a software problem. A lot like the pager attack trump got a trophy for. And musk could take any of these cars and “self drive” them to where ever, and “update” their discharge parameters or something, then boom. The trucks are 10k lbs too. Bet you could take a small building down with one without much fuss. They are pretty fast. Scary shit. Musk is a huge problem. Watch all gov envoys being his swasticars and then he can take people out russian style. opps, accident, again.