In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • Critical_Thinker@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    ·
    14 hours ago

    Of course it disengages self driving modes before an impact. Why would they want to be liable for absolutely anything?

      • ArchRecord@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        12 hours ago

        It doesn’t guarantee them protection from liability, but it makes it easier to muddy the waters.

        They never have to claim that autopilot or self driving was on during a crash in any comment to the press, or the courts. They never have to admit that it was directly the result of the crash, only that it “could have” led to the crash.

        It just makes PR easier, and allows them to delay the resolution of court cases.