Last year, two Waymo robotaxis in Phoenix “made contact” with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles’ software. A “recall” in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn’t pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn’t elaborate on what it meant by saying that its robotaxis “made contact” with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren’t carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

  • Tetsuo@jlai.lu
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    10 months ago

    Honestly, I think only trial and error will let us get a proper autonomous car.

    And I still think autonomous cars will save many more lives than it endangered once it become reliable.

    But for now this is bound to happen…

    To be clear, they still are responsible of these car and the safety of others. They didn’t test properly.

    They should be trying every edge case they can think about.

    A large screen on the side of a truck ? What if a car is displayed on it ? Would the car sensor notice the difference?

    A farmer dropped a hay bale on the road ? It got flattened by rain ? Does the car understand that this might not be safe to drive on or to brake on ?

    There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

    But even if you try everything there will be mistakes and fatalities.

    • threelonmusketeers@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

      Do you think “better than human drivers” is sufficient for deployment on public roads, or do you think the bar should be higher?

      • Tetsuo@jlai.lu
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Honestly, I’m pragmatic, if less people die in accidents involving autonomous car, then yes.

        The thing is we shouldn’t be trusting the manufacturers for these stats. It has to be reported by a government agency or something.

        Similarly Autonomous car software should have to be certified by an independent organization before being deployed. Same thing for updates to the software. Otherwise we would get deadly updates from time to time.

        If we deploy and handle autonomous car with the same safety approach as in aviation I’m sure this transition can be done fairly safely.