In a pivotal moment for the autonomous transportation industry, California chose to expand one of the biggest test cases for the technology.

  • jeffw@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    23
    ·
    1 year ago

    Good. I’m sick of the fearmonger. “OH NO, THIS ONE CAR GOT IN A CRASH!!!”

    Yeah, but humans crash too?

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      When I worked on Google’s Waymo project, we only had a small handful of our cars involved in any collision on public roads. And every single one of them was from a human driver running into the SDC. I dunno if that’s changed since I left, but even in the early stages, SDCs are remarkably safe compared to human drivers.

      • sky@codesink.io
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Cruise has hit an oncoming car, smashed into the back of a Muni bus, and is constantly stopping in emergency zones making first responders lives harder.

        7 hours of debate of the community making it clear how much they don’t want this, how much the city’s leaders don’t want this, but the state doesn’t give a shit.

        They may be “safe” because they avoid difficult maneuvers and only drive like 25-30mph, but that doesn’t mean they’re practical or should be welcome in our cities.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        7
        ·
        1 year ago

        And every single one of them was from a human driver running into the SDC

        Yea, me too. I’m such a good driver, others are crashing into me every day…

    • Gsus4@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      1 year ago

      It hits different when you’re the one being crashed into, but if it crashes less than monkeys behind the wheel and liabilities are all accounted for and punished accordingly, bring it!

        • Gsus4@feddit.nl
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Because corporations can’t be allowed to get away with what would land any of us in jail if we did it. We know they will cut corners if allowed, so make sure FSD is safer and that citizens are not defrauded when dealing with economic behemoths.

          In other words, it’s good that they have less accidents, but the ones they have should be treated the same way we treat human drivers or harsher, so that playing with chances is not just an economic factor to optimize and cut corners on. E.g. aviation safety rules: even low cost airlines need to follow these rules, not the legal farwest they created with social media.

          With FSD the example is: LIDAR is more expensive, but it is an evolving technology that is essentially safe, but Elon wants to use just cameras…because it’s cheaper…and…much less safe…it’s not a solved problem on the cheap. That’s why you need to penalize them for making such choices or outright forbid them from making them. They are going to be setting standards here and there is a risk that a shittier technology wins a few bucks for elon at the cost of lives into the future: and we can’t half-ass this forever just because Elon wants his cars to be half the price it takes to do right.