• psud@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        1 year ago

        That’s all the people who were asleep on the highway or driving at very high speed in town

        The recent versions don’t allow either of those behaviours now, so those crashes aren’t happening anymore.

        Full self driving doesn’t do that

        And the deaths I’m interested in are these ones being caused by FSD, not lane keeping and cruise control. Loads of brands do lane keeping and cruise control and implement it no better than Tesla

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Just keep in mind that FSD is only as safe as they claim because it’s supervised.

          I would hope that even a reasonably working system would be better with a human vigilantly watching it than a human driving regularly.

          The system would have to be really bad to be worse than that.

        • Zink@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          But does FSD change the logic for the lane keeping and the speed & distance?

          Aren’t one of the features “navigate on autopilot?”

          • psud@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            It is quite different. Navigate on autopilot is lane keeping, cruise control, and automatic highway exits. FSD tries to do all driving tasks - turns at stop signs, at lights, keeping to the correct side on roads with no centre line, negotiating with oncoming traffic on narrow roads…

            • Zink@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Yeah it adds more capabilities for sure. But if you are on a moderate to high speed road where autopilot works fine, then is the control logic any different?

              Obviously there are various tours of accidents that autopilot would never get the chance to cause, like maybe turning right at an intersection and hitting a pedestrian. But do they act differently on a main road where teslas have done things like run into tractor trailers?

              • psud@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                1 year ago

                The one that hit a tractor trailer was years ago. They are far better now, specifically they see low contrast stuff now and that’s on autopilot. The biggest difference to the user will be the ability to have hands off the controls.

                It isn’t the same though. FSD is written completely differently to autopilot. It’s a different program.

                Other accidents it won’t have on those roads include falling asleep and running off the road, or being surprised by someone braking ahead and running into them

                I’m sure it will be worse than humans around animals on the road. I wonder if it will see a wombat before it hits it.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I don’t need to provide you with evidence that FSD has caused crashes. There’s plenty; if you can’t find it you’re not looking.

      As to your point about accident statistics, that’s responding to a different point than the one I was making. I didn’t say that it kills people more often than they kill themselves (through dangerous, inattentive or reckless driving). I just said that it regularly kills people. There’s potentially some hyperbole there, you can quibble over definitions of “regularly” if you want to be a pendant, I really don’t care.

      The point is that when it does go wrong, it often goes spectacularly wrong, such as this case where a Tesla plowed into a truck or this thankfully low speed example of a very confused Tesla driving into oncoming traffic.

      Could a human make these errors? Absolutely. But would you, as a human, want to trust yourself to a vehicle that is capable of making these kinds of errors? Are you happy with the idea of possibly dying because the machine you’re in made one critical error? Perhaps an error that you yourself would not have made under the same circumstances?

      A lot of people will answer “yes” to that, but for me personally any autopilot that requires constant supervision to make sure it doesn’t kill me is more of a negative than a positive. Even if you try to pay attention, automation blindness will inevitably kick in. And really what is even the point of self driving if you have to be paying attention? If it’s not freeing you up to focus on other things then it might as well not be there at all.