I have no confidence that Tesla will fix this before the planned Robo-Taxi rollout in Austin in 2 weeks.
After all, they haven’t fixed it in the last 9 years that self-driving Teslas have been on the road.
I have no confidence that Tesla will fix this before the planned Robo-Taxi rollout in Austin in 2 weeks.
After all, they haven’t fixed it in the last 9 years that self-driving Teslas have been on the road.
Waymo is really interesting - you probably wouldn’t guess it, I’m a cautiously optimistic autonomy person! Waymo is already 12x safer than human drivers, that’s brilliant, I love that.
Teslas will (allegedly) start on a small, low-complexity street grid in Austin. exact size TBA. Presumably, they’re mapping the shit out of it and throwing compute power at analyzing their existing data for that postage stamp.
The rub… that all points out the obvious danger of rolling out the wild-west FSD that Tesla drivers are currently employing everywhere else. If it’s safe enough to trust to drive your car for you, why does it need a ton of additional guard-rails to operate without a safety driver?
Yeah it’s scary to think about. There should be laws though that you’re still 100% at fault if you were not driving during an accident. I imagine another issue with FSD is government having a backdoor into your car to immobilize you or whatever they want. Part of me is in favor of that, but of course that a huuuge responsibility that can be abused.
The fun part is tesla FSD shuts off just before accidents, so you’re always the one at fault.
You would be the one at fault in most states anyway as long as you’re technically operating the car. They do that mostly for potential lawsuits from their customers.
So you’re saying it’s good at anticipating accidents haha
Lol where are the Tesla fanboys insisting that geofencing isn’t useful for developing self driving tech?
Can Waymo sometimes use a remote human driver?