I've said it over and over: the American automobile transit system is inherently unsafe. That's why I've been a self driving skeptic. We have a system where there drivers must break the rules and/or drive in an unsafe manner regularly. This involves: excessive speed when others are driving too fast, entering the oncoming lane when needing to pass someone double parked, making blind turns because a big truck is blocking your view, not knowing when a pedestrian will enter a crosswalk and needing to slam on the brakes, having to yield when crossing a lane of traffic... the list goes on and on. There isn't an actual safe set of rules, and the rules we have are so commonly broken that a learning algorithm won't follow them.
I'd put this inherent danger at somewhere around 0.1% of the time. I'd say I'm forced to drive dangerously about once every 10 miles of driving, but the fact that people must drive dangerously occasionally to use the system means that a machine can't ever learn to use it safely.
I'd put this inherent danger at somewhere around 0.1% of the time. I'd say I'm forced to drive dangerously about once every 10 miles of driving, but the fact that people must drive dangerously occasionally to use the system means that a machine can't ever learn to use it safely.