Originally Posted By: Taym
Originally Posted By: mlord
I think what he meant was that most of the car features related to this are to detect critical situations, warn the driver of them, and react autonomously (eg. emergency braking) if the driver fails to respond.


Precisely.

Current Autopilot will eventually lead to actually self-driving cars, of course, but even then, the point in time in the future when the vehicle CAN drive itself, that will not imply that the driver becomes a passenger and accidents are considered legally (or morally?) caused "by the car". That would be a further step in the future, when society decides we can rely entirely and only on technology to control a vehicle...
Perhaps the terminology contributes to confusion. If a car is actually self-driving, but it is not sufficiently good that the driver becomes a passenger, then is it actually self-driving?

My point was that if the car is not entirely self-driving capable, that is, it requires the driver to be always situationally aware and ready to take control at any moment, yet the driving assistance is quite good and continues to improve, then the problem of driver inattention and non-readiness to take control increases as the car ability becomes better and actual control handoff to the human driver become less frequent.

To me it is not at all clear that the current approach to self-driving will inevitably lead to completely self-driving (autonomous) capable. Effectively the Telsa style approach uses the driver as the fail safe 'airbag' for the technology. When the technology runs up against situations where it cannot continue driving with confidence it utilizes the human as backup, an escape route for the tech.

The gap between 99% (or whatever) capable and 100% capable (driver as passenger) may prove to be a wide chasm. Much of this comes from the roads themselves which do not have anywhere near the design, configuration and supporting technology to provide highly reliable information to the car. Another portion comes from the mix of human piloted and varying degrees of automated driving vehicles. And yet another missing piece is technology on all the other cars to provide reliable information on the location, configuration and intentions of those other vehicles.

Some of the methods utilized in Tesla type self-driving cars amounts to 'cheat codes', such as following the tail lights of the car ahead in poor weather. We have all done this in poor visibility driving at speed, and we know that there is some real risk of the driver in front going the wrong way (for us) and following them into a poor vector.

Imagine that the car in the linked video (audio can be muted) was being followed by a 'self-driving' car in poor weather at night. Could the self-driving car simply follow the first car into the water? Would it suddenly hand off control to the human driver half way down the embankment?

https://youtu.be/4NJmB1F2mdE

Construction, temporary road changes, road damage (which can occur suddenly) and non-standard road configurations and other weird problems can conflict with GPS mapped routes and 'normal' visual driving clues.

https://youtu.be/9S26YPzLiDQ

https://youtu.be/u3_5qYBk6Cw


Edited by K447 (04/07/2016 14:53)