Terminology is misleading, I agree. Having followed very closely Tesla in the last two years, I only focused on what Tesla specifically explains "Autopiloti" is today. I just realized, by reading this thread, that "Autopilot" may in fact suggest that the car drives itself now, in every imaginable way. It will be the case. It is not, yet.

Quote:
If a car is actually self-driving, but it is not sufficiently good that the driver becomes a passenger, then is it actually self-driving?

I would say definitely not. In principle, a car is a self-driving car (meaning 100% such) when it drives like or better than humans in all possible conditions. While it may be hard to actually asses that, in principle that seems to me a reasonable and simple definition. It also seems to me that is the meaning Tesla assigns to a fully self-driving car, everywhere they mention the idea.

Quote:
To me it is not at all clear that the current approach to self-driving will inevitably lead to completely self-driving (autonomous) capable.

That is their goal, as Elon Musk stated repeatedly. He claimed they'll succeed in this, from a technological perspective, in 3-5 years, definitely no more than 10. That does not mean that autonomous cars will be legal at that point, though. Statistics will need to support the fact that cars do in fact drive themselves better than humans in all conditions before institutions make it legal for a human to give up control. At that time, though, a number of things would need to change: training drivers to begin with, for example. Codes and regulations, etc.
Until then, humans are responsible what what their cars do.

Interesting related point: technologically, Tesla claims however that a real self-driving car is a solved problem. All they need TODAY is more computing power and improved sensors in some areas. In other words, it is just a matter of when, not of how.
This is FYI, not to say one must believe Tesla, necessarily. I personally do believe they will succeed, but, certainly, it is just my best educated guess.

Quote:
Effectively the Tesla style approach uses the driver as the fail safe 'airbag' for the technology.

Honestly, I don't think so. They never said such a thing, or even suggested it - admittedly using the "Autopilot" name for their technology may be misleading. But, Tesla never stated any such thing, they keep repeating the opposite, that is that humans should drive in real life, out of testing environments. You're even required to keep your hands on the steering wheel or the car will progressively slow down to stop.
This is precisely because of some of the limits of the technology you point out.

Please, also notice that more computing power and better sensors will - I think - address your concerns about roads.
As to the mix of human-vehicle intervention, I am not sure I necessarily share your concern. There MAY be issues the moment in the future we start to rely on slef-driving, but it would be interesting to see what interface/interaction designs will be there to address such issues. Keep in mind that we've been living in a world where humans and machine share control for decades now, already, in various degrees. Even a completely mechanical car is a layer between you and the road you do not control to a large extent, and yet we successfully rely on that on a daily basis. I think you're deliberately drawing a line in a world where machine control and human control already fade into each other. No?
_________________________
= Taym =
MK2a #040103216 * 100Gb *All/Colors* Radio * 3.0a11 * Hijack = taympeg