@JPWhite: “Certainly the Model S knows the speed limits of most roads and could be set to comply with regulations when in autonomous mode.”
I won’t disagree with this statement. However, we are still early on in the development of autonomous systems and I believe Musk in particular is trying to not insult the intelligence of the average driver by planting super-nannies into his programming. Just imagine if he had started out by saying, “Autopilot will never allow you to drive over the speed limit.” How would people have reacted that that? We’ve got enough griping already about how so much control has been taken out of our hands by federally-mandated nannies and quite honestly I’m one of those griping because I’ve personally experienced how their function *can* be counter-productive. There will always be a need for a manual override and sometimes you just don’t have the time to punch buttons when it occurs.
So Musk has tried to give the driver the benefit of the doubt from the beginning and one way or another it’s been biting him in the tail ever since. So called ‘wonky wheels’ where suspension parts break–supposedly arbitrarily but in every case obviously from the massive torque and maneuvering stresses of people trying to drive a three-ton behemoth like a one-ton rallye car. With this Autopilot trying to make the car perform completely autonomously under conditions where the computer is only barely able to even hold the lane and in which under non-expressway conditions even a human driver can’t see what’s in front of his hood, much less a radar now aimed towards outer space. We as student drivers have all supposedly been taught about blind curves and even blinder hills which may have a curve at or just over the crest. GPS can help the car prepare, but without physical experience and an AI capable of learning from that experience, the first time over the hill could be its last.
I’ve mentioned Google’s cars before and an episode of Top Gear UK a few years ago demonstrated an autonomous truck using similar technology. The drawback with the system is the current implementation of Lidar requiring a physically rotating sensor head and needing several scans before being able to generate a picture of the area around the vehicle. Quite bluntly, it needs to garner that picture far more quickly before that form of autonomy is really highway-worthy. Optical cameras get the picture more quickly but in themselves lose the three-dimensional aspect of the image that has to be supplemented by radar and/or ultrasonics. Google is approaching it one way; Tesla another. The best system will need to be a compromise and a merging of the two. I would think a scanning Lidar doing a vintage television style raster scan to the front with camera and radar support would both speed the responsiveness and clarity of the Lidar in the direction of travel while other sensors could effectively cover the sides and rear of the car. Something almost like KITT’s scanning light array on the nose only operating at a much higher scan rate is what comes to mind.Even the Cylons of the original Battlestar Galactica emulates the concept, though was intended more for visual effect than any functionality for television.
The United States has become an incredibly litigious society. The United States citizen (albeit not they alone) has become an instant-gratification society. Our TV commercials exemplify this very clearly, “It’s My Money and I Want It NOW!” Certain political and corporate entities have worked to make us all a nation of spoiled brats. I don’t intend to make this a political argument but the simple point is that AI is coming and because of all these ‘spoiled brats’, autonomy is not only inevitable but will eventually become mandatory simply to stop us from killing ourselves and others with our careless behavior.
Source : http://www.thetruthaboutcars.com/2016/07/tesla-faces-backlash-autopilot-technology-wake-crash/