An excellent article by Levi Tillemann and Colin McCormick at The New Yorker lays out the advantages that Tesla has in the race towards driverless cars. Some, however, think that Tesla is driving recklessly toward that goal. Is Tesla racing toward victory or calamity? The answer hinges on a key issue in human/robot interaction.
Don Norman, Director, Design Lab, University of California San Diego (Source: JND.org)Don Norman, the director of the Design Lab at University of California, San Diego, argues that the most dangerous model of driving automation is the “mostly-but-not-quite-fully-automated” kind. Why?
"Because the more reliable the automation, the less likely the driver will be to respond in time for corrective action. Studies of airline pilots who routinely fly completely automated airplanes show this (as do numerous studies over the past six decades by experimental psychologists). When there is little to do, attention wanders." [Source: San Diego Union-Tribune]
Norman contends that car autopilots are more dangerous than airplane autopilots. Airplanes are high in the sky and widely distributed. Pilots are well-trained and have several minutes to respond. Drivers are not nearly as well-trained and may have only seconds to respond.
Yet, Tesla’s “Autopilot” follows exactly the “mostly-but-not-quite-fully-automated” model about which Norman warns.
I asked Norman about Tesla’s approach. His response:
"Tesla is being reckless. From what I can tell, Tesla has no understanding of how real drivers operate, and they do not understand the need for careful testing. So they release, and then they have to pull back."
As examples, Norman pointed to Tesla’s highway passing, automatic parking and summon features.
Norman considers Tesla’s highway passing feature dangerous because its cars do not have sufficient backward-looking sensors. Mercedes and Nissan, he noted, have far better back-looking sensors.
Consumer Reports found serious issues with Tesla’s automatic parking and summon features. Tesla’s sensors had high and low blind spots, causing the car to fail to stop before hitting objects like duffel bags and bicycles. There were also issues with the user interface design. The parking control buttons on the car key fob were not marked. The car continued to move when the iPhone app was closed. Consumer Reports told its readers, “It is critical to be vigilant when using this feature, especially if you have children or pets.” Tesla fixed these problems once Consumer Reports raised its safety concerns.
Here’s Don Norman’s observation about Tesla’s quick response:
"Good for Tesla, but it shows how uninformed they are about real-world situations.""Tesla thinks that a 1 in a million chance of a problem is good enough. No. Not when there are 190 million drivers who drive 2.5 trillion miles."
If Norman is right, Tesla owners will grow less attentive—rather than more vigilant—as Tesla’s autopilot software gets better. Situations where their intervention is needed will become rarer but also more time-sensitive and dangerous.
Indeed, customer experience with Tesla’s early autopilot software produced a number of reports and videos of silly antics and near-calamitous cases. Here are just a few:
Jump to time mark 2:45 for the near accident:
Be sure to read the background comments from Joey Jay, the uploader of the video:
Jump to time mark 4:00 for a particularly “fun and scary” segment:
If Norman is wrong, Tesla does have a huge advantage, as Tilleman and McCormick note.
Other companies pursuing a semi-autonomous approach, like GM and Audi, have been slower to deploy new models with comparable capabilities.
Google, which advocates a fully driverless approach for the reasons that Norman cites, is mired in a state and national struggle to remove the regulatory limits to its approach. Even if Google gets the green light, its pace is constrained by its relatively small fleet of prototypes and test vehicles.
Tesla, on the other hand, has a powerful software platform that allows it to roll out semi-autonomous capability now, as it deems appropriate. And, it is doing so aggressively. Autopilot is already on more than 35,000 Tesla models on the road—and Tesla just announced a promotion offering one-month free trials to all Model S and X owners.
Soon, it will be preinstalled on all of the more affordable Model 3, of which more than 300,000 have been preordered.
That’s a critical advantage. The quality of autonomous driving software depends in large part on the test cases that feed each developer’s deep learning AI engines. More miles enable more learning, and could help Tesla’s software outdistance its competitors.
The challenge, however, is that Tesla is relying on its customers to discover the problems. As noted in Fortune, Elon Musk has described Tesla drivers as essentially “expert trainers for how the autopilot should work.”
Tom Mutchler wrote in Consumer Reports that “Autopilot is one of the reasons we paid $127,820 for this Tesla.” But, he also noted, “One of the most surprising things about Autopilot is that Tesla owners are willingly taking part in the research and development of a highly advanced system that takes over steering, the most essential function of the car.”
Telsa’s early-adopter customers are willing, even enthusiastic, about Autopilot. But, should untrained, non-professional drivers be relied upon to be ready when Tesla’s autopilot needs to return control to the human driver? Can they anticipate problems and intervene to retake control without being asked? Will they follow safety guidelines and use the autopilot only under recommended conditions, or will they push the limits as their confidence grows?
Imagine the consequences if a new slew of Tesla owner videos ended with a catastrophic failure rather than a nervous chuckle? It would be tragic for the victims and Tesla. It might also dampen the enthusiasm for driverless cars in general and derail the many benefits that the technology could deliver.
While the mantra in Silicon Valley is “move fast and break things,” Elon Musk needs to reconsider how much that principle should apply to Tesla’s cars and customers.
Get Involved
Our authors are what set Insurance Thought Leadership apart.
Automating repetitive tasks can streamline workflows and allow for data-driven decisions with greater accuracy and speed.--but getting to scale can be tricky.