Toyota is considering skipping the development of so-called Level 3 autonomous vehicles because of the safety and legal ramifications it presents, a top safety and technology executive said.
At issue is the Level 3 “handoff” between a semi-autonomous self-driving vehicle and a possibly not-aware driver—when a situation arises that the car’s computer cannot intuit or comprehend. If the human in the driver’s seat is not paying attention, however, the human might be thrust into a situation to make a worse, and potentially lethal, decision.
Instead of walking into that minefield, Toyota might proceed from driver-assistance technology, such as smart cruise control, directly to a vehicle that controls every aspect of transport—likely without brakes, a steering wheel, or an accelerator for a human to control. But the decision of whether that is used for passenger cars sold to the public or limited to taxis and shuttles on regulated roads has yet to be determined.
“Autonomy is safety-first,” said Kiyotaka Ise, president of Toyota’s advanced R&D and engineering efforts. “The human-machine interface is the biggest concern with Level 3, in terms of the limbo of several seconds during the exchange between the system and human. Going straight to Level 4 may make better sense.”
Although Level 4 might have driver controls, they would only be in the case of driving off-road or in inclement weather. Level 5 would be a fully autonomous vehicle, where there is no possibility of human interaction
Toyota might not be the only automaker looking to avoid Level 3. Already, Ford and Volvo have indicated they likely will skip this step.
But other automakers, such as Tesla and Audi, have aggressively embraced Level 3 development, seeing it as a crucial step before automobiles can become fully self-driving. In some eventualities, instead of bringing the human into the equation, the car would come to a stop if the computer becomes confused.
Critics challenge that Level 3 could put humans in the role of beta testers of unproven technology.
“There is a challenge of handing the car over to the driver when the system cannot drive the car. People need to understand the system’s capabilities and limitations. People should not over-trust safety technology. They should understand the technology correctly,” Ise said in an interview at the Tokyo Motor Show.
Level 3 also presents a legal liability snarl in terms of who is at fault should a semi-autonomous car get into an accident. If the computer was driving, is the automaker at fault, or the driver? Even if the driver was controlling the vehicle, there could be an argument that the driver was not warned or engaged sufficiently by the computer that he needed to take command of the vehicle.
Automakers are racing toward creating a supposedly safer self-driving environment. According to Toyota, 1.3 million lives are lost annually in vehicle-related accidents. However, many of those lost lives are in countries where the infrastructure for autonomous technology could be decades away. In more advanced countries with stronger connectivity, Level 4 could be here as soon as the early 2020s, Ise said.
Toyota also is being cautious to not refer to its systems as autonomous but rather as an “advanced driver-support system.”
Toyota’s current Level 2 system under development—known as Lexus Safety System Plus—involves an advanced version of smart cruise control but requires the driver to stay in contact with the car’s actions.
The software would provide highway driving assistance with following, passing, and changing lanes, but it will be able to tell when a driver is drowsy—even through sunglasses—and when he or she has lost alertness and recognition of his or her surroundings, at which point it will rouse the driver from a stupor.
Said Ise: “We need to be 100 percent perfect on these machines.”
The post Toyota Might Skip Level 3 Autonomy appeared first on Motor Trend.
from Motor Trend http://ift.tt/2zLfttV
Aucun commentaire:
Enregistrer un commentaire