A crew loads the wreckage of Asiana Flight 214 on to a truck at San Francisco International Airport in 2013. The crash later was blamed in part on an undue reliance on the jetliner’s automated systems.
The Wall Street Journal
By Daniel Michaels and Andy Pasztor
July 27, 2016 5:30 a.m. ET
Aviation-safety experts have advice for the car industry as it moves to autonomous-driving vehicles: Proceed slowly and make sure drivers realize the limits of the technology.
As airline pilots became increasingly reliant on automation over the years, the focus shifted to simplified cockpit displays and enhanced crew training.
Many high-end cars now offer sophisticated cruise-control and collision-avoidance systems that can maneuver in traffic and on highways without driver commands. From automatic braking to steering assist, manufacturers rely on them as marketing tools.
But several accidents involving Tesla Motors Inc. cars with such technology, including a fatal accident in Florida, have federal highway-safety regulators investigating the company’s onboard automation. Some aviation safety experts warn that existing automotive controls shouldn’t be considered mature autopilot systems.
“It’s quite ridiculous we would give somebody such a complex vehicle without training,” said former commercial pilot Shawn Pruchnicki, who teaches air safety at Ohio State University. It is a mistake, he added, “to assume that the general public is going to be able to jump in and understand what [the] limitations are.”
Early autopilots originated in airplanes around World War I, but it wasn’t until 1947 that a U.S. Air Force plane made the first trans-Atlantic flight using hands-off flying controls. Such features evolved and only became widespread gradually, allowing individual elements to be rigorously tested and verified before plane makers integrated them into complex flight-management systems.
Today, jumbo jets and supersonic fighters rely on a web of computerized equipment and software to get airborne, cruise and even land with little or no human input.
A key lesson from aviation is that reducing risks takes time.
“We have to create perfection in the collision-avoidance systems before we have the car drive itself,” said Mark Rosenker, a former chairman of the National Transportation Safety Board who dealt with safety issues affecting roads, rails and aircraft. So far, “there are nuances to these technologies that we have not yet perfected,” he noted, such as the difference between avoiding a car in front and detecting when a driver is veering off the road.
Tesla Chief Executive Elon Musk has rejected calls to disable his company’s autopilot function, which steers cars more actively than those installed in other brands. But he has embraced one of the lessons from aviation: the importance of training users how to interact with complicated automation. Tesla has vowed to step up efforts to educate customers about the way its autopilots work.
The push toward self-driving vehicles has a long way to go to match advances in commercial planes. In theory, many jetliners can fly autonomously, except for taxiing to the gate and shutting off the engines.
Some computerized safeguards are designed to kick in automatically during particularly dangerous types of emergencies, including engine failure during takeoff.
Airplane automation can also be calibrated to different levels, based on crews’ preference. That is supposed to keeps pilots more aware and engaged. Airlines also require at least two pilots in cockpits, and the one not manipulating the controls is trained to monitor the automated systems.
But psychologists and automation experts say people tend to be poor monitors, whether they are behind the steering wheel or flying in an aluminum tube miles above the Earth. So airliners have extensive visual and aural warnings in case something goes wrong.
For cars, it is essential to develop auto-drive systems sophisticated enough to recognize when a vehicle is swerving between lanes or performing other dangerous maneuvers, according to Martin Chalk, an Airbus A380 pilot and president of the International Federation of Air Line Pilots’ Associations.
More important, according to Mr. Chalk, full-blown automation must be able to intercede to enforce highway rules and keep track of speed limits.
Automotive systems, however, need to avoid inundating operators with nonessential data, according to Yannick Malinge, product safety chief for Airbus Group SE : “A key point is to give information to the driver that he or she needs to have, not what simply would be nice to have.”
Mr. Malinge emphasized that Airbus has devised its flight-management computers so that during each phase of flight, they prominently display just the most relevant information. The system knows “what information you will need under various circumstances, and what isn’t essential” to show pilots.
To be sure, aviation has suffered from pilots relying unduly on cockpit automation. High-profile accidents include an Air France Airbus A330 that crashed into the Atlantic in 2009 after pilots were confused by automated warnings. A perfectly functioning Asiana Airlines Boeing 777 that slammed down short of a San Francisco runway in clear weather four years later because the crew failed to monitor airspeed.
Drivers, of course, can’t depend on simulators or co-pilot for help. But Tesla advises drivers to stay alert and keep their hands on the wheel in case they need to take over unexpectedly.
Original article can be found here: http://www.wsj.com