Spencer Platt / Getty Images
Federal regulators are investigating Tesla’s Autopilot feature to determine whether it played a role in two recent crashes. Tesla has branded Autopilot as a safety feature and an incremental step toward self-driving cars. But it doesn’t fulfill the promise implied by the term “autopilot” – by definition, a technology that can drive itself in place of a person. And it's possible that the government could take issue with that contradiction.
In a blog post after the crashes — one of which is the first known self-driving car fatality —Tesla referred to Autopilot as a “driver assistance system,” not autonomous vehicle technology. But since its inception, Tesla has promoted Autopilot as technology that “relieves drivers of the most tedious and potentially dangerous aspects of road travel” and can reduce the risk of crashes that result from human error. Founder and CEO Elon Musk has personally pushed that narrative.
David Strickland, who ran NHTSA as administrator from 2010 to 2014, told BuzzFeed News, “the agency looks very hard and specifically at the notion of how a manufacturer pulls in a technology, and whether they have taken into account how a consumer may use a product in a way that shouldn’t [be] done.” (Strickland now leads a self-driving car lobbying group that includes Google, Ford, and Uber.)
Drivers who activate Autopilot see an acknowledgement box on the display each time they turn it on, warning that it is “an assist feature that requires you to keep your hands on the steering wheel at all times.” But to Strickland's point, that's not how people often use it.
Numerous YouTube videos show drivers commonly go hands-free while driving on Autopilot. Musk himself went hands-free during a demo in 2014. In a complaint filed to the NHTSA earlier this month, someone reported that a Tesla sales representative told him or her they did not need to keep hands on a Tesla’s wheel while driving on Autopilot.
Musk has said Autopilot is named after the aircraft technology because “the intended implication is that a driver must remain alert, just as a pilot must remain alert.” But that is not how autopilot works in plane cockpits. While flying, autopilot allows pilots to go hands-free to focus on other aspects of flying, like the weather and trajectory. And so the big questions is, what does the term lead people to expect?
“It’s the terminology thing. Perhaps Autopilot with an asterisk would’ve been a better marketing technique,” Mark Peters, a 52-year-old American Airlines pilot and Tesla Model S owner, told BuzzFeed News.
Autopilot includes several functions: autosteer, auto lane change, automatic emergency steering, side collision warning, and autopark. The Model S owner’s manual mentions autopilot just four times and is lumped into other sections. Under the section for traffic-aware cruise control, which is only in cars that are equipped with Autopilot hardware, there are nine warnings for drivers. They include:
- Don’t depend on the car to determine accurate cruising speeds.
- The technology might not be able to detect all objects that cross its path, so “Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.”
- “Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.”
- “Depending on Traffic-Aware Cruise Control to slow the vehicle down enough to prevent a collision can result in serious injury or death.”
Tesla Motors Inc. Model S owner's manual
Tesla pointed to its notices to customers about Autopilot’s limitations when BuzzFeed News asked a spokeswoman why it is called Autopilot if drivers are told to maintain hold of the steering wheel.
Strickland, the former NHTSA administrator, said auto manufacturers should consider all the ways consumers could use their products – even uses that go against a company’s recommendations. In 2009, Toyota recalled more than 5 million vehicles after finding, among other factors, that driver-side floor mats could trap the gas pedal and cause unintended acceleration. (Four people died in San Diego, California after their Lexus sped out of control and crashed into a ravine.) Strickland, who oversaw the agency’s investigation, said part of Toyota’s problem was that people were stacking floor mats on top of each other in their cars, which also contributed to the unintended acceleration.
“They recalled the mats, they put vigorous warnings out there, but people still kept stacking mats,” Strickland said. “You cannot warn your way out of the responsibility of anticipating foreseeable use and abuse.”
A NHTSA spokesman declined to comment when asked whether the agency’s investigation includes looking at how Tesla markets Autopilot to consumers, citing the open crash investigations. The agency is expected to release guidelines for autonomous vehicles this summer.
from BuzzFeed - Tech http://ift.tt/29JUKP8
via IFTTT
Hiç yorum yok:
Yorum Gönder