Yesterday, during a Senate hearing, senators criticized Tesla (TSLA) for safety protocol lapses in its autopilot feature. Stricter security rules could lead to more delays in releasing Tesla autopilot cars and others. We’ll discuss this in detail. Let’s first discuss the Uber’s self-driving crash that led to the recent investigations and stricter rules around self-driving.
Tesla autopilot car concerns after Uber accident
The fully functional self-driving or autopilot cars are still a few years off. It is not only because of the software flaws or other technical issues. It is also due to concerns about vehicle safety. We highlighted in Uber at Fault for Self-Driving Car Crash, Other Problems? that the NTSB (National Transportation Safety Board) partly blamed Uber (UBER) for an accident involving a self-driving car in March 2018.
In this accident, a 49-year-old woman died when she was walking her bicycle across the street. It was the first self-driving accident that resulted in a death. This crash caused the companies testing autopilot cars to be more cautious. Also, it warns that Tesla’s autopilot cars and others driving on the road are still some time away.
After Uber, the Senate gave Tesla (TSLA) heat for safety lapses in its autopilot feature. CNBC reported that several senators weighed down on Tesla and Uber yesterday during a Commerce Committee hearing on self-driving vehicles.
Tesla autopilot “cheats”
There are concerns about Tesla’s autopilot features. One of these concerns is the “cheats” that Tesla drivers can use to trick autopilot into believing that they are alert. Senator Ed Markey said, “Tesla drivers have identified a variety of tricks to make autopilot believe they are focused on the road even if they are literally asleep at the wheel.” Further, he said that in one of the cheats, drivers can add weight to the steering wheel, such as a water bottle or orange, and the car will still drive.
During the company’s Q3 earnings call, Tesla’s CEO Elon Musk said, “There’s the car being able to be autonomous, but requiring supervision and intervention at times.” Markey said that since the Tesla autopilot system can be easily manipulated, “That’s not safe! Somebody’s gonna die!”
Already, the Senate sent a formal letter to the company to fix these issues with the Tesla autopilot design. Per the NHTSA, the self-review is voluntary. CNBC reported that Senator Tom Udall found this approach not useful. Udall added, “The self-certification approach did not work out well for the Boeing 737 MAX 8 and now Boeing is paying the price.” A fatal crash of Ethiopian Airlines Flight 302 on March 10 involving Boeing (BA) 737 MAX 8 planes led to their grounding by several airlines.
Autopilot automakers and tech companies take the lead
Several companies joined the autopilot bandwagon because of its potential. In addition to automakers, tech companies are taking the lead in this space. Alphabet’s (GOOG) self-driving car project, Waymo, already built robotaxis. These taxis are being tested in various US cities. Also, it is partnering with Lyft (LYFT) in some locations.
Besides Tesla’s autopilot, Apple (AAPL) is also quietly making progress on its autopilot car technology. It bought Drive.ai to boost its self-driving car efforts. Uber (UBER) announced in April that it was working with Volvo to build autopilot cars. Also, automakers like Ford (F) and General Motors (GM) are investing heavily in self-driving tech. In October, Tesla bought DeepScale, a computer vision start-up, to boost its autopilot driving goals.
Safety rules for autopilot cars
In August, several companies asked regulators to update laws and allow fully self-driving vehicle testing. However, there is still a lot of work that still needs to be done on safety protocols. So, it seems like Tesla autopilot cars and others are still a few years off. Meanwhile, the self-driving race between companies is only expected to heat up.