Complex Made Simple

Tesla’s autonomous cars are facing speedbumps – Should we give up hope?

When your autonomous technology has been involved in 3 lethal crashes, it's quite difficult to be hopeful for the tech's future.

The first ever autonomous Tesla crash occurred in 2016 Drivers seem to misunderstand the true capabilities of Tesla's Autopilot mode, which is only a "driver assistance tool" Critics believe that a lot of the blame behind the crashes lies in Tesla's hands

Tesla’s autonomous cars have been involved in multiple fatal crashes in just a few years, and people are calling for retribution.

Is it already over for the company’s self-driving aspirations?  

Why do autopilot Teslas keep crashing?

Just last week, Tesla witnessed the third victim of its autonomous technology. 

In a preliminary report released May 15th, the US National Traffic Safety Board (NTSB) said that driver 50 year-old Jeremy Banner engaged Autopilot about 10 seconds before colliding with a semi-truck trailer that was crossing his path, The Verge reported. According to the NTSB, he was driving at 68 mph (108 kph). Apparently, he had engaged autopilot “8 seconds” prior to the crash, immediately removing his hands from the steering wheel upon toggling it online. 

When software engineer Wei Huang died in another autopilot Tesla crash in March, a similar event had transpired inside the car. Huang had engaged autopilot and removed his hands from the wheel for 6 seconds prior to the crash, after having been given multiple visual and one audible warning by the vehicle to put his hands on the wheel throughout the drive.

(Video by Tesla showing the capabilities of its self-driving cars)

Tech news site Wired notes: “Tesla says the logs in the car’s computer show Autopilot was on, with the adaptive cruise control distance set to the minimum. The car stays in its lane and a fixed distance from the vehicle ahead, but the driver is supposed to keep his hands on the wheel and monitor the road, too.”

“Take your hands off the wheel for too long, and you get a visual warning, on the dashboard,” Wired continued. “Ignore that, and the system will get your attention with a beep. If you’re stubborn or incapacitated, the car will turn on its flashers and slow to a stop.”

The first-ever autopilot Tesla crash, also the first autonomous car-related fatality in history, took place in 2016 and involved a 40 year old man called Joshua Brown. The US National Highway Traffic Safety Administration (NHTSA) similarly found that Brown was not paying attention to the road when his Tesla Model S collided with a truck. He was believed to be speeding as well, at 74 mph (119 kph). The speed limit was 65 mph. Tesla was exonerated at the time. 

Culpability on both sides?

After three eerily similar crashes, it seems that Tesla’s self-driving software wasn’t necessarily to blame, especially given that “the car’s manual reminds Tesla drivers that Autopilot is a driver assistance tool, not a replacement, and that they retain responsibility for driving safely,” Wired stated. “The big center screen conveys the same message when you engage Autopilot for the first time.”

At this current stage, autopilot software is not entirely, and truly, self-driving. It still requires human input for certain split-second decisions, even if the program might fool you into overconfidence in its capabilities. 

Dozens of micro decisions go into driving a vehicle on your everyday commute – decisions of various caliber, both navigational and ethical. Do you swerve right onto a sidewalk to avoid crashing into a truck that’s driving illegally down your lane, saving yourself but killing several pedestrians, or do you hit the brakes and pray for the best? It’s a classic computer vs. human decision-making and ethics debate, and one we can’t hope to solve anytime soon. 

Tesla critics: The company has “been using human drivers as guinea pigs”

On the other hand of this overall debate, you’ve got critics of Tesla that are calling for it to be held accountable. 

David Friedman, acting head of the NHTSA in 2014 and current vice president of advocacy for Consumer Reports, told the Washington Post: “Tesla has for too long been using human drivers as guinea pigs. This is tragically what happens. There are multiple systems out on the roads right now that take over some level of steering and speed control, but there’s only one of them that we keep hearing about where people are dying or getting into crashes. That kind of stands out.”

Tesla needs a better system to more quickly detect whether drivers are paying attention and warn them if they are not, Friedman said. 

Others have completely discounted Tesla Autopilot software as non self-driving. 

“Vehicles that don’t have Lidar, that don’t have advanced radar, that haven’t captured a 3-D map are not self-driving vehicles,” Ken Washington, Ford’s chief technical officer, said during a recent interview with Recode. “They are great consumer vehicles with really good driver-assist technology.”

LIDAR, which stands for Light Detection and Ranging, is a technology similar to radar that uses light instead of radio waves to gather information about the surrounding environment. All automakers of self-driving cars are using LIDAR technology in their vehicles, save for Tesla. 

As The Detroit News explains, Tesla’s tech is “different from the self-driving systems being built by nearly every other company in the industry, including Google spinoff Waymo, General Motors’ Cruise Automation, and Ford-affiliated Argo AI. They all use cameras and radar covering 360 degrees, and also have light beam sensors called Lidar to the mix as a third redundant sensor, as well as detailed three-dimensional mapping.” 

Robotaxis dream scrapped?

CEO Elon Musk at Tesla Autonomy DayTesla CEO Elon Musk stated last month during the company’s Autonomy Day that the company wants to venture into robotaxis by next year. 

“From our standpoint, if you fast forward a year, maybe a year and three months, but next year for sure, we’ll have over a million robotaxis on the road,” Musk declared. “The fleet wakes up with an over the air update; that’s all it takes.”

A few weeks after that event, with another casualty tied to self-driving Teslas, these bold claims seem even more impossible to achieve – even if Tesla wasn’t necessarily responsible for them. It will need to quell public distrust with less bold ambitions and more actionable safety measures.