Written by Akash Sriram and Abhiroob Roy
(Reuters) – A Tesla (NASDAQ:) self-driving car carrying a passenger from Uber (NYSE:) collided with an SUV at an intersection on the outskirts of Las Vegas in April, an accident that raised new concerns that there is a growing instability from the companies. Self-styled Tesla (NASDAQ:) “robo-taxis” are exploiting a regulatory gray area in US cities, putting people’s lives at risk.
Tesla CEO Elon Musk aims to showcase plans for a robotaxi, or self-driving car used for ride-hailing services, on October 10, and has long considered creating a Tesla-operated taxi network of self-driving vehicles owned by individuals.
However, do-it-yourself versions are already catching on, according to 11 drivers using Tesla’s Full Self-Driving (FSD) software. Many say the software, which costs $99 a month, has limitations, but they use it because it helps reduce drivers’ fatigue and thus allows them to work longer hours and earn more money.
Reuters was the first to report on the Las Vegas accident and the related investigation by federal safety officials, and on the widespread use by taxi drivers of Tesla’s autonomous software.
While beta versions of self-driving taxis with human backup drivers from robotics operators like Alphabet (NASDAQ:)’s Waymo and General Motors (NYSE:)’ Cruise are strictly regulated, state and federal authorities say only Tesla drivers are… Responsible for their cars. Vehicles, whether or not they use driver assistance software. Waymo and Cruise use test versions of software rated as fully autonomous while Tesla’s FSD is rated at a level that requires driver monitoring.
The other driver in the April 10 Las Vegas accident, who was taken to the hospital, was blamed for failing to yield the right of way, the police report said. Tesla driver Justin Yuen in Las Vegas said on YouTube that Tesla’s software failed to slow his car even after the SUV exited a blind spot created by another vehicle.
Yoon, who posts videos on YouTube under the banner “Project Robotaxi,” was sitting in the driver’s seat of his Tesla, with his hands off the steering wheel, when it entered the intersection in a Las Vegas suburb, according to footage from inside the car. The Tesla on FSD moved the vehicle at 46 mph (74 km/h) and initially did not register an SUV crossing the road in front of Yoon. At the last moment, Yoon took control of the car and turned it into a hit that went off course, the footage shows.
“She is not perfect, she will make mistakes, and she will probably continue to make mistakes,” Yoon said in a video after the incident. He added that Yoon and his passenger suffered minor injuries and the car was destroyed.
Yoon discussed the use of FSD with Reuters before he posted videos of the incident publicly but did not respond to requests for comment afterward.
Tesla did not respond to requests for comment. Reuters was unable to reach the Uber passenger and the other driver for comment.
Uber and passenger transportation companies Lyft (NASDAQ:) responded to questions about FSD by saying that drivers are responsible for safety.
Uber, which said it had been in contact with the driver and passenger in the Las Vegas crash, cited its community guidelines: “Drivers are expected to maintain an environment that makes passengers feel safe, even if their driving practices do not violate the law.”
Uber also cited Tesla’s instructions that alert drivers using FSD to put their hands on the wheel and be ready to take over at a moment’s notice.
“Drivers agree that they will not engage in reckless behavior,” Lyft said.
Big ambitions
Musk has big plans for self-driving software based on the FSD product. This technology will serve as the basis for a robotaxi product program, and Musk envisions creating a Tesla-operated autonomous ride service using vehicles owned by his customers when they are not in use.
But drivers who spoke to Reuters also described serious flaws in the technology, including sudden, unjustified acceleration and braking. Some have stopped using it for complex situations such as airport transfers, and navigating parking lots and construction zones.
“I use it, but I’m not completely comfortable with it,” said Sergio Avedian, a ride-hailing driver in Los Angeles and a senior contributor to “The Rideshare Guy” YouTube channel, an online community for taxi drivers. With nearly 200,000 subscribers. Avedian avoids the use of FSD while transporting passengers. Based on his conversations with fellow drivers on the channel, he estimates that 30% to 40% of Tesla drivers across the United States use FSD regularly.
FSD is classified by the federal government as a type of partial automation that requires the driver to be fully engaged and attentive while the system steers, accelerates and brakes. It has come under increasing regulatory and legal scrutiny with at least two fatal accidents involving the technology. But using it on transport flights is not against the law.
“Ride-sharing services allow these partial automation systems to be used in commercial environments, and that’s something that should face significant scrutiny,” said Jake Voss, an analyst at Guidehouse Insights.
The US National Highway Traffic Safety Administration said it was aware of Yoon’s accident and had reached out to Tesla for additional information, but did not respond to specific questions about additional regulations or guidelines.
Authorities in California, Nevada and Arizona, which oversee the operations of ride-hailing companies and taxi companies, said they do not regulate the practice because FSD and other similar regulations fall outside the scope of regulation of taxis or motorized vehicles. They did not comment on the incident.
Uber recently enabled its software to send riders’ destination details to Tesla’s dashboard navigation system — a move that helps FSD users, wrote Omar Qazi, an X user with 515,000 followers who posts using the handle @WholeMarsBlog and often gets public responses from Musk on The platform.
“This will make it easier to take Uber rides on FSD,” Qazi said in X’s post.
Tesla, Uber and Lyft have no ways of knowing that a driver works for a ride-hailing company and is using FSD, industry experts said.
While almost all major automakers have some version of partial automation technology, most are limited in their capabilities and restricted for highway use. On the other hand, Tesla says FSD helps the car drive itself almost anywhere with active driver supervision but minimal intervention.
“I’m glad Tesla is doing this and is able to make it happen,” said David Kidd, a senior research scientist at the Insurance Institute for Highway Safety. “But from a safety standpoint, it raised a lot of hair.”
Instead of new regulations, Kidd said the National Highway Traffic Safety Administration should consider providing basic, non-binding guidelines to prevent misuse of such technologies.
Any federal oversight would require a formal investigation into how taxi drivers use all driver-assistance technologies, not just FSD, said Missy Cummings, director of the Center for Autonomy and Robotics at George Mason University and a former adviser to the National Highway Traffic Safety Administration.
“If Uber and Lyft were smart, they would have gotten ahead of it and banned it,” she said.
Meanwhile, taxi drivers want more from Tesla. Kaz Barnes, who has made more than 2,000 trips using FSD with passengers since 2022, told Reuters he looked forward to the day when he could get out of the car and let Musk’s network send him to work.
“You’re going to take the training wheels off,” he said. “I hope I can do that with this car someday.”
Comments are closed, but trackbacks and pingbacks are open.