Lawmakers have passed a bill that could force Tesla to stop using the term “Full Self-Driving” in California.
The bill requires a signature from the governor and is aimed at marketing around driver assistance programs.
California DMV has accused Tesla of using deceptive marketing practices when advertising FSD.
Tesla may be forced to stop using the term “Full Self Driving” (FSD) in California.
On Tuesday, California lawmakers passed a bill in the Senate that could make the use of the software’s current name illegal by the electric car maker. The legislation, which was sponsored by Senate Transportation Committee chair Lena Gonzalez (D-Long Beach), now requires a signature from Governor Gavin Newsom.
A spokesperson for Newsom declined to comment on the bill. Tesla did not respond to a request for comment from Insider before publication.
The bill doesn’t directly target Tesla, but Gonzalez told the Los Angeles Times she believes other automakers, including Ford, GM and BMW, have been clearer about the limits of their technology.
The bill would not address security issues surrounding the software, but would only focus on its marketing. It would also set new standards for automakers when it comes to explaining the capabilities of driver assistance technology. However, it is unclear how the rule will be enforced, as the California Department of Motor Vehicles (DMV) is said to be responsible for taking action against Tesla.
Lawmakers passed the bill just weeks after the state accused DMV Tesla of using deceptive marketing to advertise its Autopilot and FSD software.
The Autopilot software acts as a driver assistance feature that allows the Tesla to automatically steer, accelerate and brake in its lane, while FSD is an optional add-on that can change lanes and stop at traffic lights, as well as stop signs. Tesla has told drivers that both positions require a licensed driver to drive the vehicle and be willing to take over at any time.
Some regulators fear that the marketing around the software could give drivers a false sense of security. Last year, a man was arrested for riding in the back seat while using FSD on the highway. In June, the National Highway Traffic Safety Administration said it had expanded its Autopilot investigations, including the software’s possible role in several fatal car accidents.
Gonzalez said she and other lawmakers acted to draft the bill after the state DMV was slow to enforce its rules banning the advertising of cars as “self-driving” if they are not truly autonomous. Tesla’s FSD is currently classified as a level two driver assistance system and is in beta testing. The software has more than 100,000 subscribers who can use Tesla to test the software in real time and let the system’s AI learn from experienced drivers.
“Are we just going to wait for another person to be killed in California?” Gonzalez told the publication. “People in California think Full Self-Driving is fully automated when it isn’t,” she added.
While there have been investigations into accidents in the state involving Tesla’s Autopilot, it’s unclear if a driver in California died while taking FSD.
Tesla coined the term “Full Self-Driving” in 2016. Elon Musk has repeatedly said that Tesla will have autonomous cars as early as 2015. Most recently, in July, he said it would pass beta testing by the end of the year. But FSD beta testers have repeatedly pointed to bugs in the software.
Most recently, a Tesla critic launched a viral ad campaign showing that the software failed to recognize a child doll on the street and bumped into it. Earlier this month, Musk scolded a driver on Twitter who shared a video of the software struggling with cornering and lane changes.
Do you work at Tesla or own one of its electric cars? Contact the reporter via a non-work email at firstname.lastname@example.org
Read the original article on Business Insider