The U.S. Justice Department has requested documents from Tesla related to its Autopilot and “Full Self-Driving” features, according to a regulatory filing.
“To our knowledge no government agency in any ongoing investigation has concluded that any wrongdoing occurred,” Tesla said in the filing Tuesday with the Securities and Exchange Commission.
The Austin, Texas, electric vehicle maker cautioned that if the government decides to pursue an enforcement action, it could possibly have a material adverse impact on its business.
Messages were left Tuesday seeking comment from the Justice Department and from Tesla, which has disbanded its media relations department.
Tesla Inc. is already facing multiple investigations by the National Highway Traffic Safety Administration for problems with its two driver-assist systems, Autopilot and “Full Self-Driving.”
Despite their names, Tesla still says on its website that the cars can’t drive themselves. Teslas using “Full Self-Driving” can navigate roads in many cases, but experts say the system can make mistakes, which even CEO Elon Musk acknowledges. “We’re not saying it’s quite ready to have no one behind the wheel,” CEO Musk said in October.
Michael Brooks, executive director of the nonprofit Center for Auto Safety, said the Justice Department could be looking at safety issues with the systems, or it could be investigating Tesla claims that the cars can drive themselves when they can't.
“When you get the car, it really can't do everything that's been promised,” Brooks said. “Tesla is putting a vehicle out on the road that is unable to perform to the capabilities claimed. Yet we have drivers relying on those promises and esentially not paying attention to the drive because they think it is more capable than it is.”
The systems have been under investigation by NHTSA since June of 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing its path in Florida. A separate probe into Teslas that were using Autopilot when they crashed into emergency vehicles started in August 2021. At least 14 Teslas that have crashed into emergency vehicles while using the Autopilot system.
Including the Florida crash, NHTSA has sent investigators to 35 Tesla crashes in which automated systems are suspected of being used. Nineteen people have died in those crashes, including two motorcyclists.
The agency also is investigating complaints that Teslas can brake suddenly for no reason.
“Full Self-Driving” went on sale late in 2015, and Musk has used the name ever since. It currently costs $15,000 to activate the system. Tsl
In 2019 he promised a fleet of autonomous robotaxis by 2020, and he said in early 2022 that the cars would be autonomous that year.
Since 2021, Tesla has been beta-testing “Full Self-Driving” using owners who haven’t been trained on the system but are actively monitored by the company. Tesla said this month that 400,000 owners are participating.
Auto safety advocates and government investigators have long criticized Tesla’s monitoring system as inadequate. Three years ago the National Transportation Safety Board listed poor monitoring as a contributing factor in a 2018 fatal Tesla crash in California. The board recommended a better system, but said Tesla has not responded.
NHTSA has noted in documents that numerous Tesla crashes have occurred in which drivers had their hands on the wheel but still weren’t paying attention. The agency has said that Autopilot is being used in areas where its capabilities are limited and that many drivers aren’t taking action to avoid crashes despite warnings from the vehicle.
In addition, the National Transportation Safety Board determined in 2020 that Tesla’s system to make sure drivers are paying attention is not adequate, and it should be limited to areas where it can safely operate.
Tesla shares were up just under 4% in Tuesday morning trading.
Tom Krisher And Michelle Chapman, The Associated Press