Future Tech

Tesla FSD faces yet another probe after fatal low-visibility crash

Tan KW
Publish date: Sat, 19 Oct 2024, 06:06 AM
Tan KW
0 490,252
Future Tech

Tesla is facing yet another government investigation into the safety of its full self driving (FSD) software after a series of accidents in low-visibility conditions. 

In its latest opening resume [PDF] into a Tesla FSD investigation, the National Highway Traffic Safety Administration (NHTSA) said that it's taking a look at Tesla self-driving systems due to four accidents, one involving an injury and another fatally striking a pedestrian. All four accidents occurred in "an area of reduced roadway visibility conditions" with FSD, either beta or supervised, engaged. 

"In these crashes, the reduced roadway visibility arose from conditions such as sun glare, fog, or airborne dust," the NHTSA noted, adding that it was opening the inquiry to assess "the ability of FSD's engineering controls to detect and respond appropriately to reduced roadway visibility conditions." 

In addition, the NHTSA will examine if any further accidents, aside from the four mentioned, are linked to FSD's poor performance in low-visibility conditions. The agency will also review whether Tesla implemented any changes to FSD that could have worsened its ability to handle such scenarios.

The investigation includes all 2016-2024 Model S and X vehicles, 2017-2024 Model 3, 2020-2024 Model Y and 2023-2024 Cybertrucks outfitted with FSD, covering around 2.4 million vehicles. 

Maybe removing all those sensors wasn't a great idea after all?

Tesla owner Elon Musk has had a long-standing opposition to the use of things like ultrasonic sensors, radar and lidar in his quest to achieve unassisted full self driving, instead preferring AI and cameras to do the job. 

The EV maker has since committed entirely to computer vision, eliminating additional sensors in its latest model year vehicles. That means that aside from being able to see in the dark like most modern cameras, Tesla vehicles are ostensibly just as good at seeing in fog, sun glare or excess airborne dust as a human driver, with the added caveat that it's an AI making the decision, not a human. 

Plenty of other self-driving companies and their engineers have disagreed with Tesla over its vision-only approach, and this may be the first time we've seen its FSD formally evaluated to see whether it can actually drive better in low-visibility environments than a human. 

We've asked the NHTSA if it's worried about the lack of sensors being behind this Tesla investigation, but the agency declined to comment citing the ongoing nature of the case. 

Automotive safety experts The Register spoke with for this story noted that Tesla's reliance on computer vision, especially via low-resolution RGB cameras, could be a contributing factor in the accidents.

The type of cameras Tesla uses are vulnerable to blinding in dusk, dawn, in fog and rain and when driving into the sun because of their low dynamic range, we're told. Relying solely on cameras could be also an issue because they are passive receivers of light, meaning they're unable to predict distance unless set up in stereo, which Tesla cameras reportedly are not.

Not that Tesla hasn't had to patch FSD several times over the years to address other safety issues, including twice last year alone. Tesla patched FSD in January 2023 to address reports from the NHTSA that it was acting unsafe around intersections, and again in December because the attention controls were "insufficient to prevent misuse" of the system by drivers paying less attention than they should. 

In both cases, the NHTSA's investigation included fewer vehicles than the latest one, suggesting an even greater need to fix the issue if it doesn't play out in Tesla's favor.

We've reached out to Tesla for comment, but didn't hear back. ®

 

https://www.theregister.com//2024/10/18/tesla_fsd_lowvisibility_accident/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment