vip.stakehow.com

They investigate some 2.4 million cars for accidents related to their autonomous driving system

They investigate some 2.4 million cars for accidents related to their autonomous driving system

Tesla is undergoing a new investigation. Elon Musk’s electric car brand is under scrutiny due to four accidents – one of them fatal – related to its autonomous driving technology (FSD).

This isn’t the first time this technology has been the subject of debate and review, but recent incident reports are putting this intelligent system under even harsher scrutiny.

That is why the National Highway Traffic Safety Administration (NHTSA) has just launched an investigation into the entire range of models (Model 3, Model S, Model Y, Model a massive recall around the world. In total it affects about 2.4 million units manufactured between 2016 and 2024.

All of the incidents in question occurred in conditions of reduced visibility due to weather and environmental factors. As NHTSA explained in a statement, “the accidents occurred because the FSD system did not react appropriately to these factors.”

A camera-based technology

The accidents highlight a key weakness in Tesla’s approach to autonomous driving: its reliance on a system based only in cameras.

Unlike some competitors, which use lidar or radar along with cameras, Tesla has doubled down on visual technology. This decision has drawn criticism from industry experts, especially in cases where weather or environmental conditions impair visibility.

Jeff Schuster, vice president of GlobalData, noted that relying solely on cameras could be problematic. “Weather conditions can affect the camera’s ability to see things and I think the regulatory environment will certainly influence this,” he said.

This is not the first safety issue Tesla has faced. In December 2023, it issued a massive recall that affected more than two million vehicles to fix problems with its autopilot system. The company rolled out a software update aimed at improving safety features, but the recall didn’t stop the rising tide of concern over its autonomous technology.

Tesla's Autopilot has been criticized by industry experts. Reuters Photo

“At Tesla, we believe we have a moral obligation to continue improving our safety systems, which are already best-in-class,” the company said at the time. “At the same time, we also believe it is morally indefensible to not make these systems available to a broader set of consumers, given the irrefutable data showing they are saving lives and preventing injuries.”

NHTSA is also continuing to investigate whether the recall was sufficient or whether Tesla users still face risks when using FSD. The Justice Department has been investigating Tesla’s FSD and Autopilot systems since 2022.

Technology under the magnifying glass

NHTSA’s preliminary investigation will focus on evaluating whether the FSD system can accurately “detect and respond” to situations with limited visibility, whether under glare, fog or dust.

The agency will also investigate whether other accidents have occurred under similar conditions and what factors contributed to them. Additionally, the investigation will evaluate whether Tesla has made any software updates that could affect the performance of the FSD system in these low visibility scenarios.

Despite the growing challenges, Elon Musk and Tesla remain committed to the dream of fully autonomous driving. Musk recently unveiled the Cybercab, a two-seater robotaxi concept designed without a steering wheel or pedals. In his vision, the Cybercab will be based solely on cameras and artificial intelligence to navigate the roads.

But it’s not just about technology: Tesla also faces regulatory hurdles. The Cybercab would need NHTSA approval before it could hit the streets. And given the FSD’s ongoing investigation, that approval it could take a long time to arrive.

demo slot

togel hk

pengeluaran hk

togel hk

Exit mobile version