1676575577 Tesla will screen 362000 vehicles for concerns about the safety

Tesla will screen 362,000 vehicles over concerns about the safety of its autopilot system in the US

A new setback for Elon Musk. Tesla, the electric vehicle maker he founded in 2003, will screen 362,000 vehicles because autonomous driving software can cause an accident, the National Highway Safety Administration (NHTSA) announced Thursday, a federal agency dependent on the Department of Transportation). The news caused an immediate drop in the company’s share price on the Nasdaq index, down 3.35% mid-session.

The Transportation Safety Administration has reported that autonomous driving software allows a vehicle to “exceed speed limits or circulate [a través de intersecciones] in an illegal or unpredictable manner, increasing the risk of accidents.” Tesla will launch an update to the program, which will be free for customers, although there have been no accidents resulting in injury or death from this cause.

The revision affects the S, X, 3 and Y models, all manufactured between 2016 and 2023 and equipped with one of two autopilot software systems, known as the FSD Beta. “The function [de conducción autónoma] could potentially violate local traffic rules while performing certain driving maneuvers,” NHTSA said. Possible examples, the agency cites, are crossing intersections, turning on yellow, and changing lanes.

The federal agency’s announcement comes hours after the White House, which promotes the manufacture and use of electric vehicles, announced that Musk’s company will open a portion of its network of charging stations (at least 7,500 total spots) by the end of 2024. to its competitors after President Joe Biden wrested the commitment from Musk. The Democratic government aims to have at least 500,000 publicly accessible electric vehicle chargers on American highways by 2030, regardless of the make of vehicle they drive.

When it comes to the safety of manned or unmanned Tesla vehicles, news of accidents follows one another. A runaway Tesla killed two people in Guangdong province, China, last November, although there are no records of the driver, who tested negative for alcohol and drugs, activating the autopilot system. That same month, it was revealed that a Tesla that crashed in San Francisco in November had the AutoPilot system activated. The accident, which involved eight vehicles, caused several injuries, including a child.

Tesla-made cars feature two different automatic driving systems, Autopilot, which is intended for use on roads and expressways, and FSB Beta, which is intended for urban environments and is said to be able to recognize traffic lights, stop signs, and turning in the city The beta is the one that allegedly caused the bugs in the vehicles to be withdrawn. In both the Guangdong and San Francisco cases, the origin of the accident was so-called phantom braking, or automatic braking, in which the Autopilot and FSD systems forcibly activate the brake for no apparent reason and with dire consequences. . The Traffic Safety Authority reminds that despite the automatic pilot programs, driving the vehicle always requires human supervision.

Alongside a lawsuit against Musk over his tweets about the company’s future, Tesla has seen one of Wall Street’s worst stock market performances in recent months. The South African tycoon’s company chained its worst stock market streak since 2018 last year, losing almost 70% of its value this year at the end of December. In addition to his vocal take on the social network Twitter, doubts about a slowdown in demand and the government’s promotion of electric cars with subsidies for individuals weighed on the pioneering electric car company, urging it to offer deep discounts on its cars.

Follow all information from Business And Business on Facebook and Twitteror in our weekly newsletter

Five Day Program

The most important economic dates of the day, with the keys and context to understand their scope.

RECEIVE IT IN YOUR MAIL