Tesla rebuts Washington Post claims about Autopilot safety risks

Electric car maker defends driver-assist technology amid scrutiny over crashes

Technology designed to support drivers

Tesla has issued a forceful rebuttal to a recent Washington Post article suggesting the company's Autopilot system poses risks by allowing use beyond its intended design parameters. The Post identified about 40 fatal or serious crashes since 2016, with at least eight occurring on roads where Autopilot was not built to operate.

The article argued Tesla holds some responsibility since it permits the driver-assist technology's activation in potentially unsafe situations. "Even though the company has the technical ability to limit Autopilot's availability by geography, it has taken few definitive steps to restrict use of the software," it stated.

Tesla staunchly defends safety record

In its response, Tesla strongly disputed claims it fails to prioritize safety, saying data clearly shows systems like Autopilot drastically cut accidents when used properly. The company reiterated features like Traffic Aware Cruise Control are Level 2, requiring constant driver oversight.

"We are committed to making technology like this safer and better over time," said a Tesla spokesperson.

Ongoing debate over automation

The exchange highlights ongoing debate regarding the appropriate deployment of emerging vehicle automation. While Tesla maintains its systems boost safety under supervision, critics argue more restrictions may be needed to prevent misuse.

Following is the pertinent section of Tesla’s response.

While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context. 

We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury. 

Regulators around the globe have a duty to protect consumers, and the Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways. 

Below are some important facts, context and background.

Background

1. Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged.

a. In the 4th quarter of 2022, we recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded one crash for every 1.40 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.

b. The data is clear: The more automation technology offered to support the driver, the safer the driver and other road users. Anecdotes from the WaPo article come from plaintiff attorneys—cases involving significant driver misuse—and are not a substitute for rigorous analysis and billions of miles of data.

c. Recent Data continues this trend and is even more compelling. Autopilot is ~10X safer than US average and ~5X safer than a Tesla with no AP tech enabled. More detailed information will be publicly available in the near future.

2. Autopilot features, including Traffic-Aware Cruise Control and Autosteer, are SAE Level 2 driver-assist systems, meaning –

a. Whether the driver chooses to engage Autosteer or not, the driver is in control of the vehicle at all times. The driver is notified of this responsibility, consents, agrees to monitor the driving assistance, and can disengage anytime.

b. Despite the driver being responsible for control for the vehicle, Tesla has a number of additional safety measures designed to monitor that drivers engage in active driver supervision, including torque-based and camera-based monitoring. We have continued to make progress in improving these monitoring systems to reduce misuse.

c. Based on the above, among other factors, the data strongly indicates our customers are far safer by having the choice to decide when it is appropriate to engage Autopilot features. When used properly, it provides safety benefits on all road classes.

Tesla also provided some context about some of the crashes that were highlighted by The Washington Post. As per the electric vehicle maker, the incidents that the publication cited involved drivers who were not using Autopilot correctly. The publication, therefore, omitted several important facts when it was framing its narrative around Autopilot’s alleged risks, Tesla argued. 

Following is the pertinent section of Tesla’s response.

The Washington Post leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem. The article got it wrong, misreporting what’s actually alleged in the pending lawsuit and omitting several important facts:

1. Contrary to the Post article, the Complaint doesn’t reference complacency or Operational Design Domain.

2. Instead, the Complaint acknowledges the harms of driver inattention, misuse, and negligence.

3. Mr. Angulo and the parents of Ms. Benavides who tragically died in the crash, first sued the Tesla driver—and settled with him—before ever pursuing a claim against Tesla.

4. The Benavides lawsuit alleges the Tesla driver “carelessly and/or recklessly” “drove through the intersection…ignoring the controlling stop sign and traffic signal.”

5. The Tesla driver didn’t blame Tesla, didn’t sue Tesla, didn’t try to get Tesla to pay on his behalf. He took responsibility.

6. The Post had the driver’s statements to police and reports that he said he was “driving on cruise.” They omit that he also admitted to police “I expect to be the driver and be responsible for this.”

7. The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:

a. “I was highly aware that was still my responsibility to operate the vehicle safely.”

b. He agreed it was his “responsibility as the driver of the vehicle, even with Autopilot activated, to drive safely and be in control of the vehicle at all times.”

c. “I would say specifically I was aware that the car was my responsibility. I didn’t read all these statements and passages, but I’m aware the car was my responsibility.”

8. The Post also failed to disclose that Autopilot restricted the vehicle’s speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, “Cruise control will not brake.”

13 Ara 2023 - 11:24 - Vehicles

Son bir ayda litsuit.com sitesinde 2.082 gösterim gerçekleşti.


göndermek için kutuyu işaretleyin

Yorum yazarak Litsuit Topluluk Kuralları’nı kabul etmiş bulunuyor ve yorumunuzla ilgili doğrudan veya dolaylı tüm sorumluluğu tek başınıza üstleniyorsunuz. Yazılan yorumlardan Litsuit hiçbir şekilde sorumlu tutulamaz.

Haber ajansları tarafından servis edilen tüm haberler Litsuit editörlerinin hiçbir editöryel müdahalesi olmadan, ajans kanallarından geldiği şekliyle yayınlanmaktadır. Sitemize ajanslar üzerinden aktarılan haberlerin hukuki muhatabı Litsuit değil haberi geçen ajanstır.


World Brands

Litsuit, İstanbul ile özdeşleşen markaları ağırlıyor.

+90 (532) 765 24 01
Reklam bilgi