立即打开
特斯拉认为Model X致命事故过错在于司机

特斯拉认为Model X致命事故过错在于司机

Kirsten Korosec 2018-04-15
特斯拉称,事故发生的唯一可能在于车主没有注意道路,尽管汽车已经发出了多次警告。

特斯拉(Tesla)表示,导致Model X撞击公路隔栏的过错方,是在事故中丧生的车主,而不是车载的半自动驾驶系统。

车主沃尔特·黄的家属委托了一家法律机构研究可以采取的法律手段,而随后,这家电动汽车厂商发布了指责司机的声明。黄今年38岁,他的2017款Model X撞上了没有防撞桶的混凝土公路隔栏边缘,并导致了他的死亡。

特斯拉在一份邮件声明中表示:“根据其家人的说法,黄先生很清楚Autopilot自动驾驶系统不够完美,他也曾告诉家人Autopilot在事发位置并不可靠,但却仍然在那里启动了Autopilot。事发时天气晴朗,前方几百英尺都清晰可见,也就是说,事故发生的唯一可能在于黄先生没有注意道路,尽管汽车已经发出了多次警告。”

这是3月23日撞车事件发生以来特斯拉发表的第三份声明,也是在撇清责任上措辞最强烈的一次。公司之前曾称沃尔特·黄在事故发生的6秒前没有用双手控制方向盘。

以下是特斯拉的声明全文:

我们对车主家人遭遇的损失深表遗憾。

根据其家人的说法,黄先生很清楚Autopilot自动驾驶系统不够完美,他也曾告诉家人Autopilot在事发位置并不可靠,但却仍然在那里启动了Autopilot。事发时天气晴朗,前方几百英尺都清晰可见,也就是说,事故发生的唯一可能在于黄先生没有注意道路,尽管汽车已经发出了多次警告。

这一事件中,道德和法律责任的基本前提并不存在。特斯拉十分清楚,Autopilot要求司机保持警惕,将双手放在方向盘上。每次启用Autopilot,司机都会收到这一提示。如果系统检测到司机没有把手放在方向盘上,就会发出图像和声音警报。在黄先生驾驶的那天,系统发出了多次警报。

黄先生的家属失去亲人,悲痛之情可以理解,但我们要强调,错误地认为Autopilot不安全,会对道路上的其他司机造成危害。美国国家公路交通安全管理局(NHTSA)发现,即使是特斯拉Autopilot的早期版本也可以减少40%的撞车事件,而自那以后,系统又有了大幅改进。其他家庭没有登上新闻是因为他们挚爱的亲人仍然活着。

特斯拉的发言人并未提供关于撞车的更多信息,包括汽车行驶时对黄先生发送了多少次警报,一般情况下Autopilot系统又会在发出多少次警报后自动停止工作?如果司机没有把手放在方向盘上,Autopilot系统会发出图像和声音警报。在数次警报之后,系统就会自动停止工作。

代表黄先生家庭的法律公司Minami Tamaki表示,经过初步调查,他们发现其他特斯拉用户也有过针对Autopilot导航错误的抱怨。公司称:“我们认为特斯拉的Autopilot存在缺陷,可能导致了黄先生的死亡,尽管特斯拉显然试图把这场可怕悲剧的过错归于受害者。”

特斯拉在推广中声称Autopilot是半自动驾驶系统,而不是在特定环境下无需司机参与、能够处理所有驾驶操作的全自动系统。Autopilot拥有一系列功能,例如帮助司机在车道内行驶的自动转向功能,根据周围交通状况维持车速的自适应巡航控制功能,以及转向灯激活且路况安全的情况下移动到相邻车道的自动变道功能。

当Autopilot激活时,特斯拉的系统确实会提醒司机把手放在方向盘上。然而,不是所有的司机都会接受这些警告,特斯拉因为没有阻止系统遭到误用而一直受到批评。美国国家安全运输委员会(National Transportation Safety Board)的调查显示,2016年5月佛罗里达州一场致命的撞车,部分原因就在于司机过分依赖Autopilot。

当时,国家安全运输委员会表示“运行限制”,例如特斯拉没能确保司机在汽车高速行驶时保持专注,是2016年致命撞车的主要因素。该机构建议特斯拉和其他汽车厂商采取措施,确保半自动驾驶系统不会遭到误用。例如,通用汽车(GM)凯迪拉克(Cadillac)CT6汽车上可选搭载的半自动系统Supercruise就配有摄像系统,确保司机在关注前方路况,否则系统就会自动停止工作。

美国国家公路交通安全管理局(National Highway Traffic Safety Administration)之前曾调查过2016年的撞车事件,当时他们没有发现Autopilot存在缺陷。(财富中文网)

译者:严匡正 

Tesla says a Model X owner who died in a crash last month near San Francisco is at fault, not the semi-autonomous Autopilot system that was engaged when the SUV slammed into a highway divider.

The electric carmaker issued the statement blaming the driver, Walter Huang, after his family had hired a law firm to explore legal options for them. Huang, 38, died when his 2017 Model X drove his car into the unprotected edge of a concrete highway median that was missing its crash guard.

“According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location,” Tesla said in an emailed statement. “The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

This is the third public statement by Tesla since the March 23 crash—and its strongest language yet in an effort to distance itself from the fatality. Tesla previously said Huang’s hands were not on the steering wheel for six seconds prior to the collision.

Tesla’s complete statement:

We are very sorry for the family’s loss.

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.

We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.

A Tesla spokesman would not provide additional information about the crash, including how many times the vehicle issued the alerts to Huang while driving or how many warnings are typically issued before the Autopilot system disengages. Autopilot issues visual and audio alerts if the driver’s hands leave the steering wheel for a period of time. The system is supposed to disengage after several warnings.

The law firm representing the Huang family said that a preliminary review has uncovered complaints by other Tesla drivers of navigational errors by the Autopilot feature. “The firm believes Tesla’s Autopilot feature is defective and likely caused Huang’s death, despite Tesla’s apparent attempt to blame the victim of this terrible tragedy,” according to the law firm, Minami Tamaki.

Autopilot is marketed by Tesla as a semi-autonomous driving system rather than fully autonomous that handles all aspects of driving in certain conditions without expectation of the driver’s involvement. Autopilot includes several features like an automatic steering function that helps drivers steer within a lane, adaptive cruise control that maintains the car’s speed in relation to surrounding traffic, and an auto lane change feature that is supposed to move the vehicle into an adjacent lane automatically when the turn signal is activated—but only when it’s safe to do so.

Tesla’s system does give warnings to remind drivers to keep hands on the wheel when Autopilot is activated. However, not all drivers heed those warnings, and Tesla has been criticized for failing to protect against misuse of the system. A fatal May 2016 crash in Florida was caused partly by the driver overly relying on Autopilot, according to a National Transportation Safety Board investigation.

At the time, the NTSB said that “operational limits” such as Tesla being unable to ensure that drivers are paying attention when a car travels at high speed played a major role in the 2016 fatal crash. The agency recommended Tesla and other automakers take steps to ensure that semi-autonomous systems are not misused. For instance, GM’s semi-autonomous Supercruise system, which is an option in the Cadillac CT6, has a camera system that ensures the driver is looking ahead or it will disengage.

The National Highway Traffic Safety Administration, which also investigated the 2016 crash, found no defects with Autopilot.

热读文章
热门视频
扫描二维码下载财富APP