financetom
Business
financetom
/
Business
/
INSIGHT-Next Autopilot trial to test Tesla's blame-the-driver defense
News World Market Environment Technology Personal Finance Politics Retail Business Economy Cryptocurrency Forex Stocks Market Commodities
INSIGHT-Next Autopilot trial to test Tesla's blame-the-driver defense
Mar 11, 2024 3:22 AM

March 11 (Reuters) - Six weeks before the first fatal

U.S. accident involving Tesla's Autopilot in 2016, the

automaker's president Jon McNeill tried it out in a Model X and

emailed feedback to automated-driving chief Sterling Anderson,

cc'ing Elon Musk.

The system performed perfectly, McNeill wrote, with the

smoothness of a human driver.

"I got so comfortable under Autopilot, that I ended up

blowing by exits because I was immersed in emails or calls (I

know, I know, not a recommended use)," he wrote in the email

dated March 25 that year.

Now McNeill's email, which has not been previously reported,

is being used in a new line of legal attack against Tesla

over Autopilot.

Plaintiffs' lawyers in a California wrongful-death lawsuit

cited the message in a deposition as they asked a Tesla witness

whether the company knew drivers would not watch the road when

using its driver-assistance system, according to previously

unreported transcripts reviewed by Reuters.

The Autopilot system can steer, accelerate and brake by

itself on the open road but can't fully replace a human driver,

especially in city driving. Tesla materials explaining the

system warn that it doesn't make the car autonomous and requires

a "fully attentive driver" who can "take over at any moment".

The case, set for trial in San Jose the week of March 18,

involves a fatal March 2018 crash and follows two previous

California trials over Autopilot that Tesla won by arguing the

drivers involved had not heeded its instructions to maintain

attention while using the system.

This time, lawyers in the San Jose case have testimony from

Tesla witnesses indicating that, before the accident, the

automaker never studied how quickly and effectively drivers

could take control if Autopilot accidentally steers towards an

obstacle, the deposition transcripts show.

One witness testified that Tesla waited until 2021 to add a

system monitoring drivers' attentiveness with cameras - about

three years after first considering it. The technology is

designed to track a driver's movements and alert them if they

fail to focus on the road ahead.

The case involves a highway accident near San Francisco that

killed Apple engineer Walter Huang. Tesla contends Huang misused

the system because he was playing a video game just before the

accident.

Lawyers for Huang's family are raising questions about

whether Tesla understood that drivers - like McNeill, its own

president - likely wouldn't or couldn't use the system as

directed, and what steps the automaker took to protect them.

Experts in autonomous-vehicle law say the case could pose

the stiffest test to date of Tesla's insistence that Autopilot

is safe - if drivers do their part.

Matthew Wansley, a Cardozo law school associate professor

with experience in the automated-vehicle industry, said Tesla's

knowledge of likely driver behavior could prove legally pivotal.

"If it was reasonably foreseeable to Tesla that someone

would misuse the system, Tesla had an obligation to design the

system in a way that prevented foreseeable misuse," he said.

Richard Cupp, a Pepperdine law school professor, said Tesla

might be able to undermine the plaintiffs' strategy by arguing

that Huang misused Autopilot intentionally.

But if successful, the plaintiffs' attorneys could provide a

blueprint for others suing over Autopilot. Tesla faces at least

a dozen such suits now, eight of which involve fatalities,

putting the automaker at risk of large monetary judgments.

Musk, Tesla and its attorneys did not answer detailed

questions from Reuters for this story.

McNeill declined to comment. Anderson did not respond to

requests. Both have left Tesla. McNeill is a board member at

General Motors and its self-driving subsidiary, Cruise. Anderson

co-founded Aurora, a self-driving technology company.

Reuters could not determine whether Anderson or Musk read

McNeill's email.

NEARLY 1,000 CRASHES

The crash that killed Huang is among hundreds of U.S.

accidents where Autopilot was a suspected factor in reports to

auto safety regulators.

The U.S. National Highway Traffic Safety Administration

(NHTSA) has examined at least 956 crashes in which Autopilot was

initially reported to have been in use. The agency separately

launched more than 40 investigations into accidents involving

Tesla automated-driving systems that resulted in 23 deaths.

Amid the NHTSA scrutiny, Tesla recalled more than 2 million

vehicles with Autopilot in December to add more driver alerts.

The fix was implemented through a remote software update.

Huang's family alleges Autopilot steered his 2017 Model X

into a highway barrier.

Tesla blames Huang, saying he failed to stay alert and take

over driving. "There is no dispute that, had he been paying

attention to the road he would have had the opportunity to avoid

this crash," Tesla said in a court filing.

A Santa Clara Superior Court judge has not yet decided what

evidence jurors will hear.

Tesla also faces a federal criminal probe, first reported by

Reuters in 2022, into company claims that its cars can drive

themselves. It disclosed in October it had received subpoenas

related to driver-assistance systems.

Despite marketing features called Autopilot and Full

Self-Driving, Tesla has yet to achieve Musk's oft-stated

ambition of producing autonomous vehicles that require no human

intervention.

Tesla says Autopilot can match speed to surrounding traffic

and navigate within a highway lane. The step-up "enhanced"

Autopilot, which costs $6,000, adds automated lane-changes,

highway ramp navigation and self-parking features. The $12,000

Full Self-Driving option adds automated features for city

streets, such as stop-light recognition.

'READY TO TAKE CONTROL'

In light of the McNeill email, the plaintiffs' lawyers in

the Huang case are questioning Tesla's contention that drivers

can make split-second transitions back to driving if Autopilot

makes a mistake.

The email shows how drivers can become complacent while

using the system and ignore the road, said Bryant Walker Smith,

a University of South Carolina professor with expertise in

autonomous-vehicle law. The former Tesla president's message, he

said, "corroborates that Tesla recognizes that irresponsible

driving behavior and inattentive driving is even more tempting

in its vehicles".

Huang family attorney Andrew McDevitt read portions of the

email out loud during a deposition, according to a transcript.

Reuters was unable to obtain the full text of McNeill's note.

Plaintiffs' attorneys also cited public comments by Musk

while probing what Tesla knew about driver behavior. After a

2016 fatal crash, Musk told a news conference that drivers

struggle more with attentiveness after they have used the system

extensively.

"Autopilot accidents are far more likely for expert users,"

he said. "It is not the neophytes."

A 2017 Tesla safety analysis, a company document that was

introduced into evidence in a previous case, made clear that the

system relies on quick driver reactions. Autopilot might make an

"unexpected steering input" at high speed, potentially causing

the car to make a dangerous move, according to the document,

which was cited by plaintiffs in one of the trials Tesla won.

Such an error requires that the driver "is ready to take over

control and can quickly apply the brake".

In depositions, a Tesla employee and an expert witness the

company hired were unable to identify any research the automaker

conducted before the 2018 accident into drivers' ability to take

over when Autopilot fails.

"I'm not aware of any research specifically," said the

employee, who was designated by Tesla as the person most

qualified to testify about Autopilot.

The automaker redacted the employee's name from depositions,

arguing that it was legally protected information.

McDevitt asked the Tesla expert witness, Christopher Monk,

if he could name any specialists in human interaction with

automated systems whom Tesla consulted while designing

Autopilot.

"I cannot," said Monk, who studies driver distraction and

previously worked for the NHTSA, the depositions show.

Monk did not respond to requests for comment. Reuters was

unable to independently determine whether Tesla has since March

2018 researched how fast drivers can take back control, or if it

has studied the effectiveness of the camera monitoring systems

it activated in 2021.

LULLED INTO DISTRACTION

The National Transportation Safety Board (NTSB), which

investigated five Autopilot-related crashes, has since 2017

repeatedly recommended that Tesla improve the driver-monitoring

systems in its vehicles, without spelling out exactly how.

The agency, which conducts safety investigations and

research but cannot order recalls, concluded in its report on

the Huang accident: "Contributing to the crash was the Tesla

vehicle's ineffective monitoring of driver engagement, which

facilitated the driver's complacency and inattentiveness."

In his 2016 comments, Musk said drivers would ignore as many

as 10 warnings an hour about keeping their hands on the wheel.

The Tesla employee testified that the company considered

using cameras to monitor drivers' attentiveness before Huang's

accident, but didn't introduce such a system until May 2021.

Musk, in public comments, has long resisted calls for more

advanced driver-monitoring systems, reasoning that his cars

would soon be fully autonomous and safer than human-piloted

vehicles.

"The system is improving so much, so fast, that this is

going to be a moot point very soon," he said in 2019 on a

podcast with artificial-intelligence researcher Lex Fridman.

"I'd be shocked if it's not by next year, at the latest ... that

having a human intervene will decrease safety."

Tesla now concedes its cars need better safeguards. When it

recalled vehicles with Autopilot in December, it explained that

its driver-monitoring systems may not be sufficient and that the

alerts it added during the recall would help drivers "adhere to

their continuous driving responsibility".

The recall, however, didn't fully solve the problem, said

Kelly Funkhouser, associate director of vehicle technology at

Consumer Reports, one of the leading U.S. product-testing

companies. Its road tests of two Tesla vehicles after the

automaker's fix found the system failed in myriad ways to

address the safety concerns that sparked the recall.

"Autopilot usually does a good job," Funkhouser said. "It

rarely fails, but it does fail."

Comments
Welcome to financetom comments! Please keep conversations courteous and on-topic. To fosterproductive and respectful conversations, you may see comments from our Community Managers.
Sign up to post
Sort by
Show More Comments
Related Articles >
Copyright 2023-2026 - www.financetom.com All Rights Reserved