Home National Tesla may face less accountability for crashes under Trump : NPR

Tesla may face less accountability for crashes under Trump : NPR

by Curtis Jones
0 comments

The aftermath of a fatal collision between a Tesla Model S and a motorcycle last year outside Seattle.

Washington State Patrol


hide caption

toggle caption

Washington State Patrol

In April of last year, a distraught driver called 911 to report a crash in Snohomish County, Wash., outside Seattle.

“I hit a person on a motorcycle in my car, on the freeway, on the way home,” said the driver, who identified himself to the dispatcher as Scott Hunter.

Hunter was behind the wheel of his Tesla Model S, following a motorcycle in stop-and-go traffic, when the car ran over the motorcycle and its rider, pinning both underneath.

“I’m the driver. I’m not sure how it happened, but I am freaking out. So please get here,” Hunter told the dispatcher as he tried to summon help.

The motorcyclist, 28-year-old Jeffrey Nissen, was pronounced dead at the scene. Hunter, 56, was arrested for vehicular homicide. He told police he had been “distracted” by his phone moments before the low-speed crash.

Police eventually extracted more data and video of the crash from the car, with Tesla’s help. NPR obtained some of those materials through a public records request to the Washington State Patrol.

The documents reveal that about two minutes before the crash, Hunter put his car into what Tesla calls Full Self-Driving (Supervised) mode — its most advanced driver-assistance system, for which drivers typically pay extra. The car did not detect Hunter’s hands on the steering wheel for more than a minute before the crash, according to a report by the detective investigating the incident.

That and other details from the police investigation have not been publicly reported before. But the broader narrative they suggest — a driver not paying attention to the road — is something Tesla’s critics say they’ve seen many times.

“This is yet another case where the lack of driver engagement is directly responsible for the crash,” said Missy Cummings, a professor of robotics and engineering at George Mason University, who reviewed documents from the crash investigation at NPR’s request.

Crash reporting requirements aided push for improved safety measures

Car companies including Tesla are required to report the details of crashes involving advanced driver-assistance systems, like the one Hunter was using. Regulators and safety advocates insist the reporting requirement is crucial to understanding how well the technology is working — and is necessary to hold carmakers accountable for deaths and injuries when the technology fails.

But safety advocates fear that this requirement will be scrapped by the incoming Trump administration, along with other federal investigations into Tesla and its advanced driving systems.

Tesla CEO Elon Musk, the world’s richest person, spent more than a quarter of a billion dollars to help President-elect Donald Trump return to the White House. Now Musk may be poised to influence federal policy at the agencies that regulate his businesses, including Tesla.

The carmaker is at odds with regulators over the crash reporting requirement, which Tesla executives argue is unfair because it makes the company’s safety record look worse than it is.

That reporting requirement helped federal regulators build the case for a recall of Tesla’s Autopilot system, the predecessor to its Full Self-Driving mode.

“I saw crash after crash after crash, predominantly Teslas,” said Cummings, who previously worked as an adviser at the National Highway Traffic Safety Administration (NHTSA). Part of her job was to study data from crashes that involved advanced driver-assistance systems.

“The drivers would see a car that’s doing pretty good,” Cummings said in an interview. “They get a false sense of security, and they don’t understand that they should be paying attention.”

Under the voluntary recall announced in late 2023, Tesla agreed to update its software, adding new safeguards to try to keep drivers more engaged.

This photo shows an overhead view of State Route 522 in Snohomish County, Washington, after a fatal collision between a Tesla Model S and a motorcycle. The lanes where the collision occurred are shut down, and the Tesla and motorcycle remain at the crash site. First responders and their vehicles stand in the vicinity of the crash site. On the far-left side of the photo, cars drive in an open lane.

Minutes before the fatal crash, police say Scott Hunter put his Tesla Model S into Tesla’s Full Self-Driving (Supervised) mode, its most advanced driver-assistance system.

Washington State Patrol


hide caption

toggle caption

Washington State Patrol

Since 2021, Tesla and other carmakers have been required to report all serious incidents involving autonomous vehicles or advanced driver-assistance technology to regulators at NHTSA under what’s known as the Standing General Order on Crash Reporting.

“That information has really given NHTSA an important window into what is happening on the road,” said Ann Carlson, the former acting administrator and chief counsel at NHTSA, who signed the standing general order.

“Without it, the agency simply doesn’t have the window into what is going on on the roadways with what is really novel and interesting, important technology — but technology that we really need to make sure is safe,” said Carlson, who is also a professor at the UCLA School of Law.

But the standing general order may not be standing much longer. 

Tesla advocates say the reporting requirements give a misleading impression of the cars’ safety

Current and former Tesla officials take a very different view of the standing general order and its crash requirement. To them, it’s unfair — even misleading — because it makes Tesla’s overall safety record seem worse than it really is.

“The public is being confused by erroneous data that lacks context and lacks analysis,” said Rohan Patel, a former vice president of policy and business development at Tesla.

Tesla collects far more data on crashes than other carmakers and has many more cars on the road that are using advanced driver-assistance technology. That could help explain why Tesla has reported far more crashes than other carmakers under the standing general order: over 1,600 in total, more than 4 out of every 5 crashes that involved advanced driver-assistance systems.

“The truth is,” Patel said, “that the rate of those incidents is quite low.”

Now it appears the incoming administration may share the company’s concerns. Reuters reported last month that the Trump transition team recommends getting rid of the crash reporting requirement.

The transition team did not respond to questions from NPR about the crash reporting requirement or whether Musk would weigh in on policies that affect his own businesses.

Trump has tasked Musk and businessman Vivek Ramaswamy with leading what he calls the Department of Government Efficiency (DOGE), which is expected to recommend drastic cuts to the federal workforce and regulations. But it’s not clear how much influence, if any, Musk will wield over the federal agencies that regulate his companies. 

This photo shows Tesla CEO Elon Musk with his young son riding on his shoulders inside the U.S. Capitol. Walking in a rough row behind him are several men wearing suits, including businessman Vivek Ramaswamy and House Speaker Mike Johnson.

Tesla CEO Elon Musk at the U.S. Capitol, with businessman Vivek Ramaswamy (third from right, wearing blue tie) and House Speaker Mike Johnson, R-La., (fourth from right) last month.

Andrew Harnik/Getty Images


hide caption

toggle caption

Andrew Harnik/Getty Images

Musk has publicly defended Tesla’s safety record too.

“Inevitably when there are a lot of cars and you’ve got billions of miles, you do have the law of large numbers where, OK, there’s a small chance of something bad happening,” Musk said at a shareholder meeting last June.

Still, some Tesla shareholders seem nervous about the company’s approach. At that meeting last year, Musk was asked how he thinks about the “unfortunate mishaps” that have plagued other companies that are working on autonomous cars.

“Those are real consequences of developing this technology, and I’m just wondering where your mind is on that,” asked one shareholder who did not give his name.

Musk replied that Tesla is trying to be careful with the rollout of Full Self-Driving mode. “Human driving is not perfect,” he said, noting that roughly 40,000 people are killed every year on U.S. roadways. “What matters is, like, are we making that number smaller? And as long as we’re making that number smaller, we’re doing the right thing,” Musk said.

At the shareholder meeting last year, Musk touted the latest version of the company’s Full Self-Driving technology. With each release, he said, the number of miles the system can drive without human intervention has increased. 

“It’s headed towards unsupervised full self-driving very quickly, at an exponential pace,” Musk said. “When you look at the sort of safety per mile, because we’ve got a lot of miles, it’s very clear that the safety per mile is better than human driving.”

That’s a claim Musk has made many times, though his critics note that he has never released the evidence to back it up.

“I have never seen one ounce of data that would suggest that Teslas are safer than human drivers,” Cummings said. “They’re nowhere near ready to drive without a driver behind the wheel.”

An uncertain future for regulation

Musk has been promising the imminent arrival of autonomous cars since at least 2016. He has been telling customers and investors for years that the cars might become fully self-driving vehicles with a future software update. And last year, Tesla unveiled the prototype of its Cybercab, a taxi that Musk says will be fully autonomous.

But that’s not what Tesla is telling regulators, or its customers, about its current fleet.

Tesla says drivers must remain attentive at all times, even in Full Self-Driving mode. That includes an internal camera that monitors the driver’s head and eyes to make sure the driver is paying attention. If the driver isn’t, the car issues a series of escalating warnings.

This photo shows the front of a dark gray Tesla sedan on a road after a fatal collision with a motorcycle. On the left side of the photo, the motorcycle lies on its side on the road, next to the Tesla. First responders wearing yellow vests assess the scene.

The front of Scott Hunter’s Tesla sustained minor damage during the fatal low-speed collision in April 2024 outside Seattle, according to police.

Washington State Patrol


hide caption

toggle caption

Washington State Patrol

According to police in Washington state, that is what happened in the moments before the fatal crash outside Seattle in April. Data recovered from Scott Hunter’s Tesla shows the car tried to get the driver’s attention as it approached the motorcycle, which was moving slowly in traffic.

But then, moments before impact, the report says Hunter pressed his foot down on the accelerator, overriding the car’s automatic braking system — and kept it there for 10 seconds after the collision with motorcyclist Jeffrey Nissen, even though the car was no longer moving.

“I wish this had never happened,” Nissen’s sister, Jenessa Fagerlie, told local TV station KING 5 last year. “And I wish people would really stay off their phones more.”

Hunter has not been charged with a crime by prosecutors, though the investigation is continuing, and his lawyer declined to comment for this story.

Tesla did not respond to requests for comment about the crash. But safety advocates say the carmaker bears some responsibility here, too.

“Tesla has been overselling the effectiveness of its technology for years,” said Michael Brooks, the executive director of the nonprofit Center for Auto Safety. “And a lot of people buy into that. They’re kind of wrapped up in this belief that this is an autonomous vehicle, because it’s tweeted about that way.”

A week after the crash in Snohomish County, investigators at NHTSA published the results of a three-year investigation into Tesla’s Autopilot system. They found a “critical safety gap” between drivers’ expectations of the driver-assistance system and its true capabilities. Investigators identified at least 13 fatal crashes, as well as many more involving serious injuries, in which “foreseeable driver misuse of the system played an apparent role.”

Regulators at NHTSA also announced an investigation into the effectiveness of Tesla’s Autopilot recall and whether the company is doing enough to keep drivers engaged and off their phones. Regulators have since announced that they’re also looking into a series of crashes involving cars using Full Self-Driving in low-visibility conditions.

But safety advocates worry that those investigations may be in jeopardy too, along with the crash reporting requirement. Without it, they say, regulators — and the public — will know less about who is keeping their eyes on the road and who isn’t.

You may also like

Leave a Comment

AdSense Space

@2023 – All Right Reserved. Designed and Developed by  Kaniz Fatema