A Tesla driver in Seattle has admitted to police that he was using the Tesla’s controversial Autopilot feature before fatally hitting a motorcyclist who was driving in front of him.
The 56-year old man was driving a 2022 Tesla Model S through the streets of a Seattle suburb on Friday afternoon, April 19th, when the crash took place.
The driver was arrested by a Washington State Patrol at the scene under suspicion of vehicular manslaughter due to “inattention to driving, while in Autopilot mode and the distraction of the cell phone while moving forward.”
The driver told the state trooper that he was using the Autopilot driving system and looked at his cellphone while the Tesla was moving, according to the Associated Press.
“The next thing he knew there was a bang and the vehicle lurched forward as it accelerated and collided with the motorcycle in front of him," the trooper had wrote in the police report.
The 28-year old motorcyclist died after being stuck under the Tesla, according to the driver, who was released on $100,000 bail on Sunday.
On Wednesday, the Washington State Police Captain said the department had not verified the drivers statements and that the investigation was still in the “very early stages.”
The Autopilot features come standard with all new Tesla vehicles, although owners can choose to upgrade to an enhanced version. Tesla founder and CEO Elon Musk has often stated his ambition to eventually manufacture autonomous vehicles that require no human intervention.
It’s not clear what Autopilot version the driver has installed in the Model S, but the Tesla website states that “the currently enabled Autopilot, Enhanced Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous.”
That active supervision includes the driver being “fully attentive” and with hands on the steering wheel at all times ready to "take over at any moment," Tesla states.
“Bottom line – if you are behind the wheel you are responsible,” said Chris Loftis, a spokesperson for the Washington State patrol.
“It is imperative that the traveling public remember the driver operator is always responsible for the safe and legal operation of their vehicle in any locale,” Loftis said.
Autopilot fatal crash cases
Tesla has been forced to rethink its advertising of its standard Autopilot, enhanced Autopilot, and Full self-driving features after the US National Highway Traffic and Safety Administration (NHTSA) forced the car manufacturer to recall nearly two million Teslas in December 2023 due to safety concerns over the advanced system.
Federal regulators said it had discovered that, after a two year investigation, drivers “are not always paying attention when that system is on.”
For the recall, Tesla deployed a software update designed to "incorporate additional controls and alerts” to encourage driver engagement when the Autopilot feature was activated.
The investigation was triggered after the death of a Los Angeles driver who crashed while using a beta version of Tesla’s Autopilot system.
Although a jury found that Tesla was not liable for the man's death in a much-publicized trial ending last November, it was still considered one of the largest recalls in Tesla’s history.
This February, another 350,000 Tesla's were recalled for having faulty self-driving software.
The NHTSA said the defective software allowed cars with the feature to "exceed speed limits or travel through intersections in an unlawful or unpredictable manner," making them susceptible to crashing.
Since 2021, regulators have opened cases into at least 956 crashes in which Autopilot was initially reported to have been in use, resulting in at least 23 deaths.
Earlier this week, Tesla began rolling out new features and software updates for its Full Self-Driving (Supervised) on models S, 3, X, and Y.
Your email address will not be published. Required fields are markedmarked