
A Tesla owner from Ohio has posted a disturbing video clip. His vehicle didn’t notice a passing train until the very last moment, suddenly turning right and crashing into the crossing gate. The owner says this is the second time this has happened at different railroad crossings while the Tesla was in full self-driving (FSD) mode.
According to NDTV, Craig Doty II from Camden, Ohio, claimed that his vehicle didn’t slow down while approaching the passing train. He started looking for explanations online on the Tesla Motors Club forum.
“I have owned my Tesla for less than a year, and within the last six months, it has twice attempted to drive directly into a passing train while in FSD mode. The most recent incident occurred on May 8th, 2024,” Doty posted, sharing his dashcam footage.
The footage shows the car turning suddenly at the last moment and hitting the railroad crossing gate. It’s unclear whether that was the driver’s desperate intervention or the Tesla’s emergency brake. In a separate photo, the driver revealed that the car suffered serious damage.
Doty suffered no significant injuries – only backaches and a deep bruise on his right elbow, which didn’t require medical attention. Doty admits that he, as the driver, is ultimately responsible for the vehicle, which is also a reason why lawyers seem unwilling to take his case.
Cameras are good enough for FSD - Elon Musk.
undefined Artem Russakovskii (@ArtemR) May 19, 2024
Unless you don't want to get hit by a train.
undefinedI have owned my Tesla for less than a year, and within the last six months, it has twice attempted to drive directly into a passing train while in FSD mode. The most recent incident… pic.twitter.com/XAQccItBYw
“However, my issue lies with the FSD system's failure to recognize the train. I was attentive and aware during the incident. I expected the system to brake for the train, as it should, based on previous experience where the FSD system is typically more cautious than a regular driver in many situations,” he explained to suspicious forum members.
According to a Tesla owner, after using the FSD feature for a while, Tesla drivers tend to trust it and assume the vehicle will slow down when approaching the obstacle ahead. That is, until it doesn’t.
“You’re suddenly forced to take control,” Doty noted.
He claims the Tesla Data Report confirmed that FSD was active at the time of the incident, and it was not the first time it had happened.
“Both incidents happened at different railroad crossings, where I was the first car at the tracks. I have a 55-mile commute one-way each morning, and 98% of the time, the FSD does exactly what it’s supposed to do. Since an update about a month ago, the car has been setting the speed on the rural state route (with a 55 mph speed limit) to around 61 to 63 mph, which is consistent with normal traffic,” he posted.
Tesla Owner’s manual clearly states that FSD does not make the vehicle autonomous “and requires a fully attentive driver who is ready to take immediate action at all times.“
“Driver intervention may be required in certain situations, such as on narrow roads with oncoming cars, in construction zones, or while going through complex intersections,” the manual reads.
Tesla uses inputs from cameras to build a model of the surrounding area and neural networks to make decisions.
After hundreds of crashes, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into Tesla's Autopilot system. The investigation found that in many crashes, drivers had their hands on the wheel but took no evasive action before Autopilot aborted control just before impact. Sometimes, Teslas crashed into stationary vehicles parked on the side of the road.
NHTSA found that Tesla drivers involved in the crashes “were not sufficiently engaged in the driving task and that the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task.”
“Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” NHTSA’s document said.
Your email address will not be published. Required fields are markedmarked