Tube4vids logo

Your daily adult tube feed all in one place!

Heart-stopping moment Tesla owner nearly plows into a moving TRAIN in 'self-drive' mode (and he says it wasn't the first time!)

PUBLISHED
UPDATED
VIEWS

A Tesla owner is blaming his vehicle's Full Self-Driving feature for veering toward an oncoming train before he could intervene.

Craig Doty II, from Ohio, was driving down a road at night earlier this month when dashcam footage showed his Tesla quickly approaching a passing train with no sign of slowing down.

He claimed his vehicle was in Full Self-Driving (FSD) mode at the time and didn't slow down despite the train crossing the road - but did not specify the make or model of the car.

In the video, the driver appears to have been forced to intervene by veering right through the railway crossing sign and coming to a stop mere feet from the moving train.

Tesla has faced numerous lawsuits from owners who claimed the FSD or Autopilot feature caused them to crash because it didn't stop for another vehicle or swerved into an object, in some cases claiming the lives of drivers involved.

The Tesla driver claimed his vehicle was in Full Self-Driving (FSD) mode but didn't slow down as it approached a train crossing the road. In the video, the driver was allegedly forced to swerve off the road to avoid colliding with the train Pictured: The Tesla vehicle after the near-collision

The Tesla driver claimed his vehicle was in Full Self-Driving (FSD) mode but didn't slow down as it approached a train crossing the road. In the video, the driver was allegedly forced to swerve off the road to avoid colliding with the train Pictured: The Tesla vehicle after the near-collision

As of April 2024, Tesla models Y, X, S and 3 Autopilot systems were involved in a total of 17 fatalities and 736 crashes since 2019, according to the National Highway Traffic Safety Administration (NHTSA). 

Doty reported the issue on the Tesla Motors Club website, saying he has owned the car for less than a year but 'within the last six months, it has twice attempted to drive directly into a passing train while in FSD mode.'

He had reportedly tried to report the incident and find similar cases to his, but hasn't found a lawyer that would accept his case because he didn't suffer significant injuries - only backaches and a bruise, Doty said.

DailyMail.com has reached out to Tesla and Doty for comment.

Tesla cautions drivers against using the FSD system in low light or poor weather conditions like rain, snow, direct sunlight and fog which can 'significantly degrade performance.'

This is because the conditions interfere with the Tesla's sensor operation which includes ultrasonic sensors that use high-frequency sound waves to bounce off nearby objects.

It also uses radar systems that produce low light frequencies to determine if there is a car nearby and 360-degree cameras.

These systems collectively gather data about the surrounding area including the road conditions, traffics and nearby objects, but in low visibility, the systems aren't able to accurately detect the conditions around them.

Craig Doty II reported that he had owned his Tesla vehicle for less than a year but the full self-driving feature had already caused almost two crashes. Pictured: The Tesla approaching the train but does not slow down or stop

Craig Doty II reported that he had owned his Tesla vehicle for less than a year but the full self-driving feature had already caused almost two crashes. Pictured: The Tesla approaching the train but does not slow down or stop

Tesla is facing numerous lawsuits from owners who claimed the FSD or Autopilot feature caused them to crash. Pictured: Craig Doty II's car swerving off the road to avoid the train

Tesla is facing numerous lawsuits from owners who claimed the FSD or Autopilot feature caused them to crash. Pictured: Craig Doty II's car swerving off the road to avoid the train

When asked why he had continued to use the FSD system after the first near collision, Doty said he trusted it would perform correctly because he hadn't had any other issues for a while.

'After using the FSD system for a while, you tend to trust it to perform correctly, much like you would with adaptive cruise control,' he said. 

'You assume the vehicle will slow down when approaching a slower car in front until it doesn't, and you're suddenly forced to take control. 

'This complacency can build up over time due to the system usually performing as expected, making incidents like this particularly concerning.'

Tesla's manual does warn drivers against solely relying on the FSD feature saying they need to keep their hands on the steering wheel at all times, 'be mindful of road conditions and surrounding traffic, pay attention to pedestrians and cyclists, and always be prepared to take immediate action.'

Tesla's Autopilot system has been accused of causing a fiery crash that killed a Colorado man as he was driving home from the golf course, according to a lawsuit filed May 3.

Hans Von Ohain's family claimed he was using the 2021 Tesla Model 3's Autopilot system on May 16, 2022, when it sharply veered to the right off the road, but Erik Rossiter, who was a passenger said the driver was heavily intoxicated at the time of the crash.

Ohain tried to regain control of the vehicle but couldn't and died when the car collided with a tree and burst into flames.

An autopsy report later revealed he had three times the legal alcohol limit in his system when he died.

A Florida man was also killed in 2019 when the Tesla Model 3's Autopilot didn't brake as a semi-truck turned onto the road, causing the car to slide under the trailer - the man was killed instantly.

In October of last year, Tesla won its first lawsuit over allegations that the Autopilot feature led to the death of a Los Angeles man when the Model 3 veered off the highway and into a palm tree before bursting into flames.

The NHTSA investigated the crashes associated with the Autopilot feature and said a 'weak driver engagement system' contributed to the car wrecks. 

The Autopilot feature 'led to foreseeable misuse and avoidable crashes,' the NHTSA report said, adding that the system did not 'sufficiently ensure driver attention and appropriate use.'

Tesla issued an over-the-air software update in December for two million vehicles in the US that was supposed to improve the vehicle's Autopilot of FSD systems, but the NHTSA has now suggested the update likely wasn't enough in light of more crashes. 

Elon Musk has not commented on the NHTSA's report, but previously touted the effectiveness of Tesla's self-driving systems, claiming in a 2021 post on X that people who used the Autopilot feature were 10 times less likely to be in an accident than the average vehicle.

'People are dying due to misplaced confidence in Tesla Autopilot capabilities. Even simple steps could improve safety,' Philip Koopman, an automotive safety researcher and computer engineering professor at Carnegie Mellon University told CNBC

'Tesla could automatically restrict Autopilot use to intended roads based on map data already in the vehicle,' he continued.

'Tesla could improve monitoring so drivers can't routinely become absorbed in their cellphones while Autopilot is in use.'

Comments