Close
Updated:

Investigators Fault Driver in Tesla Autopilot Crash

The latest automotive technology—driverless vehicles—promises a world where accidents caused by human error are a thing of the past. Several companies, most notably Tesla, have made great strides toward bring this future into reality, but, as of 2017, we are not quite there yet. Most vehicles equipped with self-driving technology sold today are what are known as semi-autonomous vehicles, in which a human driver is still the primary operator of the vehicle, but the vehicle can assist the driver with a variety of tasks, including automatic braking, self parking, and lane detection. While these technologies are a promising start toward completely autonomous vehicles, they still have their limitations, which were tragically illustrated last year when an inattentive driver’s over reliance on his Tesla Model S sedan’s semi-autonomous driving system caused a deadly crash.

Joshua Brown, 40, was traveling on a divided highway near Gainesville, Florida using the Tesla’s automated driving system known as Autopilot when a truck driver made a left-hand turn in front of him. The vehicle did not recognize the oncoming truck, which resulted in a fatal collision. Tesla stated that it told drivers of the Model S vehicle that the automated systems should only be used on limited-access highways where there are no vehicles suddenly turning into the car’s path. Despite this warning, however, the company did not incorporate protections against using using Autopilot on other types of roads.

The Model S is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomously in nearly all circumstances. Level 2 automation systems are generally limited to use on interstate highways, which don’t have intersections. Drivers are supposed to continuously monitor vehicle performance and be ready to take control if necessary. In its investigation of the Brown accident, the National Transportation Safety Board (NTSB) found that the car’s cameras and radar weren’t capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions.Investigators also found that Brown had his hands on the car’s steering wheel for only 25 seconds out of the 37.5 minutes the vehicle’s cruise control and lane-keeping systems were in use prior to the crash. As a result, Brown’s attention wandered and he did not see the truck turning into his path.

 

At the conclusion of its investigation, the board re-issued its earlier recommendations that the government require all new cars and trucks to be equipped with technology that wirelessly transmits the vehicles’ location, speed, heading and other information to other vehicles in order to prevent these types of collisions.Last December, the Obama administration proposed that new vehicles be able to wirelessly communicate with each other, with traffic lights and with other roadway infrastructure. Automakers were generally supportive of the proposal, but it hasn’t been acted on by the Trump administration.At this time, the Department of Transportation promulgates a 12-point set of safety guidelines to makers of autonomous and semi-autonomous vehicles, but the department makes clear that the guidelines are voluntary and are not to be considered as regulations.

Contact an Atlanta Personal Injury Attorney Today for a Free Case Evaluation

If you or a loved one has been injured or killed in a crash involving a self-driving or semi-autonomous vehicle, you may be able to recover. Contact the attorneys at Slappey & Sadd for a free consultation to discuss your case by calling 404.255.6677. We serve the entire state of Georgia, including the following locations: Atlanta, Roswell, and Sandy Springs.

Contact Us
Start Chat