Although the introduction of self-driving automobiles comes with the advantage of eliminating human error in operating a vehicle, it appears that these cars are not without flaws. On May 7th, the first self-driving car fatality occurred when a driver of a Tesla S sports car died in collision in Williston, Florida while using the vehicle’s “autopilot” system. While driving on the highway, Joshua Brown, put his Model S Tesla into the vehicle’s autopilot mode, when he collided with a large tractor-trailer.
While the crash, and Tesla’s system, are being investigated by the government, preliminary reports indicate that the crash occurred when the tractor-trailer made a left turn in front of the Tesla driver, causing his vehicle to strike the tractor-trailer. The Highway Traffic Safety administration, who investigated the crash, stated that the driver died due to the injuries he sustained in the accident.
According to Tesla, neither the driver nor the vehicle’s sensors were able to detect the large white side of the trailer as it made the left turn against the brightly lit sky. The vehicle’s sensor system was unable to distinguish the 18-wheeler as it passed in front of Brown’s vehicle. As a result, the vehicle did not brake at any point prior to impact. Instead, the Tesla attempted to drive full speed under the trailer, causing the windshield to impact the bottom of the trailer. The car then continued to travel after striking and passing under the trailer, veering off the road and ultimately crashing into a power pole, according to local police. The company stated that the combination of the high height of the tractor-trailer, its positioning on the road, and circumstances regarding its color and the lighting conditions caused the accident.
Tesla has emphasized the rarity of such accidents, stating that this fatality is the first known death in over 130 million miles of autopilot operation, while there is on average a fatality every 94 million miles among all vehicles in the U.S. This accident, however, demonstrates the potential for, even in the rarest of circumstances, autopilot error resulting in a serious accident or even death. While Tesla stands behind its autopilot system, they have explained that the system is not perfect and still requires a driver to remain alert when operating the vehicle. The software is in fact designed to nudge drivers to keep their hands on the wheel to be sure that they are paying attention to road conditions, rather than relying wholly on the autopilot system.
Other automakers have stated that they intend to soon release vehicles with such self-driving capabilities in the near future. In fact, within a year, General Motors plans to test self-driving taxis through the car service app Lyft. With the increasing popularity and development of such autopilot technology, this accident sheds light on both the work that remains to perfect these systems, as well as the need for drivers to remain alert rather than reliant on the system itself.