Who is Responsible When a “Self-Driving” Car Crashes?
Unfortunately, Joshua Brown made history on May 7, 2016. He was the first person killed in a self-driving vehicle, when his car crashed into a tractor-trailer that was legally crossing the road at the intersection ahead. Brown had turned on the Autopilot feature, the vehicles autonomous driving system, and set the cruise control at 74 mph.
The vehicle was a Tesla Model S. Although all systems appeared to be working, due to the time of day and location of the sun, the camera failed to recognize that a tractor-trailer was driving across the roadway. The car did not stop, nor did it give a warning signal to alert the driver to danger.
Although drivers of this car, even while on Autopilot, were still supposed to keep their eyes on the road, it was determined that Brown was not paying attention. If he had been watching, he would have had time to brake after seeing the large truck and the collision would likely have been avoided. There was some evidence that Brown had even been watching a movie on his cell phone at the time the accident occurred.
The National Highway Traffic Safety Administration (NHTSA) conducted an investigation and determined Tesla was not at fault, since the “Autopilot” system was intended to prevent the car from rear-ending other cars, not to handle situations where vehicles crossed roads in front of the car at intersections. The fault in this case was attributed to Brown.
In March 2018, near Phoenix, an Uber self-driving vehicle slammed into a pedestrian who was pushing her bike across the road. The vehicle was in “autonomous mode” at the time of the wreck, with a human test driver behind the wheel. The pedestrian was killed instantly. She was not in a crosswalk, was wearing dark clothes on a dark night, and came suddenly out of the shadows onto the roadway.
The police chief who viewed a video of the accident believed it was unavoidable. An investigation suggested that a human driver would have been able to stop, or at least swerve, to avoid the accident. A mechanical engineer, writing for Forbes magazine, viewed the same video as the police chief and opined that the “automated system should have outperformed a human,” noting that the pedestrian should have never been struck.
Only 10 days after the crash, Uber negotiated a financial settlement to the woman’s husband and daughter. There have been several claims regarding injuries and deaths involving self-driving vehicles. All of them have settled out of court, at this time. Pre-litigation or pre-trial settlements avoid the risks associated with trial and going against the manufacturers of self-driving vehicles.
Who is Responsible: The Car or the Driver?
The ABA Journal notes that “the law, as it stands now, is simple. Human beings cannot delegate driving responsibility to their cars. In self-driving cars, a human must be ready to override the system and take control.” But, it is time to update the law. The NHTSA reported in September 2016 that different legal standards need to be developed that are “based on whether the human operator or the automated system is primarily responsible for monitoring the driving environment.”
The NHTSA recommends that in highly automated vehicles (HAVs) the system should be considered the driver and therefore, be the entity responsible for the accidents. If the person in the vehicle has no way of operating it, that person should not be liable when there is an accident.
It is expected that negligence cases that were fairly straight-forward as to fault will become more complex. Possible defendants will be:
- The company responsible for the software used in the car.
- The manufacturers of the car’s components.
- The manufacturer of the car.
- The fleet owner, such as Uber, if the car is part of a fleet.
Detailed technical investigations will need to be conducted to discover exactly what part of the system failed and caused the auto accident and the personal injury. This will complicate negligence and product liability cases.
The ABA Journal continues, stating that “a negligence standard might make it too expensive for crash victims to obtain justice and a strict liability standard might discourage companies from putting HAVs on the road,” it may be time to develop less traditional methods” for handling damages claims by those who are injured.
An insurance executive points out that currently, car insurance focuses on the drivers and their driving record. As self-driving vehicles become the norm, the insurance industry will begin requiring more information about the systems used.
If you were injured in an accident involving a self-driving vehicle, or any other type of car accident, contact Georgia Trial Attorneys for a free consultation.