Many researchers and automakers insist that driver automation can help prevent car accidents because it removes human error from the equation. However, there is a strong possibility that self-driving cars may still pose risks, including those which involve potentially defective auto technology.
“The technology can confuse operators if it’s poorly designed or leads to complacency that breeds its own hazards,” according to a recent article in Bloomberg News about the crash of a Tesla Motors Inc. car (now under investigation by the NTSB).
The reality is that many cars being manufactured today boast safety features which make the cars semi-autonomous. Click here for a detailed look at these advances in crash tech from the IIHS report for a detailed look at crash-avoidance technology.
Still, are fully automated vehicles actually something we should be striving to see on the roads?
The fatal Tesla accident, which has resulted in the federal investigation by the NTSB as well as the National Highway Traffic Safety Administration (NHTSA), occurred on May 7 in Williston, Florida.
A 40-year-old man suffered fatal injuries when his 2015 Tesla Model S collided with a semi-truck on a highway, according to Bloomberg.
At the time of the crash, the Tesla Autopilot had been engaged.
As Bloomberg explains, the Autopilot “can guide the vehicle in certain conditions.” Yet, in this instance, the feature “didn’t notice the white side of the tractor trailer as it turned in front of the car against a brightly lit sky, so the brake wasn’t applied.” In short, the Autopilot made a deadly error.
Tesla indicated that “the system may have confused the truck with an overhead highway sign,” Bloomberg reports.
At the time of the fatal May 7 crash, more than 130 million miles of driving had been conducted with the Tesla Autopilot without any known fatalities. However, since that deadly collision, more Tesla accidents have occurred –all while the vehicles were in Autopilot mode.
According to USA Today, a second collision happened in July “when a Michigan art dealer and his son-in-law were traveling in a Tesla Model X that hit a guardrail along the Pennsylvania Turnpike” and “crossed over several lanes before hitting a concrete median.” The vehicle rolled over as a result of the crash.
A third Tesla accident occurred a short time later when a vehicle headed from Seattle to West Yellowstone, Montana crashed. The driver indicated that the Autopilot had been engaged before the start of the trip.
With the recent Tesla crashes, federal officials have launched an investigation into the general safety of automated technology in automobiles. A major focus is on how drivers interact with this technology.
According to a report in The Truth About Cars, the NTSB is working to determine whether, drivers are letting their guard down as vehicles increasingly rely on electronic aids for safety.
In other words, is one of the major problems with self-driving cars that drivers simply are not paying sufficient attention in the event of an automated technology error?
“Autopilot can be fooled, but it isn’t clear why [the Florida driver] himself didn’t try to avoid the large obstacle that appeared directly in front of him on a dry, sunny day,” the report notes.
As NTSB Chairman Christopher Hart emphasized following a crash involving automated technology in 2013, drivers “must understand and command automation, and not become over-reliant on it.”
Is there a distinction between the technology employed in self-driving cars and recent advances in crash-avoidance technology?
As the IIHS noted in a March 2016 report, crash-prevention technology greatly reduces the rate of automobile accidents.
“Systems with forward-collision warning and automatic braking reduce rear-end crashes by about 40 percent on average, while forward-collision warning alone cuts them by 23 percent,” the report states.
Studies suggest that automation, in some circumstances, is more helpful than harmful. Yet, when systems such as Tesla’s Autopilot make errors and cause fatal accidents, who is to blame?
While previous NTSB reports suggest that drivers need to avoid relying entirely on autopilot systems, is the automaker nevertheless liable for injuries in a crash?
When you buy an automobile and engage its safety technologies, you should be able to expect that those technologies will work as designed and will not make mistakes that result in serious and fatal accidents.
While we await the results of the NTSB investigation into the fatal Tesla crash, it is important to consider the relation between self-driving cars and product liability claims.
If Tesla’s Autopilot was, in fact, responsible for a deadly collision, should the automaker be liable?
As the Cornell Legal Information Institute (LII) explains, products liability law is an area of personal injury law that allows injured plaintiffs to seek compensation from “any or all parties along the chain of manufacture of any product for damage caused by that product.”
In product liability cases, people who sustained injuries or lost loved ones because of an auto product defect may be able to file claims against the designer or manufacturer of the vehicle or its technology as well as the car dealer.
If you or someone you love got hurt in a car accident caused by an auto product, you may be able to file a claim for compensation. An experienced product liability attorney in Chicago can assist you. Contact Salvi, Schostok & Pritchard P.C. today to learn more about our services in a free consultation.