516.543.3277

The Fine Print

LeFebvre Law's blog exploring legal issues related to the small business community, entrepreneurs, contractors, the construction industry, along with occasional pieces about futurism, the commercial space industry and space law, and other emergent technologies and novel legal fields. "Always read The Fine Print!"

The Fine Print

LeFebvre Law's blog exploring legal issues related to the small business community, entrepreneurs, contractors, the construction industry, along with occasional pieces about futurism, the commercial space industry and space law, and other emergent technologies and novel legal fields. "Always read The Fine Print!"

Tesla, Driverless Cars, and Liability

Image courtesy of Tesla Motors - www.teslamotors.com

Tesla, Driverless Cars, and Liability

Tesla Motors, Inc. has taken bold steps to introduce driverless features to its futuristic fleet of electric vehicles. By being a pioneer, Elon Musk’s ambitious electric car company has opened itself up to some unique risks. The most noteworthy of those risks is liability for an injury or death that occurs while the “autopilot” feature is engaged. This heretofore hypothetical scenario became a reality on May 7, 2016 on a highway in Florida when a Tesla Model S with autopilot engaged was involved in a fatal accident.

Tesla’s autopilot feature is a semi-autonomous feature that utilizes a vast array of sensors and cameras which allow the car to accelerate, brake, and steer under certain conditions and provided the driver’s hands are on the steering wheel. The vehicle is classified as Level 2 Autonomy by the National Highway Traffic Safety Administration (NHTSA) which means the vehicle is able to handle at least two functions at the same time such as maintaining a consistent speed on cruise control and making lane changes.

No technology being flawless, it was inevitable that an accident would occur and sadly inevitable that such an accident would prove fatal. That is what occurred on May 7 when a Tesla owner driving on a highway in Williston, Florida died when his electric vehicle drove under the trailer of an 18-wheel truck while the autopilot was engaged. Tesla states that the autopilot sensors failed to detect the white truck against a bright May sky as it turned in front of a Tesla Model S, resulting in the fatal accident.

In the wake of such a tragedy involving a novel technology, the question turns to liability. Tesla has made some efforts to disclaim their own liability for such mishaps, and have implemented a number of features into the autopilot in an attempt to keep the drivers engaged. But it is still early days and whether these efforts will be sufficient remains to be seen.

There is the serious possibility that the family of the man killed in the accident may have grounds to sue Tesla for a products liability claim. It is perfectly possible that a court might find that the risks associated with the autopilot were not adequately communicated to the consumer, and as a result the consumer was not aware of the potential defects in the system and the risks those might pose, which in this case includes death.

Additionally, the branding may prove problematic. The term “autopilot” has a meaning in the popular vernacular. It is generally associated with a machine that essentially operates itself, typically an airplane. An argument could be made that this branding of the feature could be deceptive, lulling consumers into believing the technology is more capable than it is in reality.

Tesla would argue that it has many safety features associated with the autopilot feature. Among them are sensors that emit visible and audible warnings if the driver does not keep at least one hand on the wheel at all times and will even slow the vehicle until a hand is placed on the wheel. The manufacturer stated in a blog post that “Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.” This beta concept is novel to the automotive world – in the tech sector, where Elon Musk hails from, releasing a product in its early development stage is an important way to weed out bugs and improve the product before it is widely released as a final product. But in the world of automotive products liability, it is possible that this might be perceived by a judge as a rush to put a product on the market before it has been properly vetted and tested – with deadly consequences.

This emerging technological and legal landscape continues to evolve. Regulators are expected to release driverless car guidelines sometime this summer in an attempt to balance the potential safety benefits of driverless vehicles with the limitations of the nascent and imperfect technology. It is worth noting that over 90% of road fatalities are the result of human error and autonomous vehicles are very likely to reduce that number significantly as the technology matures and is more widely distributed. These benefits are extremely encouraging, but as the technology matures we are likely to see other mishaps and no doubt the auto industry and the tech industry will be paying very close attention to how regulators, lawmakers and judges react to these incidents.

 

Tim LeFebvre, Esq.