Self-Driving Cars: What Does This Mean for the Legal Field?

Self-Driving Cars: What Does This Mean for the Legal Field?

Self-Driving Cars: What Does This Mean for the Legal Field?

As we all know, car accidents happen on a daily basis due to a variety of reasons, but most commonly that reason being human error. However, with the rise of technology has come the self-driving car. This product, although still in the testing phase, has claimed to be the new-coming of a better means of driving as these vehicles are alleged to take out the human error associated with driving and in place, have a system that lacks distraction and negligence. Regardless of these intentions however, the use of the self-driving car is sure to bring forth a new set of legal issues and a niche in the law in which many attorneys may find themselves practicing. Additionally, the question then comes – how will the law be developed in this regard? Will new laws be set forth which govern accidents involving self-driving cars? Will self-driving car accidents be premised on negligence or products liability suits? Only the time will tell. However, what we have seen so far in regard to these self-driving cars that are being developed is that they are far from perfect. For example, a tragic accident occurred with an Uber self-driving vehicle on March 18, 2018 in Arizona. News reports state that on this date, a woman was walking across the street with her bicycle at which time an uber self-driving vehicle, traveling autonomously, although with a human “safety driver” inside, tragically struck the pedestrian. More recent reports show that the accident is believed to have occurred because Uber’s software in the vehicle that would have prevented the vehicle from striking the woman “was tuned in such a way that it ‘decided’ it didn’t need to take evasive action, and possibly flagged the detection as a ‘false positive’. The reason a system would do this, according to the report, is because there are a number of situations where the computers that power an autonomous car might see something it thinks is a human or some other obstacle. Uber reportedly set that threshold so low, though, that the system saw a person crossing the road with a bicycle and determined that immediate evasive action wasn’t necessary.” Sean O’Kane wrote in a May 7th article. Accordingly, it appears from reading these reports that such an accident was caused by a defect in Uber’s software; however, these reports also state that the “safety driver” in the vehicle was also looking down at the time of the incident thus failing to prevent the accident as well. Thus, the question then arises, who is responsible for such an accident? Uber, the safety driver, the manufacturer of the vehicle? As self-driving vehicles become more prevalent it’s likely that all of these issues will be thoroughly discussed and argued, and further, that this area of the law will continue to emerge.