기사검색

Who's to blame when a self-driving car has an accident?

가 -가 +

박소현 기자
기사입력 2020/12/02 [10:37]

▲ Self-driving cars are programmed to identify and avoid risk, but in the case of an accident, who is legally responsible?

 

With self-driving cars gaining traction in today’s automobile landscape, the issue of legal liability in the case of an accident has become more relevant.

 

Research in human-vehicle interaction has shown time and again that even systems designed to automate driving — like adaptive cruise control, which maintains the vehicle at a certain speed and distance from the car ahead — are far from being error-proof.

 

Recent evidence points to drivers’ limited understanding of what these systems can and cannot do (also known as mental models) as a contributing factor to system misuse.

 

A webinar on the dangers of advanced driver-assisted systems.

 

There are many issues troubling the world of self-driving cars including the less-than-perfect technology and lukewarm public acceptance of autonomous systems. There is also the question of legal liabilities. In particular, what are the legal responsibilities of the human driver and the car maker that built the self-driving car?

 

Trust and accountability

 

In a recent study published in Humanities and Social Science Communications, the authors tackle the issue of over-trusting drivers and the resulting system misuse from a legal viewpoint. They look at what the manufacturers of self-driving cars should legally do to ensure that drivers understand how to use the vehicles appropriately.

 

One solution suggested in the study involves requiring buyers to sign end-user licence agreements (EULAs), similar to the terms and conditions that require agreement when using new computer or software products. To obtain consent, manufacturers might employ the omnipresent touchscreen, which comes installed in most new vehicles.

 

The issue is that this is far from being ideal, or even safe. And the interface may not provide enough information to the driver, leading to confusion about the nature of the requests for agreement and their implications.

 

The problem is, most end users don’t read EULAs: a 2017 Deloitte study shows that 91 per cent of people agree to them without reading. The percentage is even higher in young people, with 97 per cent agreeing without reviewing the terms.

 

Unlike using a smartphone app, operating a car has intrinsic and sizeable safety risks, whether the driver is human or software. Human drivers need to consent to take responsibility for the outcomes of the software and hardware.

 

“Warning fatigue” and distracted driving are also causes for concern. For example, a driver, annoyed after receiving continuous warnings, could decide to just ignore the message. Or, if the message is presented while the vehicle is in motion, it could represent a distraction.

 

Given these limitations and concerns, even if this mode of obtaining consent is to move forward, it likely won’t fully shield automakers from their legal liability should the system malfunction or an accident occur.

 

Driver training for self-driving vehicles can help ensure that drivers fully understand system capabilities and limitations. This needs to occur beyond the vehicle purchase — recent evidence shows that even relying on the information provided by the dealership is not going to answer many questions.

 

All of this considered, the road forward for self-driving cars is not going to be a smooth ride after all.

트위터 페이스북 카카오톡 카카오스토리 band naver URL복사

뉴스레터 구독하기

세상을 바꾸고 있는 블록체인과 IT 관련 이야기를 쉽고 재미있게 만나보세요.

개인정보 수집 및 이용

뉴스레터 발송을 위한 최소한의 개인정보를 수집하고 이용합니다. 수집된 정보는 발송 외 다른 목적으로 이용되지 않으며, 서비스가 종료되거나 구독을 해지할 경우 즉시 파기됩니다.

URL 복사
x
  • 위에의 URL을 누르면 복사하실수 있습니다.

PC버전 맨위로

Copyright ⓒ 코인캣미디어. All rights reserved.