Thank you very much for Winnie’s article. In fact, driverless technology is still an immature technology nowadays, because traffic is always linked to life. When an accident happens in a driverless car, who should bear the responsibility? What moral standards should people use to measure it?
At present, there is no moral standard to measure: automatic driving system based on different program settings or machine learning procedures, will make different choices when facing the same traffic accident involving ethical problems, and even under certain circumstances, there is no precedent to follow or no clear moral standards.
Image 1: driverless car. Source: Dezeen.
In dealing with traffic accidents, the reasonable goal is to ensure safety, mobility and legitimacy; in some cases, there is a conflict between the three or one of the goals itself. The driver’s subconscious reaction starts with his own safety, and when facing the safety choice of two vehicles’ passengers, he may need to choose to protect the safety of another person with the least harm to one of them. Generally speaking, unmanned driving technology needs better security in order to gain people’s trust.
Segmenting the Autonomous Vehicle Value Chain: A Look at Who is in the “Driverless” Seat
Visit http://www.visionsystemsintelligence.com for further information