View Single Post
 
Old 11-24-2019, 03:51 PM
ColdNoMore ColdNoMore is offline
Sage
Join Date: Apr 2016
Location: Between 466 & 466A
Posts: 10,509
Thanks: 82
Thanked 1,507 Times in 677 Posts
Default

Quote:
Originally Posted by rustyp View Post
Ethics morality question. Since the car knows exactly how long it takes to stop at any given speed what will happen when a child runs out to fetch a ball. The situation is as follows. The car calculates stopping distance and the answer is can not stop in time to avoid the child. The only other option is to swerve but car realizes there are a group of people at the bus stop and calculates that distance also as impossible to stop. What programer get to write the decision into the software program I.E. take one life or several but the one life is that of a child and the bus stop was adults.
There will always be no-win situations...or only bad/arguably worse choices.

In fact, ethics classes have a plethora of these types of scenarios.

More importantly though, is the big picture whereby how many accidents will be avoided by the use of automation/sensors that can react/process all of the factors, in a fraction of the time...that even the best of drivers can do?