You confine testing of these vehicles to the neighbourhoods where the company directors and programmers send their children to school.Sad and, unfortunately, inevitable. It is notable that pedestrian accidents are generally way up lately and not due to AV's.
I would still like to see a TV show called the "Autonomous Vehicle Challenge" in which the show producers stage undisclosed challenges for some AV's and we see what happens. An obvious one is a somebody cluelessly crossing the street while texting. I'm not saying we know that's the scenario in Tempe, but it wouldn't surprise me. I could think of 100 others. I suspect everybody can think of some things that they have experienced driving that the AV software engineers haven't thought about or planned for. I also envision them working in a room in which a big "scenario to-do" list is posted and it has about 1000 entries on it, and growing ....
in a variant of that, self-driving cars are programmed to kill the occupants if the programming considers it the least bad thing to do.There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person tied up on the side track. You have two options:
Do nothing, and the trolley kills the five people on the main track.
Pull the lever, diverting the trolley onto the side track where it will kill one person.
Which is the most ethical choice?
I concurAs for the speeding ticket (fines, suspension of license, prison terms, etc.), those should be given to the programmers!
If the company (Cruise) thinks the safety driver did everything right, they need to step up and take responsibility for the ticketThe company claims the human test driver did everything right but is now responsible for the citation. San Francisco police did not immediately respond to a KPIX inquiry about the incident.
Given that the safety driver is explicitly there to ensure the software does not cause accidents or violate traffic laws, the safety driver is the one who is liable. Admittedly, the safety driver concept is pretty asinine given that humans are terrible at monitoring something in which nothing happens except on rare occasions.I concurAs for the speeding ticket (fines, suspension of license, prison terms, etc.), those should be given to the programmers!
Most jurisdictions have a rule that if you get too many tickets too close together, they suspend (or even revoke) your license. How does that work with self-driving cars? do they count each car separately or aggregate them and ground the entire fleet?
a self-driving car was just ticketed for failing to yield to a pedestrian at a crosswalk, and the cop seems to have issued the ticket to the safety driver, even though the software not the safety driver was the one driving at the time.
A self-driving car was slapped with a ticket after police said it got too close to a pedestrian on a San Francisco street.If the company (Cruise) thinks the safety driver did everything right, they need to step up and take responsibility for the ticketThe company claims the human test driver did everything right but is now responsible for the citation. San Francisco police did not immediately respond to a KPIX inquiry about the incident.