An Autonomous Car Might Decide You Should Die

By refusing to release a safer technology to the public until it is “perfect,” these parties are telling us that they don’t want to be responsible for having made the decision to kill one person, because the decision to let five die is an outcome society has already learned to accept without judgment.

The intersection between Artificial Intelligence, ethics and the possible applications of these technologies is interesting as is scary. The dilemmas that arise from AI (plausible) scenarios bring a new perspective to the whole human existence. It reshapes everything as we know it.

Trouble is, in order for self-driving cars to solve the problems, they need to have a grasp of said “currency” in order to work out the decisioning logic as situations present themselves. In short: they need a level of awareness we don’t have about ourselves.

It's interesting though, how we can "accept" our imperfections, but we couldn't tolerate a "currency based" decision from a machine, although the global outcome was way better than ours. 99,99% is not enough, it has to be 100%.