Imagine sitting back and relaxing in a driverless car...
Getting from A to B with your feet up and eyes off the road... but then... there's a girl and an elderly woman in the road on a pedestrian crossing... should the car keep on going and hit them, almost certainly killing them?
Or should the car swerve, but crash into a concrete barrier, which will kill a boy and an elderly man? What's the lesser of two evils in this terrible situation?
This exact kind of moral dilemma is put to you in the Moral Machine, a morbid sort of game that in fact has some pretty serious implications for technology.
When it comes to robots, machines and especially driverless cars, there's a growing issue over the ethics and morals these not-quite beings will have.
Should complex machines behave like humans? If so, how do we behave? Scientists at MIT created the game to answer just this question, by crowdsourcing a picture of how humans respond to such events.
Should a car – and would a human – choose between saving an older person or a younger person? One person or 20 dogs? A doctor or a criminal?
The game poses these very questions – and the choices make for uncomfortable reading. More so is the collation of all your answers into one handy chart indicating at a glance just what kind of monster you really are...
It's like a modern day (and much more morbid) snog, marry, avoid....
Give it a go yourself (if you dare).