It isn't so obvious in whifflier domains such as morality; hence the enduring popularity of talking about The Trolley Problem, wherein we are faced with a moral dilema well out of the bounds of any experience1. Morality is custom (update: see-also The Foundations of Morality) and so things well outside experience and therefore custom aren't subject to our moral intuitions.
This smacks once again of the softer sciences thoughtlessly aping the harder ones. If there genuinely are strict laws, then testing them with edge cases makes sense. If there aren't, trying to interpret out-of-bounds information within your (admittedly unclearly-)bounded framework will only confuse you.
Even less sensible is the attempt to think about TTP in the context of Implications for autonomous vehicles. No-one is going to write their software in a way that the question comes up.
Refs
* (I Don't Want to Go to) Chelsea.
* Philosophy of Physics Seminar: Sabine Hossenfelder (Munich Center for Mathematical Philosophy): 'Superdeterminism – The Forgotten Solution'
* The FTC’s Confused Case Against Amazon.
* In talking about The Ethical Case for a Siege of Gaza, Richard Hanania says that Individual morality does not transfer to geopolitical issues. This is consistent with what I'm saying here.
Notes
1. This also doesn't begin to cover what people would do in practice if faced with such problems. And in practice they can't be: the conditions are not real-world.
No comments:
Post a Comment