Plamenatz Philosophy Society considers the ‘trolley problem’ at first meeting

“The store of wisdom does not consist of hard coins which keep their shape as they pass from hand to hand; it consists of ideas and doctrines whose meanings change with the minds that entertain them.” John Plamenatz – Old Clayesmorian.

Clayesmore’s Chaplain, Mr Andrew West, writes about the first meeting of the newly-formed Plamenatz Philosophical Society, named for eminent Old Clayesmorian political philosopher and Oxford Fellow, John Plamenatz.

“A group of sixth form students launched the newly-formed Plamenatz Philosophical Society on Tuesday 29 Jan. At the meeting the society discussed the well-known and much-discussed ‘trolley problem’, a thought experiment in ethics that was first introduced by the philosopher Philippa Foot in 1967.

The problem has been revived and revised recently by Massachusetts Institute of Technology Media Lab in order to “initiate a conversation about ethics in technology”.

In the first part of the meeting the group discussed the original problem and its many variations and the students were encouraged to identify the principles on which they were basing their actions. Were they being consistent utilitarians? How relevant is the Kantian maxim of not treating people as a ‘means to an end’? Could we use the concept of ‘rights’ to help us solve the problem? Or was there a way of looking at the problem that didn’t focus on ‘outcomes’ but more on the ‘characters’ of the those involved – a virtue ethics approach

In the second half of the session the students discussed the ‘Moral Machine’, a platform “for getting a human perspective on moral decisions made by machine intelligence, such as self-driving cars.” The ‘Moral Machine’ invites you to make judgements in various scenarios – all variations of the trolley problem – about who should live and who should die. Should it be the passengers in the car? The old people? The pedestrians crossing the road? Should the car ever swerve, ie initiate an intervention, or should it always go straight ahead regardless of the consequences?

The students quickly, and pleasingly, decided that the occupation, gender and status of the people involved was irrelevant, though there was quite a bit of discussion around whether children took priority over old people. The researchers at MIT have concluded from all those that have contributed to the ‘Moral Machine’ across the world that people in more individualistic cultures are more likely to prioritise the young. This was certainly borne out in our discussion. The students also felt that priority ought to be given to pedestrians, particularly when the pedestrians are ‘obeying the rules’ and crossing on a green light. Which raised an interesting question for the group: what if you are the owner of and passenger in a driverless car? Wouldn’t you expect the car to prioritise your safety? Would manufacturers build this into the design of their driverless cars?

The discussion was rigorously philosophical but good fun. And the group also acknowledged the seriousness of the issues at stake. While the original trolley problem might seem to be merely of academic interest, the cross-cultural preferences collected by the ‘Moral Machine’ platform will be used by researchers at MIT to ‘contribute to developing global, socially acceptable principles for machine ethics.’ But we worry that the deep philosophical thinking that is needed is coming too late to the party. Decisions will already have been made by the machine engineers and high-tech investors: the people with the technical expertise and the money but without the ability to think philosophically about the the impact these emerging technologies will have on humanity.”