Robert O’Toole, September 2024
I’m listening to the philosophical podcast series This is Technology Ethics by John Danaher and Sven Nyholm. Much of this concerns, but is not limited to, the advance of AI and its implications.
Episode 2 (20th September 2023) uses the case of self-driving cars (SDCs) to explore methods in ethics. As the authors say, this is a current hot-spot for technology ethics research. It also works well to illustrate the difference between narrow approaches to ethics, and a broader systems-focussed approach (that I’m interested in). Similar methods can be applied to other fields, and similar issues appear in those fields – such as education (although that is a much more open and emergent field).
Two obvious questions are introduced:
- How should self-driving cars behave in risky situations?
- Who is responsible when a self-driving car has an accident?
These questions are derived from the “trolley problem” tradition of ethics, where the focus is on how people or a system should behave in specific imagined situations (raising tricky dilemmas that are hard for even humans to resolve).
But we can add further significant questions from a systems perspective. This is a lesson already learned from the ethics of automation in the airline industry. Some I can think of are:
- If SDCs will only be safe when all vehicles are self-driving, should we ban human drivers (HDCs), removing unpredictability from the system?
- Should we then also ban cyclists and pedestrians?
- If we ban human drivers, what will the psychological and emotional impacts be?
- Should SDCs be allowed to mix with HDCs?
- Should we mandate that every SDC is occupied when in use by a fully competent human driver who can take over as required? If so, do we mandate that every human driver has to spend a sufficient number of hours driving manually, so that they retain the required competence? (as with airline pilots)
- If we are mandating fully competent drivers, do we introduce a method of state surveillance to enable this?
- Should SDCs only be allowed if their operation is always easily understood by a human drive in attendance, is always predictable, so that the human can safely supplement of take over control?
The “human competency degradation” issue is a good example of ethics from a systems perspective – considering how the technology may improve a specific situation at a specific time, while at the same time degrading the wider system (the sum total of driving competence) over a longer time. This can lead to us considering a major change in the system, banning all human drivers (as SDCs and semi-competent human drivers don’t mix well), with other negative consequences to consider. That question “if we ban human drivers, what will the psychological and emotional impacts be?” could be, ultimately, the most significant. And real people in the real world recognise that. Many people enjoy driving (and even more motorcyclists enjoy riding). It’s a big part of their lives, and when it goes well, provides significant joy. So is it right to take that away from them in the interests of efficiency? There’s a similar question we can ask about the use of AI to make work practices more efficient, to cut out meetings and communications. Maybe for many people the reason they come to work, the reason the enterprises they work for exist, is to give an excuse for that social activity? Take it away and the enterprise might become much less attractive. What would we be left with? Sitting around in perfectly manicured spaces sipping yet another non-alcoholic gin and tonic while our AI assistants send bland corporate emails to each other.
A better approach is to introduce smaller, constrained, innovations that we can adapt to. For example, hybrid cars don’t require an entirely new fuelling system (and lots of other changes), but do allow people to get used to automations (the computer has much more control over the operation of the engine and transmission). It doesn’t take long to learn how a hybrid car behaves, and how to intervene (I can hit the power button to over-ride the computer). In driving my hybrid I’ve learned new skills, and had no use for others (I drive a manual 4×4 occasionally for off roading, but eventually that will become obsolete too).
Leave a Reply