Année
2024
Auteurs
WAARDENBURG Lauren, MÁRTON Attila
Abstract
AI systems are commonly believed to be able to aid in more objective decision-making and, eventually, to make objective decisions of their own. However, such belief is riddled with fallacies, which are based on an overly simplistic approach to organizational decision-making. Based on an ethnography of the Dutch police, we demonstrate that making decisions with AI requires practical explanations that go beyond an analysis of the computational methods used to generate predictions, to include an entire ecology of unbounded, open-ended interactions and interdependencies. In other words, explaining AI is ecological. Yet, this typically goes unnoticed. We argue that this is highly problematic, as it is through acknowledging this ecology that we can recognize that we are not, and never will be, making objective decisions with AI. If we continue to ignore the ecology of explaining AI, we end up reinforcing, and potentially even further stigmatizing, existing societal categories.
WAARDENBURG, L. et MÁRTON, A. (2024). Chapter 12: It takes a village: the ecology of explaining AI. Dans: Ioanna Constantiou, Mayur P. Joshi, Marta Stelmaszak eds. Research Handbook on Artificial Intelligence and Decision Making in Organizations. 1st ed. Edward Elgar Publishing Ltd, pp. 214–225.