Artificial intelligence and discrimination
It concludes that learning systems hold the potential to adopt or even exacerbate discriminations that already exist in society. An example: An AI system is trained to pre-sort job applications using data from employees successfully hired in the past. If the majority of them is male, the algorithm learns to value applications from men higher than those from women.
Independent supervisory authority to verify AI decisions
Due to the complexity of autonomously developing AI systems, the authors recommend an independent authority to monitor the decisions of learning systems. This authority should assist, similar to a data security officer, in asserting the rights of potentially discriminated citizens. Manufacturers or operators should also be obligated to monitor their systems during operation and improve them in case of discriminating decisions, so the whitepaper.
Another recommendation is to pre-select the criteria on which the algorithm’s learning is based. To this end, society has to agree on the characteristics that are seen as discriminating (e.g., ethnic origin) so that they can be eliminated from the input for machine learning techniques.
Background of the whitepaper
The whitepaper “Künstliche Intelligenz und Diskriminierung” (Artificial Intelligence and Discrimination) has been written by Armin Grunwald, head of ITAS, together with Susanne Beck (Leibniz University Hannover), Kai Jacob (SAP), and Tobias Matzner (Paderborn University). The authors are members of the working group “IT Security, Privacy, Legal and Ethical Framework” of the platform Lernende Systeme, which was founded in 2017 by the Federal Ministry of Education and Research (BMBF).
Risks of discrimination by algorithms are also subject of an ITAS project carried out on behalf of the Federal Anti-Discrimination Agency. The final report is planned to be published in late summer 2019. (19.07.2019)
Further links
- Download of the whitepaper Künstliche Intelligenz und Diskriminierung – Herausforderungen und Lösungsansätze (PDF)