Project on systemic and existential risks of AI
Besides great hopes, especially economic ones, the rapidly advancing development of artificial intelligence also seems to be substantiating fears that until recently belonged more to the realm of science fiction. For example, can applications based on increasingly powerful AI models further facilitate the spread of disinformation and fundamentally disrupt democratic processes? Or can AI systems completely escape human control and thus become an existential threat?
Systematically investigating these questions and developing viable assessment and action approaches is the aim of a project now launched at ITAS on “Systemic and existential risks of AI.” The two-part project is funded by the Federal Ministry of Education and Research (BMBF).
In the first subproject, the researchers will focus on systemic risks. “Such risks are characterized by complex interactions or tipping points that lead to malfunctions or the collapse of systems, such as in the financial or climate crisis,” explains project manager Carsten Orwat. The project aims to examine the growing evidence of such risks in AI applications. In addition, causes, specific effects, and forms of damage will be analyzed in order to derive suitable forms of regulation.
The second subproject also aims to provide recommendations for action that help consider possible risks and concerns at an early stage. “Here, we investigate the fears of AI experts and developers that artificial intelligences could be developed that are completely beyond human control and pursue goals that conflict with human interests and even threaten the survival of human civilization,” says project manager Reinhard Heil. (14.02.2024)
Further information:
- Unkontrollierbare Künstliche Intelligenz – ein existenzielles Risiko? Lecture by Reinhard Heil at the Heidelberg University (in German)
- Subproject page Systemic risks of artificial intelligence on the ITAS website
- Subproject page Uncontrollable artificial intelligence: An existential risk? on the ITAS website