Project

The rapid pace of technological innovation challenges established knowledge-production and dissemination structures. The project proposes a philosophical investigation of Large Language Models (LLMs) and their impact on democratic societies. Capable of generating vast quantities of human-like text, LLMs are likely to have significant epistemic effects.

Our research question is: What epistemic effects can be expected of LLMs in democratic settings, and how can we improve the odds that this technology will serve democracy rather than harm it?

We focus on three areas: (1) LLMs’ impact on democratic discourse; (2) their implications for the epistemic authority of expertise and legitimacy of democratic institutions; and (3) the ethical responsibilities of governmental bodies and citizens in shaping and using LLMs.

Utilising methods of analysis and conceptualisation common in social epistemology and philosophy of technology, we aim to advance the understanding of the intricate relationship between technology and democracy.

The project is supported by The Czech Science Foundation (GACR) as 24-11697S.