I2SC Lecture Series (Recording): David Garcia (Computer Science, University of Konstanz), Language Understanding as a Constraint on Consensus Size in LLM Societies
Date: December 13, 2024
Abstract:
The applications of Large Language Models (LLMs) are going towards collaborative tasks where several agents interact with each other like in an LLM society. In such a setting, large groups of LLMs could reach consensus about arbitrary norms for which there is no information supporting one option over another, regulating their own behavior in a self-organized way. In human societies, the ability to reach consensus without institutions has a limit in the cognitive capacities of humans. To understand if a similar phenomenon characterizes also LLMs, we apply methods from complexity science and principles from behavioral sciences in a new approach of AI anthropology. We find that LLMs are able to reach consensus in groups and that the opinion dynamics of LLMs can be understood with a function parametrized by a majority force coefficient that determines whether consensus is possible. This majority force is stronger for models with higher language understanding capabilities and decreases for larger groups, leading to a critical group size beyond which, for a given LLM, consensus is unfeasible. This critical group size grows exponentially with the language understanding capabilities of models and for the most advanced models, it can reach an order of magnitude beyond the typical size of informal human groups.
You can watch the recording of the talk below.