LONDON - The capacity to change minds through reasoned discourse lies at the heart of democracy. Clear and effective communication forms the foundation of deliberation and persuasion, which are essential for resolving competing interests. Unfortunately, there is also a dark side to persuasion - false motives, lies, cognitive manipulation, and other malicious behavior that artificial intelligence (AI) at times facilitates.
In the not-so-distant future, generative AI (GenAI) will likely enable the creation of new user interfaces designed to persuade people on behalf of any person or entity with the means to establish such a system. By leveraging private knowledge bases, these specialized models offer differential truths that compete based on their ability to generate convincing responses for a target group - i.e., an AI for each ideology. A wave of AI-assisted social engineering would surely follow, with escalating competition making it easier and cheaper for bad actors to spread disinformation and perpetrate scams.
The emergence of GenAI has thus fueled a crisis of epistemic insecurity. The initial policy response has been to ensure that humans know they are engaging with an AI. Last June, the European Commission urged large tech companies to start labeling text, video, and audio created or manipulated by AI tools, and the European Parliament is now pushing for a similar rule in the forthcoming AI Act. This awareness, the argument goes, will prevent artificial agents - no matter how convincing they may be - from misleading people.
Alerting people to the presence of AI is a good start, but still not enough to safeguard them against manipulation. As ear
The content herein is subject to copyright by Project Syndicate. All rights reserved. The content of the services is owned or licensed to The Yuan. The copying or storing of any content for anything other than personal use is expressly prohibited without prior written permission from The Yuan, or the copyright holder identified in the copyright notice contained in the content.