CAMBRIDGE, MASSACHUSETTS - Not only are billions of people around the world glued to their mobile phones, but the information they consume has changed dramatically - and not for the better. On dominant social media platforms like Facebook, researchers have documented how falsehoods spread faster and more widely than similar content that includes accurate information. Even though most users are not demanding misinformation, the algorithms that determine what people see tend to favor sensational, inaccurate, and misleading content because that is what generates ‘engagement’ and thus advertising revenue.
As internet activist Eli Pariser noted in 2011, Facebook also creates filter bubbles, whereby individuals are more likely to be presented with content that reinforces their own ideological leanings and confirms their own biases. More recent research has also demonstrated that this process has a major influence on the type of information users see.
Even leaving aside Facebook’s algorithmic choices, the broader social-media ecosystem allows people to find subcommunities that align with their interests. This is not necessarily a bad thing. Even if one is the only person in one’s community with an interest in ornithology, one no longer has to be alone because one can now connect with other ornithology enthusiasts anywhere in the world. Unfortunately, however, the same applies to lone extremists who can use the same platforms to access or propagate hate speech and conspiracy theories.
The content herein is subject to copyright by Project Syndicate. All rights reserved. The content of the services is owned or licensed to The Yuan. The copying or storing of any content for anything other than personal use is expressly prohibited without prior written permission from The Yuan, or the copyright holder identified in the copyright notice contained in the content.