Citizens versus the Internet: How can we protect ourselves against manipulation, fake news, and other digital challenges?

Tips from a behavioral science perspective

February 12, 2021

In contrast to the offline world, the online world is largely driven by the logic of the attention economy: Users’ attention is a precious currency, and online environments are designed to capture and steer that attention. Yet users and legislators currently have little say in how these environments are regulated and controlled; instead, this responsibility is mostly left in the hands of corporations. How can users respond to these challenges of the digital age and how might the design of the online world be improved? A team of researchers from the Max Planck Institute for Human Development and the University of Bristol has addressed these questions from the perspective of behavioral science. In an article published in Psychological Science in the Public Interest, they propose some answers.

The Internet has revolutionized our lives – whether in terms of working, finding information or entertainment, connecting with others, or shopping. The online world has made many things easier and opened up previously unimaginable opportunities. At the same time, it presents both individuals and societies with major challenges: The underlying technologies do not necessarily serve users’ best interests.

“We’re interested in questions such as: How can we create online environments that respect human autonomy and promote truth? And what can people themselves do to avoid being misled and manipulated?”, says Anastasia Kozyreva, lead author and researcher at the Center for Adaptive Rationality at the Max Planck Institute for Human Development. The research team began by examining the differences between the online and offline worlds, and identified four major challenges.

(1) User behavior is influenced by manipulative choice architectures. These “dark patterns” steer users toward unintended behaviors; they include advertising that blends into the content or navigation of a page to generate more clicks, or confusing privacy settings that prompt people to share more information than they really want to.

(2) The information presented by AI-powered information architectures is not neutral; it is personalized on the basis of the data collected from users. This means that two people who enter the same term into a search engine will probably be shown different results. That can be helpful if, for example, we want to look up a restaurant and the search engine displays hits in our neighborhood at the top of the list, rather than a restaurant with the same name on the other side of the world. But if we are shown news or political content solely on the basis of our preferences, we risk finding ourselves in a filter bubble where we are no longer exposed to any other opinions.

(3) The research team sees false and misleading information as another challenge for people online. Videos and posts propagating conspiracy theories and unverified rumors can spread rapidly through social media, causing real harm. For example, people may decide not to get vaccinated due to misinformation about vaccines, putting themselves and others at risk.

(4) Distracting online environments constantly seek to attract users’ attention – whether by means of push notifications, flashing displays, pop-up ads, or constantly updated content. The aim is to capture and hold users’ attention for as long as possible: That is the very basis of Internet platforms’ business models. We find ourselves spending far more time on our screens than we intended – with no real benefit and at the cost of our attention for other things. 

Taking a behavioral science perspective, the researchers propose specific interventions to address these four challenges. They suggest that “boosting tools” can be used to train new competencies and enable better, more autonomous decisions in the online world.

Self-nudging is one of the cognitive tools that people can use to create “healthier” choice and information environments for themselves. Self-nudging empowers people to set up their digital environment in the way that works best for them. This might involve turning off notifications from apps or rearranging one’s smartphone home screen so that only useful apps are displayed: the calendar, camera, and maps, for example, along with meditation and weather apps. Everything that is overly distracting, such as social media and games, is better tucked away in folders. The researchers also recommend that users consciously set time limits on their social media use.

“The digital world is full of traps,” says Ralph Hertwig, Director of the Center for Adaptive Rationality at the Max Planck Institute for Human Development. “But we can take steps to avoid falling into them. In the same way as we might hide our chocolate stash at the back of the cupboard and put a bowl of apples on the table, we can turn off notifications from apps that permanently demand our attention. Out of sight is out of mind – whether in real life or in the digital world.”

And just as we look right and left before crossing a street, we should make a habit of asking certain questions to evaluate the content we encounter online. Questions such as: What is the origin of the information? Which sources are cited? Can I find similar content on reputable websites? This approach can boost users’ competence in evaluating the reliability of online information. But Internet platforms could also help users to assess content – for example, by displaying decision trees that remind users to check the source and the facts before sharing content.

More generally, however, policymakers also need to consider putting in place stronger regulatory measures to ensure that Internet users retain control over the digital environment and their personal data – for example, through default privacy settings. Last but not least, the smart and self-determined use of digital technologies needs to be taught in both school and adult education. The earlier, the better.

The researchers emphasize that none of the interventions they propose can singlehandedly counter online manipulation or prevent the spread of misinformation. “It will take a combination of smart cognitive tools, early media literacy education, and a regulatory framework that limits the power of commercial interests to hijack people’s attention to make the online world a more democratic and truthful place,” says Stephan Lewandowsky, professor of cognitive psychology at the University of Bristol.

Originalpublikation

Kozyreva, A., Lewandowsky, S., & Hertwig, R. (2020). Citizens versus the Internet: Confronting digital challenges with cognitive tools. Psychological Science in the Public Interest, 21(3), 103–156. https://doi.org/10.1177/1529100620946707

Other Interesting Articles

Go to Editor View