Fake News and Ideological Polarization

Filter Bubbles and Selective Exposure on Social Media

As humans, we’re drawn to people who share our beliefs. Nature steers it. And, with the advent of the digital age, finding individuals like yourself has become even easier. Thanks to social media and its algorithms, we are now closer than ever to those who agree with us.

This trend toward similarity appears to be harmless, but it can be extremely dangerous. Living in a bubble of like-minded individuals can cause our perceptions of the world to be distorted, and in many cases, it can happen without our knowledge. Given how easy it is to gather information that confirms our beliefs, we may be unaware of the more powerful forces at work.

And yet, we are almost accustomed to it. With today's smart algorithms and customized softwares, social media can study us in unprecedented ways, examining details such as which brands we like and which pages we follow. Once collected, the data is returned to us in the form of targeted ads and personalized campaigns.

While some may argue that collecting information about brand preferences is harmless and can benefit purchasing decisions, this is not entirely accurate. In reality, even seemingly innocuous data, like brand preferences, can be used for broader and more harmful forms of personalization, like ideological alignment.

What Role Does Social Media Play in Politics and Elections?

In his 2017 study, “Fake News and Ideological Polarization: Filter Bubbles and Selective Exposure on Social Media,” Dominic Spohr posed this exact question. In researching the role of social media in news consumption, he was able to draw several important conclusions about how politics and media intersect.

One of Spohr's many findings was that political polarization had long-term consequences, including reduced diversity and agreement, as well as a weakening of democracy. And it wasn't just online, either. Political ideologies frequently crossed physical boundaries, resulting in today's political schisms. Furthermore, since online communities often facilitate extreme views, there is a discord between citizens.

Yet, it almost seemed unavoidable. The ease with which individuals could form communities of like-minded people resulted in political intolerance so extreme that national consensus was regarded as an impossible goal. As politics became more divisive, other aspects of daily life were impacted.

According to the theorists, this was never supposed to happen. Indeed, the introduction of new technologies and social media was expected to improve political communication. But because platforms were designed in a way where information was managed selectively—some content was transferred while other content was not—the opposite happened.

We were surrounded by so much data that algorithms began presenting us with only the most relevant information. In other words, social media introduced the "filter bubble" to eliminate the issue of "information overload."

Taking things a step further, these platforms began creating content to cater to their users' preferences at the expense of reliable information, which led to the spread of several fake news stories. Constructive debates became increasingly scarce, making it even more difficult for users to stay informed.

In this new reality devoid of debate and discussion, people chose affirmation and reinforcement over debate and discussion, regardless of what the truth was. And with time, things only got worse.

Even so, our journey is not yet over. The truth is that there is still reason for optimism, and understanding how it all began is the first step toward taking action. After all, knowledge, not chance, determines change.

How Did We Get So Politically Divided in The First Place?

Theorists present two arguments in response to this question. The first is that algorithms shape our online experiences, placing us in echo chambers with people who believe the same things we do. According to this theory, technology is to blame for political polarization. On the other hand, some theorists argue that we actively seek information that confirms our pre-existing beliefs, or simply put, that we divide ourselves.

Even if you favour one framework over another, the fact remains that users prefer to receive information online, whether they do so on purpose or not. In many ways, it is clear that we have abandoned traditional information gathering methods in favour of a "news-finds-me" mentality. As a result, we no longer engage with cross-cutting content, preferring to consume only what is easily accessible to us.

In the absence of a middle ground, we rarely encounter information outside our field of interest and tend to consume media that reinforces our beliefs. In part, this is due to what researchers call "selective exposure" and "availability bias," in which we only trust information provided by algorithms or content that is readily available.

Is it Possible to Escape a Filter Bubble?

The bad news is that we are no longer seeking information in the same way we used to, which leads us to trust the wrong channels of information. There is good news, though: Studies have shown that these habits are easily broken.

In a study on ideological polarization, the Guardian reported that reading news outside your echo chamber exposes you to opposing viewpoints and encourages you to seek out more information. The finding confirmed one thing: political opinions can change for some people, creating what is currently lacking, a middle ground.

In fact, finding a happy medium may be the key to resolving all of this. By removing "echo chambers" and "filter bubbles," individuals can facilitate progressive public discussions and increase exposure to political differences. In turn, this encourages political diversity and broadens our understanding.

How Can We Find a Happy Medium?

To move forward, it is necessary to investigate the factors that contribute to political polarization. After all, understanding the source will help us figure out how to stop it.

In some cases, technology companies like Google, Facebook, and Twitter may be held liable for the spread of false news and misinformation. In this case, algorithms may need to be redesigned so that tracking systems do not see users’ online activities. If carried out, there will be fewer echo chambers and greater access to content that crosses ideological boundaries.

Furthermore, we must recognize that consuming cross-cutting content is an active process, not something that happens automatically when we are online. Therefore, it is critical to make a concerted effort to seek out multiple reliable sources when looking for political information.

In the end, fake news is only as powerful as we make it. By thinking critically, we can prevent its spread. Observe things with curiosity, but don't let them blind you. And, as always, do your own research by visiting a variety of reputable government websites and credible sources that are backed up by research.

To begin, here are some tips you can use to identify trustworthy media.


Study Objective & Methods

Fake News and Ideological Polarization: Filter Bubbles and Selective Exposure on Social Media

Dominic Spohr, Media and Communications Graduate


Published in Business Information Review, 2017.

The following study examines recent social media trends as well as key political events such as the 2016 US Presidential Election and the 2016 UK EU Referendum to determine ideological polarization on social media. However, researcher Dominic Spohr goes a step further, investigating the various contributing factors, such as availability bias and selective exposure.
Based on concepts such as the "Risky Shift" phenomenon and the "news-finds-me" perception, he concludes that ideological polarization undermines public discourse by creating echo chambers. In the final section, the study discusses future research opportunities as well as what industry, society, and policymakers can do.

Become E Certified

This research (and all our social media and well-being articles) have laid the foundation for our E Certification training: a 3-course program for anyone wanting to approach social media and communications in a way that protects well-being and puts people first. Learn more here.

Previous
Previous

The Umbrella Movement

Next
Next

#Sleepyteens