How Social Media Algorithms Drive Political Polarization

Emilie Robichaud
The Startup
Published in
5 min readOct 9, 2020

--

Social media is connecting people in ways like never before; family members reunited, love found online, and a sea of information at your fingertips... right? Well, yes, but social media and the algorithms that work behind-the-scenes also have detrimental affects.

Behind your Google search bar, on your Facebook feed, and in the hashtags you see on Twitter, there is one common element: algorithms. An algorithm can be defined as a set of mathematical instructions or rules that, especially if given to a computer, will help to calculate an answer to a problem.¹ The algorithms behind social networks is how information is catered specifically to you, which can be beneficial, but there is also a dark side that many people are unaware of.

Myth of the Black Box

We often see social media as a tool — something we use, but it uses you as well. The algorithms behind social media are persuasive; they can influence what you buy, what news you consume, and who you vote for. This is referred to as persuasive technology or persuasive algorithms, defined precisely as technology designed with the underlying motive of modifying a certain attitude or behavior, exploiting psychological and sociological theories, such as persuasion and social influence.²

Let’s look at Google: you input the information you’re trying to learn about and you see results. How often have you thought about the order of these results? Why is one particular article ranked higher than the rest? Is it most popular? Has that spot been paid for? Questioning these things is important; as Crawford (2015) notes:

“[Describing a search engine as a black box] doesn’t acknowledge the ways in which individuals, institutions, and industries have emerged to attempt to “game” search algorithms. From paid Search Engine Optimization services to the tricks that people try to make their name appear first in Google searches, the spaces of intersection between humans and algorithms can be competitive and rivalrous, rather than being purely dictated by algorithms that are divorced from their human creators.” ³

These search engines are not “black boxes”; the algorithms behind them have a purpose or goal in mind, which can have a major impact on real-world issues. For example, during the 2016 presidential campaign, Google was accused of manipulating search results to favor Hillary Clinton’s candidacy⁴. And in 2017, the company was even fined billions of dollars by the European Union for manipulating search results.⁵

One danger that accompanies these persuasive technologies is the extent to which we are aware of them — or completely unaware. This is called ‘algorithmic awareness’, which is defined as the extent to which people are aware that ‘our daily digital life is full of algorithmically selected content’⁶. In a study conducted by Eslami et al. (2015), it was found that ‘more than half of participants (62.5%) were not aware of the News Feed curation’ on Facebook (p. 1).⁶ The major consequence of this lack of algorithmic awareness is a distorted version of reality — we can see this impact recently in politics.

The “Bubble” How the Algorithm Drives Political Polarization

It’s clear that political polarization is increasing in America. The visual below from Pew Research Center shows how political beliefs have become more divisive in the past twenty years; liberals and conservatives are finding less and less common ground.

And there’s reason to believe that social media is driving some of this political polarization. As mentioned earlier, many people are unaware of the underlying algorithms that work behind-the-scenes. If people think what they see on their feed is “news”, instead of content that is curated specifically for them, as well as only engage with people that have similar beliefs, this creates a bubble — an online space which only reinforces their beliefs.

We’ll examine this concept of ideological homophily, which is defined as the tendency to choose to associate with others similar to oneself in political,⁷ in the scope of the social media platform Twitter. A study conducted in Japan by Takikawa & Nagayoshi (2017) was done to see if users on Twitter frequently engage with or follow people of differing beliefs. They found that:

“Each community discusses different issues that rarely overlap. Right-wing followers write about Korea, Korean Japanese, and dual nationality issues. Left-wing followers write about conspiracy law and the corruption of the government. ”⁸

Their findings suggest that since each side in normally concerned with different issues, they rarely cross community lines — which is when an echo chamber occurs. Furthermore, the introduction of Twitter’s “Who to Follow” feature has encouraged users to follow the same accounts as their alters, thus nudging them towards greater homophily.⁷ An increase in homophily drives political polarization because, as Myers and Lamm (1976) notes, individuals who participate in homogeneous discussion groups tend to adopt more extreme positions after deliberating with their like-minded peers. The frightening part is that the algorithms underlying social media today are encouraging this.

Designing a Better Future

This technology-driven world is relatively new to all of us, but this is no excuse to ignore the real-world ramifications that are unfolding as a consequence. Understanding the underlying problem with these algorithms is the first step in being able to design a better future. These algorithms were not developed with ethics in mind, but that can be changed. We can design a better future.

“If you are getting all of your information off algorithms on a phone, it’s just reinforcing whatever biases you have …. That’s what is happening with these Facebook pages where more and more people are getting their news from. At a certain point, you just live in a bubble, and that’s part of why our politics is so polarized right now. I think it’s a solvable problem, but it’s one that we need to spend some valuable time thinking about.”

-Barack Obama on David Letterman, 2018

[1]: ALGORITHM: Meaning in the Cambridge English Dictionary. (n.d.). Retrieved October 08, 2020, from https://dictionary.cambridge.org/dictionary/english/algorithm

[2]: What is Persuasive Technology. (n.d.). Retrieved October 08, 2020, from https://www.igi-global.com/dictionary/persuasive-technology/22565

[3]: Crawford, K. (2015). Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics. Science, Technology, & Human Values, 41(1), 77–92. doi:10.1177/0162243915589635

[4] Sullivan, Danny (2016, June 07). Google says it’s not deliberately filtering “Crooked Hillary” suggested search to favor Clinton. Retrieved October 08, 2020, from https://searchengineland.com/google-crooked-hillary-251152?utm_content=buffer3e25f

[5]: Romm, T. (2017, June 27). Europe has fined Google $2.7 billion for manipulating search results. Retrieved October 08, 2020, from https://www.vox.com/2017/6/27/15878980/europe-fine-google-antitrust-search

[6]: Bucher, T. (2019). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. The Social Power of Algorithms, 30–44. doi:10.4324/9781351200677–3

[7]: Boutyline, A., & Willer, R. (2016). The Social Structure of Political Echo Chambers: Variation in Ideological Homophily in Online Networks. Political Psychology, 38(3), 551–569. doi:10.1111/pops.12337

[8]: Takikawa, H., & Nagayoshi, K. (2017). Political polarization in social media: Analysis of the “Twitter political field” in Japan. 2017 IEEE International Conference on Big Data (Big Data). doi:10.1109/bigdata.2017.8258291

--

--

Emilie Robichaud
The Startup

University of Toronto graduate with majors in Computer Science and Mathematics! Always eager to explore more in the world of technology.