When looking at social media platforms, the layout seems to be a free space for political discussion. For instance, Instagram has recently become a booming application for sharing political content, organizing protests, and voicing for change (Stewart, 2020). More specifically, the Black Lives Matters movement shook the online world so much that the force translated it’s momentum to in-person protests across the globe (Stewart, 2020). Although this is a great example of Instagram being a democratic space, there is more to what appears on screen. The unfortunate reality is that all social media platforms are designed for profit (Wheeler, 2017). These platforms are wired to curate a user’s newsfeed based on their personality and behaviours, making the app addictive. This allows organizations to play more ads and maximize revenue (Wheeler, 2017). Social media can be seen as a democratic space but only when the right causes become a trend. As soon as misinformation spreads, it can easily polarize nations and create chaos. Companies have failed to acknowledge that “prey[ing] on [human] psychology for their own profit” has damaged democracy (TED, 2017).
Instagram has over 500 million active users every day (Mohsin, 2020). With so many users online, what are they looking at? Well, the answer is: it depends. Every user’s newsfeed is totally different because Instagram is designed to allow users to select who they follow and what they like. Based on that information, an algorithm tracks those digital footprints to curate a user’s feed to constantly satisfy their needs. If a user consistently likes cute dog videos, the algorithm will continuously input cute dog videos. If a user is constantly checking a certain Instagram account, the posts of that account will appear more often on one’s feed. With a design like this, algorithms solely focus on showing content that triggers one’s interest; this means sharing any kind of content whether it is fake or true. Which leads to the next problem: social media fails to filter out fake news.
Around 3.8 billion people across the globe are social media users and it is expected that the number will go up 9% each year (Kemp, 2020). As more people will have access to social media, information will spread at a greater rate to a greater audience. The concern is that social media platforms have become a reliable news source for many people, especially the younger generations. According to YPulse (2020), an organization that collects data about young generations stated that: “Instagram is set to overtake Twitter as the top news source among young consumers with 75% of under-25-year-olds saying they use the platform for news.” (para. 6). With that in mind, companies have built these platforms with the goal to profit, they didn’t regulate the space for political discussion. 75% of the young users may not know that the news they see might be fake (YPulse, 2020). The dissemination of news is different for every user. Political views will vary from person to person because the algorithm is only sharing information that will prolong a user’s screen time. The problem with social media as a news source is that the posts are catered to one’s views. There is no transparency. The information they feed on to people’s accounts are ideas and discussions that reinforce and back up one’s already existing view.
The lack of transparent distribution of news can create distrust between groups of people. The spread of misinformation can split communities and even worse, polarize a nation. Sunstein (2018) conducted a social experiment with sixty American citizens where they were asked to deliberate on controversial issues in groups of six. They divided the groups in two categories: the liberals and conservatives. During the experiment, each subject was carefully monitored to make sure they conformed to the stereotypes of the category (Sunstein, 2018). In short, the results were disturbing. They discovered that “[i]n almost every group, members ended up with more extreme positions after they spoke with one another” (Sunstein, 2018, p. 3). This is an example of group polarization in action (Sunstein, 2018). This experiment reveals the dangers of echo chambers and that when one is not shown the other side, they can easily take their own views to extreme points. Social media platforms replicate the Colorado experiment on a daily basis as users continue to like and share posts that reflect their opinions. This experiment shows how the algorithm can be detrimental to democracy.
Social media platforms have broken the barriers of time and distance between people. It has become a globalized space where anyone with a phone can like, comment, and share any idea. Furthermore, it has become a news source for a majority. These aspects of social media platforms seem to be in favour of democracy. However, what people don’t realize is that every like, comment, and share stimulates an algorithm that determines the next post or story on one’s newsfeed. This is how information circulates on all forms of social media. These platforms are designed to grab people’s attention and keep it for as long as possible (TED, 2017). Extending a user’s screen time allows the platform to share as many ads as possible and maximize ad revenue. An algorithm designed this way is not in favour of the transparent dissemination of news, rather it focuses on the maximisation of monetization (Wheeler, 2017). With a business model focussed on profits and an algorithm that craves attention—will further divide people and destroy democracy.
Ghaffary, S. Stewart, E. (2020, June 24). It’s not just your feed. Political content has
taken over Instagram. Vox.
Gen Z & Millennials Have Very Different News Sources. (2020, July 20). YPulse.
Kemp, S. (2020, January 30). Digital 2020: 3.8 Billion People Use Social Media. We Are
Mohsin, M. (2020, July 6). 10 Instagram Stats Every Marketer Should Know 2020
Sunstein, C. (2018, January). Is Social Media Good or Bad Democracy? International
Journal On Human Rights, 1(27), 1-4.
[TED]. (2017, July 28). How a handful of tech companies control billions of minds every
day | Tristan Harris [Video File].
Wheeler, T. (2017, November 2). How social media algorithms are altering our