Apr 10, 2024

The Divisive Force of Technological Polarization: Understanding the TikTok Ban

Written by: Alexandra MorkNjambi Karobia

In the ever-evolving landscape of social media platforms, TikTok has captivated millions with its short-form videos and trend-setting content. However, beneath its entertaining surface lies a growing concern: the threat of technological polarization on democratic societies. As governments grapple with the implications of this phenomenon, the recent move to bring legislation to ban TikTok serves as a stark reminder of the divisive forces at play.

The allure of TikTok lies in its algorithmically curated feed, which tailors content to users’ preferences and creates a trap where individuals are exposed to viewpoints that align with their own. While this personalized experience may seem harmless, it fosters a sense of tribalism and reinforces pre-existing beliefs, contributing to the polarization of society. Moreover, the viral nature of TikTok content amplifies extreme viewpoints, drowning out moderate voices and exacerbating societal divisions. As users engage with content that aligns with their biases, they become entrenched in their perspectives, further widening the ideological chasm.

The possible decision to ban TikTok reflects growing concerns about its potential to undermine democratic principles. By perpetuating echo chambers and amplifying extreme viewpoints, TikTok threatens the fabric of civil discourse essential for a functioning democracy. In response, governments have taken decisive action to curb its influence, citing national security concerns and the need to safeguard democratic values. However, the ban on TikTok raises questions about the balance between security and freedom of expression. While governments have a responsibility to protect citizens from external threats, they must also uphold the principles of free speech and open discourse. Banning platforms like TikTok may quell immediate concerns but risks setting a dangerous precedent for censorship and government overreach.

Additionally, the global reach of TikTok allows for the disbursement of information, making it a powerful tool for shaping public opinion, which has influenced political discourse. However, this same feature also makes TikTok susceptible to manipulation by malicious actors seeking to spread disinformation or sow discord. The proliferation of fake news and conspiracy theories on the platform undermines trust in democratic institutions and erodes the foundation of informed citizenship.

The decision to ban TikTok in certain countries reflects concerns about its potential to undermine national security and democratic values. For instance, the United States cited data privacy concerns and the perceived threat of Chinese government influence as primary reasons for moving to potentially ban the platform. Similarly, India banned TikTok amid escalating tensions with China and concerns about the platform’s role in spreading misinformation and inciting violence.

While the motivations behind these bans may vary, they underscore the need for robust regulation and oversight of social media platforms. However, the effectiveness of bans in addressing the root causes of technological polarization is questionable. Merely prohibiting access to TikTok does not address the underlying issues of algorithmic bias, echo chambers, and the spread of disinformation. Instead, it may push users towards alternative platforms with similar pitfalls, perpetuating the cycle of polarization.

To effectively combat technological polarization, a holistic approach is required, encompassing both regulatory interventions and societal initiatives. Platforms must prioritize the ethical design of algorithms, ensuring transparency, accountability, and fairness in content moderation and recommendation systems. This includes mechanisms for users to customize their content preferences and control the information they are exposed to, empowering them to break free from filter bubbles and engage with diverse viewpoints.

In addition to platform-level interventions, media literacy programs are essential to equip users with the skills and knowledge necessary to critically evaluate information and navigate the digital landscape responsibly. By promoting digital literacy and critical thinking skills, individuals can become more discerning consumers of online content, less susceptible to manipulation, and better equipped to engage in constructive dialogue across ideological divides.

Governments also play a crucial role in shaping the regulatory framework for social media platforms, balancing the need for security and public safety with the protection of fundamental rights and freedoms. This includes enacting legislation to safeguard data privacy, combat disinformation, and promote online civility and respect for diverse viewpoints. However, regulatory measures must be carefully crafted to avoid unintended consequences, such as censorship or stifling innovation.

Ultimately, addressing the threat of technological polarization requires collective action and collaboration across stakeholders, including governments, technology companies, civil society organizations, and individual users. By working together to promote digital literacy, foster online civility, and uphold democratic values, we can mitigate the risks posed by polarization and create a more inclusive and resilient digital society. TikTok is emblematic of the challenges posed by technological polarization and the need for proactive measures to address them. While bans may offer short-term solutions, they do not address the root causes of polarization and may have unintended consequences. Instead, we must focus on promoting transparency, accountability, and digital literacy to empower users to navigate the digital landscape responsibly and foster a more inclusive and democratic online discourse.

References

Ghosh, D. (2021, January 14). Are we entering a new era of social media regulation?. Harvard Business Review. https://hbr.org/2021/01/are-we-entering-a-new-era-of-social-media-regulation 

McCoy, J., & Somer, M. (2021). Overcoming Polarization. Journal of Democracy 32(1), 6-21. https://doi.org/10.1353/jod.2021.0012.

Murray, C. (2023, March 23). Here’s what happened when this massive country banned TikTok. Forbes. https://www.forbes.com/sites/conormurray/2023/03/23/heres-what-happened-when-this-massive-country-banned-tiktok/?sh=749a84292c03

Newsroom. (2023, July 13). Why civic engagement matters in democracy?. Modern Diplomacy. https://moderndiplomacy.eu/2023/07/13/why-civic-engagement-matters-in-democracy/

Sign Up For Updates

Get the latest updates, research, teaching opportunities, and event information from the Democratic Erosion Consortium by signing up for our listserv.

Popular Tags

Popular Categories

2 Comments

  1. Kate Hein

    One of the solutions mentioned, allowing users to customize their content preferences, may also increase the confirmation bias the algorithm already creates. If users are allowed to choose what content they see, like adding both view points, it would create a less polarized algorithm. However, if users are allowed to select that content they do not see, would that not allow for people to choose to only see one view point? For instance, instead of the intended people opting to receive both Fox News and CNN related videos, could users choose not to watch Fox News related videos and only receive CNN videos?

  2. Maria Meco

    While the concept of enabling users to filter their content may initially appear as an optimal solution, I contend that it undermines our capacity to engage with diverse perspectives. Such filtering perpetuates the polarization of viewpoints and the dissemination of misinformation, as individuals are only exposed to content that aligns with their beliefs, perpetuating a cycle of reinforcement. Instead, I advocate for implementing filtering mechanisms at the app level to prevent the spread of misinformation from occurring in the first instance.

Submit a Comment