Mar 21, 2021

The Challenge of Propaganda in Today’s Social Media

Written By: Margaret Purnhagen

The internet has become a net of propaganda with little prospect for change. As social media has an increasing role in everyday life, there is an increase in propaganda to influence public opinion. Bad actors have learned how to make effective propaganda from past successes and translate those strategies into modern technology. Teams of analysts have found new ways to use the data collected by social media companies to allow propaganda creators to personalize every piece of information consumed by the users. 

Using media to spread propaganda has significant historical precedence and shows when the messaging is the most effective. When in power, the Nazis manipulated broadcasts to amplify their views to gather supporters and increase anti-Semitic feelings in their base. Germans believed the propaganda since the radio was the only source of current information and had no way to contradict the Nazi’s messaging. However, Maja Adena and her fellow researchers found the propaganda was only successful when the target region had prior anti-Semitic beliefs and historical context for discrimination against the Jewish population. A preexisting bias is essential for propaganda promoting extremist ideas to be effective. The bias feeds in to an “us versus them” narrative, creating a villain or scapegoat out of a minority group for the target population’s problems.  

There are countless examples of actors adapting Nazi propaganda techniques.  In Rwanda, the Hutu extremist used the radio to radicalize listeners against the Tutsis. The Nazis and the Hutu extremists used the radio to efficiently reach masses of people, repeating dehumanizing misinformation and appealing to preexisting biases. Whereas in Myanmar, Facebook was used to rapidly spread misinformation about the Rohingya. The military and other extremist groups played on the public’s lack of internet literacy and exploited Facebook to quickly spread propaganda without any monitoring. In both cases, the propaganda scapegoated a minority group, which led to extreme violence against the minority group. How rapidly the propaganda reaches the target audience can determine how quickly the violence escalates but is dependent on the technology available. 

Today’s media acts as a pipeline, linking mainstream ideas and extremist propaganda. Social media users search for a phrase or watch a video, and the “watch next” algorithm will recommend more content within the same field. As Kevin Roose shows in his in-depth look at how YouTube has radicalized viewers, the algorithms used to personalize the viewer’s experience now look for more extreme material to keep viewers entertained. Viewers help the algorithm by continuing to self-select into seeing similar material by not actively looking for counterarguments. They learn to see propaganda as the truth, and they fall deeper into the echo chamber. The viewers allow the algorithm to choose their worldview, often seeing the opposite political party as their enemy and working against their values. This practice has allowed propaganda to spread throughout the internet, quickly normalizing the extremist ideas and indoctrinating those most vulnerable to its message. In Origins of Totalitarianism, Hannah Arendt says the indoctrination through propaganda is a vital step in a radical party taking power and holding it. The party will target those who feel disconnected and disenfranchised from society, giving them a reason for those feelings and the party’s solution. The algorithms link fringe ideas together, giving them consistency to make them more believable. Viewers are now primed for further propaganda, bias, misinformation, and conspiracy. 

Using social media to spread propaganda goes beyond an internal power struggle. The Oxford Internet Institute found 70 countries use social media to shape public opinion. The 2019 study explains how states manipulate public and private platforms to censor dissenting views, spread misinformation to sway public opinion, or flood forums with disinformation to confuse the public. The practice of computational propaganda, creating both human and bot profiles to influence media, has been exploited on the social media “watch next” algorithms. Fake groups and profiles with many followers created by the bots are constantly recommended by the algorithm, creating fake news to influence or confuse the members. These fake groups and profiles disguise the truth and blame any situation on the opposition party, once again making them the villain. In 2016, Russia tried to influence multiple elections around the world by using computational propaganda. Through propaganda, China, Russia, and others have mastered Arendt’s spheres of influence. An external focus is on a widespread appeal to allies and the internal focus on keeping control of their citizens. Specialized teams within these governments continuously analyze the data collected through social media to find ways to push propaganda within each sphere to control narratives and further goals worldwide. 

As social media continues to gain a significant foothold in everyday life, it collects more data about its users. Media companies collect data from every aspect of life, allowing the data to be used by bad actors to create more impactful propaganda. The cycle of data analysis and propaganda creation based on the data has made a net impossible to escape once indoctrinated since all the propaganda creators determine the facts. There is some hope for change, where users have used the same tricks as the propaganda creators to counter their messaging. Roose highlights a group of YouTubers who use the same tags as the radical right to post videos of opposite viewpoints and identify misinformation. These counter videos give the viewer a more balanced set of videos to stop further indoctrination. However, many will continue to stay in the propaganda echo chamber without widespread and effective intervention. 

Sign Up For Updates

Get the latest updates, research, teaching opportunities, and event information from the Democratic Erosion Consortium by signing up for our listserv.

Popular Tags

Popular Categories

2 Comments

  1. Amna Rana

    Margaret, I think you have identified an extremely relevant issue of our times. Social media is such a dangerous tool of misinformation, especially in places that lack media literacy. There are also several pockets of population in the U.S. that suffer from this lack of media literacy, and I think that conspiracy theories like QANON and dangerous actions like the capitol riots can partly be blamed on the lack of media literacy. This is also such a difficult issue to solve because it is so fundamental. Many people do not know how to properly research or distinguish between facts and opinion/misinformation, because they have not been taught to do this. Hopefully, with time, as more social media challenges come to the forefront, we as a society will address these challenges aptly. The main concern is one of time, since technological developments will almost always be one step ahead of regulation laws. This is also one area where democracies are at a disadvantage because authoritarian governments can not only freely spread misinformation, but they can also regulate the media in ways that democracies simply can’t (or at least they shouldn’t).

    Furthermore, your description of social media as an “echo chamber” is very pertinent. The mechanism at play here is reinforcement. Maja Adena identified this mechanism in Nazi Germany – propaganda was more successful in places where there was more pre-existing anti-Semitic bias. So, the issue of a misinformation echo chamber is not only that social media companies use data to present viewers with similar ads/content, but also that this content reinforces people’s existing views and biases. Therefore, the social media algorithm does not necessarily “choose their worldview,” but it certainly reinforces and solidifies it!

  2. Jacob Waddell

    Margaret, I really enjoyed reading this post and wholeheartedly agree with you about social media’s ability to spread propaganda at an unprecedented pace. Though your post mainly covers what content gets posted to social media sites, I couldn’t help but think how the platforms themselves are becoming propaganda battlegrounds as well and how confusing the messaging can get from elected officials, particularly in the prior administration. Donald Trump was obviously notorious for using Twitter to reach out to his audience but would rail against the company whenever they took any action he perceived to be infringing upon his freedom of speech. But then in August last year his administration attempted to ban WeChat and Tiktok over fears of Chinese intelligence gathering, but completely abandoning the American ethos of free markets and free speech. To be fair, plenty of Chinese propaganda is spread through WeChat and Tiktok as well, but the incident just further demonstrates to me that none or very few of our elected officials have any sort of an idea how to handle the content being spread on social media as well as the platforms’ competition for market share and users.

Submit a Comment