In the past, the word “authoritarian” brought to mind images of violent repression, censorship, and intensive surveillance, an image that. Today, however, autocrats have a new set of tools that allow them to be subtler in their methods of control, both at home and abroad. The internet and social media are key to autocrats’ strategies for domestic stability and have been used to great effect in damaging confidence in democracy worldwide. Though it is not the only country to use social media in this way, in the American consciousness Russia is the most controversial and sensational, and a perfect case study of the ways authoritarian governments use the internet and disinformation to increase control at home and destabilize the idea of liberal democracy.
How is disinformation used within Russia?
While Russia does engage in censorship and restrict criticism of the government, they also have a unique method of spreading propaganda, which Christopher Paul and Miriam Matthews of RAND call a “firehose of falsehood”. Disinformation is spread through every type of media, constantly, and at a high volume. Different narratives are completely drowned out by the amount of false information being disseminated, and the high volume of different sources both increase the number of recipients and render similar content more believable. Falsehoods are often founded in truth, and desirable interpretations of real events are attributed to experts with views opposite to what is presented.
These tactics make truth and reality difficult to grasp. With so much information being put out constantly, it is impossible to refute all of it. Moreover, it makes domestic political engagement difficult. Russian trolls invade popular discourse (Kurowska & Reshetnikov), and making it harder for citizens to know whether what they are receiving is real political dissent or manufactured content. For some, this might result in cynicism and disengagement, because of a perception that their voices will never be heard.
How is social media used to spread disinformation abroad?
Russia’s aggressive disinformation campaign in the US began in late 2014, with general attempts to sow chaos rather than to promote any specific narrative. The goals of the propaganda followed four general themes, including attempts to undermine confidence in democratic leaders and institutions, as well as finding and exploiting fault lines in society to increase social divisions. Even their attempts to influence the 2016 US presidential election were focused as much on creating distrust in elected officials. The “firehose of falsehood” was turned on the US, aiming to create a similar suspicion of media sources, and making reality hard to distinguish from lies.
All of these tactics were used to great effect in Ukraine in 2013-2014. Crimea’s ethnic Russian population was bombarded with disinformation from Russian state media, widely available and popular on the peninsula. The news being presented portrayed the Euromaidan protests against corruption as a threat to the safety of ethnic Russians in Crimea. This exploitation of social tensions ultimately led to the annexation of Crimea by Russia. The ethnic Russian populations of Donetsk and Luhansk were also targeted, contributing to the ongoing war in the Donbass region of Ukraine.
What can be done to prevent Russia and other actors from interference/destabilizing actions?
Because there is so much disinformation it is impossible to refute every untruth on the internet. Nor would doing so be particularly effective, given the distrust in institutions and the media that has been created by this propaganda. So if contradicting falsehoods does not work, what can be done?
It’s important to know that we do not have to start from scratch. In the case of Russian interference, the Baltic States have made significant efforts to counter Russian disinformation targeting ethnic Russian residents. Targeting ethnic Russians in Crimea and the Donbass aided Russian efforts to destabilize Ukraine, and similar efforts have been made to persuade Russian speakers in Estonia, Latvia, and Lithuania of viewpoints more closely aligned with Russia than the mainstream of Baltic politics. Looking at how they have responded to attempts to cause division can inform efforts to defend against disinformation campaigns from any origin.
All three Baltic states have suspended or fined news channels that portray biased or fake information as news. Estonia launched a Russian language channel to provide an alternative for their Russian population, but it has so far been unsuccessful. Estonia and Latvia have also worked with NATO to establish institutions to fight the spread of disinformation. The Cooperative Cyber Defense Center of Excellence researches cyber defense and training, and the Strategic Communications Center for Excellence publishes reports on Russian disinformation.
The US also has institutions working to fight the spread of disinformation. The International Republican Institute works to promote democracy abroad, and has established the Beacon Project specifically to combat Russian propaganda.
One interesting proposition is that the US create a volunteer Cyber Defense Unit based on an Estonian organization. Under that model, civilians would train with experts to expand the capabilities of Estonia’s cyber defense. What seems most effective is the fact that training is not limited to those who intend to volunteer, but to Estonians as a whole. The goal is to increase awareness of cyber security for all internet users, and as the volunteers often work in the technology sector and do not need additional training to help increase cyber security, the Cyber Defense Unit is extremely cost effective.
That specific tactic may be most effective against more technical cyber-attacks, but that model could also be applied to civic education and media literacy. If one of the primary goals of Russian propaganda is to decrease trust in democratic institutions, building trust might be the way forward. In the “firehose of falsehood” study, it is suggested that “redirecting the firehose” is the way to go, rather than trying to struggle against it directly. Civic education builds trust in democratic institutions. It helps citizens understand how their governments work, and demystifying the processes makes it harder for disinformation to spread. Media literacy is the ability to engage proficiently with all forms of media, and includes discerning what is and is not legitimate news source.
While both civic education and media literacy should be added to curricula around the world, such a thing would take time, especially in nations like the US, where education standards are not centralized. Volunteer networks that train citizens in civic education and media literacy could be a ground up counterpart to the top-down approach of increased cyber security and suspending false news sources. Such programs would not only defend against Russian disinformation, by cyber-attacks and disinformation from other actors by creating a society with a high level of trust in democracy and a sense of what media they can trust.
Like you, my peer, I’m also interested in disinformation campaigns and the Russian model. I learned a lot reading your post. I really enjoyed how you first orient readers with the tactics the Russian government employed, then discuss how they are used overseas, and finally what is to be done to prevent further destabilizing actions. I totally agree with you that “redirecting the firehose” instead of trying to confront against it directly is the right way to go. In my recent blog about democracy in the age of artificial intelligence, I also mentioned how civic education is a pivotal part of the solution. On that note, you might find some of the media and news literacy projects from the Center for News Literacy at Stony Brook University School of Journalism interesting and helpful, as I did. Finally, I suggest that when thinking about the Russian disinformation campaigns, it is helpful to also consider the role social media and the internet play in the whole process. On that note, thinking about what factors external to the Russian endeavours are also contributing to the success of the Russian narratives might yield valuable insights to the whole story too.
Like you, my peer, I’m also interested in disinformation campaigns and the Russian model. I learned a lot reading your post. I really enjoyed how you first orient readers with the tactics the Russian government employed, then discuss how they are used overseas, and finally what is to be done to prevent further destabilizing actions. I totally agree with you that “redirecting the firehose” instead of trying to confront against it directly is the right way to go. In my recent blog about democracy in the age of artificial intelligence, I also mentioned how civic education is a pivotal part of the solution. On that note, you might find some of the media and news literacy projects from the Center for News Literacy at Stony Brook University School of Journalism interesting and helpful, as I did. Finally, I suggest that when thinking about the Russian disinformation campaigns, it is helpful to also consider the role social media and the internet play in the whole process. On that note, thinking about what factors external to the Russian endeavours are also contributing to the success of the Russian narratives might yield valuable insights into the whole story too.
I am glad I found your post, I was not aware there were cases of Russians interfering with the political processes of other countries with fake information or incendiary information besides the United States. I wrote about something similar, but I had a narrow field of vision and was looking into the role of Facebook including how foreign influence can proliferate through social media. Besides groups such as the Beacon Project that you mention, the U.S. has federal laws against foreign interference in elections since 2002, stating that non-American citizens cannot “directly or indirectly” spend money “in connection with” any U.S. election. It seems the Baltic States were able to take legal action against the Russian media, and I wonder if Ukraine has been able to do the same.
The concept of encouraging public awareness of misinformation, as a means of combatting its threat to democratic institutions is very interesting. One question which it raises, for me, is how you factor in people’s different opinions and political leanings when encouraging this awareness. In many cases these attacks would act to aggravate existing ideological divisions and animosities. How do you convince individuals to look past these differences and be more aware, when some of the misinformation is specifically intended to appeal to certain viewpoints and encourage rejection of others.
I found your article super interesting, as I am also interested in disinformation, especially in the Baltic states. I think you are right in stating that the Baltics have had to directly target their ethnic Russian (or Russian-speaking) minorities, as these groups make up a substantial portion of the population. However, I don’t know how effective this has been, as the Baltic countries still regularly pursue policies that put their ethnic minorities on the outs, particularly when it comes to language policy. Thus, a large portion of the population may still be ostracized, and regardless of any ability to recognize disinformation, may still be inclined to side with Russia. I also agree that civic education via media literacy/disinformation workshops is one of the most effective ways to combat disinformation. One thing I would note, is that while I believe these programs are a step in the right direction, I wonder about their effectiveness in the United States. It may be the case that the veracity of information actually matters very little. Some research has theorized that fact-checking, for example, while helping individuals to adjust their understanding of reality, does little to impact their vote. While I believe that disinformation/media literacy education is essential to citizens living in the 21st century, I wonder whether it would be enough to actually change individuals’ preferences.
I would like to commend the author for supporting his argument on the motives of Russia for undermining democracy abroad with evidences and offering solutions to eradicate the problem. The point raised on refuting all misleading information released by Russia to decrease the trust of people in democratic institutions as counterproductive is commendable as well. The facts presented have further supported Russia’s zero rating on the indicator, free and independent media since disinformation have been extending beyond its borders. Honestly, if an individual is not going to read about Russian politics, Russia appears to be a free country especially with Vladimir Putin’s charisma to other states.
In the book How Democracies Die, Steven Levitsky and Daniel Ziblatt highlighted the importance of civil liberties, media, and institutions in countering an authoritarian rule. Interestingly, the author was able to present solutions originating from civil liberties and institutions in various levels namely: individual level, where media literacy was highlighted; institutional level with reference to the actions taken by USA such as the establishment of a Cyber Defense Unit and the Beacon Project to combat Russian Propaganda; and a state level following the initiatives taken by Baltic States such as Estonia, Latvia, and Lithuania.
However, if we are to analyze the problem carefully, the root cause emanates from the actions taken by the Russian Government itself and that plans to deliberately eliminate the risk should start from the government and the Russians themselves. While it might be difficult to pass reforms during Vladimir Putin’s regime, people could only dwell on institutions and civil liberties to minimize the damage. After all, in anarchic world order, the government might perish but institutions would continue to exist.