The tools to understand and respond
to disinformation

The content of this page is available: EN RU UA

Disinformation is one of the key challenges of our times. Sometimes it may seem that it is everywhere around us. From the family gatherings where heated discussions on politics, society and even personal health choices take place, to internet, social media and even international politics.

It is not just individuals on the internet who are creating and spreading disinformation now and then. Foreign states, particularly Russia and China, have systematically used disinformation and information manipulation to sow division within our societies and to undermine our democracies, by eroding trust in the rule of law, elected institutions, democratic values and media. Disinformation as part of foreign information manipulation and interference poses a security threat affecting the safety of the European Union and its Member States.

What is disinformation exactly? How can we avoid falling for it, if at all? How can we respond to it? The Learn platform aims to help you find answers to these and other topical questions based on EUvsDisinfo’s collective experience gained since its creation in 2015. Here, you will find some of our best texts and a selection of useful tools, games, podcasts and other resources to build or strengthen your resilience to disinformation. Learn to discern with EUvsDisinfo, #DontBeDeceived and become more resilient.


Fake news

Inaccurate, sensationalist, misleading information. The term “fake news” has strong political connotations and is woefully inaccurate to describe the complexity of the issues at stake. Hence, at EUvsDisinfo we prefer more precise definitions of the phenomenon (e.g. disinformation, information manipulation).


Content disseminated to further one's cause or to damage an opposing cause, often using unethical persuasion techniques. This is a catch-all term with strong historical connotations, thus we rarely use it in our work. Notably, the International Covenant on Civil and Political Rights, adopted by the UN in 1966 states that propaganda for war shall be prohibited by law.


False or misleading content shared without intent to cause harm. However, its effects can still be harmful, e.g. when people share false information with friends and family in good faith.


False or misleading content that is created, presented and disseminated with an intention to deceive or secure economic or political gain and which may cause public harm. Disinformation does not include errors, satire and parody, or clearly identified partisan news and commentary.

Information influence operation

Coordinated efforts by domestic or foreign actors to influence a target audience using a range of deceptive means, including suppressing independent information sources in combination with disinformation.

Foreign information manipulation and interference (FIMI)

A pattern of behaviour in the information domain that threatens values, procedures and political processes. Such activity is manipulative (though usually not illegal), conducted in an intentional and coordinated manner, often in relation to other hybrid activities. It can be pursued by state or non-state actors and their proxies.


What is Disinformation?

And why should you care?

Some would say that disinformation, or lying, is a part of human interaction. White lies, blatant lies, falsifications, “alternative facts”; propaganda has followed humankind throughout our history. Even the snake in the garden of Eden lied to Adam and Eve!

Others would add that disinformation, especially used for political or geopolitical purposes, is a much more recent invention that became widely used by the totalitarian regimes of the 20th century. And that it was perfected by the KGB - the Soviet Union’s main security agency - which developed so-called “active measures”[1] to sow division and confusion in attempts to undermine the West. And that disinformation continues to be used by Russia for the same purpose to this day. (You can learn more about how Russia has revitalised KGB disinformation methods in our 2019 interview with independent Russian journalist Roman Dobrokhotov.)

There are many ways to answer the question of what disinformation is, and at EUvsDisinfo we have considered its philosophical, technological, political, sociological and communications aspects. We have tried to cover them all in this LEARN section.

Our own story began in 2015, after the European Council, the highest level of decision-making in the European Union, called out Russia as a source of disinformation, and tasked us with challenging Russia’s ongoing disinformation campaigns. Read our story here. In 2014 – the year before EUvsDisinfo was set up – a European country had, for the first time since World War II, used military force to attack and take land from a neighbour: Russia illegally annexed the Ukrainian peninsula of Crimea. Russia’s aggression in Ukraine was accompanied by an overwhelming disinformation campaign, culminating in an all-out invasion and large-scale genocidal violence against Ukraine. Countering Russian disinformation means fighting Russian aggression – as told by Ukrainian fact-checkers Vox Check, who talked to us quite literally from the battlefield trenches where they continue to defend Ukraine.

It is hard to overstate the role of Russian state-controlled media and the wider pro-Kremlin disinformation ecosystem in mobilising domestic support for the invasion of Ukraine.[2] The Kremlin’s grip on the information space in Russia is also an illustration of how authoritarian regimes use state-controlled media as a Tribune, platform to disseminate instructions to their subjects on how to act and what to think, demanding unconditional loyalty from the audience. This stands in sharp contrast to the understanding of media as a forum where a free exchange of views and ideas takes place; where debates, scrutiny and criticism create public discourse that sustains democracies. (We explore these concepts in our text on propaganda and disempowerment.)

Just like the use of media as a Tribune, pro-Kremlin disinformation is supported by a megaphone – the megaphone of manipulative tactics. The use of bots, trolls, fake websites and fake experts and many more activities trying to distort the genuine discussions we need for a democratic debate, is designed to reach as many people as possible to make them feel uncertain, afraid and to instil hatred in them. This shows that it is not a matter of free speech. The right to say false or misleading things is protected in our societies. This, however, is a matter of the Kremlin using all this manipulation as a way to be louder than everyone else. Such information manipulation and interference, including disinformation, is what EUvsDisinfo wants to expose, explain and counter.

Disinformation and other information manipulation efforts, which we also cover in LEARN, attempt to poison such public discourse. Thus, countering disinformation also means defending democracy and standing up against authoritarianism.

Scroll through this section and make sure to check the others, to learn more about the Narratives and Rhetoric of pro-Kremlin disinformation; Disinformation Tactics, Techniques and Procedures; the Pro-Kremlin Media Ecosystem; and Philosophy and Disinformation. Check out the Respond section to learn what you can do it about it. And if you are still curious – we have something special for you too!

[1] The New York Times made an excellent documentary on this back in 2018, called “Operation InfeKtion”,  (available in English).

[2] It is for this reason that the EU has sanctioned several dozen Russian propagandists and suspended the broadcasting of Russia state-controlled outlets such as RT on the territory of the EU.

“To Challenge Russia’s Ongoing Disinformation Campaigns”: The Story of EUvsDisinfo


With this article, we invite our readers on a tour behind the scenes of the output you see on the EUvsDisinfo website, including the public disinformation database; our presence on Facebook and Twitter and our weekly newsletter, the Disinformation Review.

We go through some key moments of our history in changing contexts; from the 2015 decision to set up the East StratCom Task Force in the light of the conflict in Ukraine to the reality of the COVID-19 infodemic in the spring of 2020. As a part of this story, we share some of the thinking behind our methods and approaches.

A unique mandate

On 19 and 20 of March 2015, the leaders of the 28 EU countries gathered in Brussels. One of the decisions the participants in this summit put down on paper was the following:

“The European Council stressed the need to challenge Russia's ongoing disinformation campaigns and invited the High Representative, in cooperation with Member States and EU institutions, to prepare by June an action plan on strategic communication. The establishment of a communication team is a first step in this regard.”

With a unanimous decision coming from the highest level of decision making in the European Union – 28 heads of state and government – and with the clear language in which Russia was called out as a source of disinformation, the future work of what became the East Stratcom Task Force had been given a unique and strong mandate.

In March 2015, the leaders of the 28 EU member states decided to set up the East Stratcom Task Force.

A team of experts, who mainly had their background in communications, journalism and Russian studies, was formed in the EEAS – the EU’s diplomatic service, which is led by the EU’s High Representative.

Back to top

2015: Russian aggression in Ukraine

Before looking into how this mandate was turned into practice, let us recall the situation in the eastern part of the European continent at that time.

In 2014 – the year before the team was set up – a European country had for the first time since World War 2 used military force to attack and take land from a neighbour: Russia’s illegal annexation of the Ukrainian peninsula of Crimea.

Fog of falsehood: From the "little green men" in Crimea to the killing of 298 innocent civilians in the sky over Ukraine in July 2014, Russian authorities and state-controlled media cooperated to spread confusion and hide the truth.

Russia-backed armed separatist groups had also taken control over a part of eastern Ukraine in the Donbas region, which borders with Russia. In July 2014, this conflict suddenly moved closer to the European Union itself when Malaysia Airlines Flight MH17 with 298 people on board – among them 80 children and 196 Dutch nationals – was shot down by a Russian missile launched from the part of Ukraine controlled by Russia-backed separatists.

Back to top

Ukraine and EU as targets of disinformation

Russia’s aggression in Ukraine had been accompanied by an overwhelming disinformation campaign, in which outright lies played a central role. This hybrid operation was integrated into the overall attempt to destabilise Ukraine, and its aim was to undermine Ukraine’s position – both directly in the conflict with Russia and in the eyes of the international community, sowing doubt and confusion.

Russian state-controlled TV showed a woman who claimed to be an eyewitness of Ukrainian forces crucifying a local boy in eastern Ukraine; but the woman turned out to be an actress and the execution had never happened. In another case, Russian audiences were told about a little girl who had been killed as a result of Ukrainian shelling, also in eastern Ukraine; but a BBC journalist managed to make producers from Russia’s NTV admit that they had reported this story, knowing that it was not true. Russian government representatives and Russian media spread dozens of different, contradictory stories about what had happened to Flight MH17 – a smokescreen meant to spread uncertainty and avoid accepting Russia’s responsibility for this horrendous crime.


Reporters on the ground and investigative journalists have played a key role in exposing pro-Kremlin disinformation. Above an example of Vice News' award-winning video reporting from Crimea.

At the same time, the EU and its relationship with Ukraine – the largest partner country in the EU’s Eastern Partnership policy – was targeted by disinformation. Among the examples was reporting which accused the EU of financing the construction of “concentration camps” in Ukraine. Similar examples of pro-Kremlin disinformation targeting the relationship between the EU and Ukraine were exposed in the important work of Ukrainian fact-checkers.

This was the geopolitical situation and the information environment which in March 2015 made European leaders take this first political step against disinformation.

Three responses to disinformation

In order to move into action, the formulation of the mandate needed to be translated into concrete work descriptions: What should be understood by the phrase “to challenge”? How can you challenge disinformation? In other words: What should the team be doing?

In consultation with international experts, the EEAS identified not one, but three different strands of work as both politically acceptable and effective means of challenging disinformation:

  • The Task Force should make the EU’s own communication more effective, with special focus on the Eastern Partnership countries. In other words, if audiences should be resilient to e.g. the disinformation targeting the relationship between EU and Ukraine, it would make sense to raise the level of knowledge about what the EU is and what it does.
  • The Task Force should also help to strengthen free and independent media in the same region. One of the best ways of keeping a society resilient to disinformation is to have strong and trusted, independent outlets – including public service media – upholding fundamental journalistic standards.
  • Finally, as its third strand of work, the Task Force should raise awareness of the disinformation problem by running an advocacy campaign. This campaign should collect examples of disinformation and exhibit them in a framing that would not reinforce, but instead challenge the disinformation. EUvsDisinfo is this awareness raising campaign.

Since the East Stratcom Task Force began its operations in Brussels on 1 September 2015, these three directions have formed our work.

In other words, what we publish under the brand EUvsDisinfo is one of the EU’s responses to disinformation – an answer to the fundamental question: How to challenge disinformation?

Back to top

The EUvsDisinfo awareness raising campaign

The EUvsDisinfo.eu website is the hub of our campaign to raise awareness of pro-Kremlin disinformation. The campaign website includes a number of different products:

  • A publicly available disinformation database: since 2016, we have collected individual examples of disinformation with links to the originals and added short debunks. By April 2020, we have more than 8,000 examples. Initially, we relied on the team members’ media monitoring and a network of sympathising volunteers who forwarded examples they had spotted to us. Later, we received funding from the European Parliament which allowed us to systematise this work with the help of professional media monitoring services. However, the final judgment, i.e. the decision whether to include an example of disinformation in the database, lies with us.
  • A weekly Disinformation Review which presents the latest examples in the disinformation database in order to outline the current trends. When similar disinformation messages appear in different media outlets, and sometimes in different languages, it can be a sign that certain narratives are appearing, i.e. attempts to form and spread particular perceptions of reality among audiences using (social) media.
  • If the Disinformation Review resembles a news section, we also have a feature section with articles that look deeper into specific topics and narratives over longer time. This section includes different approaches, and you will see that we use different writing styles: longer analytical pieces, interviews, a “figure of the week”, as well as articles with an entertaining twist – for example when attempts to spread disinformation have exposed themselves in an embarrassing way. Finally, while the Disinformation Review is primarily based on examples of disinformation we ourselves have detected with the help of our professional media monitors, our feature articles frequently highlight examples when investigative and fact-checking journalists have exposed disinformation. These are often articles, radio and TV productions in Russian, made by independent Russian journalists, whose important work we thereby make available in English to a wider international audience.
  • We have dedicated topical sections, currently two: one about elections, which includes the online output of a campaign we ran in cooperation with colleagues in the European Parliament and at the European Commission representations in the member states to raise awareness of disinformation ahead of the European elections in 2019; and a dedicated COVID-19 section, which collects our output in response to disinformation about the coronavirus pandemic.
  • We also produce videos that raise awareness of disinformation, with special attention to examples taken from Russian TV. Sometimes, the simplest response to the problem is to add English subtitles to news broadcasts or a talk shows from Russian TV. We present these videos on our Facebook page and on Twitter


Watch our compilation of highlights from Russian TV in 2019.

Back to top

Terminology and methodology on EUvsDisinfo

We acknowledge that the results of our work very much depend on the definitions and approaches we use. Some key notions deserve to be singled out as particularly important:

  • We prefer to say pro-Kremlin disinformation because our focus is on the message. While it is well-known that the Kremlin issues guidelines for media messaging, there are also actors which operate in different degrees of dependency, loyalty, or simply inspiration from the narratives of the Russian authorities. For our understanding of this “ecosystem”, see the article, "The Strategy and the Tactics of the Pro-Kremlin Disinformation Campaign."
  • Since our work is a part of the EU’s foreign policy, we focus on disinformation coming from sources that are external to the EU; and with our mandate in mind, these sources must have a clear connection to the pro-Kremlin ecosystem. The original mandate is the reason why EUvsDisinfo looks specifically at pro-Kremlin disinformation.
  • We highlight examples of disinformation messages unless the context clearly states that the claim is untrue. This e.g. means that we do not include clearly labelled satire. Disinformation cases can, however, appear in the context of e.g. a televised talk show discussion where competing opinions are also made available: When the context of a clear disinformation message legitimises it as a relevant “opinion” – even if it is part of a “mix” of different points of view – we consider the case to be of relevance to our reporting.
  • The EUvsDisinfo database includes “disproofs”, which explain the components that make a certain claim disinformation, i.e. verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm. We put the disinformation examples in context with the weekly newsletter, the Disinformation Review, and with the feature articles. Our focus on the context is due to the fact that we acknowledge the important distinction between misinformation vs. disinformation, i.e. the difference between an incorrect claim seen in isolation, and the way such a claim can be used intentionally, systematically and manipulatively to pursue political goals. Awareness raising is not only about knowing why something is not correct; it is also about understanding how the systems, in which such a claims appears, work. Searching through the database, the reader sees a timeline of how a certain disinformation message pops up the first time, and how it changes and develops. These findings give hints to researchers, journalists or other users where to look for more. For a detailed breakdown of the terminology we support, we refer you to the terminology table in this article.

Back to top

From 2018: Increased support and a new mandate

Since 2015, our focus on Ukraine has remained strong; but we have also looked into other areas where pro-Kremlin disinformation has been active, including: migration; the MeToo movement; election interference; human rights; the anti-vaccination movement; the chemical attack in Salisbury; climate; conspiracy theories, and many other topics.

We have also seen growing interest and support of our work, including from the European Parliament. Effective of 2018, the European Parliament granted us an ear-marked budget to support our work. A part of this funding is spent on contracting a systematic media monitoring service, which replaced the initial network of volunteers. We wanted to move from presenting illustrative examples to also include a quantitative approach. As a result, we see and hear more now, and in more languages than we did in the beginning; and we have become able to identify larger tendencies thanks the access to larger bodies of data.

In December 2018, the European leaders gathered again in Brussels for a new discussion on disinformation. This time, they adopted an Action Plan on Disinformation, which acknowledged East Stratcom’s work; this means that our original mandate remained intact. The new Action Plan added new policies and initiatives, including a Rapid Alert System, in which EU’s member states keep each other informed internally about disinformation, and a Code of Practice, which pushes to make social media and other tech giants take more responsibility for information appearing on their platforms. The Action Plan also mentioned the important role played by our colleagues in the EEAS Task Forces for the Western Balkans and the South (the latter covering the Middle East and North Africa). The relevance of these regions is visible in articles published on the EUvsDisinfo website, e.g. about RT disinformation in Arabic and Sputnik in the Western Balkans. Disinformation operations originating from China have recently been added to our broader working environment as a topic of interest, and we have increased the capability to perform data analysis. An example of this work is an article on how a Facebook page pretending to represent the European Parliament has systematically been sharing publications from RT.

In addition to running the EUvsDisinfo online campaign, the East Stratcom Task Force has also begun to organise conferences with aim to raise awareness of disinformation and bring experts together; so far, one such event has been held in Brussels, and one has been organised in Tbilisi, Georgia. In addition, the team cooperates with different actors in the Eastern Partnership countries, including representatives of government institutions.

In 2018, the European leaders also stressed the particular importance of protecting elections against disinformation; in response to that, we launched an awareness raising campaign specifically ahead of the European elections in May 2019.

An introduction to the work of the St. Petersburg "troll factory" is among the features in our 2019 campaign to raise awareness of election interference.

Finally, we acknowledge that large parts of our audience prefer to read in other languages than English. Since the very beginning, we have published and promoted Russian versions of all our articles and of the Disinformation Review; we translate select publications into German and now begin similar work in French, Italian and Spanish.

The standard products in the EUvsDisinfo output – the disinformation database, the Disinformation Review and our analytical articles – are labelled as “not an official EU position”. We find that a review of other actors’ communications should not be seen as a policy; we want our work to be considered an analytical product made available to the public by the EU.

See also:

Questions and Answers about the East StratCom Task Force

The Strategy and the Tactics of the Pro-Kremlin Disinformation Campaign


Back to top

Read more

Propaganda and Disempowerment


Mass media are grouped along two major concepts: Media as a Forum or media as a Tribune. This is, of course, a theoretical model to describe two ideal types of relations between the media and its audience.

The concept of the Forum is based on a horizontal exchange of ideas and views. In general, the media lends itself to a function as a space for a public discourse. The forum is not a place where decisions are being made; it is a place for debate, questioning, scrutiny, criticism. A successful forum can be loud, rough and even vulgar. It can be moderated, but never controlled.

The concept of the Tribune is first and foremost a platform for dissemination of the ideas and values of whoever is controlling the platform. It is a top-down process, where the audience is expected to passively accept the notions; to receive instructions from the rulers on how to act and what to think. The concept is based on unconditional loyalty from the audience's part.

The Forum and the Tribune have different views on the concept of "fake". For the Forum, fake is information lacking a factual base. The participants in the discourse demand sources, they have a critical approach to statements. Attempts to doctor pictures, forge documents, hide details or just lie will sooner or later be brought to public attention.

For the Tribune, "fake" is anything that challenges the authority of the broadcaster. Whether or not a statement is based on fact is less important; the truth is anything that benefits the broadcaster. It is true, because the rulers say it is.

It is easy to see that most propaganda outlets have all the features of the Tribune. The media is an instrument, "The Party's Sharpest Weapon". A weapon wielded only by the powerful men in charge: their instrument. The audience is disempowered, force-fed views and thoughts.

Yet, the audience possesses a powerful tool. It can stop listening. The former Czech President, poet and dissident Václav Havel called this "The Power of the Powerless". The Tribune is based on the acceptance of a set of ideological rituals, quickly eroding, as they have never been tested in a fair contest between ideas.

Historically, Forums for public discourse have appeared in unexpected places when public debate has been forced out from the media. People have found spaces, rooms elsewhere, means of questioning, discussing, challenging authority when the media has degraded to dissemination of the Party Line.

The Forum corresponds with Democracy, just as the Tribune corresponds with Authoritarianism. The core of democracy is dissent; its method is critical thinking. The core of authoritarianism is submission; its method is disempowerment and corruption.

Read more

“Our job helps fight the Russian aggression. So we keep going.” Interview with Ukrainian fact-checkers VoxCheck


VoxCheck is an independent non-profit fact-checking project in Ukraine. In early February 2022, as the Kremlin was still massing its tanks and soldiers at the borders of Ukraine, we sent VoxCheck a couple of questions in writing about their work and pro-Kremlin disinformation in Ukraine. Before we could finalise the interview, Russia invaded Ukraine. On 2 March, the seventh day of the invasion, we received the replies to our questions and decided to publish them in their current form, as an illustration of the perseverance and resilience of the Ukrainian people.

VoxCheck continues to fight disinformation even as its work is interrupted by the howl of air raid sirens.

The language has been slightly edited for better readability.

What is VoxUkraine and what is its mission?

VoxUkraine an independent analytical platform. We help Ukraine move into the future. We focus on economics, governance, social developments and reforms. Neither parties nor oligarchs support us. The quality of our materials is ensured by the editorial process.

VoxCheck is a fact-checking unit of VoxUkraine. We monitor and verify the statements of politicians and opinion leaders which they deliver to a wider audience, for example, in interviews with leading media or on political talk shows. The other directions of our work are debunking fakes and countering Russian disinformation. The goal of our project is: less lies by politicians and more critical thinking by people.

In the current context of Russia’s military escalation, many observers point out that the Kremlin has stepped up its disinformation activities (the question was asked before the beginning of Russian invasion – EuvsDisinfo). Do you agree, and if so, what are the major disinformation narratives targeting Ukrainian audiences?

The general narrative of Russian disinformation stays the same: “Ukraine is governed by Nazis”; “Russian language and culture are supressed in Ukraine”; “Ukraine is governed by Western handlers”; “the Ukrainian army is shelling civilians in Donbas” and many others.

However, the number of misleading stories has increased with Russia’s military build-up and the beginning of the large-scale invasion. For example, there were allegations that Ukrainians had attacked and shelled some locations (while in fact that did not happen), fabricated stories of Ukrainian soldiers surrendering and high-level officials leaving Ukraine, claims about the defeat of the Armed Forces of Ukraine (the goal of these stories is to demoralise the Ukrainian army).

You may find more examples at Vox Ukraine.

Can you compare the present situation in the information environment to 2014? Have the Ukrainian people become more resilient to disinformation?

One thing to mention: there was no VoxCheck in 2014.

In our opinion, Ukrainians are more resilient than in 2014. But there is room for improvement.

According to a 2020 survey conducted by “Detector Media”, 15 per cent of Ukrainians had a low level and 33 per cent a below average level of media literacy (conversely, 52 per cent had above average or high media literacy - EUvsDisinfo).

It should be noted that nearly half (45 per cent) of the low media literacy segment were people aged 56-65 years. More than half of the segment with a below average level of media literacy had vocational secondary education.

Hence, we see two key factors affecting media literacy: education and age. Some people grew up in the former USSR where nobody had even heard about media literacy given the massive propaganda and information manipulation at the time.

In your work you focus a lot on disinformation targeting the reform process in Ukraine. Can you single out one theme that, in your opinion, it is particularly important to challenge?

The reform of the healthcare system of Ukraine, in our opinion, is an example of the most targeted reform. Since it has a lot of facets, there are a lot of stories and sub-topics to lie about.

What is the answer to the challenge of disinformation? Can disinformation be stopped, or is it an inevitable feature of a modern information environment? What can other countries learn from Ukraine?

Although some may consider it a violation of freedom of speech, we would highlight the banning of pro-Russian TV channels and websites that were spreading disinformation or acted as platforms hosting disinformation mouthpieces, as a recommended intervention to fight disinformation. If a disinformation platform is banned, it takes resources to restore it. And the multiple attempts at restoration could finally deplete the disinformation resources.

Although banning outlets from television and radio frequencies may not prevent them from using other channels (e.g., those sanctioned in Ukraine continued using social platforms Facebook, Telegram, YouTube), this administrative step may decrease their reach. In addition, it may lead to their total shutdown, as it happened with one of the TV channels sanctioned in Ukraine.

What are the biggest obstacles in your work, and what keeps you optimistic about the future?


  • Influx of misinformation. For the team it’s difficult to deal with all the incoming messages.
  • Psychological toll.
  • At the moment, there is a safety concern. Many of the members of our team are staying in Ukrainian cities where there is a risk of bombardment. So, the functioning is often interrupted by air raid alarms.

What keeps us optimistic:


  • Support from the readers (they thank us and emphasise that we do an important job)
  • Hope that our society will be able to think critically.
  • Now, we understand as never before that our job helps fight Russian aggression. So, we keep going.

Read more

“'Information War’ is a Term Used by the Kremlin to Justify Disinformation”


Roman Dobrokhotov is a Russian journalist and the editor-in-chief of the independent outlet The Insider.

He has earned international recognition for exposing government-sponsored disinformation in Russia, often working together with international partners. Notably, The Insider’s collaborative investigation with Bellingcat into the chemical attack in Salisbury in March 2018 received this year's European Press Prize Investigative Reporting Award.

In this exclusive interview, Roman Dobrokhotov shares his reflections on the nature of the disinformation campaign in Russia. He also tells how his work has resonated inside and outside Russia and why he is not very concerned about his personal safety.

Disinformation vs. Propaganda

Q. First a question about terminology. Which term do you prefer: propaganda, fakes, disinformation? Or another term?

A. Disinformation and propaganda are two completely different things. Propaganda highlights events which are favourable to the authorities, while disinformation spreads falsehoods. As a rule, disinformation is intentional – it is deliberately spread by someone who knows that he or she is deceiving the audience.

The term “disinformation” became widespread after the Cold War, and it was taken from the Soviet lexicon. It is very much a Soviet concept, which was used not only in the context of the media, but also in the context of the work of the KGB: there were special units of the KGB which engaged professionally in disinformation – by the way, they still exist nowadays, just in a different form. There are parts of [Russia’s military intelligence agency] the GRU whose task is to officially conduct disinformation campaigns.

The 2019 European Press Prize Investigative Reporting Award was awarded to The Insider and Bellingcat for their collaborative investigation, "Unmasking the Salisbury Poisoning Suspects: A Four-Part Investigation."

Soviet ideological heritage

Q. Do you think that the situation with disinformation is special in Russia, or can it be compared with the situation in other countries?

It can be compared with the situation in those countries that share the Soviet ideological heritage – like China, for example, which borrowed a lot from the USSR.

Something similar may exist in different authoritarian countries, even if they have not taken over anything from the Soviet experience. The very logic of the existence of such states in the modern information environment implies that they have to limit their citizens’ access to information, disseminate defamatory information about enemies and beneficial information about themselves.

The besieged fortress

Q. What purpose or purposes does the disinformation serve, in your opinion? If it is possible to generalise?

A. The main goal is to discredit the enemy or bring discord into the enemy camp. Disinformation is in fact a military concept: when you spread propaganda and disinformation leaflets behind enemy lines, you do it because you consider it part of an “information war”.

The Kremlin likes to talk about an “information war” in order to legitimise military terminology and thereby justify the disinformation campaigns: They say that they do not just cover current affairs, but participate in an “information war”, in which all means are good.

Again, comparing with China: even though China is a totalitarian country, there is still no feeling of a war with the West, at least not today. In China, the propaganda does not describe China’s position in the world as a state of war, it does not claim that the country is surrounded by enemies. The Chinese have a different agenda: They underline how they are different; that they have an ancient culture; that they are a big country and so on. In other words, they advertise themselves in a more positive context.

The first association with the situation in Russia will rather be North Korea, which also supports a “besieged fortress” ideology: they also like to conduct hacker attacks, disinformation campaigns and shameless propaganda.

In 2018, Roman Dobrokhotov received the Journalism as a Profession award in the Investigative Journalism category for the investigation into the Salisbury attack.

International investigations

Q. You have been doing investigative journalism for a long time. Can you give an example of how you have exposed disinformation and other problems, and to what this exposure has led? Has your exposure ever had concrete consequences?

A. The Insider” has been doing investigative reporting since 2013, so we have published a huge variety of different stories, with a wide variety of resonances.

When we have written about local issues in Russia, we know that as a result of our articles, different sorts of crooks have faced criminal charges. This happens rarely, of course, but there are such examples. These stories have usually had to do with social topics; in a case when we, for example, wrote about an orphanage, then we saw that the local authorities took care of the problems.

There are also more politically sensitive topics, but in those cases it is clear that we cannot make a big difference; if the state were to react, it would mean that it would not be an authoritarian state anymore.

In 2017, Roman Dobrokhotov and The Insider received the Council of Europe Democracy Innovation Award. Image: Council of Europe.

And then there are our international investigations; obviously, they are the ones that have the greatest effect. When we cover something that happens in Western countries, the democratic governments there are more willing to react to the hype in the press, they carefully read everything that is published, and try to somehow rectify the situation.

For example, when we saw that one of the participants in the Salisbury case had also been in Poland and Bulgaria, when the businessman Gebrev was poisoned, then after that the special services of these two countries began to cooperate, exchange information and realised that we were talking about a Russian strategy, and that these were not just two random episodes.

There are many such examples. In Spain, the law enforcement agencies have also used the information that we provided on the GRU officers who traveled to Barcelona.

But the reaction is not always very strong and the one we expected; for example, in Germany, after we were able to link an assassin with the Russian special services, the investigation did not reach the political level, and they are still trying to hush up this case and investigate it as if it was just a criminal incident.

In other words, the democracies are also different and show varying degrees of courage, but we have nothing to complain about in terms of resonance, despite the fact that our media outlet is rather small. In general, when we look at the citations of our work and at the influence we have, we see that reactions are quite strong.

In the 2018 documentary Factory of Lies, Roman Dobrokhotov was one of a number of Russian journalists who told about uncovering the hidden processes of disinformation campaigns coming from Russia.

Competing with disinformation

Q. So do you think that journalistic investigations and exposure can pose a significant threat to disinformation? Or compete with it?

A. We have a separate section on our website called “Anti-Fake,” which is dedicated specifically to disinformation and propaganda, where we monitor and expose fakes. It aims precisely at confronting, and, if you will, competing with the propaganda.

Q. For which audience do you write, for example, about the Salisbury case? As a journalist, you probably keep in mind a specific reader – what can you say about him or her?

A. In terms of demographic criteria — gender, age, and regional location of the reader — our audience is fairly evenly distributed across Russia. The ratio of men and women, old and young people will be about the same as the average for Russia. Obviously, more people from large cities read us, but there are also more people living in large cities, so I don’t see any deviations here, and we are not trying to single out some kind of core audience and shape our texts accordingly.

An entire "Antifake" section of The Insider is devoted to analysing and debunking disinformation appearing in pro-Kremlin media. The section is updated with examples several times every week.

Of course, we understand that on average, a reader of The Insider will be a little more educated, advanced and progressive than the reader of any regular site. Why would people open The Insider if our articles made them angry? It happens by itself, we do not try to focus only on liberal circles, unlike, for example, Grani.ru, where they now refer to Russian police as “punishment forces” in their headlines, etc. – we try to be neutral in our language and the way we work.

So actually, I think that all citizens of Russia as a whole are our target audience. And if they are ideologically far from us – well, there is not much you can do about that; maybe they will read our stories and stop being ideologically far from us.

Q. What can you tell about the methods you use in exposing disinformation?

A. For example, we investigate who runs the propaganda. Tomorrow, we will publish an article about this kind of question, namely about [a consultant working for the Kremlin, Konstantin] Kostin and how the Presidential Administration organises provocations against [the opposition politician and anti-corruption activist Alexei] Navalny; that is one kind of story.

Monitoring fakes and exposing them, fact checking – that is a completely different story. These are different things, and different people are doing them at The Insider; they require completely different approaches.


Click to watch Roman Dobrokhotov explain to Associated Press how he uncovered the identity of one of the suspects in the Skripal case.

“How can you work from Russia at all?”

Q. The last question, on a more personal note, and I think you are often asked about this. Are you not afraid for your safety? Have you considered stopping your investigations, somehow changing your profile?

A. I was just on a business trip to the US where I had an average of ten meetings every day for three days in a row – and during these three days there was one question which was repeated constantly, it came in first at all meetings, and it was this: Do you fear for your safety, and how can you work from Russia at all?

I think that journalists in Russia are in a privileged position compared to activists and even members of NGOs; we do not see such mass repressions against journalists, as, for example, in Turkey, where hundreds of people are imprisoned – not to mention countries like China, Iran, Egypt.

In the Russian regions it can indeed be difficult for journalists to work; there the value of a human life is even lower. So that is where we see more murders, attacks, journalists being arrested. In Moscow, you can work more or less without being touched, so I see no reason to worry.

But everything is understood through comparison; the risks are different everywhere. It would be one thing if I lived, say, in Switzerland; then maybe I would have given more thought to the question whether it is worth undertaking some dangerous investigations or not; but we live in Russia and share the experience of having lived in the Soviet Union, the experience of our parents – I also remember it well. Compared to that period, risks are smaller now, so perhaps we shouldn’t complain, Russia is not such a totalitarian country‚ and even if the risks are big – well, sometimes you have to accept that there is a risk in what you do and that it’s a part of the job.

Other articles in our series of interviews with Russian journalists:

Propaganda Must be Opposed by the Language of Values“: Andrei Arkhangelsky is one of Russia’s most active commentators on the topic of disinformation and propaganda.

Distracting the audience from the real problems“: Exclusive interview with Russian journalist Maria Borzunova. Every week she exposes pro-Kremlin disinformation in her programme ‘Fake News’ on the independent TV Rain.

"The Propaganda digs a cultural ditch between Russia and Europe": Pavel Kanygin has covered the MH17 case as an investigative journalist with Novaya Gazeta. In this exclusive interview, he shares his experience in challenging the state-sponsored disinformation.

Top photo: Roman Dobrokhotov on Facebook

Read more

Narratives and Rhetoric of Disinformation

The Narratives section will introduce the key narratives repeatedly pushed by pro-Kremlin disinformation and the cheap rhetorical tricks that the Kremlin uses to gain the upper hand in the information space. This section also discusses the lure of conspiracy theories and finally uncovers the dangers of hate speech. We will explain all pro-Kremlin tools and tricks used to erode trust; discourage, confuse and disempower citizens; attack democratic values, institutions and countries; and incite hate and violence.

The Key Narratives in Pro-Kremlin Disinformation

A narrative is an overall message communicated through texts, images, metaphors, and other means. Narratives help relay a message by creating suspense and making information attractive. Pro-Kremlin narratives are harmful and form a part of information manipulation. They are designed to foster distrust and a feeling of disempowerment, and thus increase polarisation and social fragmentation. Ultimately, these narratives are intended to undermine trust in democratic institutions and liberal democracy itself as a form of governing.

We have identified six major repetitive narratives that pro-Kremlin disinformation outlets use in order to undermine democracy and democratic institutions, in particular in “the West”.

These narratives are: 1) The Elites vs. People; 2) The 'Threatened Values'; 3) Lost Sovereignty; 4) The Imminent Collapse;  5) Hahaganda; and 6. Unfounded accusations of Nazism.

Rhetorical Devices as Kremlin Cheap Tricks

The Kremlin's cheap tricks are a series of rhetorical devices used to, among other purposes, deflect criticism, discourage debate, and discredit any opponents.

These rhetorical devices are designed to occupy the information space, create an element of uncertainty, and to exhaust any opposition. They are often used in combination with each other to create a more effective disinformation campaign.

The rhetorical devices that the pro-Kremlin outlets and on-line trolls alike use include the straw man, whataboutism, attack, mockery, provocation, exhaust, and denial.

For example, the straw man is a rhetorical device where the troll attacks views or ideas never expressed by the opponent. The Kremlin also frequently uses attack as a cheap trick to discourage the opposition from continuing the conversation. Sarcasm, mockery, and ridicule are also common Kremlin tactics to gain advantage in a debate. Finally, the Kremlin often uses denial to discredit opponents and dismiss any evidence that raises questions about Russian accountability.

The Lure of Conspiracy Theories for Authoritarian Leaders

Conspiracy theories are not only a potent element for creating an enticing plot in thrillers, but also for propaganda purposes. One of the many conspiracy theories that has made its way on the Russian TV is the Shadow Government conspiracy theory. It is based on the belief that there is a small group of people, hiding from us, controlling the world.

From the propaganda perspective, the charm of the Shadow Government theory is that it can be filled with anything you want. Catholics, bankers, Jews, feminists, freemasons, “Big Pharma”, Muslims, the gay lobby, bureaucrats - all depending on your target audience.

The goal of the Shadow Government narrative is to question the legitimacy of democracy and our institutions. What is the point of voting if the Shadow Government already rules the world? What is the point of being elected if the Deep State resists all attempts to reform? We, as voters, citizens and human beings, are disempowered through the Shadow Government narrative. Ultimately, the narrative is designed to make us give up voting or practicing our right to express our views.

Hate Speech Is Dangerous

Hate speech is any kind of communication in speech, writing or behaviour that attacks or uses pejorative or discriminatory language with reference to a person or group on the basis of who they are. In other words, based on their religion, ethnicity or affiliation.

Hate speech is dangerous as it can lead to wide-scale human rights violations, as we have witnessed most recently in Ukraine. It can also be used to dehumanise an opponent, making them seem less than human and therefore not worthy of the same rights and treatment.

Russian leaders and media have been increasingly using genocide-inciting hate speech against Ukraine and its people since the annexation of Crimea in 2014 and with increasing intensity before the full-scale invasion in February 2022. By portraying the legitimate government in Kyiv and the wider Ukrainian population as sub-human, both the general Russian population and Russian soldiers alike are able to justify atrocities against them.



A defining feature of pro-Kremlin disinformation is its repetitiveness. For all the outrageous claims they make, pro-Kremlin outlets often sound like a broken record sticking to a just a handful of basic messages for domestic and international audiences. This is not by accident or oversight, it is by design: repetition makes lies sound more believable. Pro-Kremlin disinformation outlets achieve this by sticking to a set of recurring narratives that work as templates for particular stories.

A narrative is an overall message, communicated through texts, images, metaphors, and other means. Narratives help relay a message, they create suspense and make information attractive. They can be combined and modified based on current events and prevailing attitudes. Some of them have been around for hundreds of years – variations of the narrative of the “decaying West” have been documented since the 19th century. EUvsDisinfo has identified a set of five dominating narratives used by pro-Kremlin disinformation outlets, and the key elements of Kremlin story-telling. We have seen these key pro-Kremlin disinformation narratives deployed on many occasions: in attempts of election interference, throughout the COVID-19 pandemic, in an effort to justify the unprovoked war in Ukraine.

We bring you an updated overview of the most common disinformation narratives that continue to appear in Russian and pro-Kremlin disinformation outlets.

The first key narrative in pro-Kremlin disinformation: the Elites v the People

The idea of an elite disconnected from the hard-working people runs strongly in political history. Several politicians and political movements have claimed to represent the voice of the common man, the little guy, the silent majority, against a corrupt and smug clique comprising of the representatives of political parties, corporations and the media. This narrative is not the Kremlin’s invention, but pro-Kremlin disinformation outlets exploit it frequently.

Smorgasbord of scapegoats

This narrative can be very successful, as it provides a scapegoat for the target audience to blame for any grievances: bankers, Big Corporations, Jews, oligarchs, Muslims, Brussels bureaucrats, you name it. Russian disinformation outlets heavily exploited this narrative throughout the COVID-19 pandemic, notoriously alleging that Bill Gates either invented the coronavirus, or was using vaccines against it to implant “microchips”.

The narrative is also strongly connected with various conspiracy beliefs. A common feature is a claim about the existence of secret elites: shadow rulers, puppet-masters with odious intentions. Throughout the pandemic, it has proven to be a working, efficient and comfortable template for producers of disinformation. The EUvsDisinfo website contains numerous claims on the virus being man-made and the measures to curb its spread merely the elites’ ways of destroying the lives of ordinary people.

Beyond the pandemic, this narrative was deployed on the eve of the 2016 Brexit referendum, as these two Sputnik articles demonstrate: “The Threat from Eurocracy threatens Europe” and “Waffen-EU”.

Anglo-Saxons and Ukraine

The narrative of “the Elites v the people” has also been used in the context of Russia’s invasion of Ukraine. Pro-Kremlin outlets have tried to paint Russia’s invasion as an “Anglo-Saxon” plot, pitting Slavs against each other.

In pro-Kremlin parlance, “Anglo-Saxons” is used as a catch-all term to vilify the West, and in particular the UK and the US. Anglo-Saxons are supposedly cunning and bloodthirsty, devising nefarious plots for global domination. The term is frequently used to construct conspiracy beliefs and has a “clash of civilisations” element to it, helping to frame the West as the “other” and reinforcing the idea that Russia belongs to a “different civilisation”.

Therefore, in Ukraine, we have the (Anglo-Saxon) elites vs the (Slavic) people, according to the pro-Kremlin spin-doctors: the Anglo-Saxons are seeking conflict with Russia at all costs, organised the 2014 coup d’état disguised as a democratic protest, want to involve Ukraine in war against Slavs, and are using Ukraine as an anti-Russian outpost, etc.

Lies of Reason

The Elites v the People narrative has a long, over hundred-year history. Its purveyors claim to be the voice of reason and to advocate on behalf of disenfranchised citizens, speaking truth to power against elites that seek to hide the “truth” at any cost.

The “truth” can relate to a broad variety of issues, including war and peace, migration, economy, while the particular elites deemed “guilty” of hiding the truth are strategically selected to suit the grievances of the target audience. Indeed, this narrative can be adapted and applied to a seemingly infinite number of issues: “The migration crisis is caused by big corporations in order to obtain cheap labour“; “The Global Warming Hoax is used by bankers to divert public attention from real-world problems “; “Global corporations, mainly arms manufacturers are responsible for the war in Ukraine”.

Ultimately, while this narrative appears on its surface to sympathise with ordinary people, its roots are, in fact, authoritarian. Evidence is rarely provided to substantiate the claims made and, following the principles of conspiracy thinking, the very absence of evidence is sometimes used as proof: “See how powerful the elites are, hiding all trace of their conspiracy!” Typically, this narrative also demands that the reader rely exclusively on the word of the narrator: “I know the truth, trust me!” Indeed, like all narratives based on conspiracy beliefs, this one requires its audiences to accept the claims on the basis of faith rather than fact.

Read more on the Elites vs. the People here. Read more about Anglo-Saxons here.

The second key narrative in pro-Kremlin disinformation: the ‘Threatened Values’

The narrative about ‘Threatened Values’ is adapted to a wide range of topics and typically used to challenge Western attitudes about the rights of women, ethnic and religious minorities, and LGBTQI+ groups, among others. Pro-Kremlin commentators ridicule alleged Western ‘moral decay’ or ‘depraved attitudes’. By contrast, Russia and Orthodox Christianity stand out as the true defenders of traditional values, as by this official Russian promotional video illustrates.

According to this narrative, the ‘effeminate West’ is rotting under the onslaught of decadence, feminism and ‘political correctness’ and driving down its economy, while Russia embodies traditional paternal values. This narrative is depicted in a 2015 cartoon by Russian state news agency RIA Novosti, illustrating Europe’s apparent moral decay: from Hitler, to sexual deviance, to a future of rabid hyenas.

The idea of a decaying West, juxtaposed against a Russia that is the ‘guardian’ of decency and morality emanates from the very top of the Kremlin. According to an analysis of the European Council of Foreign Relations(opens in a new tab), already back in 2013, Putin had assumed this posture, condemning ‘Euro-Atlantic’ countries for their moral decadence and immorality.

Talking about sex…

Pro-Kremlin media eagerly followed suit. Russian state broadcaster Sputnik described Western mass culture as ‘various forms of paedophilia’. Pro-Kremlin outlets operating in Arabic claimed that the West attempts to destroy basic values, such as those related to the state and the family. In Armenia, pro-Kremlin disinformation outlets alleged that the West was ‘planting’ alien moral foundations to undermine national identities of other states. In this narrative, the Kremlin not only manages to ‘preserve’ basic decency and values, but also ‘defends’ them against the onslaught of immorality from the West.

According to the European Council of Foreign Relations’ analysis(opens in a new tab):

‘One clever propaganda trick was to enhance the image of the evil West by merging together the social conservative and the anti-Western posture. In this way, the West and Westernisers, gay people, liberals, contemporary artists and their fans, those who did not treat the Russian Orthodox Church with due respect, and those who dared to doubt Russia’s unblemished historical record were all presented as one ‘indivisible evil’,’ a threat to Russia, its culture, its values, and its very national identity.’

Homophobia goes hand in hand with the claim of protecting traditional values, so it is no surprise that Russian state media spend time ridiculing, for instance, rights for sexual minorities as illustrated in these examples. Also recently, pro-Kremlin outlets have resorted to homophobic tropes in their denigration of Ukrainian service members.

Pro-Kremlin outlets show special delight when they can hit two rabbits with one stone: accusing the all-mighty EU of tyrannical behaviour by ordering individual states to abolish or destroy their own values. One example is the manipulated and severely misrepresented story, ‘European Parliament bans the words Mother and Father’.

Common Sense

Value-based disinformation narratives usually centre on threatened concepts like ‘tradition’, ‘decency’, and ‘common sense’ – terms that all have positive connotations, but are rarely clearly defined.

The narrative creates an ‘us vs. them’ framework, which suggests that those who are committed to traditional values are now threatened by those who oppose them and instead seek to establish a morally bankrupt dystopia. Russian and pro-Kremlin disinformation outlets pushed variations of this narrative in the run-up to the 2018 Swedish general election, as can be seen here(opens in a new tab) and here(opens in a new tab). In Russian-language outlets, like the infamous St. Petersburg Troll Factory News Agency RIAFAN, the language of this narrative is particularly aggressive: ‘What it is like in the country of victorious tolerance: gays and lesbians issue dictates, oppression of men and women, Russophobia and fear’.(opens in a new tab)

By contrast to the Western conception of values, which favours the individual rights of personal integrity, safety, and freedom of expression, the Russian value system entails a set of collective norms that every individual is expected to conform to.

Yet the Threatened Values narrative is always expressed from a position of perceived moral high ground in which the silent majority, committed to decency and traditionalism, is under attack from liberal ‘tyranny’. The target audience is invited to join the heroic ranks under the Kremlin banner, boldly fighting for family values, traditional Christianity, and purity.

Read more on the ‘Threatened Values’ here.

The third key narrative in pro-Kremlin disinformation: Lost Sovereignty

Russian and pro-Kremlin disinformation sources like to claim that certain countries are no longer truly sovereign. Back in 2015, a cartoonist for the Russian state news agency RIA Novosti illustrated this idea with an image:(opens in a new tab) Uncle Sam is turning up the flame on a gas stove, forcing Europeans to jump up and down while crying for sanctions against Russia.

Original illustration: RIA Novosti

Since then, there have been many more examples of this narrative proliferating in pro-Kremlin outlets: for example, Ukraine is ruled by foreigners and the Baltic states are not really countries. Pro-Kremlin disinformation outlets also claim that with their accession to NATO, Finland and Sweden are now about to lose their sovereignty and that they are acting under foreign (US, NATO) pressure. Further examples for this narrative abound: the EU is directed by Washington, Japan is a vassal state, Germany is an occupied territory, decisions in Ukraine aren’t made by its president but by the US, and so on.

Pro-Kremlin outlets even uses a specific vocabulary to define states that are ‘not-sovereign’: ‘limitrophes’, or frontier territories providing subsistence and subservience to their masters. The Polish edition of the pro-Kremlin propaganda outlet RuBaltica explains:

‘There are real states, which are capable of implementing all state functions, and then there is geopolitical scum or fictional countries that have formal attributes of statehood but which are not real states. These countries include the post-Soviet limitrophe states separating Russia from the West. The pseudo-elites of these countries are not able to respond to any serious historical challenge such as overcoming the migration crisis, protecting the border and fighting the epidemic. They keep asking Western Europe and the United States for help because they cannot cope on their own. These countries are ruled by puppets – they are able only to speak about “stopping Russia” and vote for anti-Russian resolutions in the EU and UN.’

The narrative of ‘lost sovereignty’ is ultimately a narrative of disempowerment aimed at eroding the very foundations of democracy. Why would anyone care about democratic processes and elections, if powerful outsiders rule their country?

In contrast, pro-Kremlin outlets claim, real sovereignty is possible only under Russian control.

No independent political will of the people

Closely related to the ‘lost sovereignty’ narrative are pro-Kremlin disinformation messages about so-called ‘colour revolutions’. These messages allege that social upheavals or political protests are orchestrated by powerful outsiders (the West) and never a genuine expression of citizens’ activism or grievances. The examples are many and date back to the Euromaidan protests in Ukraine in 2013-2014, when pro-Kremlin outlets falsely accused US officials of staging the popular protests.

The Kremlin’s dismissal of the independent political will and aspirations of other peoples is arrogant and often hides an imperialistic approach to the people or country in question. Failing to grasp the concept of free will, the Kremlin resorts to conspiratorial thinking – ‘Why would anyone in their right mind want to distance themselves from Russia? This can only be because of Anglo-Saxon-led manipulation.’ Not surprisingly, in the Kremlin’s view, the EU is also under Anglo-Saxon control.

Undermining the statehood of Ukraine

The narrative of ‘lost sovereignty’ aims to erode trust in and ultimately corrupt the foundations of democratic institutions. It has also gained a special significance in the attempts of pro-Kremlin outlets to justify military aggression against Ukraine. An infamous example was Putin alleging(opens in a new tab) that ‘Lenin created Ukraine’ a few days before Russia launched a full-scale invasion (a claim that had already been circulating in the pro-Kremlin disinformation ecosystem).

In the context of Ukraine, the pro-Kremlin narrative of ‘lost sovereignty’ takes on an even more sinister, imperialist hue. It denies not only Ukrainian statehood, but also its very existence by alleging that ‘a state of Ukraine has never existed before’. This narrative, along with the myth of ‘Nazi Ukraine’, has been one of the central disinformation tropes justifying Russia’s unprovoked invasion of Ukraine. Related disinformation narratives include claims that Ukrainians, Russians, and Belarusians are ‘one nation(opens in a new tab)’ and multiple allegations that Ukraine is on the verge of disintegration.

Read more about the ‘lost sovereignty’ narrative here.

The fourth key narrative in pro-Kremlin disinformation: Imminent Collapse

In Aristotelian rhetoric, the concept of kairos denotes a sense of urgency for action. Most speakers utilise this concept when they claim: act now, before it’s too late! In the pro-Kremlin disinformation context, the narrative of the ‘Imminent Collapse’ fulfils this function.

Russian and pro-Kremlin disinformation outlets regularly employ this narrative. Examples include: the EU is on the verge of collapse, the US is collapsing, NATO is breaking down, Ukraine’s entry into the EU would provoke the bloc’s collapse, and the financial system is collapsing.

According to the Kremlin propaganda, other factors are also hastening this alleged collapse. For example, the RIA Novosti cartoonist describes terrorism in Europe as a deadly scorpion that Europeans have unwittingly placed in their pocket.

An Old Story

Russia has been foreshadowing Europe’s imminent collapse for well over a century. Describing Europe or EU member states as ‘on the verge of civil war’ worked just as well in 2019 as it did back in 1919.

Russian state-run outlets feed their local audiences multiple stories of how life is terrible in the EU: unrest, violence, poverty, political extremism, and so forth. All to create a sense of comfort for the audience of living inside Russia, apparently not on a verge of an imminent collapse, in a clear contrast to the situation elsewhere.

The ‘imminent collapse’ is a hard-working narrative that usually resonates well with both local and international target audiences, despite the fact that there has not been a civil war in the EU, nor has Europe collapsed. By many metrics, the bloc continues to flourish.

Target audiences that – legitimately or not – already fear political and social turmoil in their countries are particularly susceptible to this narrative.

Thus, this narrative works especially well during periods of real political challenges, like during the migration crisis in the fall of 2015, during the pandemic, and now during the Russian invasion of Ukraine.

Original illustration: RIA Novosti

The Final Battle

The enormous influx of migrants to Europe certainly posed a major challenge to European governments, but Russian and pro-Kremlin outlets portrayed the situation in grossly overstated and apocalyptic terms, reporting about the crisis as though it constituted a systemic collapse. Of course, the system survived intact, but the image of a collapse lingers on in Kremlin parlance.

During the early stages of the COVID-19 outbreak, in March 2020, nationalist philosopher Aleksandr Dugin relished in witnessing the collapse(opens in a new tab) of Western democracies:

"The global capitalist society collapsed immediately. Not everyone has understood it yet. But they will very soon. This means that the very substance of Liberal Globalism has collapsed, the world of office workers and beauty bloggers, transgender persons and climate activists, human right defenders and hipsters, migrants and feminists".

The approach was also visible in Russian and pro-Kremlin coverage of the Yellow Vest protests in France. The right to express discontent with government and politics is an integral part of democracy, and the citizens of any European state enjoy the right to protest peacefully. The Yellow Vest movement and other expressions of discontent like it belong to the European democratic tradition. They are not proof of the breakdown of the system, but of its ability to reinvent and rejuvenate itself.

Despite their lack of success in prophesising, pro-Kremlin fantasising of an imminent collapse of their perceived adversaries has continued also after the Russian invasion of Ukraine. In pro-Kremlin outlets’ disinformation, the war in Ukraine is just a Western ruse to postpone the ‘inevitable collapse of global capitalism.’

The ‘imminent collapse’ narrative is also sometimes used to lament the alleged breakdown of European moral values and traditions. Russian and pro-Kremlin disinformation outlets for instance regularly describe children’s rights in Europe as an attack on family values. Their bottom line: Europe is dying by abandoning all decency and morals.

However, the reports of Europe’s death may have been greatly exaggerated. Navigating a tumultuous economic situation, wading through a global pandemic, or witnessing a heated political landscape should not be mistaken as an existential collapse.

Read more on the ‘Imminent Collapse’ here.

The fifth key narrative in pro-Kremlin disinformation: the Hahaganda

A final resort in disinformation, typically when confronted with compelling evidence or unassailable arguments, is to make a joke about the subject, or to ridicule the topic at hand.

The Skripal poisoning case is an excellent example of this strategy. Russian and pro-Kremlin disinformation outlets have continued their attempts to drown out the assassination attempt with sarcasm to turn the entire tragedy into one big joke. A similar approach has been employed in the case of the attempted assassination of Alexei Navalny, where pro-Kremlin media has competed on delivering “fun” stories on how to better kill the Russian dissident.

The methods of ‘hahaganda’ also involve the use of various derogatory words to belittle the concept of democracy, democratic procedures, and candidates.

Kremlin aide Vladislav Surkov describes the concept of democracy as “a battle of bastards(opens in a new tab)” and instead recommends the ‘enlightened rule’ of Vladimir Putin as an alternative for Europe. Former Ukrainian President Petro Poroshenko was almost constantly ridiculed(opens in a new tab) in pro-Kremlin media, as is Ukraine’s entire election process. According to Russian state media, an election with several candidates and no obvious outcome is considered a circus.

Ukraine’s sitting president Volodymyr Zelensky has also received more than his fair share of ridicule and humiliation in pro-Kremlin disinformation outlets during his time at the office. Among other ridiculous allegations, he has been claimed to get military advice from his 9-year old son, dance to the tune of the US, and allegedly also to that of Turkey. No pro-Kremlin orchestrated humiliation would be full without the mandatory Nazi and Soros affiliations.

The Kremlin Weaponising Jokes

This weaponisation of jokes and public ridicule is so favoured by the Kremlin, that the State News Agency RIA Novosti employed two pranksters, tasked with setting up fake telephone conversations with politicians, activists and decision-makers. Posing as representatives of the Alexei Navalny team, environmental activist Greta Thunberg, and most recently as Ukraine’s prime minister(opens in a new tab), the Kremlin affiliated jesters attempt to con the interlocutor into saying something politically destructive.

Of course, satire, humour, and parody are all integral components of public discourse. The right to poke fun at politicians or make jokes about bureaucrats is important to the vitality of any democracy.

It is ironic, then, that Russian and pro-Kremlin disinformation outlets often seek to disguise their anti-Western lies and deception behind a veil of satire, claiming it is within their rights of free speech. However, at the same time, they aggressively refuse to tolerate any satire that is critical of the Kremlin, or undermines its political agenda. An example of this hypocrisy is Russia’s ban on the 2018 British comedy The Death of Stalin(opens in a new tab).

Ridicule and humiliation

In their 2017 report(opens in a new tab), NATO’s StratCom Centre of Excellence explained how Russian and pro-Kremlin disinformation outlets use humour to discredit Western political leaders.

One of its authors, the Latvian scholar Solvita Denisa-Liepniece, has suggested the term ‘hahaganda’ for this particular brand of disinformation, which is based on ridiculing institutions and politicians.

The grotesque feature of hahaganda is that it is very hard to defend yourself against it. There is no point in protesting. The joke is not supposed to convey factual information. It’s a joke! Don’t you guys have humour? Do you have to be so politically correct all the time?

The goal of hahaganda is not to convince audiences of the truth of a particular joke, but rather to undermine the credibility and trustworthiness of a given target, such as an individual or an institution, via constant ridicule and humiliation. At times, hahaganda takes a truly morbid turn, when the pro-Kremlin disinformation machinery decides to turn an attempted political assassination into a laughing stock.

Read more on Hahaganda here.

One more key narrative in pro-Kremlin disinformation: "Nazis"

Nazis East and West, but mostly in Ukraine

For many years, Russian state-controlled media have claimed that various states and entities are ruled by Nazis or permeated by Nazi ideology. In the jargon, ‘Nazi’ and ‘fascist’ have become synonyms. The examples are many and well documented in the EUvsDisinfo database since 2015: Moldova is ruled by Fascists, and so are the Baltic States and Poland. Europe is ‘supporting’ Fascism and so is the European Parliament. Speaking in Vladivostok six months into Russia’s invasion of Ukraine, Putin even speculated that the EU’s foreign policy chief, High Representative Josep Borrell, ‘would have been on the side of the fascists had he lived in the 1930s’ because the EU is supporting the fascists in Kyiv.

But for the Kremlin, Ukraine – a country where millions died fighting Nazism in WW2, where the Nazi ideology is banned(opens in a new tab), and that is currently governed by the grandson of a Holocaust survivor(opens in a new tab) – is the most ‘Nazi’ of them all. The EUvsDisinfo database contains nearly 500 examples of pro-Kremlin disinformation claims about ‘Nazi/Fascist Ukraine’. It has been a cornerstone of the Kremlin’s propaganda since the Euromaidan protests in 2013-2014, when the Kremlin sought to discredit pro-European protests in Kyiv and, subsequently, the broader pro-Western shift in Ukraine’s foreign policy as a ‘Nazi coup’.

This is because in the Kremlinverse, ‘Nazis’ and ‘Nazism’ are in no way linked to the actual history or ideology of National Socialism or fascism, nor to contemporary manifestations of far-right ideas. Rather, anyone deemed hostile to Russia or the idea of ‘Russkiy Mir’ – a geopolitical project of uniting the Russian-speaking world under the sceptre of the Kremlin – is labelled a ‘Nazi’. First and foremost – Ukraine.

Glossing over history, hiding the Nazi contacts

The potency of the ‘Nazi’ narrative for Russian audiences is no accident. It is something that the Kremlin has been building systematically for years. From the perspective of the Kremlin, history is not something to be remembered and studied; it is something to be managed. This is how historical memory was turned into a tool to fulfil the Kremlin’s geopolitical ambitions.

For years, Russian state-controlled media along with politicians – from Putin himself(opens in a new tab) to the now deceased right-wing extremist Vladimir Zhirinovsky – hammered down the notion that only the Soviet Union had genuinely fought the Hitler regime. All others in the West were not really contributing but somehow provided a space for Hitler. By slinging history-laden accusations against the West, Russia weaponised history, much in the same way it has weaponised its media as well as energy, food exports and trade in a broader sense.

It is difficult to see how this twisted perception of history squares with the fact that the Soviet Union and Nazi Germany had a Pact of non-aggression – the Molotov-Ribbentrop Pact, accompanied by secret protocols and significant trade relations, from August 1939 until July 1941. As documented by historian Roger Moorhouse(opens in a new tab), this equals a third of the duration of the WW2, during which Hitler’s Nazi regime had a free hand and built its war machine with Soviet imports. The Soviet Union aided significantly the training(opens in a new tab) of Hitler’s modern war machine(opens in a new tab) and provided critical resources and fuel for Nazi Germany.

However, pointing out these facts, let alone the atrocities committed by the Red Army, including the occupation of large swathes of Europe, is a taboo in Russia. For several years now it has been a criminal offense in Russia to engage in what is termed ‘staining the reputation of the Red Army’s heroic deeds 1941-1945’. Since 2021, it is now also criminal(opens in a new tab) to ‘insult war veterans’. Memorial International, an NGO that has documented the repression and human rights violations under Stalin’s regime, was harassed for years and finally dissolved(opens in a new tab) just weeks ahead of Russia’s invasion of Ukraine.

The Kremlin has silenced critical research by historians and stifled all free debate. With a free hand to manipulate and instrumentalise historical memory, the Kremlin stoked and appropriated feelings of heroism and national pride, to define itself as the sole force of resistance against Nazism now, throughout history, and into the future. From this, the Kremlin continues to draw its legitimacy in Russia, fuelling its neo-imperial ambitions abroad. The invasion of Ukraine, which Putin infamously attempted to justify as ‘de-nazification’, is the most extreme example of this.

Now – Nazi battle cry all over the place

The Kremlin has deployed “fighting Nazism” as a battle cry in Ukraine, with all the horrific consequences. In the months preceding the invasion, Russian state-controlled media went into overdrive to portray Ukraine as a ‘Nazi state’.

This was a signal for its affiliates and subjects that from that moment on, all means were allowed, as they justified the ultimate end of a new victory over ‘Nazism’ – this time in Ukraine. This is how the Kremlin’s propagandists tried to justify Russian atrocities in Ukraine that they couldn’t deny, including the bombing of a maternity ward in Mariupol. This is a justification we are likely to see again, as the full picture of Russian atrocities committed under Russian control in the Kharkiv region emerges.(opens in a new tab)

Nazi = Dehumanising Ukrainians

It is bad enough to weaponise history, but the Kremlin has taken it a step further. The ‘Nazi’ narrative has been used to dehumanise Ukrainians. The genocide-inciting rhetoric about everything Ukrainian has moved from fringe to Russian mainstream, e.g., to Russian state-controlled news agency RIA Novosti. And the effect is more brutality on the battlefield and against civilians.

The lack of success on the battlefield led the Kremlin and its propagandists to call not just for getting rid of the ‘Nazi junta’ in Kyiv, but for the large-scale ‘denazification(opens in a new tab)’ of Ukraine, which was to take generations. In the blink of an eye, the Kremlin expanded the circle of ‘Nazis’ – from the Ukrainian authorities to the whole population and anyone who supports Ukraine. And as Ukraine continues to liberate occupied territories, the rhetoric on the Russian side keeps hardening and getting more extreme. In recent days, prominent Russian voices have called for(opens in a new tab) pushing 20 million Ukrainians from their homes, destroying region after region in Ukraine, civilian and critical infrastructure, and suggested that the laws of armed conflict are just recommendations and should not restrict the total war.

Can there be a world without Nazis?

Can there be a world, a Ukraine, where the Kremlin and dominant layers of Russia do NOT brand Ukrainian leaders as Nazis?

This question is relevant in light of the current intense debate inside Russia, including in leading circles, about the failures of the Russian armed forces around Kharkiv and Kherson. A few voices(opens in a new tab) who have been allowed to talk on Kremlin-affiliated key TV(opens in a new tab) are beginning to wonder if it is productive to keep denying that Ukrainians exist as a people and nation. That branding Ukrainians may be counterproductive. The majority, Putin included, is, however, still throwing the Nazi label at everybody. For now.

If you want to follow the trail see also our article from 2017: “Nazi east, Nazi west, Nazi over the cuckoo’s nest” and an overview of the Kremlin’s historical revisionism attempts: In the shadow of revised history.

Read more

Modus Trollerandi


How Democracy is SWAMPED: Seven Cheap Tricks

Since 2015, EUvsDisinfo has detected, documented and debunked disinformation from pro-Kremlin media. All individual cases are collected and available in a large and steadily growing database.

A consequence of EUvsDisinfo having followed pro-Kremlin disinformation outlets for several years is that certain patterns appear. EUvsDisinfo have, on earlier occasions, observed storytelling as a means of persuasion and certain fixed narratives; we have scrutinised fears and phobias exploited by the disinformation outlets.

The Rhetoric of Disinformation

This article will demonstrate a few of the cheap tricks of the Rhetoric of Disinformation; how the producers of disinformation systematically derail an exchange of ideas – the core of democracy – through a set of handy devices. The Kremlin trolls get the public discourse bogged down in a quagmire of pointless contestation. A Swedish activist has coined the term Modus Trollerandi to describe ways of trying to spoil public debate through a set of cheap tricks. EUvsDisinfo has developed the concept further, to show how democracy gets SWAMPED by malign manipulation:

S: Straw Man:
Attack views or ideas, never expressed by the target.

W: Whataboutism:
Deflect the discussion away from the subject.

A: Attack:
Use brutal language to discourage the opposition.

M: Mockery:
Use sarcasm to belittle the opposition.

P: Provocations:
Who benefits from Cui Bono?

E: Exhaust:
Drown the opposition in details and technicalities.

D: Denial:
Flatly deny any evidence

The common feature of all these devices is that they exclude any possibility for dialogue. Dissent is at the core of democracy; a democratic society moves forward through debate, discussion, compromise, and attempts to find common ground and agree on an acceptable solution. The Kremlin trolls are not interested in “questioning more”: they want to control the discourse; manage our thoughts; and shut down dissent. They attack the core of democracy: the concept of a respectful public dialogue.

Identifying the cheap tricks of Kremlin demagoguery helps us preserve a culture of dissent and questioning in the democratic discourse. We can stay on topic and continue talking about pressing issues instead of falling into the traps of the Kremlin’s spin doctors.

S for The Straw Man

The straw man is a rhetorical device where the troll attacks views or ideas never expressed by the opponent. EUvsDisinfo has occasionally highlighted the method in articles: sometimes using “neoliberalism” as a straw-man; on other occasions suggesting very sinister forces behind activists. Russian state broadcaster RT has claimed that Swedish teenage eco-activist Greta Thunberg wants to “reduce the world population” and argues loudly against such malevolent Malthusian plans.

The Modus Trollerandi of straw men is convenient and effective. From the perspective of classical rhetoric, it is an informal fallacy – both logically and factually false. That does not exclude its effectiveness as it appeals to the emotions of the audience: “Are you really going to allow Greta Thunberg to kill you and your children to save the rich people of the world?”

The EUvsDisinfo database contains several examples of the straw man device. The above-mentioned “Malthusian Straw Man” has been popular during the pandemic. Accusations of “Satanism” appear occasionally, suggesting democracy is a scheme to divert Europeans from God, decency and sound traditions.

The most prominent straw man exploited by the pro-Kremlin media are accusations of Nazism/Fascism. Any kind of criticism towards Kremlin policies might be labelled “Fascist”. Actual Fascism as in Italy during Mussolini or Spain during Franco, meanwhile, is met with understanding and sympathy. The EUvsDisinfo database contains numerous examples of virtually anyone being labelled a “Nazi/Fascist”. For instance, Russia-based SouthFront calls the German Green Party candidate for Chancellor and outspoken critic of the Kremlin, “An American Nazi”. Earlier, Ukraine has been labelled as “Nazi”, The United States is “Fascist”, France was an ally of Hitler, the Baltic States and Poland are ruled by Nazi sympathisers… In all these cases, Nazism functions as a straw man: the Kremlin can act as a fighter against Fascism, while ignoring any critical remarks.

Another popular straw man is “neoliberalism”. A vague, unpleasantly-sounding label that can be glued on any form of dissent.

The antidote against this device: stay on track! Refuse to take the bait! Continue insisting on talking about the core issue.

W for Whataboutism

This device was frequently employed in connection with the Belarusian president Alyaksandr Lukashenka’s act of air piracy in late May. The Kremlin spin doctors made furious attempts to deflect the news from the incident, claiming that sending fighter jets to intercept civilian aircraft and forwarding false bomb threats were common practice. This claim has been repeated ad nauseam (a device to be covered in a later article). Examples can be found here, here and here.

Whataboutism is a rhetorical device to deflect attention away from an unpleasant issue. The Oxford English Dictionary defines it as:

The technique or practice of responding to an accusation or difficult question by making a counter-accusation or raising a different issue.

The method is very popular in the pro-Kremlin disinformation ecosystem. “Russian military shot down a passenger plane? Well, what about all the aircraft the US has shot down?“Eastern Ukraine? Well, what about all the countries in whose affairs the US has interfered?” “Demonstrators detained in Russia? Well, what about the police violence in Europe and the gilets jaunes? What about the persecution of the participants in the peaceful protests in Washington?”

The Tu Quoque Fallacy

Classical rhetoric describes this technique as a version of the tu quoque fallacy. In Russian, it is “сам дурак” – “you are the idiot!”. This approach has deep roots in Soviet rhetorical traditions, and the Kremlin disinformation ecosystem often uses it. EUvsDisinfo has described the method in several articles – here, here and here, for instance.

Whataboutism is an efficient cheap trick, and it has been a core element of Kremlin disinformation for a very long time. Even at the highest political level, instead of replying to questions on mass detentions of peaceful demonstrators, Russian officials apply whataboutism instead of dialogue:

We sent compilation of materials about how detentions are carried out and peaceful protests are broken up, how the police act in EU countries to Mr. Borrell so that he and his delegation could have a chance to answer many questions themselves before posing them to us.

Whataboutism is an aggressive means of assuming control over a debate. Through whataboutism, the Kremlin deflects criticism and turns the accusation against the opposition. The attacked party is forced into a defensive position and the Kremlin claims moral high ground.

The antidote to whataboutism: stay on track, and insist on continuing debate on the core issue.

A for Attack

Overblown language and exaggerated statements are to some extent standard practice in any polemical debate, e.g.: “The proposal of the opposition is plain madness!”; “The prime minister has lost her last piece of credibility”. But democratic discourse usually refrains from ridicule, dehumanisation and abuse. The democratic conversation is ideally connected to a fundamental respect and fair play between participants.

While most of the Kremlin’s cheap tricks are aggressive, the goal in this case is to discourage the opposition from continuing the conversation. Kremlin media are prone to label dissidents as “fascists”, “extremists” and even “paedophiles”. The Russian Foreign Ministry spokesperson. Maria Zakharova, frequently does this in her statements:

The russophobic epileptic seizure in the Swedish parliament.

Statement of Maria Zakharova, Facebook, 2 June 2021


Our western partners live in a world of fantasy. I think they only see what does not exist and cannot see what stands in front of them. This is a strange ability – to be able to live in a world of illusions. My impression is that the collective West lives in a world of illusions.

RT, 23 April 2021

According to Ms. Zakharova, Russia acts impeccably, pragmatically and is above reproach. Criticism of Russian activities is nothing but a mental deviation, a sickness. A strange kind of epilepsy. Fantasies.

The EUvsDisinfo database contains several examples of this trick. Pro-Kremlin outlets describe the West as “satanists” and “perverts”. Russian state media suggest political decision-making is based on “mental illness”. The Swedish activist Greta Thunberg is a “victim of political paedophilia”.

As with most other of the Kremlin cheap tricks, the aim is to deflect. Instead of commenting on the issue, the opponent is attacked and dehumanised. The purpose is to discourage the opponent. Silence the conversation.

Of course, this method is not only used by the Kremlin. It is also frequently employed by Belarusian TV. The channel broadcasts a programme called “The Order of Judas”, and the host, Ryhor Azaronak, explains the purpose of the show:

“When nothing is clean, when the Liberal filth has covered almost everything, the world stands on the verge of extinction. Belarus is still standing. Thrice cursed is the son of Judas; may he suffocate from his love of silver. This is The Order of Judas. We talk about those who have forgotten about the good, and soiled their lives with the sin of betrayal.”

The antidote to this device: ignore the attack. Continue insisting on answers to the core issues, as such an attack clearly demonstrates the absence of viable arguments.

M for Mockery

Like a crazy person shooting deadly flaming arrows are those who deceive their neighbour and say, “Hey, I was only joking!”

Proverbs 26:18

Sarcasm, mockery, ridicule – “hahaganda” are well-functioning means of gaining advantage in a debate. EUvsDisinfo has frequently described the device in previous articles: here and here, for example.

Sarcasm, jokes, exaggerations can all be perfectly fine in a debate, but in the Kremlin’s case, it is a core element of the rhetoric. The Foreign Ministry spokesperson, Maria Zakharova, uses it as a staple in her weekly briefings, when ironic remarks about “our Western partners” and “our esteemed colleague” are used condescendingly, with the intention to belittle and defame.

“This is not the first time that the daily has published blatant nonsense.”

Briefing by Foreign Ministry Spokeswoman Maria Zakharova, Moscow, 4 March, 2021

“All the nonsense that is now being issued publicly or through EU institutions’ accounts on social media, we will respond to it with clear argumentation and truthful factual information.”

Briefing by Foreign Ministry Spokeswoman Maria Zakharova, Moscow, 4 March, 2021

“Many, unfortunately, still believe in this inculcated nonsense.”

Briefing by Foreign Ministry Spokeswoman Maria Zakharova, Moscow, 10 December, 2020

“Repeated thunderous statements that no one, except Moscow, could perpetrate this cyber-attack because it is impossible to implement such an attack without using a special state resource are absurd. This amounts to a pseudo-legal position and nonsense.”

Briefing by Foreign Ministry Spokeswoman Maria Zakharova, Moscow, 4 June, 2020

“The British government has come to the correct conclusion that the term “highly likely” no longer makes any sense.”

Briefing by Foreign Ministry Spokeswoman Maria Zakharova, Moscow, 16 July, 2020

“Most likely, these fantasies (and this is how they should be qualified) should be considered in the context of London’s tactics of accusing Russia in their preferred “highly likely” style we know so well.”

Briefing by Foreign Ministry Spokeswoman Maria Zakharova, Moscow, 16 July, 2020

In 2017, the NATO Strategic Communications Centre of Excellence in Riga published a report on the systematic defamation of foreign political leaders on Russian state-controlled TV. While satire and humour is a fundamental element of public discourse in a democratic society, the satire on the Russian TV programmes deliberately intended to defame and belittle. The same approach was employed during the demonstrations in Russia in January 2021: Pro-Kremlin media outlets and pundits carefully labelled the hundreds of thousands of protesters “Navalny cubs and Mommy’s revolutionaries”.

Mockery is a powerful device and very hard to defend against. Mockery has to be endured, while not being diverted from the core issues. Kremlin humour is no laughing matter.

P for provocations

Occasionally, the purpose of a rhetorical device is to occupy the information space. This is in line with military theory, outlined in the Journal of the Russian Institute of Strategic Studies:

“A preventively shaped narrative, answering to the national interests of the state, can significantly diminish the impact of foreign forces’ activities in the information sphere, as they, as a rule, attempt to occupy “voids” [in the information flow].”

Russian military in Syria habitually forward claims that anti-Assad groups are planning chemical attacks against civilians, in order “to stage a provocation”, i.e. to blame the Syrian regime’s troops.

Moscow, 23 May. INTERFAX. “On the eve of the presidential elections in Syria, militants from the terror group Hayat Tahrir Al-Sham (forbidden in Russia) are preparing provocations with the use of poisonous substances in the Western parts of the Idlib province. So reports the deputy head of the Russian Reconciliation Centre for Syria, Rear Admiral Aleksandr Karpov.”

Moscow, 20 February. RT. “The deputy head of the Russian Reconciliation Centre for Syria, Rear Admiral Vyacheslav Sytnik reports that militants in the North-East part of the Idlib zone are preparing a provocation, using poisonous substances.”

Moscow, 14 October. TASS. “Militants are preparing provocations with the use of poisonous chemical substances in the southern parts of the Idlib zone of de-escalation in Syria. So reports the deputy head of the Russian Reconciliation Centre for Syria, Rear Admiral Aleksandr Grinkevich.”

Moscow, 23 September. Izvestiya. “Terrorists are planning a provocation, using poisonous substances in Idlib in order to blame the government of Syria for using chemical weapons against the civilian population. This was reported Wednesday by the deputy head of the Russian Reconciliation centre for Syria, Rear Admiral Aleksandr Grinkevich.”

A provocation is, in the pro-Kremlin disinformation outlets’ toolbox, an operation performed by the opposition, targeting their own, in order to justify an attack. Other terms for the same concept are “set-up” or “false flag operation”. A similar idea is “the sacrifice of the sacred”. The EUvsDisinfo disinformation database contains numerous examples of the device: Navalny was poisoned by Western intelligence; MH17 was shot down by the Brits, The Democrats set up the January riots in Washington DC and, most recently, Lukashenka fell into a trap, rigged by British Intelligence.

The statements are, as a rule, never actually intended to convince anyone and the predictions are 100 per cent inaccurate. They aim at setting up an element of uncertainty, despite a complete lack of foundation or proof for the claims. In some cases, the “provocation device” might tempt professional media to suggest “alternative versions”. The world is a globe, but there might be some truth in the flat Earth theory.

E for Exhaust

An efficient way of destroying a debate is to endlessly bring forward technical details, peripheral elements of earlier statements and simply put effort in to avoid the core issue. The term “sea-lioning” appeared in a comic 2014, where a sea lion with exaggeratedly polite manners intrudes into a conversation, demanding explanations to an earlier statement. The point of this device is to achieve a position as a persistent hunter for the truth.

This method was on display in connection with the act of air piracy, committed by the president of Belarus, Alyaksandr Lukashenka. Pro-Kremlin outlets compared it with an incident in 2013, when US authorities demanded the landing in Vienna of a plane carrying the Bolivian President, Evo Morales. In this case, there were no false bomb-threats and no fighter jets intercepted the aircraft. The Vienna incident is still used as a comparison, however, the unique features of the Lukashenka act totally disregarded. Examples can be found here, here and here.

138 and Counting

Another means of exhausting the opponent in a debate is by simply flooding the information space with versions, conflicting theories and fake details. For example, King’s College London has collected 138 separate and contradictory narratives on the Skripal poisoning. A similar method is on display regarding the attempted assassination of Navalny, and the shooting down of MH17: new lies, new attempts to deflect… eventually one gets exhausted and is reluctant to engage in a debate on the issues.

Herein lies the goal of the Kremlin’s methods: the less the facts around MH17, the Skripals and Navalny are talked about, the better it is for the Kremlin. Permanent attacks, persistent denials of apparent evidence will eventually discourage criticism.

D for Denial

Among the deflective methods of the pro-Kremlin disinformation outlets, denial is, arguably, the favourite. EUvsDisinfo has over 500 examples of statements in pro-Kremlin outlets, containing the words “There is no proof”. In the pro-Kremlin ecosystem, “no proof” means “no proof we accept”. Any evidence, not in line with the Kremlin narratives, does not exist. Evidence, be that witness accounts, hard evidence, poison residue, photos or videos, or even detained individuals are “no proof”. Any individual, NGO or authority, forwarding data that raises questions about the Russian authorities’ accountability or links it to crimes, is, according to the Kremlin, biased and the evidence has no value. Even the United Nations is biased against Russia and Belarus.

A special case of denial is Russia’s persistent repudiation of deploying military units in Crimea in the spring of 2014, suggesting that the military there were local activists, purchasing equipment at army surplus stores. President Putin later acknowledged that the personnel were from Russian regular units, brought to Crimea to control the “referendum”.

Denial, despite mountains of evidence, can be effective, especially if combined with other tricks from the rhetorical toolbox.

The Limits of the Cult of Denial

The “cult of denial” occasionally creates problem for the Kremlin. Even the editor-in-chief of the Kremlin disinformation mouthpieces Sputnik and RT, Margarita Simonyan, doubts the value of blunt denial:

“There are two religions as to how a large and demanding country should behave itself in case of a disastrous screw-up. The first suggests that the country should enter into a state of strong denial; admit to nothing, atone for nothing. Otherwise everything will get much worse. Well, just because. Most responsible comrades in most powerful countries, including ours, adhere to this one.

The other religion suggests that one should act like Iran. This is closer to me. For simple human reasons. In my system of values, Iran acted like a real man.”

Iran admitted to accidently shooting down a Ukrainian passenger aircraft, quickly issued apologies and reparations. The incident was a tragedy, but the story soon left the front pages of the world media. The US acted in a similar way when an Iranian civilian plane was shot down by a US naval vessel in 1996.

Evidence in the MH17 case has been collected by investigators, NGOs and authorities and is being examined by a court of law in the Netherlands. The body of evidence points to Russian state structures. Russia continues to deny, deny and deny, and the case has poisoned Russia’s relations with virtually the rest of the world.

This concludes the series of the Kremlin’s cheap tricks. Keep your eyes on the core issue, don’t get SWAMPED!

Read more

Why Authoritarians Love the Concept of the Big Conspiracy


The idea of a Shadow World Government has always been very popular among conspiracy theorists. Its manifestations might be different, but generally the concept conjures up the image of a small group of men, deciding the fate of the world behind the scenes; puppet-masters, covertly controlling the world. Rulers of the rulers. EUvsDisinfo has described such cases frequently; here, for instance, or here.

The appeal of conspiracy theories is that they are virtually impossible to debunk. The premise on which they are based is the fact that everything is going on behind the scenes; behind veils, covertly, in the shadows, carried out by the world’s most powerful people with limitless resources. The complete absence of proof of the existence of the Shadow Rulers is actually the proof that they exist and that they are successful. We have encountered this logic in other cases as well. Any attempt to disprove the concept will be met with the counter-claim – “Yes, of course you deny it; you are also part of it!”. As the Shadow Government, the Deep State, by definition is covert, it must, logically, be malign. Why would they hide if they weren’t into something bad?

The concept is usually cultivated among die hard conspiracy theorists, and it occasionally appears in popular culture. We find conspiracy theories involving powerful people in James Bond movies, in Tintin comic books, in novels like the Da Vinci Code by Dan Brown or Science Fiction thrillers with Reptilians, ruling the world. It is a potent element for creating an enticing plot in thrillers.

And it is a potent element in propaganda too. One of the most odious examples of the use of this concept for propaganda purposes is The Protocol of the Elders of Zion. Published in 1903 in Russia, the text describes a Jewish plan for global domination. The text has been proved to be a forgery made by the Tsarist Secret Police in Russia, but it is still distributed in anti-Semitic circles as a “proof” of a Jewish plot to rule the world.

As late as 2016, one columnist of the Russian daily Komsomolskaya Pravda referred to the “Elders of Zion” in an article describing how the Brexit referendum in Britain and the election of Donald Trump in the US proved that the “Shadow government” was really not omnipotent. The same publication has on several occasions brought forward the idea of “The Rotschilds” controlling the world.

Recently, the same paper gave space to an elaborate conspiracy theory that the Yellow Vest movement was the manifestation of a struggle between Donald Trump and “The Rotschilds”, the latter using the alleged threat of global warming to promote their own pecuniary interests: Trump (and Komsomolskaya Pravda) knows the Paris Climate Agreement is just a waste of resources.

A recent study demonstrates how today’s Russian media has become more and more prone to referring to various conspiracy theories. And this has an impact on the audience. A poll performed in the summer of 2018 by the State Russian Polling Institute VTsIOM states that two thirds of the Russian population are convinced of the existence of a “World Government”. Most of the respondents are also convinced that this World Government is hostile towards Russia. Interestingly enough, the number of “believers” has risen from 45 percent to 67 since Russia’s illegal annexation of Crimea.


More on conspiracy theories in Russian TV

The charm of the Shadow Government Theory is that it can be filled with anything you want. Catholics, bankers, Jews, Feminists, Free-Masons, “Big Pharma”, Muslims, the Gay Lobby, Bureaucrats… all depending on your target audience. You never need to prove anything and all the grievances of the audience can be explained – the Gay Lobby/Jews/Bureaucrats/Bankers have stolen our money, are developing germ weapons, instigating wars, installing presidents

The goal of the Shadow Government narrative is to question the legitimacy of democracy and our institutions. What is the point of voting, if the Shadow Government already rules the world? What is the point of being elected, if the Deep State resists all attempts to reform? We, as voters, citizens and human beings, are disempowered through the Shadow Government narrative. Eventually, the narrative is designed to make us give up voting or practising our rights to express our views.

Promoters of the Shadow Government narrative depict themselves as “straight talkers”, dissidents; heroes of the public discourse… “Question everything!” they call. The irony is that, in reality, the shadow conspiracy is based on blind belief in authority.

Further reading

EU Activities to Combat Anti-Semitism

Anti-Defamation League on "The Protocols of the Elders of Zion"

November Surprise: an Anti-Semitic Anniversary

Antisemitism and pro-Kremlin propaganda

Everyone against Russia

Read more

When Words Kill – from Moscow to Mariupol


For the first time, the world will observe the International Day for Countering Hate Speech on 18 June. This follows the UN General Assembly resolution from July 2021 on “promoting inter-religious and intercultural dialogue and tolerance in countering hate speech”. The UN invites governments, international organisations, civil society groups, and individuals to hold events and initiatives promoting strategies to identify, address and counter hate speech.

Hate Speech - What are we talking about?

In the words of the UN, hate speech is any kind of communication in speech, writing or behaviour that attacks or uses pejorative or discriminatory language with reference to a person or group on the basis of who they are, in other words, based on their religion, ethnicity or affiliation.

Hate speech is dangerous. Again, in the words of the UN:

“If left unchecked, hate speech can even harm peace and development, as it lays the ground for conflicts and tensions, wide-scale human rights violations.”

Hate speech and dehumanisation

Hate speech has two “traveling companions” – disinformation and media manipulation. Russia’s war against Ukraine demonstrates the deadly effect of hate speech, as it has served to dehumanise the opponent, in this case the legitimate, elected government in Kyiv and the wider Ukrainian population.

Once the foe is dehumanised, soldiers on the battlefield do not fight another person like you and me, but rather a lower-ranking group.

Few would have expected that, by the middle of 2022, key Russian leaders as well as trend-setting media and opinion formers would openly embrace genocidal views or call for people “to disappear”. What dynamics propela literate society, a rich culture, with platforms for information and exchange of views, social media etc. to take such a trajectory?

One obvious – but uncomfortable – answer: humans can be manipulated. Primal group dynamics kick in, especially when repeatedly exposed to emotionally charged disinformation. Our group versus ‘them’. (See our article on 5 common narratives.)

Sadly, European and world history is rich in such examples. Hate speech has been the precursor for atrocities in several wars. Recent examples include the Holocaust, the Rwanda 1994 killings, the 1990’ies wars in the former Yugoslavia where Vukovar, Srebrenica and Kosovo are just some recent reminders of how quickly erstwhile neighbours can turn on each other ending in brutal killings.

How hate speech dominated Kremlin speak during the last 12 months

Since 2014, use of the term “Nazi” has been frequent as our database illustrates, but the general tone has also escalated.

July 2021: Putin's long article: About a year ago, on 12 July 2021, the Kremlin published an article signed by Putin: ‘On the historical unity of Russians and Ukrainians’. In essence, Putin’s claims can be summarised as follows:

- Much of Ukraine is land robbed away from “Historic Russia”

- The Ukrainian nation is an artificial idea and Ukrainians are basically brainwashed Russians because

- Ukraine is led by “radicals and neo-Nazis” who are “instruments” of the West [US, NATO, EU]

The “Nazi / neo-Nazi” slogan features in five different parts of the text. Nazi accusations have been used since the Maidan revolution in 2013 to de-humanise Ukrainians and capture all evil in one word.

Putin’s article has been distributed to soldiers in the Russian army in what appears to be a modern version of the political education of soldiers in the former Soviet army.

24 February 2022: War In the weeks leading up to the new invasion on 24 February, Putin and the Russian government often spoke about Kyiv wanting to “attack and wage a genocide against Donbas Russian-speakers”. On 24 February, Putin presented the operation as “self-defence” to “de-nazify Ukraine”. “Nazi” is becoming like an order to fire for the soldier with the weapon.

16 March: To prevent the evil ones from using nukes In Putin’s speech on 16 March, the denunciation went one notch up: the “Neo-Nazis” in Kyiv are preparing attacks with chemical and biological weapons, anthrax or something similar. Soon Kyiv will have nuclear weapons ready to use against Donbas and Russia. But Putin presents himself as the tamer of Neo-Nazis.

It is during March that atrocities in areas under Russian control gain momentum, as documented by the OSCE.

3 April: RIA Novosti praising genocide Russia’s largest news service, state agency RIA Novosti, published a leading article by the Kremlin-affiliated, high-profile intellectual and film-maker Timofey Sergeytsev, where he calls for action in Ukraine which can only be considered genocidal: No mercy on the battlefield, mass repression, ethnic cleansing à la Stalin. The message then spread to the main Russian TV channels and online media.

The terms “Nazis” and “Nazism” are again used overwhelmingly in the article to label anything associated with the Ukrainian state, the Kyiv government or the Ukrainian authorities. The plan calls for the destruction and liquidation of all “Nazis”, and for mass repression against Ukrainians. Apart from strict censorship on any Ukrainian voice, and the introduction of Russian laws and culture, the aim is to ban even the name Ukraine and the term Ukrainian itself. To make Ukraine disappear.

7 June: Dmitry Medvedev wants “them to disappear”RIA Novosti’s leading article is not far from the thinking of former president, once liberal reformer, Dmitry Medvedev, now deputy chair of the Russian Security Council. In a Telegram post, he trumped up earlier harsh belligerent language, sending this clear message: “They are bastards who want death for Russia. I hate them and will do everything to make them disappear”. Although not explicitly mentioning Ukraine, it is a fair guess that Medvedev had Ukrainians in mind.

13 June: Ukraine, an “existential threat to Russia”

Medvedev’s words have been seconded by more explicit calls to violence by another top Russian official, Dmitry Rogozin, former Deputy Prime Minister, former Russian Ambassador to NATO and now head of Roskosmos, the Russian space agency. On his (verified) Twitter account he proposes to put an end to Ukrainians “once and forever”. Ukraine, or in Rogozin’s words “what appeared in the place of Ukraine”, represents “an existential threat to the Russian people, Russian history, Russian language and Russian civilisation”. (See also here)

“Existential threat” is also a key word in the Russian doctrine for the eventual use of nuclear weapons, as such a threat must be made against Russia to trigger a first-use.

15 June: Dmitry Medvedev (again): “Ukraine may not exist in two years”

Medvedev made another Telegram post on 15 June when ridiculing talks on eventual US liquefied natural gas (LNG) for Ukraine under a form of two year Lend Lease. Provocative, he questioned: “Who said that in two years Ukraine will even exist on the world map?” This continued maelstrom of “disappear / cease to exist / hate them” is one big green light for soldiers to go ahead with any action one could imagine.

Toxic TV

The Russian state TV and pro-Kremlin outlets have long been a toxic voice with a very militaristic and revanchist agenda. From talk shows like the popular “60 Minutes” where pundits overdo each other (follow highlights here) to regular TV news which constantly drum out the “nazi”-stories. This enforces an already hateful atmosphere.

The young soldier on the battlefield – when hate speech gets applied

The hate speech flows directly down through the military hierarchy to the soldier at the front line who consume news and inputs like anyone else. Most of the Russian soldiers sent to war in Ukraine are young. They have known only one leader of Russia since their childhood: Putin. Education in Russian schools and later in the army has put him on a pedestal. Eight years have passed since 2014, the illegal annexation of the Crimean Peninsula and the start of the Donbas war. What goes on in Russia is approved by him or thanks to him. This is close to a personality cult, perhaps best captured in the 2014 words by then Kremlin Deputy chief of staff Vyacheslav Volodin, now Chairman of the Russian State Duma: "There is no Russia today if there is no Putin."

After months of fierce fighting, after having lost battles and withdrawn from Kyiv, having heard almost apocalyptic-like nuclear sabre rattling every other day, the words “have them disappear” is likely perceived in such a way that the enemy is every Ukrainian; they must be killed and annihilated.

On the International Day for Countering Hate Speech let us recall the words of the Duke of Wellington after the 1815 battle of Waterloo, “Nothing except a battle lost can be half so melancholy as a battle won” – illustrating that even the most victorious war commanders were able to feel humility and regret for lost human lives.


Read more

Tactics, Techniques and Procedures of Disinformation

For years, work against disinformation used to revolve around a few central questions: is a piece of information true or false? If it is false, is it accidentally or intentionally so? If it is intentionally false or misleading, what is the purpose of its creator or amplifier? Let’s call this is a content-based approach to the problem – a way of monitoring, detecting, and analysing disinformation that is largely focused on the content.

While content is and will remain an integral part of all information manipulation operations, focusing just on that most visible part does not give us a full picture. This is why we have been pivoting to an approach that also includes the analysis of behaviour in the information space. Central to the approach of detecting, analysing, and understanding foreign information manipulation and interference, including disinformation (FIMI), is an ever-evolving set of tactics, techniques and procedures (TTPs).

The logic of TTPs explained

Using TTPs to identify and analyse patterns of manipulative behaviour is far from new. Confronted with similar challenges, the cyber- and information security community developed the Att&ck framework back in 2013. It addressed a complex challenge by providing a structure for organising adversary TTPs that allows analysts to categorise adversary behaviours and communicate them in a way that is easily understandable and actionable by defenders.

Based on the Att&ck framework, the cross-Atlantic DISARM Foundation (in collaboration with Cognitive Security Collaborative, Alliance 4 Europe, and many others) set up a similar framework for information manipulation – the DISARM Framework. It is a free and open resource for the global counter-disinformation community. It is not the only one out there, but it is currently one of the most advanced of its kind. In the simplest terms, it provides a single, standard language for describing information manipulation tactics, techniques, and procedures.

Examples of TTPs

The DISARM Framework organises all the TTPs that are known to be used in information manipulation operations – currently standing at about 250 – into an easily comprehensible system. The framework spans across 12 tactical steps of an information operation from planning the strategy and objectives to developing narratives and content and delivering a final assessment.

The TTPs mapped in the framework cover everything from well-known techniques (e.g. creating fake accounts, building bot networks, amplifying conspiracy narratives, using fake experts etc.) to less often talked about ones (e.g. exploiting data voids, utilising butterfly attacks, spamouflaging etc.). The list of TTPs in the framework is far from final as malign actors keep innovating and the threat landscape keeps evolving. Thanks to the fact that the DISARM Framework is an open and joint effort, it is easily modifiable to keep up with the latest insights and trends in information manipulation.

Needless to say, not all information operations include all the phases laid out in the framework, let alone the 250 or so TTPs listed. The idea of the framework is to map out and present a complete picture that we can then use to analyse and systematise an information manipulation operation.

Analysing the behaviour of malign actors by no means implies that the content of information manipulation operations loses its relevance. Quite the contrary – looking at both the behaviour of malign actors and the content used in their operations gives us a much better understanding of the overall threat landscape. Furthermore, approaching the problem equipped with an organised set of TTPs makes information manipulation – an infamously elusive concept – much more measurable. The additional benefit of a common language of clearly defined TTPs is that the work of analysts worldwide becomes more comparable and interoperable.

Addressing information manipulation as a behavioural problem enables us to come up with responses that are targeted, scalable, more objective, and go beyond awareness raising and the debunking and prebunking of misleading or false narratives. A malign actor who wishes to manipulate the information environment needs to follow certain TTPs that we can now understand, detect, and make more costly even before reactive responses become necessary.

Bullshit, the noisy conqueror of the information space


In 2018, we wrote about a special kind of pro-Kremlin media content: infoshum.

The word infoshum has its roots in the internationally known term “white noise”, i.e. random and meaningless noise, or otherwise known as “info-noise”. It is in the grey zone between information and disinformation. And we have evidence that it is actively being pushed by pro-Kremlin media.

An important concept to help us understand infoshum is bullshit.

Interestingly, there actually exists a theory of bullshit and it has implications for disinformation.

This theory was presented in the book “On Bullshit”, by philosopher Harry Frankfurt. The very first lines are:

“One of the most salient features of our culture is that there is so much bullshit. Everyone knows this.”

According to Mr Frankfurt, bullshit is speech intended to persuade, without any concern for truth. This lack of concern distinguishes the bullshitter from the liar. The bullshitter is more radical.

The liar knows and cares about truth, and this is precisely why he tries to cover it. The bullshitter however, does not care whether he utters truth or lies. His focus is solely on persuasion.

Frankfurt does not claim that there is more bullshit in society than in the past. Instead, he explains that all forms of communications have increased, leading to more visible bullshit.

Think about this. Frankfurt published the book 2005. Already then, he wrote about the amplification of bullshit. However, this was years before the spectacular rise of social media. In 2005, the most valuable companies of the world still managed oil and money, instead of information; and Facebook had meagre 6 million friends, mostly US college students.

Fast forward to 2020, bullshit has become a lot darker than even Frankfurt perhaps predicted. Sometimes, it seems our democracies drowning in bullshit. Sadly, our database covers quite some bull.

A recent and extravagant example is the Danes being painted as zoophilians.

Aleksey Zhuravlyov, member of the Duma, stated that facilities for zoophiles have been opened in Denmark, where one can go and “rape a turtle.” This resonates with earlier narratives, in which the Danes were portrayed as zoophiles. It also fits a larger narrative of the moral decline of the West.

There are more examples. What should one make of “the time Russian TV claimed gay couples could buy an actual baby at a fair in Brussels?”, or when it was claimed that the “Council of Europe was trying to divide men and women of the Russian delegation into 6 sexes”?

Occupying the information space

Why would anyone spread bullshit? In the end, the goal is to occupy the information space.

As we flagged in February, the Russian Institute for Strategic Studies, a Kremlin-funded think tank, published an essay titled “Securing Information for Foreign Policy Purposes in the Context of Digital Reality”. The paper claimed that:

“A preventively shaped narrative, answering to the national interests of the state, can significantly diminish the impact of foreign forces’ activities in the information sphere, as they, as a rule, attempt to occupy “voids” [in the information flow].”

This strategy points to the ambition to take away attention from a certain truth. Therefore one who applies this strategy is a liar, not a bullshitter.

However, both the tactical liar and the bullshitter share an attitude. Substance is secondary, and the primary goal is to flood the information system.

With this perspective, even false information that does not seem to be directly harmful, is dangerous because it occupies space, hurting the general conditions to establish truth.

In this regard, it is in itself telling that in Russia, almost half of all political conversation on Twitter is conducted by bots.

Researchers proved that this is also the case with regards to Covid-19. They scrutinised more than 200 million virus-related tweets worldwide and concluded that since January about 45% of the tweets were sent by accounts that behave more like computerised robots than humans.

In 2018 Facebook deleted 835 million fake accounts - that’s almost ten percent of the earth’s population.

Steve Bannon once notoriously said: "The Democrats don't matter. The real opposition is the media. And the way to deal with them is to flood the zone with shit."

The falsifiability of bullshit

If you want to flood the information system, bullshit is a fine instrument, because it might be harder to falsify than a lie.

The well-known philosopher Karl Popper called this falsifiability: the capability of a claim to be contradicted by evidence.

For example, "all swans are white" is falsifiable. You need only one black swan to disprove it.

Whereas, "this human action is altruistic" is a non-falsifiable statement. This is because we have no instruments to decide, whether or not an action is driven by self-interest.

In the disinformation context, this works the same.

For example, if you push the narrative that the BBC claims the MH17-flight was downed by a Ukraine fighter jet, this can easily by disproved. The documentary was clearly misrepresented.

It becomes more challenging, however, if you want to refute the following claim: George Soros is the driving force behind a secret society supporting colour revolutions, their hidden intent is to overthrow all nation states, to make space for a world government.

This can never be disproven completely. However, that does not make it true. Rather, it strongly suggests we are dealing with bullshit. And as we know bullshit is not innocent.

But wait a minute, isn’t it a strange coincidence that Soros was a student of Karl Popper?!?! Very strange indeed…

Read more

From dance videos to coordinated lip-syncing in support of war


Social media platforms are widely used to spread disinformation, also during conflicts. With over 1 billion users worldwide, TikTok is no exception to the rule. The platform has taken the world by storm, allowing people to follow parts of the war in real time. Since Russia’s invasion of Ukraine, TikTok has exploded with videos shot on the ground: columns of military vehicles, instructions on how to drive an abandoned tank, dance choreos on the battlefield, cooking lessons in bomb shelters, etc. The other side of the coin is more ominous: TikTok has also seen an outbreak of disinformation videos praising the ‘special operation’. From self-proclaimed war experts to Kremlin-backed channels, TikTok has become a hotbed of war propaganda.

From disinformation to war propaganda

TikTok videos expressing support for the Russian invasion of Ukraine frequently use the toxic pro-war Z and feature demonstrations in favour of the ‘special operation’. Many videos justify the ‘special operation’ with the need to save Russian-speaking residents in Donbas from an alleged ‘genocide’. Russian servicemen are presented as the ‘saviours’ and ‘defenders’ that local people have been waiting for since 2014.

Well-known pro-Kremlin propagandists keep misleadingly claiming that the ‘special operation’ is only targeting Ukrainian military infrastructures and that US-funded labs in Ukraine are developing biological weapons to attack Russia. In the meantime, other videos blame Ukraine for not respecting the Minsk agreements and accuse NATO, the EU and the US of arming Ukraine with atomic bombs, of expanding towards Russia and of interfering in other countries’ internal affairs.

In response to the near unanimous international condemnation of Russia’s invasion of Ukraine, the #мненестыдно [#Iamnotashamed] hashtag has been trending on Twitter and TikTok, while many TikTok compilations refer to Russia’s greatness, often compared to the USSR and Imperial Russia. The top figure in the pro-Kremlin disinformation ecosystem to push this narrative was Margarita Simonyan, RT’s editor-in-chief. She reacted to the #мненестыдно hashtag and narrative by lamenting that the only thing she is ashamed of is that they [i.e. Russians] ‘did nothing’ for Donbas for eight years.

The role of influencers

Recently, a blogger raised allegations of Russian influencers being coordinated and instructed to recite the exact same script. In addition to the all too obvious overlap of messages, the fact that all the videos in question have now been deleted and six of the 13 channels that hosted these videos have changed their TikTok IDs since gives credence to this claim.

But these were far from the only coordinated videos that have been making rounds on TikTok. A number of other nearly identical videos are being pushed by popular influencers, all using the same audio clips, hashtags and filters. These channels – counting from tens of thousands to millions of followers – all look like ordinary TikTok accounts that people use to share pranks, dance videos and comedy content. However, as Russia invaded Ukraine, these accounts started publishing politically-oriented videos.

In one set of videos, the scene starts with the protagonist on one knee, holding a sign reading ‘Russophobia’, ‘Donbas’, ‘hate speech’, ‘cancelling’, ‘Luhansk’, ‘sanctions,’ ‘info war’ and ‘nationalism’ in English. The person then stands up and flips the sign, revealing the other side, which reads ‘Russian lives matter’. All these videos use the same song – a remix of Katyusha – and the same filter, featuring The Motherland Calls statue in Volgograd. They are all accompanied by a ‘Russian Lives Matter’ caption, as well as the #RLM hashtag. More than twenty identical videos have been created by various Russian influencers (see here, here, here and here, for example) with some of them generating hundreds of thousands of views, likes and comments.

Could this trend be propelled by an actual grassroots initiative? Hardly. Another Russian influencer shared a story on her Instagram account (see below), showing a request for a video with clear instructions: ‘the blogger has to play a Donbass inhabitant who survived 2014 [...]. Now he is safe already, because the Russian forces help DNR and LNR’. The message also instructs what hashtag and audio clip should be used to do the ‘lip-sync’.

A quick search on TikTok shows that at least two videos (one here, another one now deleted) were created following those instructions. More so, one of the videos was published by an influencer who was also noticed reciting identical text from a script mentioned above.

Sponsored or paid content is commonplace on TikTok – and on other social media platforms, for that matter. Dozens of Telegram channels offer money to TikTok and Instagram bloggers to create themed videos. One channel in particular, MM-Media, announced (note: this and other links to posts by MM-Media are visible to members of the channel only, as the channel went private recently) at the end of February that there were going to be a lot of requests on political themes for a ‘big project’, with a ‘better budget’ going ‘over 20,000 rubles’ for a video. The owner of the Telegram channels asks TikTok bloggers ‘with over 5M followers’ to get in touch with her if they want to do ‘Government advertisements’. Since then, the channel has shared several calls with political themes to TikTok bloggers with at least 1.5M, 5M and 10M followers. In some of those calls, an example was provided as a reference. Dozens of videos following these instructions have been circulating on TikTok (here, here and here, for example). In another call, influencers were asked to form a Z with their fingers. Again, many videos following this request were shared on TikTok (here, here and here, for example). Recently, MM-Media’s Telegram channel launched a call for English-speaking bloggers to create videos on the theme of ‘stupid sanctions’.

What lies ahead

On 4 March, TikTok announced that it would start applying labels to content from some state-controlled media accounts. Later, it announced that it was suspending the posting of all posts and live streams from Russia. The company also announced that it was blocking all non-Russian content in Russia, which means a user with a Russian IP address can no longer access non-Russian TikTok content. These decisions have effectively isolated many Russian users from foreign content.

Still, TikTok has become the birthplace of thousands of videos reporting on the war in Ukraine. The quest for popularity on TikTok generates millions of reactions and videos are massively reshared with many of them ending up circulating on other social networks, traditional media outlets and even TV. While there is no reason to doubt that most of the videos shared on TikTok are genuine and express personal opinions, it merits wider attention that there are also influencers who get paid for sharing videos supporting the ‘special operation’ and the Kremlin.

On TikTok, war conveniently fits in a virtual world where violence is trivialised. All of this for a few rubles. If anything, there are all the reasons to believe that TikTok’s importance in the battlefield of information will only keep growing in the foreseeable future.

Read more

The Dirty Side of Advertising: Forced Confessions from Belarus on YouTube


To convey a positive image of Belarus and promote a pro-government Telegram channel, the ‘Belarus - the country for life’ YouTube channel uses forced confessions of political prisoners as paid advertisements.


Main findings:

It all started with an alarming tweet. Tadeusz Giczan, the editor-in-Chief of Belarus's largest Telegram channel NEXTA_TV, wrote that confession videos of people captured by the Belarusian state are used as paid ads on YouTube to advertise pro-government Telegram channels.

The man on the video screenshot is Raman Pratasevich after being arrested on 23 May 2021, following the forced diversion of a RyanAir flight from Athens to Vilnius to Minsk airport.

Pratasevich has worked as a photographer and journalist and currently is the editor-in-chief of the opposition Telegram channels "Nexta" and "Nexta Live". After the rigged presidential election in 2020, these channels collected and published videos of mass protests, making the content accessible across the whole world. Now Pratasevich’s forced confession, titled ‘The terrorist confessed everything’, seems to be used to advertise pro-government Telegram channel Yellow plums (@zheltyeslivy).

It’s also important to bear in mind that this is not the first time such forced confession videos have been released.


The channel

The channel spreading the videos is called Беларусь - страна для жизни or ‘Belarus - the country for life’. It is worth noting that the official promotional video of Belarus (also visible on YouTube) from 2015 bears the same name.



Another link to the Belarusian regime is apparent in the channel’s choice of logo, as it resembles the Belarusian presidential flag.

The channel’s ‘About’ section does not reveal who is behind it. The only publicly available information shows that it has 1390 subscribers, and the view count exceeds a whopping 6 million.

Last, but not least, the channel was created on 5 August 2020. On the very same day, a video was uploaded, showing illegitimate president Alyaksandr Lukashenka delivering a speech. After the presidential elections on 9 August, the videos showed violent protesters with a question: “Is this peaceful?”. Another clip bashed the opposition.


The spread

To see how many people noticed this promoted campaign, we ran a Google search using the Channel ID code. We found that the videos uploaded by the channel and then promoted as advertisements, were shared in multiple Telegram channels. We also found a post describing how previously users saw advertisements of shampoo and English courses, but afterwards were targeted with pro-government video clips. The user also wondered why YouTube’s moderators allowed such ads to run.

We also found a Telegram message with a screenshot of the forced confession video of Raman Pratasevich’s partner, Sofia Sapega used as an advert. The video is covered with a Russian language watermark saying Yellow Plums, the advertisement has a hyperlink to the Yellow Plums Telegram channel and the very same channel had also published the same confession video.

The comments below the screenshot discuss how propagandists use the state budget for such advertisements and draw some historic parallels.

The cost and videos

As there is no publicly available information on who the ads targeted, we tried to find out how much it would cost the channel to run such ads if they only focused on an audience in Belarus. Thus, we mimicked the process of creating a YouTube advertisement to obtain the suggested cost per view (CPV). We found that a view in Belarus costs 0.01 euro.

As for what counts as a view, Google Ads defines it as a user interacting with the ad, watching it entirely or for at least 30 seconds. It’s important to bear in mind that YouTube does not publicly share information about how many of the views were a result of paid promotion.


The videos

According to the information above and the publicly visible 139,000 views on the Raman Pratasevich’s video, the promotion cost should be at least 1390 euros. But that’s only the minimum sum. As stated above, YouTube only counts a view if a user watches the video for at least 30 seconds but charges the channel owner 0.01 euros every time a user saw at least 4 seconds of the clip.

As for the entire channel, by June 2, it had 55 videos. The most popular video features rapper Seryoga who claims there is absolute peace and order in Belarus and that the infrastructure is working. The clip has 672,000 views, 4 shares and 12 likes on Facebook and according to media analysis tool BuzzSumo it is an advertisement.

The channel also has multiple videos about Belarusian opposition leader Sviatlana Tsikhanouskaya. One of them is a 12 second compilation of different clips that make her say "Spring is close and we know that those guilty for violence against Belarusians... Me, Svetlana Tikhanovskaya, my team, my representatives will go to jail and it's not an exaggeration." The description of the video is simple: “Это конец Света” or “This is the end, Sveta”. The video has 196,000 views.

On the other side of the spectrum, we can find a two minute video showing illegitimate Belarusian president Alyaksandr Lukashenka giving a speech, protesters harassing the police and some smiling babies. The video description says it’s powerful, bright in emotions and constructive and it has 171,000 views.


The numbers

The total view count of the channel is 6,279,000. If this is divided between all the 55 videos uploaded, the result is a whopping 114,164 views on average.

When the same approach is applied to the YouTube channel of the Belarusian Information Agency BelTA, the result is, on average, 7,909 views per video.

Moreover, a YouTube channel with only 1390 subscribers should normally receive fewer than 1000 views on non-promoted content (point proven by eight videos with less than 1000 views). The numbers above and below show that the content was either viral or there were more videos used as adverts. The latter would mean that someone is paying YouTube tens of thousands of euros to attack the critics of the Belarusian regime, advertise a pro-government Telegram channel, show that everything is in order in Belarus and that the president is a trustworthy person.

Out of the 55 videos uploaded by the channel, BuzzSumo was able to gather information about 14. While it labelled only two of them as advertisements, some of the others (namely the forced confession of Pratasevich) fell very close to the threshold BuzzSumo uses to detect an advert. Note that while Sofia Sapega's video has 193,600 views, it has no shares, likes or comments on Facebook, Twitter, Reddit or Pinterest.


The Telegram and YouTube channels

Returning to the Tweet of Tadeusz Giczan and the question of what kind of role the Belarusian Information Agency BelTA plays in using Raman Pratasevich’s video to promote pro-government Telegram channel Yellow Plums, we can only rely on netizens discussing the matter and sharing screenshots of ads with hyperlinks to the channel visible.

Thus, on the surface, the Yellow Plums Telegram channel is not linked to BelTA. However, the channel regularly posts exclusive material and personal data that only the special services have access to. It also posts a ton of inhumane, highly toxic content, not just propaganda per se.

YouTube has said that it has removed the forced confession ads. But the damage has already been done, the videos of Raman Pratasevich and his partner Sofia Sapega have already received 139,000 and 191,000 views respectively. While the ads have been stopped, the videos are still there, amongst others, some of which portray the opposition as criminals and the government as its victim. And by doing so, the owners of the channel seem able to avoid YouTube’s Community Guidelines that regulate what is not allowed on the platform.

For some, these videos might be a stark reminder of how their country treats independent media. For others, they are a reminder of an authoritarian regime using taxpayers money and the internet to lie and manipulate their citizens. And for some, these videos are just advertising worth 0.01 euros per view. Times five, maybe six million? And that’s just one channel.

POST SCRIPTUM: On 3 June 2021, Belarus state-owned TV ONT aired another piece displaying Raman Pratasevich in what is called “an interview” but the 1,5 hour piece should rather be considered yet another ‘hostage video’ displaying a person put under great stress by the Belarus state authorities.

*This article was edited for clarity on 7 June 2021.

Read more

How to get the European Parliament to read Russia Today


EP Today: A news website that doesn't write 99% of its articles, targets decision makers in the EU and misleads the public about the origin of most of their content: Russia Today.

"EP Today" (short for "European Parliament today") is a self-proclaimed “monthly news magazine for the European Parliament”. And sure enough, with a name including one of the major institutions of the European Union, together with a cursive logo in a ring of 12-stars on a blue background, you may get the impression that it is a serious news outlet. The 145 000 Facebook fans and an average output of 25 articles per day complete the narrative that it is an established, influential and informed part of EU politics.

Except: "EP Today" uses the name of the European Parliament in a misleading way and without any legal authorisation.

The website, Facebook page and Twitter account suggests that "EP Today" has a connection to the European Parliament, which is not the case. The European Parliament contacted "EP Today" in September of this year regarding the unauthorised use of the institution's name and suggested ways to rectify the situation and stop misleading the public.

On their about page, "EP Today" states it is “designed only for the MEPs to write article about issues which they think are currently important and need attention of all their colleagues and other policy makers.” [sic]

However, more than 99% of articles that appear on eptoday.com are not from "EP Today". Since October 24th 2018, 47% of the articles are word-for-word copies of articles from RT.com. Often, the entire homepage is filled with exact copies of RT articles. The same article can even appear multiple times on the same page.

We analysed all available 17697 articles published by eptoday.com under their “news” section. A dataset spanning over 2 years, from April 6th 2017 to the 23rd of August 2019. We only found 25 contributions from MEPs or other decision makers on a variety of issues published within this time. A mere 0.14%. "EP Today" clearly is not “only for the MEPs”.

Articles published by day on eptoday.com. Publishing before 24 Oct 2018 shows a high level of automation (30 or 50 articles max per day)

The website seems to be a lobbying platform, presented as a serious news outlet, whose target audience are EU decision makers. The concept is simple: "EP Today" offers exposure (or the illusion thereof) in return for attention.

To make that offer, a website needs to appear like a serious news outlet providing relevant content on a regular basis that attracts many viewers. Given that “EP Today operates out of brussels with its core staff” [sic], with only a mailbox address in Brussels, none of their editors on LinkedIn or Twitter and all their 4 Facebook page admins situated in India, it is difficult to provide enough relevant content to appear like an attractive outlet for contributions by decision makers. The answer to appearing bigger, active and not like a one-issue lobby group is: content syndication.

Content syndication makes content from one website available to other sites. Wikipedia explains the motivation: websites that republish other page's content (subscribing sites) become more interesting to readers while websites that provide content to republish (providing site) benefit essentially from free advertisement.

Since Oct 24th, 2018 EP Today uses a multitude of sources to automatically provide content on their website. Before Oct 24th, only “Voice of America” was syndicated, later RT.com together with a variety of EU sources.

Before October 24th 2018, "EP Today" exclusively syndicated “Voice of America” (VOA). Up to 50 articles per day were automatically embedded, which resulted in more than 99% of "EP Today" content coming from VOA. Why the switch from one syndicated outlet to another in 2018 and why did they chose these ones?

One reason could be that RT.com, global rank 299 on Alexa, has a higher readership internationally than voanews.com, which Alexa currently ranks on global position 2831. But there are many other news outlets ranking much higher and are more relevant to the EU audience that "EP Today" targets.

Both, “Voice of America” and “Russia Today”, have very permissive terms of use, explicitly allowing the reuse of their content on other websites. While bbc.com for example, with a global rank of 81, requires a case-by-case permission to embed their content, “Voice of America”’s content is in the public domain, allowing reuse – however, with exceptions to content not produced by them. RT.com even goes as far as to “encourage free re-use of its materials for non-commercial purposes”, under the condition to link to the original page. VOA only requires subscribing sites to credit them.

Social Media Does Not Care About Your Crediting

"EP Today" owns at least one Twitter profile with almost 6000 followers and a Facebook page with 145 000 followers.

Below is the exact moment before and after the switch from VOA to RT on "EP Today"'s Twitter profile. Without announcement or transparency, the Twitter account suddenly promotes RT messages. Ironically, during the INF treaty debate. The last syndicated VOA article on @eptoday displays the two sides of the debate, while the first article by RT condemns the United States as seeking world domination. A remarkable shift in tone and perspective.

To the 6000 followers, this shift was invisible, as Twitter does not show the original source of the article, but only the host website: eptoday.com.

We have seen related “information laundering” of RT content in the US, Germany or Belarus. The result (and sometimes objective) is always to mislead readers.

Read more

Filmmaker Falsifies Fake Forgery


It’s fake, fake, fake! Filmmaker Nikita Mikhalkov did not leave his audience in any doubt when demonstrating that the vast crowds of protesters in Belarus were nothing but computer generated imagery. The pictures, allegedly showing “oceans of people” were doctored – “trust me”, Mikhalkov said, “I’m a professional”.

No one can doubt Mr Mikhalkov’s professionalism. He is certainly one of Russia’s most internationally acknowledged film professionals. He has an impressive oeuvre; since his mid-70s debut, his films have earned prizes at international film-festivals in Cannes, Venice and Berlin. In 1995, he won an Oscar for Burned by the Sun; a story set in the Stalin era, where he himself stars in the leading role as a Soviet officer falling out of grace with Kremlin rulers. A great work of art; a great piece of fiction.

The Saint

Mikhalkov continues to excel in the field of storytelling and is hosting a YouTube show called “Besogon” – the title alludes to driving out demons. September 28, the Russian church celebrates a 4th century saint called Nikita Besogon (In Western tradition – Nicetas the Goth). Mr Mikhalkov has registered the icon of Nikita Besogon as a trademark.

Mikhalkov’s TV-show used to be aired on Russian state TV, but was cancelled when the host went a little too far, broadcasting anti-vaxxer conspiracy theories and claiming that billionaire Bill Gates might be seeking to implant humanity with microchips under the guise of vaccination. Bang, and the show went off the air. After all, Russia is investing a lot in promoting a COVID-19 vaccine to the world market. Lying is fine, spreading conspiracy theories – no problem… but hurting potential profits – there’s where the Kremlin draws a line.

The Saviour

Undeterred, Mikhalkov has continued with his show, now on YouTube, and its broadcast of 12 September was devoted to the situation in Belarus. Mr Mikhalkov’s position is staunchly pro-Lukhashenka.

He saved the country. He really saved the country… He came, powerful, young, charismatic. I’ve seen Belarus and its villages with paved streets, with functioning electric light everywhere… A wonderful impression of enormous well-being…

Mikhalkov shows pictures of nice, tidy villages, hinting that only Alyaksandr Lukashenka is able to furnish the Belarusian countryside with paved streets and electricity. How can anyone be unhappy if there is electricity? Mikhalkov suggests that not many Belarusians are protesting, despite pictures of large crowds:

Look at those pictures. An ocean of people! But look, look at the distances between the people in the picture! It is impossible that such a crowd could have such distances between peoples’ legs. There should be no space at all. First row, second, third… This is obviously computer graphics. And here is an even more clear example: Independence Square in Minsk. This is how it looks in “peacetime”. And look here how full it is during a demonstration: See how even the glass cupola of the fountain is covered with people. This is impossible! It’s fake, fake! “Good enough, have it like that! The people will stomach it!”

The Sinner

Mikhalkov demonstrates pedagogically doctored pictures, and he is not lying. The picture from Independence Square in Minsk is certainly doctored. And with a visible logotype from the opposition NEXTA channel. Fake! Fake!

Only issue – the NEXTA telegram channel did not broadcast the fake picture; the Mikhalkov show did. And, not very professionally, one must say. The forgery was revealed by Russian journalists, comparing the original picture published by NEXTA, and the one presented – quite correctly – as doctored by Oscar-winner Mikhalkov.

Fiction in art can be something beautiful, and fiction can convey truth. But, no Oscars will ever be won for lies and forgeries.

Read more

Pro-Kremlin Media Ecosystem

Russia’s attempts to disinform and manipulate in the information space are a global operation. It is an ecosystem of state-funded global messaging where regime representatives speak in unison with the media, organisations, offline and online proxies, and even the Orthodox church. It is an elaborate system using a wide array of techniques, tactics, and procedures, and speaking in dozens of languages – all with the aim of sowing discord, manipulating audiences, and undermining democracy.

EUvsDisinfo has been tracking Russia’s disinformation for years. We have gotten better at detecting and responding to the manipulation attempts. There is now a robust body of evidence of disinformation and manipulation. Nevertheless, Russia keeps attempting to manipulate and sow chaos, and other actors follow suit or, as in the case of China, develop their own playbook of information manipulation and interference, including disinformation.

The ecosystem consists of five main pillars:

  1. official government communications;
  2. state-funded global messaging;
  3. the cultivation of proxy sources;
  4. the weaponisation of social media;
  5. and cyber-enabled information manipulation.

The ecosystem reflects both the sources of information manipulation and disinformation and the tactics that these channels use.

Source: Pillars of Russia's Disinformation and Propaganda Ecosystem, GEC

RT and Sputnik

The main instruments bringing the Kremlin’s disinformation to audiences outside of Russia are RT (available in over 100 countries plus online) and Sputnik (a ‘news’ website in over 30 languages). Both outlets are state-funded and state-directed. With an annual budget of hundreds of millions of dollars, RT and Sputnik’s basic role is to spread disinformation and propaganda narratives via their own channels, websites, and multiple social media accounts (now blocked in the EU due to sanctions connected to Russia’s aggression against Ukraine).

RT and Sputnik also interact with other pillars of the ecosystem. They amplify content from Kremlin and Kremlin-aligned proxy sites, exploit social media to reach as many audiences as possible, and promote cyber-enabled disinformation.

Both outlets attempt to equate themselves with major independent and professional international media outlets. They have been trying to increase both their reach and credibility that way. This is also why they portray any criticism towards them as either Russophobia or as violations of media freedom. The same goes for numerous cases of penalties and EU sanctions which RT tried to fight in the European Court of Justice, but failed.

Moreover, these outlets do not have – and do not seek – any editorial independence, and are instructed what to report on and how by the Kremlin.

In reality, RT and Sputnik’s organisational set-ups and goals are fundamentally different from independent media. RT was included in an official list of core organisations of strategic importance for Russia. RT’s own editor-in-chief, Margarita Simonyan, defines the mission of the outlet in military terms, publicly equating the need for RT with the need for a Defence Ministry. Simonyan also made clear that RT’s mission is to serve the Russian state as an ‘information weapon’ in times of conflict.

Spreading its tentacles

The Kremlin’s tentacles in the information space go way beyond RT and Sputnik. RT is affiliated with Rossiya Segodnya through Margarita Simonyan, the editor-in-chief of both RT and Rossiya Segodnya. Moreover, RT’s parent company, TV-Novosti, was founded by RIA Novosti, and RIA Novosti’s founder’s rights were transferred to Rossiya Segodnya via a presidential executive order in 2013. By the way, the head of Rossiya Segodnya, Dmitry Kiselyov, was sanctioned by the EU back in 2014 for his role as a ‘central figure of the government propaganda supporting the deployment of Russian forces in Ukraine’.

Public records show that some employees work for both RT and Rossiya Segodnya despite the two organisations claiming to be separate. In some cases, staff working for Rossiya Segodnya have worked for other Kremlin-affiliated outlets at the same time.

This is only the tip of the iceberg. There is a well-documented relationship between RT and other pillars in the Russian disinformation ecosystem – a collection of official, proxy, and unattributed communication channels and platforms that Russia uses to create and amplify narratives. These include, among others:

… and many, many, many more.

A special place on this list is reserved for outlets connected and directed by the Belarusian regime, who now act in coordination with the Russian ecosystem (examples here, here and here).

Image from Clint Watts

Russia’s ecosystem of disinformation and information manipulation is about shouting disinformation and propaganda from the rooftops and spreading disinformation as widely as possible using different tactics, techniques, and procedures.

The Kremlin’s weapons of deception: 7 things you need to know about RT and Sputnik


The European Union has adopted sanctions against the Kremlin's weapons of deception: Sputnik and RT. Their broadcasting activities were urgently suspended in the EU, until the Kremlin’s aggression against Ukraine is put to an end and until Russia ceases to conduct disinformation and information manipulation activities against the EU and its Member States.

These are exceptional, targeted and temporary measures, taken in a very specific, unprecedented context – Russia’s military aggression against Ukraine.

These measures are no “censorship” or “general ban” and strike a careful balance: they respect fundamental rights and only restrict them in a proportionate manner as part of the legally based sanctions regime. This is part of the EU’s reaction to a clear threat to European peace and security.

The EU sanctions do not prevent staff from working for these outlets to carry out their duties. Notably though, following Russia’s invasion to Ukraine, a number of RT employees have resigned throughout the world.

The EU sanctions against RT and Sputnik result in temporary suspension of broadcasting or emission rights to disarm the Kremlin’s instruments of informational war - at the time when it is waging an actual brutal war against Ukraine.

RT and Sputnik may look and sound like international media outlets, but they are not. They function with one overarching goal: to facilitate and support the Kremlin’s policies abroad with means of disinformation and information manipulation. RT and Sputnik and their social media offshoots aim to exacerbate fragmentation and polarisation and undermine democracies, obfuscate facts of the Kremlin’s violations of international law, and manufacture support for Putin’s war.

7 things you need to know about RT & Sputnik:

1. RT and Sputnik are not media organizations, but the Kremlin’s weapons of deception.

RT has defined its mission in military terms. Its editor-in-chief Margarita Simonyan, has publicly equated RT’s importance with that of Russian’s need for a Defense Ministry. According to her, RT is capable of “conducting information war against the whole Western world,” using “the information weapon. RT’s strategic aim is to “conquer” and to “grow an audience” in order to make use of access to this audience in “critical moments”.

RT is included in an official list of core organizations of strategic importance for Russia. Sputnik was created by a Presidential decree, with the aim to “report on the state policy of Russia abroad.” RT receives millions of roubles from the Russian state every year. For example almost 23 billion roubles or €325 million at the time, were designated for RT in the Russian federal budget draft for 2020. Russia’s overall budget for state media was nearly €1,3 billion in 2021.

2. RT and Sputnik are not objective, nor do they strive to be.

Independent journalistic investigations reveal that the chief editor of RT and Sputnik, Margarita Simonyan plays a central role in the media control network, which instructs Russian state-controlled media outlets on what should and should not be covered. She has admitted having a direct secure line to the Kremlin via a “yellow phone.”

  1. Simonyan has expressed strong support for limiting access to information in Russia by calling for the prohibition of foreign social media networks, and welcoming a crackdown on Belarusian journalists by Lukashenka’s regime.

The head of “Rossiya Segodnia” Dmitry Kiselyov, who was sanctioned by the EU in 2014, has told his employees that “the period of impartial journalism is over,” and that “objectivity is a myth.”

3. RT employs people with background in journalism, but does NOT follow international journalistic standards.

RT has been serving the Kremlin’s agenda in the West under the guise of “alternative” journalism, aimed at counteracting the alleged bias of “mainstream media.” It has relied on appearances of Western politicians, academics, journalists, and other influential public personalities to boost its credibility, and uses the op-ed format prolifically to disguise disinformation as an opinion.

But appearances are exactly that – RT cannot disguise its fundamental lack of public accountability, transparency and editorial independence. A far cry from international media standards that international public broadcasters adhere to, RT is opaque about its budget, supervisory structure, and oversight mechanisms.

4. RT and Sputnik are a part of a larger disinformation ecosystem.

There is a well-documented relationship between RT and Sputnik and other pillars in the Russian disinformation ecosystem – a collection of official, proxy, and unattributed communication channels and platforms that the Kremlin uses to create and amplify misleading and manipulative narratives.

RT and Sputnik facilitate and engage in cyber-facilitated influence operations, including those that have been attributed to the Russian Military Intelligence GRU.

5. The real impact of RT lies in online and social media.

RT is best understood not as a TV channel with social media presence, but as a social media operation with professional TV capabilities. A 2018 study by UK media regulator Ofcom ranked RT's popularity second to last "among adults who use TV for news," with 2 percent of the total audience share.

The reach of RT is far greater on social media. In the beginning of 2020, RT boasted 10 billion views across multiple YouTube channels. Yet RT's mass appeal on social media has been contested by international media and Russian activists. Investigations revealed that RT have been artificially inflating online viewership figures to boost its notoriety and relevance as a media outlet.

RT has actively tried to occupy the information space within the EU with a huge volume of articles on issues of key importance to the Kremlin, like the obfuscation of facts about the poisoning of Alexei Navalny. RT’s use of sensationalist, misleading headlines to boost social media engagement combined with lax copyright policies created a dangerous ecosystem for RT content to be easily amplified by proxy websites – either unwittingly or purposefully.

Throughout the years RT has created multiple spin-offs tailored specifically for social media propagation, where entertainment content comes spiced up with the “Russian perspective”. Hidden behind the smokescreen of multiple offshoots RT tailors and targets messages to diverse audiences. RT and its satellite outlets have also gone at lengths to disguise their connections with each other and with the Russian state.

6. RT has run into trouble before.

The EU sanctions against RT and Sputnik are exceptional, targeted and temporary measures, taken in a very specific, unprecedented context – Russia’s military aggression against Ukraine – but the Kremlin’s mouthpieces have run into trouble before.

In Ukraine, RT has been banned since 2014 and it has never managed to obtain a broadcasting licence in Germany. In 2019 the German Association of Journalists warned the state media authorities, that, "Russia Today is not an information medium for us, but a propaganda instrument of the Kremlin." RT DE tried to side-step Germany’s rules and obtain a broadcasting licence in Luxembourg, but the regulators in Luxembourg also rejected the request. After RT DE tried to go on air in Germany at the end of 2021 without a valid broadcasting licence, its broadcast was suspended by the German Commission on Licencing and Supervision.

RT has been banned in Latvia Lithuania and Estonia. In the US, RT had to register as a foreign agent. RT’s output had been found “materially misleading” by UK media regulator Ofcom, which in 2019 imposed a fine of £200,000. In February, 2022, Ofcom launched 15 new investigations into the impartiality of RT’s news channel programming. The French broadcasting regulator has also issued warnings to RT.

7. RT and Sputnik have been instrumental in bringing forward and supporting Russia’s military aggression against Ukraine.

RT and Sputnik continue to cultivate and amplify disinformation narratives denigrating Ukraine, in attempts to justify and mobilise support for Putin’s war. And they have been doing it for years – see some examples from 2016, 2017, 2018, 2019, 2020, 2021, 2022...

In 2021 RT’s editor in chief, Margarita Simonyan attended a conference in Donetsk, where she called on “Mother Russia to bring Donbas home,” literally calling for the violation of Ukraine’s territorial integrity.

Just weeks before Putin’s orders to start the war, RT riled up its audiences by stating that “Ukrainians wanted Russian blood.”

As the Russian missiles are falling on Ukrainian cities, both RT and Sputnik continue to disinform and obfuscate the true nature and extent of Putin’s war against Ukraine.

Read more

Chief Editor: RT is Like "a Defence Ministry"


The chief editor of RT (Russia Today), Margarita Simonyan, cannot be blamed for lack of openness about the nature of the outlet whose output she manages on behalf of the Russian government. In her own words, RT is needed "for about the same reason as why the country needs a Defense Ministry." RT is capable of "conducting information war against the whole Western world," using "the information weapon," Simonyan has explained. According to Simonyan, RT's strategic aim is to "conquer" and to "grow an audience" in order to make use of access to this audience in "critical moments".

Ofcom: RT is unfair and biased

Simonyan's statements were made available in English translation in a recent article from the Atlantic Council's Digital Forensic Research Lab. DFRLab identified two interviews Simonyan made with the Russian daily Kommersant in 2012 and with the Russian online news portal Lenta in 2013. The chief editor's overall message – that RT's mission is to use information and communication for purposes traditionally handled by military forces – echoes statements made with a similar degree of transparency by high ranking political and military leaders in Russia.

DFRLab's article also analyses the claims made by Simonyan and other representatives of Russian state media that RT is supposedly no better and no worse than other media outlets, such as the BBC. In its analysis, DFRLab includes a series of rulings from the telecoms regulator in the UK, Ofcom, which underline that RT systematically fails to meet the minimal standards for fair and unbiased reporting, thereby jeopardizing the privilege of calling its work journalism.

RT's chief editor Margarita Simonyan with President Putin at the celebration of RT's 10th anniversary in December 2015.

RT weaponised information before the conflict in Ukraine

It is worth noting that the two interviews with Simonyan were published already in April 2012 and in March 2013, i.e. before Russian state media began to escalate their messaging around the conflict in Ukraine. This circumstance throws light on the fact that the idea that RT and similar outlets are the government's weapons, cannot be seen as part of a reactive Russian response to the perception that the country's interests were challenged in Ukraine; what Simonyan says about weaponising information and communication confirms that this proactive and strategic approach was present also well before the Ukrainian conflict became known.

RT launched a channel in France in addition to its English, Spanish and Arabic channels. The fact that Ms Simonyan is close to the Russian president was confirmed on Friday when her name appeared on the list of 259 "trusted persons" registered with Russia's Central Election Committee as supporters of Vladimir Putin in his bid for a fourth term as Russia's president, as reported by Vedomosti.

Follow this link to read DFRLab's article in its full length.

Further reading:

Seven things you should know about RT and Sputnik

Figure of the Week: 20 Million

Inside RT’s world of alternative news

KT – Kremlin Today

RT goes undercover as In The Now

Honest about lying

The Big Birthday Burger Lie

(Images: Wikimedia commons)

Read more

A Helping Hand: Pro-Kremlin Media Defend China’s Human Rights Record in Xinjiang


Pro-Kremlin outlets reproduce and amplify Beijing’s narratives denying human rights violations against the Uyghur population.

“Cherished dreams can only come true through honest work! […] the people of Xinjiang and all hardworking people deserve the respect of the whole world”, trumpeted Rossiskaya Gazeta shortly after International Labour Day. The newspaper – an official outlet of the Russian government – used the occasion to declare that Western allegations of forced labour in Xinjiang were “absurd” and aimed at undermining the development of the region.

This was just the latest example of pro-Kremlin media defending China’s human rights record amid growing global concerns over the repression of the Uyghur population in China’s Xinjiang province.

Earlier this year, an op-ed in Russian state-owned RIA Novosti news agency claimed that the province was a battlefield of “global lie campaigns” against China aimed at discrediting the 2022 Beijing Winter Olympics. In the past, pro-Kremlin media have also falsely alleged that reports denouncing the repression of Uyghurs in China were exaggerated and a part of “propaganda”, and that “fakes” about slave labour in Xinjiang were refuted by the Uyghurs themselves.

Such messages downplaying human rights violations in China or dismissing them as outright Western propaganda follow the official narratives of the Chinese authorities, aiming to give them more prominence and a veil of international legitimacy.


Pro-Kremlin adhesion to the China story

Throughout the years, pro-Kremlin disinformation has been portraying universal human rights as a “Western Trojan Horse”, a tool allegedly used to distract the public from internal problems in the West and eliminate Russia as a geopolitical actor. Such claims have found a sympathetic ear: earlier this year, Russia and China released a joint statement depicting their common vision of the world order and rejecting the West’s alleged use of human rights as an excuse to interfere in other countries’ internal affairs – which has been a recurring argument for Beijing.

To defend the repression of the Uighurs and attack the West over the human rights sanctions, all pro-Kremlin media have to do is relay the official position of Beijing. Pro-Kremlin outlets operating in multiple languages amplify the narratives of the Chinese officials, who deny any allegation of genocide or forced labour in Xinjiang, regularly accuse critical voices of being “anti-China”, and refer to conspiracy theories blaming the CIA for allegedly stirring unrest in the region. In the most recent example, Russian state-controlled Rossiya 24 TV channel dispatched its own journalist to Xinjiang to tour factories and schools in the region and assert that the local people “harmoniously combine prayer, communist spirit and national motives”. Such “findings” squarely contradict documentary evidence, including China’s own official documents and eyewitness accounts which illustrate the extent of the crackdown on Uyghurs and other ethnic minorities in Xinjiang.

Both countries’ state-controlled channels are also aligned in their criticism of independent media investigating human rights abuses in Xinjiang. China’s English-language cable news service CGTN has accused BBC of misrepresenting “professional training institutes” in Xinjiang as camps with rampant rape and torture, and claimed that the “BBC and some Western media have become heralds for international anti-China campaigns”. Soon after, Sputnik News followed with an op-ed calling BBC the “British Black-ops Corp” and a “cog in the Western imperialist machine”, and accusing the British media outlet of attempting to destabilise China.

Researchers are a target, too. In March 2021, Chinese state-controlled media announced that companies in Xinjiang region were suing a German academic Adrian Zenz, who has been investigating and exposing human rights abuses in Xinjiang. The academic was also included in the Chinese sanctions list along with a German think tank and a number of European politicians, and he was systematically denigrated by the Chinese media and officials. Russian state-controlled media and its proxies pitched in too, republishing the content of Chinese state-controlled media and dismissing Zenz -- a leading scholar and senior fellow of US-based Victims of Communism Memorial Foundation – as a “far-right fundamentalist”.


What about… “whataboutism”?

In the past, China had mostly refrained from attacking Europe’s human rights track record, as this contradicts the golden principle of “non-interference” propagated by Beijing. However, faced with rising international scrutiny over Xinjiang, Chinese state-controlled outlets have turned to “whataboutism”. This is a disinformation tactic perfected by pro-Kremlin media, which attempt to discredit an opponent’s credibility by accusing them of hypocrisy, usually without any evidence.

In March 2021, after the EU sanctioned Chinese officials for human rights violations in Xinjiang, Chinese state-controlled Global Times published an infographic listing, under the EU flag, various EU human rights “misdeeds” ranging from the Holocaust to the gender pay gap in Luxembourg. Shortly after the publication, the infographic was surreptitiously edited to remove references to the EU and the Holocaust.

A screenshot of a tweet shared by Chinese state-controlled outlet "Global Times", 23 March 2021.

Soon after, the French edition of RT published a Chinese government report on human rights violations in the US focused on racial justice protests, and mentioned China’s “standoff with the West” regarding its human rights situation. All the while transmitting Beijing’s arguments of “intimidation” and “hypocrisy” without ever questioning China’s version.

Alignment, coordination, cooperation?

This is not the first instance of Russian and Chinese state-controlled media outlets converging on the same topic and engaging in similar disinformation narratives. Last year we have observed how China’s state controlled CGTN amplified the pro-Kremlin disinformation trope about US “secret labs”.

Back in 2017, Sputnik signed a cooperation agreement with Global Times (one of the multiple cooperation agreements now linking Russian and Chinese state-controlled outlets), with the purpose of “showing to the international community China and Russia’s respective developments and their concerns and positions on some major international issues”.

The alignment of messaging in state-controlled media does not necessarily indicate systematic coordination of (dis)information efforts. After all, although both Russia and China have engaged in targeted influence operations and disinformation campaigns around COVID-19 – with the underlying claim that autocracies are better placed than democracies to deal with emergencies - both sides have avoided promoting each other’s vaccines (as they are commercial competitors in the vaccine market). Indeed, according to Mercator Institute for China Studies (MERICS), the cooperation between Chinese and Russian state-controlled media is much less focused on creating a good image of one another than it is on building up the joint threats. It seems that when it comes to human rights protection, both sides have opted for the same convenient target.

Read more

Figure of the Week: 1.3 Billion


The Russian federal budget draft for 2020 is in and it shows that state-owned media will receive 1.3 billion euros next year; a big jump from the current almost 1 billion.

The single biggest beneficiary is the TV News channel RT (Russia Today), which will receive almost 23 billion roubles or 325 million euros, according to the draft. Needless to say, RT's funding has been growing for years and so has the number of cases connected to RT in our disinformation cases database.

Pervyi Kanal will receive 92 million euros, which will cover producing, acquiring and broadcasting content. The subsidy will be very welcome, as the channel’s losses last year reached 96 million euros.

Zvezda (“The Star”) is owned by the Russia’s Ministry of Defence and broadcasts television, produces online stories and radio programmes. This successor to the official newspaper of the Soviet armed forces, Krasnaya Zvezda (“Red Star”) will receive 29 million euros.

The Russian Ministry of Finance also proposes to allocate 339 million euros to finance the VGTRK, which is short for the “All-Russia State Television and Radio Broadcasting Company”. Amongst many radio and TV channels, the state media giant owns Rossiya 1.

Another flagship in the media landscape under the Kremlin’s control is the “Rossiya Segodnya” holding, which, even if the name translates as “Russia Today” is not the same as RT. Rossiya Segodnya, whose CEO is the EU-sanctioned Dmitry Kiselyov, includes the news agency RIA Novosti, InoSMI and the multilingual international outlet Sputnik. Their subsidy will amount to 106 million euros.

Finally, a state media outlet, which has survived with its Soviet name, TASS, will receive 41 million euros in support of its news agency services.

All in all, as Interfax also notes, the share of media subsidies in Russia’s new draft budget has grown with a dramatic third; a figure which confirms the central place the control over the media narrative holds in the Kremlin’s political priorities.

Read more

Lie, Manipulate, Spread, Change, Spread Again


Sputnik Polska as the tip of the iceberg of a disinformation campaign.

Penetration of the information space by pro-Kremlin actors goes far beyond using state-funded media or a troll factory. As shown in the new report by Info Ops Polska, disinformation messages can be spread by multiple actors using a variety of tools – so that at the end of the day, its recipients cannot see its original source or its fake ventilation. Authors of the report take a look at one of the hot-button issues in Poland, frequently exploited by disinformation campaigns: namely, Ukrainian-Polish history and relations. It traces the road that pro-Kremlin narratives travel across cyberspace.

In the Beginning Was the Website

Poland and Ukraine share a long, rich, and sometimes painful history; the nations have a lot in common. According to different statistics, there are approximately 900 thousand to 1,2 million Ukrainian citizens living and working in Poland. The immigration started in 2014, with the deteriorating situation in eastern Ukraine due to Russia's aggression and plentiful opportunities on offer in Poland's economy. Given this background, the issue remains high on the political agenda of both informing and disinforming actors.

Sputnik Polska has been actively pushing disinformation narratives related to this topic. One of them touches upon social and economic issues close to the hearts of many Poles. On the one hand, it tried to push manipulative messages, blaming Ukrainians for the reduced standard of living in Poland (which, by the way, has been dynamically growing, not decreasing, in recent years). On the other hand, it tried to paint a picture of “Ukrainian slave workers” (debunked numerous times by EUvsDisinfo). This narrative's aims were to portray Poland as a country which doesn't treat people fairly, to exaggerate Polish-Ukrainian tensions, and to show Ukraine as a country without any prospects, forcing people to leave to become “slaves of Poland”. Adding insult to disinformation, Sputnik Polska also covered the deceptive report of the Russian Federal Supervisory Service for the Protection of Human Rights, according to which Ukrainians were allegedly bringing an epidemiological threat to Poland. This narrative spread to the mainstream media as well, and its aim was not only to portray Ukrainians in a negative light, but to question the EU visa-free travel for Ukraine.

As the Info Ops Polska report points out, Sputnik Polska has also been exploiting difficult issues in Polish-Ukrainian history, trying to evoke fear in its audience by portraying Ukraine as “possessed by fascism and by the cult of Stepan Bandera” or even responsible for a new nuclear world war.

The Pattern of Distribution

However, once a manipulative or disinforming message appears in Sputnik Polska, it marks not the end, but the beginning of an information operation. Different actors and tools are applied to spread the messages as widely as possible and target the groups and individuals who provide the biggest chance of disseminating these messages effectively.

Specifically, the content from Sputnik is duplicated (with slight changes in some cases) and disseminated through different websites, choice of which depends on the particular narrative. The next stage consists of spreading the same narratives via numerous blogs – this is the point where the content is changed, so that it reflects the familiar style of a blogger. Nevertheless, the thrust of the pro-Kremlin narratives stays the same. Next comes social media (chosen according to the message and the audience, its preferences, and vulnerabilities). Another type of interference used by pro-Kremlin actors is engaging in discussions, both on internet fora and in the comments sections of different websites. They comment and put links to either the websites (used to disseminate original Sputnik message) or blogs. They don't usually link to the original Sputnik Polska article, because hiding the connection to official pro-Kremlin Russian sources gives the message more credibility and make it seem organic.

The final stage of the information operation is micro-targeting – where pro-Kremlin actors seek out particular small groups or even individuals in hopes of getting them to push their messages through their own channels and thus launder the origins of the messages even more effectively. The targets are identified in accordance with their preferences and vulnerabilities, which pro-Kremlin actors have recognised and analysed in depth. If the operation succeeds, a disinformation message is then spread to the mainstream – in the social and traditional media. It also paves the way for sophisticated psychological operations, and to many other gears from the Kremlin's toolbox.

The strategy described here is another example of the Kremlin’s broader strategy of ’’information laundering’’, described already by several studies in different contexts (see examples by GMF, DFR Lab and Avaaz). The idea is to cover the tracks of original pro-Kremlin message as much as possible in order to give it more legitimacy, by creating burner accounts, using ‘’independent’’, seemingly unrelated websites, changing the content posted by social media accounts.

Apparently, laundering isn’t always about making clean.

Read more

Disinformation and Philosophy

Other parts of ‘Understand’ focus on narratives, techniques, actors and technological aspects of foreign information manipulation and interference, including disinformation.

However, approaching the matter only from a technological angle ignores that disinformation is an idea. It also ignores that, when practiced, disinformation uses other ideas and is often based on old concepts and metaphors.

Therefore, in our 2021 series called “Disinformation and Philosophy” we explored the historical evolution of disinformation as an idea. 

Broadly speaking, an idea is a thought, concept, sensation, or image that is or could be present in the mind. We asked what the greatest thinkers in the history of philosophy would make of disinformation.

Wisdom, Truth and Falsehoods: are we in Plato’s Cave?


Series on Disinformation and Philosophy

Where is disinformation? This probably feels like a somewhat strange question.

Of course, we could point at the websites of Sputnik or other perpetrators of disinformation. However, it feels like we are missing something.

Approaching the matter from a technological angle ignores that disinformation is an idea. It also ignores that disinformation in action uses other ideas too.

Broadly speaking, an idea is an entity (thought, concept, sensation, or image) actually or potentially present to consciousness, to the mind. Therefore, we asked what the greatest thinkers in the history of philosophy would make of disinformation.

This summer, we will publish a series of articles on this question. Of course, we cannot promise any definitive answers, but we hope that by raising better questions, our understanding of disinformation will nevertheless benefit.

Plato: Eros for Ideas

It is a cliché, but the acclaimed philosopher Whitehead famously said: “The safest general characterisation of the European philosophical tradition is that it consists of a series of footnotes to Plato.” Therefore, we assume it is proper to kick off our series with this giant of thought.

Plato, who lived from 429(?) until 347 BC is, by any measure, one of the most captivating writers in the Western literary tradition. His thoughts, often voiced by Socrates, are recognised as the deepest, most wide-ranging, and influential in the history of philosophy.

The core of his thinking is the understanding of reality as having two dimensions: we can perceive the material dimension (which is ever-changing) but this consists of mere derivates (shadows) of the higher dimension, which is eternal and immaterial and can therefore not be perceived by us using our senses. Plato thought this second dimension was more valuable and should be the orientation of human life and activity.

Plato, an aristocratic citizen of Athens, witnessed the city’s turbulent times, and his works reflect the political events and intellectual movements of his time. The questions he raised, however, are so fundamental, and the way he tackled them so provocative and inspirational, that he has influenced philosophers, scientists, artists and ideologists of nearly every period after him.

Disinformation in Athens

Athens was a turbulent place in Plato’s times. Waves of technological automation led to periods of social and political ‘disadjustment’, when quick and fundamental change left institutions unable to ensure social cohesion. Sounds familiar?

The technological automation of Athens was writing and reading itself.

Roughly until then, poems, science, drama – everything was mostly saved in memory. The alphabet allowed simple and uncomplicated means of saving texts and making them accessible for others.

Some have argued that the works of Plato can be interpreted as reacting to an “infodemic” in fifth-century Athens, which was attributable, in turn, to the technological revolution of alphabetic writing.

Plato’s student and philosophical rival Aristotle was nicknamed, by Plato, The Reader for a reason. It was considered special and dangerous (!) that someone acquired knowledge purely from text.

Plato’s solution: stepping outside the Cave of Untruths

Plato was not entirely happy about the rise of readers. He feared that people would follow texts blindly.

Answering to this threat, he came up with something radical: the pure ideas. Behind the scenes, behind the phenomena we perceive, these ideas – metaphysical, semi-religious entities – structure the material world. These absolute truths are somewhere “out there”.

The famous allegory of the cave symbolises this. In short, we humans are all in a cave, and can only see shadows cast by the light of the sun, which is outside the cave and therefore cannot be perceived by us. The truth is behind the facts, behind the shadows, and can only be approached by the exercise of philosophy.

Crudely speaking, you could say Plato invented the abstract ideas to emancipate people to engage with text critically – to create duality between the text and the reader. In more modern words: to promote critical thinking. Perhaps this is also why he presented his ideas in the form of dialogues: to create a confrontation between different perspectives.

Problems of Plato’s logic: birth of the Conspiracy

As said, Plato’s influence is simply immeasurable. Sadly, his reach even extends to the current spreaders of disinformation. Many conspiracies draw from Platonic anti-materialism. A few examples:

  • You think you live in a democracy, but there is some form of agency governing behind the scenes: the global elite, Bill Gates, Satan, the Jews. The list goes on.
  • Empirically, there might be overwhelming evidence Russia was involved in downing flight MH17. However, facts are irrelevant if our abstract thinking tells us differently.

The philosopher Dugin misused Plato’s thinking to delegitimise the work of journalists as not contributing to truth, as we wrote in January.

Also, the idea of eternity, drawing from Plato’s eternal forms, is very powerful in disinformation. The eternal Russia, connecting Peter the Great with Putin, the eternal Russia to which Crimea belongs. Russia fighting its eternal enemies: the US, the West, the (eternal) Nazis.

Is there a Philosopher on Board?

Recently, the European Commission compared the current technological advancements with the invention of the steam engine. Others have compared it with the invention of the printing press. These comparisons underline technological progress, but also contain a warning.

The rise of text created chaos in Athens. Plato tried to give readers tools to think critically. Later in history, the rise of the printed text ignited Kant to come up with a framework for public debate. So the crucial question is: which philosopher will stand up now, in the times of digital text?

Read more

Aristotle: Disinformation, truth and practical wisdom


A Desire to Know

We kicked off our series with Plato. Today, we will continue with Aristoteles, his student, friend, intellectual rival and his equal in terms of philosophical influence.

In an age of disinformation, it might be quite reassuring to hear that Aristotle reputedly claimed that humans by nature desire to know. According to him, in principle, we humans, are wired to reject untruths and lies. How did Aristotle come to this conclusion, and what conditions are required for truthful communication?

By any standard, Aristotle is recognised as one of the greatest philosophers. His texts shaped philosophy from Late Antiquity, the Middle Ages and the Renaissance. Even today, they are examined with eager, non-antiquarian attention.

Aristotle, who lived from 384 to 322 B.C., was an industrious researcher and writer. He produced a great body of work, by some estimates around two hundred treatises, from which around only thirty-one have survived. His thinking spans a wide range of disciplines, from logic, political theory, metaphysics and philosophy of mind, through to ethics, aesthetics and rhetoric, and even moves into non-philosophical fields such as empirical biology, as he excelled at detailed descriptions of plants and animals.

One detail in Aristotle’s personal life historians and filmmakers like to speculate on, is that in 343, at the request of the king of Macedon, Philip, Aristotle instructed the king’s thirteen-year-old son, Alexander, who was later to become Alexander the Great.

Platonic Relationship with Plato

Before Plato and Aristotle, the leading philosophers were the “Sophists”. They promoted Phronesispractical truth. They taught how to make the stronger case through debating competing arguments. Sophists saw truth as what a society of equals with diverse claims convinced one another to believe was true. Some would say they were the postmodernists of antiquity.

As discussed in our first article, Plato was heavily concerned about the “disinformation” of his days, which emerged with the rise of the alphabet.

Going against the ideas of the Sophists, Plato argued that in order to engage critically with written language, absolute ideas (Sophia) are necessary. Those ideas emancipate the reader in confrontation with the text. Philosophers can arrive at those ideas through a dialectic method - a process of questioning and testing. Something important that is often overlooked, is that gaining insight into these absolute ideas does not lead to possessing the “truth,” but only to being aware of one’s own ignorance of it.

Because of his high ideals, and introspective methods, Plato was no fan of democracy. For similar reasons, he did not care for speech rhetoric. He feared that people without knowledge of the Truth would use manipulation and “base rhetoric” to persuade audiences who were unable to tell the difference.

In a characteristic middle-ground move, Aristotle combined the thinking of Plato and the Sophists. According to Aristotle, rhetoric is the counterpart of dialectic. Both methods of truth-seeking are necessary to solve (political) problems and know the truth. At the same time, both methods have their pitfalls as far as disinformation is concerned. Not employing abstract ideas might prevent critical engagement with the text, as Plato argued. On the other hand, too much of this medicine might lead to a fissure between empirical and abstract truth. Conspiracies are the ultimate example.

Perhaps the practical Aristotle would also have recognised that often the goal of disinformation is not persuasion, but rather compliance or loyalty. This means disinformation does not employ either Sophia or Phronesis, as it is not interested in truth-seeking. Conceptually, pro-Kremlin media operate as “tribune”. Its characteristics: top-down communication, loyalty to hierarchy and just two types of expression: praise or condemnation.

Classical rhetoric and disinformation

How would Aristotle try to make sense of disinformation in our time? He would probably have a practical perspective.

Aristotle had a lot to say about the practice of communication. His book on rhetoric even has its own entry in the Stanford Encyclopedia of Philosophy. His ideas are still relevant.

Rhetoric, according to Aristotle, ‘is the power to see, in each case, the possible ways to persuade’. Aristotle loved to distinguish. Different contexts, however, require different techniques. Aristotle famously said that in context, speakers have at their disposal three main avenues of persuasion: ethos, pathos and logos. Ethos is about the character of the speaker. Pathos is about the emotional constitution of the audience. Logos is about the general argument of the speech itself. This distinction implicitly breaks communication up into three dimensions: speaker, message and audience. Every communicator knows this is key.

Say it like you mean it

Pathos is about understanding – and using - the emotional make-up of the audience. It is about both invoking a particular emotion in the audience as well as evoking emotions from the audience. For example, a growing body of research confirms that the presence of moral-emotional language in political messages substantially increases their diffusion within (and less so between) ideological group boundaries. Just like romantic drama, feelings sell.

Pathos works for communication in general and certainly for disinformation. A famous past example are the reports of Dr. Wakefield in 1998, claiming to have found a link between autism and vaccines. This was not true, but it created sufficient fear to fuel the anti-vax movement, eventually leading to the re-emergence of diseases like measles. The combination of Covid-19 and modern communications technology proved an excellent opportunity for disinformation to invoke and evoke more fear on unprecedented levels.

The logic of the Lie

Logos is about the argument of a statement, separate from the speaker or audience. When applying this to disinformation one immediately recognises that some fakes are evidently… fake.

Think of: Bill Gates is actually (under the control of) Satan. Utter nonsense.

In other cases, it is more complicated.

For example, successful campaigns often “shield a forgery under the armour of a larger truth, explains disinfo scholar Thomas Rid. His acclaimed book, Active Measures, showcases a spectacular example from World War II, the forged Tanaka Memorial. This document (allegedly from 1927) was instrumental in convincing many states that Japan had elaborated a military strategy to achieve world domination. It was not authentic though.

Why was this false narrative so effective? Because it was rooted in Japan’s actual assertive foreign policy of that time.

How do you apply this larger-truth method in pandemic times? Last year, we saw that the pro-Kremlin media took part of the truth (the AstraZeneca vaccine was developed using chimpanzee viral vector) to rebrand it as “the monkey vaccine”, in order to undermine the credibility of western-produced vaccines This enabled the pro-Kremlin media to suggest that the British vaccine will turn people into monkeys, and also tap into criticism from animal rights supporters and anti-vaxxers.

Eroding all credibility

Ethos is about the character of the speaker – his credibility. This dimension is fundamental when understanding disinformation.

Why are fake accounts created for spreading disinformation? It’s not just to create a greater volume of disinformation spreaders. Foremost, these accounts aim to create a group of seemingly like-minded people, people who feel relatable, people whose opinions you can trust. For just one recent example, see research from Graphica on Russian actors posing as far-right Americans.

An important goal of disinformation is also hurting the ethos of other communicators, contributing to distrust in society.

Just a few examples:

This is why we see many attempts to undermine Navalny’s standing.

This is why we see claims Joe Biden is senile.

We see the many different – and mutually exclusive - claims around Raman Pratasevich and the forced landing in Minsk of his Ryanair flight.

Another example is what Facebook calls ‘perception hacking’. Threat actors seek to capitalise on the public’s fear of influence operations (IOs) to create the false perception of widespread manipulation of electoral systems, even without evidence.

Disinformation aims to flood the information space with lies and manipulations. A philosopher from 300 BC shows us how pathos is employed for efficient dissemination, logos to conceal the lie within the truth, and ethos to damage people’s ability to trust one another.





Read more

David Hume: Disinformation, the Slave of the Passions


A Philosopher’s Philosopher

David Hume, who lived in the second half of the eighteenth century, was a Scottish philosopher and public intellectual. He is a philosopher’s philosopher. A few years back, thousands of philosophers selected him as the one with whom they most identified, ahead of Plato, Spinoza and Wittgenstein. His popularity is probably based on his highly influential philosophical empiricism and scepticism. Against Plato, Hume argued against the existence of innate ideas, claiming that human knowledge derives only from experience. This underscored the importance of the psychological basis of human nature, which Hume studied extensively. Perpetrators of disinformation profited from his insights, but so can we.

Classical ideas smashed by science

Hume swept aside Aristotle and especially Plato. It is not that Hume disliked antiquity, but something major had happened that needed integration into the philosophical equation.


Firstly, the Polish astronomer Copernicus formulated a model of the universe that placed the sun rather than the Earth at its centre. Secondly, Galileo discovered evidence to support Copernicus' heliocentric theory, as he observed four moons in orbit around Jupiter. Thirdly, Isaac Newton contributed to this paradigm shift, as he discovered the laws of gravity and motion, and invented calculus.

As these ideas contradicted much of the thinking of Aristotle and Plato, philosophy needed a new start. Thinkers from Hobbes to Descartes had aimed to integrate the new ideas of the natural sciences into philosophy. This ignited the Enlightenment, a movement centred on the idea that reason is the primary source of authority and legitimacy. New ideals such as liberty, progress, tolerance, constitutional government, and the separation of church and state emerged.

Three sceptic views

David Hume was one of the Enlightenment’s leading philosophers.

As Newton was Hume’s inspiration, he followed the Newtonian maxim “Hypotheses non fingo”, crudely, “I do not do hypotheses”. He believed any scientific law must be established by observation and experiment, there is no space for traditional a priori metaphysics.

This approach led to three sceptic views that are widely known, amongst philosophers at least.

First, our confidence in cause and effect, on which all thinking about matters of fact rests, is not justified by either observation or by logical deduction. Actually, we only ever see one thing following another: we never perceive any power that makes one thing necessitate an outcome.

Second, Hume is known for his arguments against dogmatic aspects of religion, although he never promoted atheism.

Slave of the Passions

The third view is what makes Hume really stand out. Although he lived in the ‘Age of Reason’, he famously proclaimed that "reason is, and ought only to be, the slave of the passions."

Hume does not say, as some interpret him, that rationality is somehow distorted by the passions. Rather, he points out the emptiness of rationality. Rationality is nothing more than a tool to optimise a certain set of preferences, which are in the end based on emotions. He famously explained this emptiness with “Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger.”

According to Hume, reason by itself provides no motivation to act, let alone principles that are able to carry morality. Consequently, to call someone “good” means we have a basic fellow-feeling, driving a sympathetic response to his suffering, or pleasure at the thought of his success. Meet anyone with different opinions? He is not irrational, but heartless.

As the Newton of morality, Hume tried to explain how perceptions of the mind come and go and merge into complex perceptions leading to human thought, belief, feeling and action. Ultimately, this view inspired many psychologists, including Freud.

Disinformation exploiting the passions

Hume experienced disinformation. In his own time, falsehoods clustered especially around dynastic succession. A big lie that concerned Hume, was about a (fictitious) Jesuit conspiracy to assassinate the British King Charles II. This was part of a campaign by the Whig political party to exclude Charles’s Catholic brother from succession to the throne.

These days, it is pretty clear disinformation is also a slave of the passions. An MIT study found that false statements spread much more quickly and widely than true ones, probably because they’re more entertaining in that they target users’ emotions. Remarkably, Hume, in A Treatise of Human Nature (1739) wrote of what today we call “emotional contagion”, describing how affections pass from one to another.

As for our intellect, Hume demonstrated that what we call ‘rational’ is often hubris. As we wrote earlier, in the 20th century, the psychologists Daniel Kahneman and Amos Tversky showed that humans are wired to jump to conclusions (too) quickly. One example is confirmation bias: the tendency to search for, interpret, favour, and recall information in a way that confirms one’s prior beliefs or values. In 2019, we wrote about how confirmation bias can help make disinformation credible.

How should we deal with disinformation - and ourselves? Hume warned we should always be suspicious of the assumptions favoured by our passions. The stronger we feel about something, the more likely our feelings are to influence and bias our reasoning. On a personal level, this means we need to do more than involve a variety of sources, since we can always find reasons to dismiss evidence from the “other” side. At the very least, before critical thinking can emerge, we need introspection and to know our emotional positions.

On a macro level, our democracies need not only media plurality, but perhaps even theatre, comedians, music, and spirituality to help us get to know and talk about our own deepest feelings and moral presumptions, so we can hopefully prevent disinformation from exploiting them.

Read more

Kant, the philosophy of Autonomy, Truth and Peace


Kant: question more?

It is fascinating that disinformation often flirts with the idea of critical thinking for oneself. The motto of the disinformation outlet RT is: question more. Also, think of the YouTube algorithm, which recommends videos to watch, including those relaying disinformation and misinformation, while still letting viewers believe they are independently selecting the videos themselves. Doing research even. The roots of the idea that one should investigate the truth oneself, independent of any external authority, go back to the German philosopher Immanuel Kant. Human autonomy is central to his philosophy.

Still, if Kant were alive today, he would probably be wary of disinformation. According to him, the ability to establish the truth, or shared notions underpinning it, is essential for enduring peace. In other words, disinformation poses a grave risk; the stakes are high. Kant also absolutely condemned lying. How did he reach these conclusions?

What can I know? What should I do? What may I hope?

Kant contended that all philosophical doctrines, including his own, answered three questions: 1) what can I know? 2) What should I do? 3) What may I hope? Kant’s answers combined modern empirical science (for example Newton’s) and a certain trust in rationality beyond experience, allowing him to become the central figure in modern philosophy. His greatness lay in his ability to find an answer that was on the one hand systematic, rational and integral, yet, on the other, still left room for a certain degree of mystery. At the same time, his conclusions have wide-ranging consequences; from metaphysics, to ethics, to politics, even to disinformation.

If we want to understand Kant, it is important to note that Kant responded to Hume and his scepticism, which we discussed in an earlier article. This largely determined how Kant approached truth, and consequently, everything else. In short, Hume was impressed by the scientific breakthroughs of Copernicus and Newton. Therefore, he argued that we should practice philosophy following their empirical methods. At the same time, Hume believed that the enlightenment had placed too much value on ratio. Instead, in a manner sometimes compared with Buddhism, we need to focus on what we perceive, including our emotions.

Although Kant appreciated Hume’s work, he politely declined to accept his conclusions. Instead, Kant countered them in perhaps his most famous work; the Critique of Pure Reason. This book argues that “synthetic a priori knowledge” is possible; it is a certain rationality that precedes experiences. For example, in theory, one could say “a square has four sides” without having ever seen a square. To put it crudely, Kant pointed out that we all shape our experience of things through the filter of our mind.

Kant, not modestly perhaps, compared his ideas to the Copernican revolution, in the sense that the objects of the senses necessarily conform to our spatial and temporal forms of understanding. This means we can have a priori cognition of the objects of the senses.

Image: Copernicus's heliocentric model (Source: Wikimedia Commons)

Enlightenment = Critical Thinking = Freedom

Kant wanted to find a philosophical base for human autonomy. However, paradoxically, he understood autonomy in a rather restricted way. For Kant, autonomy does not mean something like ‘do whatever you like’, but is more along the lines of being the authority of one’s own actions. Because of his notion that our thinking shapes how we experience the world, he developed his method of “critical philosophy” to underpin freedom. To become free, we need to understand both rationality and our thinking. And adjust accordingly.

Kant's Enlightenment philosophy is known as 'critical thinking'. Kant, in his famous essay What is Enlightenment?, defines it “as humankind’s release from its self-incurred immaturity; immaturity is the inability to use one’s own understanding without the guidance of another.”

Kant began to critically examine our faculties of knowledge. He asked questions like: What can we know? What conditions make our knowledge possible? And where are the limits of our knowledge? All claims must then be critically investigated. They are only credible if they can be substantiated with arguments. Because all people should be able to recognise truth independently, Kant’s approach is often connected with freedom and equality for all.

Although people should be able recognise truth independently, at the same time, the fact that the possibility of getting direct access to truth is no longer an option, given Kant's "Copernican revolution", means that to discover truth, we need an intersubjective method. To put it simply; we need public reason to compensate for our blind spots.

Very roughly, this critical thinking on a macro level is what Kant calls public reason. Without it, the state cannot be legitimate. We will explain this briefly. According to Kant, political authority requires the establishment of political institutions in the civil state. Rational individuals even have a moral obligation to do this. The state however derives its legitimacy from “any public law’s conformity with right”. This means, hypothetically, every law could have the consent of any rational person.

Consequently, the head of state, to be legitimate, must obey public reason and create such legislation. Obeying public reason presupposes facilitating the free flow of information, so Kant strongly defends the ‘freedom of the pen’ and the ability to make known one’s opinions. As free speech is recognised as the ultimate safeguard or protection of the people’s rights, paradoxically, these rights can also be seen as an argument to limit free speech. Without a legitimate state, requiring public reason, there can no protection of rights. So speech intended to overturn the legitimate state can be limited. Kant was thinking of revolutionary movements, but we can easily apply this reasoning to disinformation, which aims to stimulate polarisation, and ultimately destroy democracies.

Image: Kant lecturing (Source: Wikimedia Commons)

Never, Never, Never Lie!

One of the reasons why Kant is so interesting from a disinformation perspective, is he examined the moral prohibitions against intentional falsehoods: lies. Kant famously argued that all lies are harmful because they undermine the dignity of others. Lies prevent people acting freely and rationally. When someone lies, he interferes with his audience’s right to receive information that is correct. Also, lies distort the ability to make informed decisions. Kant goes further and argues that lies cause broader harm by undermining a speaker’s credibility, which, in turn, causes people to distrust each other’s contentions. Kant even goes as far as to say lying is immoral, under all conditions. In one famous example, Kant argues this is even the case if a murderer knocks on your door, looking for a man sheltering in your house. No-one is allowed to fabricate a story, even if it means saving a life!

Kant aimed to underpin human autonomy philosophically, and more practically, described the social conditions needed for freedom. According to Kant, it calls for a legitimate state, which obeys and protects public reason. It could mean that Kant would support policies to establish a diverse media landscape, and defend measures to curb antidemocratic movements, including measures targeting the spreaders of disinformation. Kant staunchly attacks lying, and thus disinformation, as it is not just immoral because it manipulates others and undermines their dignity, but it also erodes the foundations for lasting peace. When it comes to disinformation, Kant’s thinking is both a very thorough and a very serious warning.

In 1945, Russia annexed the old Prussian town Königsberg, today’s Kaliningrad. It is ironic that Kant’s grave can be found in Russia, just as the Kremlin is attempting to kill all forms of critical thinking.

Image: Monument for Immanuel Kant in Kaliningrad (Source: Wikimedia Commons)

Read more

Mill and the Virtuous Circle of Confidence through Self-Correction 


The reasoning of John Stuart Mill is often invoked in debates about disinformation, both by people wary of state intervention, as well as by those who seek to protect the free flow of information. It makes sense to involve Mill, who is one of the great defenders of liberty in the canon of philosophy. The free flow of information is central to his scientific and practical philosophy. Mill explored the consequences of an outright empiricist perspective, but combined it with the Romantic movement of nineteenth century poets, which was fresh and new at the time, as poetry had reignited his life. Mill was raised strictly, perhaps tyrannically even, by his father, who wanted to create a great mind. His book On Liberty, published in 1859, remains the conclusive promotion of the liberal idea that individual freedom is the best path to a happy and just society. Its implications for disinformation are less clear-cut than many would expect.

Mill and Mill

John Stuart Mill was born in Pentonville, England, in the year 1806 and went on to become one of the fathers of modern-day liberalism. We know a great deal about his youth because he documented it extensively in an autobiography. John was the son of James Mill, who was, together with Jeremy Bentham, an influential proponent of the then upcoming movement of utilitarianism. Roughly, they hold that the morally right action is the one that produces the most good. They equated the good with pleasure, so, like Epicurus, they were hedonistic about value.

Although James Mill propagated pleasure, he gave John an exceptionally strict upbringing. He explicitly aimed to create a genius who could further the cause of utilitarianism. His father did not allow him to play with other children. He was taught Greek from the age of three. By the age of eight, he had read a great deal of the Greek classics and knew history, arithmetic, physics and astronomy. Later Mill suffered from large periods of sadness and even contemplated suicide. As his recovery was inspired by the poetry of the Romantics, Mill realised the philosophy of the Enlightenment only contained “one side of the truth”. From that moment, he aimed to bring together the culture of feelings and of a just society.

Truth does not exist outside criticism

In comparison to Kant, Mill holds that a priori knowledge of objective facts is not possible. The mind has no exalted place in the order of things; instead it is part of nature. Therefore, Mill argues that knowledge can be obtained only by empirical observation, and by the reasoning which takes place on the basis of such observations. For this reason Mill is considered a naturalist, whose ideas on thinking are closer to Hume than to Kant or Plato.

However different from Hume, Mill’s thinking has a certain optimism. Mill characterises the history of science as the compounding of our knowledge by inductive reason, but also the growth of our knowledge of inductive reason. Knowledge compounds, and it does so increasingly efficiently. As man learns more about the universe, induction becomes more and more ingrained in our being. Gradually, man becomes more self-critical and systematic. Nevertheless, because induction always remains based on empirical observation, new evidence potentially always means we need to correct our thinking. Truth is provisional, as we might realise tomorrow we have been wrong all along!

This principal willingness to admit mistakes and correct views underpins the strength of modernity. If a certain position has survived many (intellectual) attacks, it must be solid, because we are not attached to it in principle. Paradoxically, one more likely to be right precisely because he is insecure!

Freedom of speech - smart, beneficial and just

This provisional element links Mill’s theoretical philosophy with his practical philosophy. This openness in fact is the cornerstone of Mill’s political views which are voiced in his On Liberty. Central to the book is the so called “harm principle”. It holds that the actions of individuals should only be limited to prevent harm to other individuals:

"The only purpose for which power can be rightfully exercised over any member of a civilised community, against his will, is to prevent harm to others."

On Liberty also contains a strong defence of freedom of speech. Mill offers essentially three arguments for freedom and against censorship. Firstly, an epistemological argument. One can never be completely sure a silenced individual might be (partly) right. It is also useful to be confronted with erroneous thinking, as it forces one to re-examining one’s views and prevents them from becoming dogmas. Moreover, people are more likely to drop mistaken opinions when engaged in open conversation. Secondly, a utilitarian argument. According to Mill, it is beneficial for society to have competing ideas. It will eventually lead to the best solutions for society. This is often connected to the idea of the ‘marketplace of ideas’; from competing ideas the truth will emerge. Thirdly, there is a Romantic aspect to Mill’s thinking. Censoring an individual is wrong, as he has a right to self-expression and self-development.

Liberty – yes, but how?

Mill was aware that liberty needs more than just institutions to guarantee freedom. For him, it was not just the state but also society that could harm the free flow of ideas. He rightly foresaw that in mass democratic societies the informal workings of social pressure and expectation could be tyrannical too. For this reason, Mill defended individual liberty from government intervention, but also from the dictates of society, tradition, and custom, which Mill feared would strangle individuality. Actually, for Mill the biggest danger was not government censorship, but social censorship. In other words, a culture of intolerance against people who dare to think differently.

What does it mean for the world of disinformation? Obviously, Mill did not write about this directly. He did however write that some ways of asserting an argument nonetheless “may justly incur severe censure.” He thought of speeches that were “to suppress facts or arguments, to misstate the elements of the case, or misrepresent the opposite opinion.” Deliberate false information might fall under these categories. So Mill was not categorically against censuring those who act in bad faith, although he warned that it is difficult to decide who really does act in bad faith. It is likely Mill would not be categorical against cataloguing foreign interference by false information, executed, financed or supported by a state actor, as EUvsDisinfo is doing.

Another aspect of the threat of disinformation is technology, and who controls it. From a competition perspective, it is not a stretch to imagine that Mill would be worried about the concentration of power with the platforms.

In addition, in the domain of information technology, the application of Mill’s concepts of freedom and harm become slippery. As platforms create environments that aim (and succeed) in holding the attention of users for as long as possible, while mining data users may not fully be aware they’re giving for more effective ad targeting, and in some cases being led to known malicious spreaders of false information, the voluntary nature of users is rather doubtful.

It is a vain attempt to project the thoughts of a great philosopher from the past on today’s problems. Still, for many it is tempting to read On Liberty as a definitive argument against any government-action in the face of disinformation. However, we cannot reduce Mill’s defence of an open process of searching for the truth, his warnings for government and social censorship and his appreciation for the eccentric individual to such a narrow position. His ideas are just too important and interesting to do that.

Read more

Nietzsche: Beyond the Age of Post-Truth?


Beyond the Age of Post-Truth?

Thus far, in our series we have presented five titans of thought in relation to disinformation. Now we will turn to one who also became a cultural superstar of sorts: Friedrich Nietzsche. For some, this tragic prophet of the age of post-truth is a liberator from all dogmatism; for others, a dangerous opener of the gates of nihilism. We will argue that his “perspectivism” might be less lethal for the survival of truth than some think.

Nietzsche - the Birth of the Tragedy

For Nietzsche, philosophy cannot be separated from the philosopher. Nietzsche’s own life was tough and sad yet carries some tragic glory. He was born in 1844 in Röcken, Germany, as the son of a pastor. As the son of a child prodigy, his family expected he would become one as well. His father died when he was five and the young Friedrich was sent to an extremely strict school. After becoming the youngest professor of classical philology in Basel ever, he wrote The Birth of Tragedy. The book proposes that healthy cultures should find a middle between Apollonian (logical, harmonious) and Dionysian (chaotic, ecstatic) forces. In a time that was deeply Apollonian, it was not well received. Much later, it became influential in psychiatry (Jung), philosophy (Foucault), and art (Rothko). After that setback, however, Nietzsche gradually withdrew from academia; from to 1879 to 1889, he lived without fixed address, wandering in the Alps and Italy. As most of the time he was plagued by severe health problems, he wrote a streak of books in the short periods he felt well.

Death and Disinformation leading to Brown Reception

In 1889, at just 44, Nietzsche collapsed in Turin, after which he completely lost his mental faculties. The remaining ten years of his life – he died in 1900 – he lived in the care of his mother and of his sister Elisabeth while his work gradually attracted more attention, something Nietzsche himself was not aware of.

A photograph of Nietzsche with his mother after his collapse. Source: Wikimedia

A large part of Nietzsche’s dark reputation dates to that period, when Elisabeth worked hard to popularise her brother’s work amongst circles of the emerging extreme right, which, conveniently, also lifted her social standing. In a move of disinformation avant la lettre, she even forged nearly 30 letters and rewrote many passages of his books. For example, a concept that was distorted in the wrong hands was that of Übermensch. For Nietzsche, this term refers to a possible future, signalling that man as he currently exists is just a phase in an evolutionary process, moving from bacteria to Bach towards an unknown future. Thinking you already know the outcome of that process, thinking even that your race is the outcome, contradicts Nietzsche completely. Among philosophers, he was rehabilitated from the 1950’s onwards and in 2019 a new biography established that Nietzsche’s political thinking was, if anything, more European than nationalist, that he deeply regretted the violence of the German-Franco war, that he had several Jewish friends, and that he was among the very few professors at Basel University voting in favour of admitting women. Yet as the hijacking of his ideas can never be undone, and the hijackers consequently led the world into an unparalleled catastrophe, the shadow remains.

The Impossibility of Nietzscheanism

It is hard to give a brief overview of Nietzsche’s thought; Nietzsche never created a coherent “philosophy”, a system of thinking. He called himself the “philosopher of maybe”. Instead, an important element is the recognition of the chaotic and dynamic nature of reality. One has to affirm chaos, according to Nietzsche: “One has to be chaos to give birth to dancing stars”.

One could say that Nietzsche, in order to create more chaos, put gunpowder into the canon of philosophy, including the thinkers we covered earlier in our series.

Nietzsche called Plato a sophist himself. Moreover, Plato’s objective ideas behind the phenomena we perceive are life-denying, as they only reflect the psychological weakness of not being able to accept truth. Projecting this inability on religion, he called Christianity “Plato for the masses”.

Although Nietzsche appreciated Aristotle more than Plato, he would still dismiss Aristotle’s teleological ideas (everything has its fixed function) as fairy tales.

The empirical tradition of Hume and Mill in Nietzsche’s eyes is philosophically naïve: “This is not a philosophical race – these Englishmen”.

Nietzsche is tough on Kant as well. In rash terms, he called him intellectually cowardly, as he tried to save the idea of human autonomy from empirical science.

At the same time, his writing is so rich in different perspectives, Nietzsche cannot be reduced to one position or his apparent dismay of other philosophers. There is always another side.

Although Nietzsche attacks Kant harshly, one could also say Nietzsche did not sweep away his thinking, but continued it, radicalised it even. How does this work? As we wrote before, Kant explained that we can never have direct access to the things themselves (Ding an sich), but only through a medium (our consciousness). Take this example: if you look up from your screen, you see the ceiling (if you are indoors). However, this is actually fictional, a story made inside your brain. Instead, something else happened: the retina behind your eyes catches reflections of light bouncing off some object. Your brain creates the meaning of this process into a static image: the ceiling. It needs the concept of “ceiling” to be able to do this.

While Kant thought there was a certain rationality in the use of these concepts, Nietzsche, inspired by Arthur Schopenhauer, came up with a different interpretation. Very crudely, both Nietzsche and Schopenhauer would say that everything in the light-ceiling-retina-consciousness example behaves as it does because it desires to will. Nietzsche famously says in The Genealogy of Morals that man would rather will nothing than not will at all. Nietzsche departs from Schopenhauer, however, when he argues that will is directed and constantly striving, while Schopenhauer contended the will had no sense of purpose.

Reinventing the Sacred in Post-Religious Times

Based on his ideas on chaos and will, Nietzsche criticised modern culture. In the Gay Science, he presents a parable that illustrates how far he is ahead of his own time. The story is about a marketplace where a madman shows up with a lantern. He shouts that he is looking for God. The people in the market are amused and mock him: “Why! is he lost? said one. Has he strayed away like a child? said another. Or does he keep himself hidden? Is he afraid of us? Has he taken a sea-voyage? Has he emigrated?... The insane man jumped into their midst and transfixed them with his glances. ‘Where is God gone?’ he called out. ‘I mean to tell you! We have killed him,—you and I!’”

This tale tells us two things. First, God is dead, meaning we now live in the age of scepticism, of intellectual chaos. However, this is not something which is applauded by Nietzsche. Because secondly, he wants to show that the people do not understand the implications of what has happened. Modern humans have only replaced the image of God with something else (the market, the dignity of man, progress) but the way of thinking, the grammar of God, remains in place. According to Nietzsche, we have given ourselves freedom, but we lack the mental power to use it. He wants to create space for a modern sacred attitude in duality with modern scepticism.

"Portrait of Friedrich Nietzsche" by Edvard Munch. Source: Wikimedia

Beyond Post-Truths…are Truths!

To Nietzsche everything part of reality is chaos, or struggle. This means that everyone making claims about the truth should realise he does never offer the truth, only an interpretation of it. Contrary to how some have interpreted this, this does not mean that Nietzsche does not care for truth. He cares so much about it he warns about scientific hubris claiming to have the Truth.

According to Nietzsche, this does not have to lead to a state of mind in which everything is futile or morally acceptable. His work suggests it is possible and necessary to overcome nihilism by establishing an attitude of strong scepticism. This refers to a truth practice based on perspectivism: not aiming to find static truths about reality, but becoming a truthful person.

Fake News and Disinformation

Because truthfulness is important for Nietzsche, he would despise spreaders of disinformation. One who spreads disinformation implicitly acknowledges he is weak. Nietzsche also warns that one cannot separate a claim on the truth from the person making that claim. This does not mean we have to give in to total relativism. It does not mean that all reasons are equally valuable and thus no one really is. Nietzsche is ruthlessly logical in his thinking. Instead, it means we have to be alert and always ask questions: who is making this claim? What are their interests? Maybe in general it is healthy for a society to be somewhat sceptical towards those that have power. Actually, this is what EUvsDisinfo is doing with state-sponsored media which we have strong reasons to believe are designed to create or stimulate tensions in our democracies and advance Kremlin interests.

However, Nietzsche might be more relaxed about misinformation or “fake news”. Of course, there is a lot of wrong information, or claims that appear to be wrong or prove to be wrong in hindsight. This is inevitable given the chaotic nature of the world (plus all the noise machines at our disposal). The fact that people are upset about it perhaps also betrays unreasonable expectations about truth and the public debate. It seems to deny the reality that the truth can be messy. Disinformation, on the other hand, often takes advantage of that messiness and the disappointment about it. This does not imply we should not take seriously the danger of spreading misinformation. Yet, we should realise the phenomenon itself is an inherent part of reality.

Anything Goes? Probably Not

So if we are experiencing Nietzsche’s prediction of a post-truth age, how worried should we be? According to him, it does not need to be as gloomy or hollow as one would imagine. If Nietzsche was right in claiming that our positions are just perspectives on truth, this does not mean that we have to live within the boundaries of those perspectives. Alexis Papazoglou, who wrote about Nietzsche and post-truth, argues that if we increase our awareness of different views, it becomes more likely that we are able to reach something close to “objectivity”. Psychologically, this might be easier if we drop the notion of objectivity, which always seduces people to think one is right and the other is wrong. In his book On the Genealogy of Morality, Nietzsche writes:

The more eyes, different eyes, we know how to bring to bear on one and the same matter, that much more complete will our “concept” of this matter, our “objectivity” be.

In these times, which are both connected and more alone, sincerely listening to people with different views – integrate, like Nietzsche, as many views as possible – seems vitally important.

Read more

Hannah Arendt and the Fragility of Facts


The Loneliness Factor

These days, most people intuitively grasp that loneliness creates mental health challenges. As COVID-19 had forced societies into lockdowns, many realised we humans are mammals, too. Alone, we suffer.

A study by Princeton University shows that people are also more likely to believe disinformation when they feel excluded. Furthermore, loneliness is how some experts explain why older people are prone to share disinformation.

Hannah Arendt, one of the most original thinkers of the 20th century, first suggested how loneliness fuels political extremism. Personally, she had experienced her share of both.

In 1933, as a young Jew, she had to flee Germany. After World War II, she wrote extensively on why lies, populism, and, eventually, totalitarianism were able to rise.

In her classic book The Origins of Totalitarianism, Arendt argues that totalitarianism was historically something new. It “differs essentially from other forms of political oppression” (think despotism, tyranny and dictatorship) as it applies terror to subjugate mass populations rather than just political adversaries. At the end of her book, she focuses on what seems to be a remarkable factor of its success: loneliness.

Down and Out in Berlin

Everyone knows the story. The New York stock market crashed in 1929, hitting the global economy and Germany mercilessly. As a consequence, trade and the economy were in tatters, with brutal unemployment and a fragmenting society. Even before the crash, the country was scrambling after the First World War, yet the economic boom of the roaring twenties had made some very rich. The perfect conditions for the rise of “mass man”, who feels abandoned, isolated and redundant: lonely.

For Arendt, loneliness is not primarily the lack of social interaction with other people, but rather the lack of self-identity, which is a product of being together, being part of a community: “The experience of not belonging to the world at all” hinders the capacity for forming an identity, leaving space for terror and a totalitarianism. The ideal subject of totalitarian rule is not someone with ideological convictions, “but people for whom the distinction between fact and fiction… and the distinction between true and false… no longer exist.”

A Speaker of Truth Has no Friends

As Arendt experienced herself, speaking the truth in public debate can also be lonely. She thought she documented her experience and shared her views in writing about the Eichmann trial. However, in return she personally received blame. Nevertheless, this experience, though painful, emboldened her to take up the pen against lying in politics. According to Arendt, if one accepts lying or denying someone the space to form their opinion or their experience of reality, it could destroy the common fibre of humanity – the world we together construct.

Cover of "Eichmann in Jerusalem". Source: Wikimedia

The Inflation of Facts

Her 1967 essay, Truth and Politics, recognises that in modern times, truth is no longer seen as something that is a given, waiting to be discovered by man; instead, it is produced by the mind. Arendt discerns different degrees of truths: mathematical truths, scientific truths, philosophical truths and facts.

Facts and events are what Arendt calls “factual truth” and they are the product of a community living and acting together. Facts and events are documented in collective memory and history as the collective narratives and traditions challenged or upheld. Their function? To serve as common ground to stand on, giving every person the opportunity to their experiences and create meaning. However, the difficulty with facts is that compared to math, science and philosophy they are more fragile, as Arendt writes:

“Facts and events are infinitely more fragile things than axioms, discoveries, theories – even the most wildly speculative ones – produced by the human mind; they occur in the field of the ever-changing affairs of men, in whose flux there is nothing more permanent than the admittedly relative permanence of the human mind’s structure. Once they are lost, no rational effort will ever bring them back.”

Facts, according to Arendt, are always in danger from politics, as politics is not primarily concerned with truth but with action. However, beyond normal political practice, and more dangerously, looms what Arendt calls “organised lying”. Organised liars aim not just to replace a part of the truth with a falsehood. Actually, they go further, as they want to undermine the factual fabric of reality. In less philosophical terms, “to flood the zone with shit” (in the words of Steve Bannon).

Through organised lying, facts are gradually downgraded to mere matters of opinion. Reading Arendt, it becomes even clearer how the current concept of “alternative facts” is instrumental to achieve inflation of facts.

The many instances of revisionist disinformation we have spotted underscore this. Just look for example at narratives on WW II, the Katyn massacre, or false stories on the NATO promise to never “expand” eastwards.

Even Putin himself has shown remarkable interest in rewriting history and published several papers that distort history. For example, one blaming Poland for the outbreak of WWII, another claiming the historic unity of Russia and Ukraine, and yet another one on American hunger for power leading to the eastern “expansion” of NATO and the unconstitutional coup in Ukraine.

According to Arendt, once disinformation from those in power succeeds, it gives the organised liar three abilities. First, to say things and then claim they never did. Second, to rewrite history in order to serve their interests. Third, to call specific groups (minorities, political rivals) a threat, without the need to back it up by evidence. As the fragility of factual truth is not something that can ever be remedied permanently, modern democracies have to live with this inherent insecurity.

All this speaks to the enormous importance of public institutions such as libraries, museums, universities, courts and newspapers. These are places where facts are conserved and the public record is maintained. Without them, we would lose an important pillar protecting factual truth.

The Advantage of the Lie

Why do lies have an impact in the public space? One answer is that it is because truth is often unclear, unstable. People prefer stories that remove their uncertainty. When COVID-19 struck the world a year ago, as people fear change, some really wanted to believe this was nothing serious. Disinformation actors exploited this. In the same period, other narratives did acknowledge the dangers, but subsequently explained them in terms of already existing beliefs; like the threat from global elites, evil billionaires, or China.

Hannah Arendt explained this phenomenon in 1971 in another topical essay, Lying in Politics:

“It is this fragility that makes deception so very easy up to a point, and so tempting. It never comes into a conflict with reason, because things could indeed have been as the liar maintains they were. Lies are often much more plausible, more appealing to reason, than reality, since the liar has the great advantage of knowing beforehand what the audience wishes or expects to hear. He has prepared his story for public consumption with a careful eye to making it credible, whereas reality has the disconcerting habit of confronting us with the unexpected, for which we were not prepared.”

When it comes to the dangers of disinformation for society, Hannah Arendt is still an absolute authority. She recognised that loneliness makes people susceptible to disinformation and extremism and she explained why facts are so fragile and people prone to buy lies.

Hannah Arendt. Source: Wikimedia

Read more

Karl Popper and the Unfalsiability of Bullshit


Falsification Popping Up

Karl Popper, born in 1902 in Vienna, is widely recognised as one of the greatest philosophers of science. He was also a social and political philosopher with significant influence, a self-proclaimed fighter of all forms of scepticism and relativism in science and in human affairs generally. He staunchly defended what coined the “Open Society”. It is hard to overestimate Popper’s influence, which was expressed by Hermann Bondi, a well-known mathematician, when he said: “There is no more to science than its method, and there is no more to its method than Popper has said”.

Popper’s own experiences had a strong influence on his thought. As a student in the 1920s, Popper heard Einstein lecturing. It greatly inspired him. Before, as a teenager, he had flirted with Marxism and later with the psychoanalytical works of Freud. Listening to Einstein, however, he realised his theory was “risky”. This was a positive thing. Einstein, according to Popper, was daring, because it was possible to deduce consequences from his theory, which would, if they turned out to be false, falsify the whole theory. This “critical spirit” in Einstein, which was completely absent in Marx (critical theory is something different entirely) and Freud, was crucially important for Popper. Freud and Marx, Popper thought, developed theories in terms that could only be confirmed. By contrast, Einstein provided testable implications, which, if false, would falsify the whole theory. This later became the core of Popper’s philosophy of science, as we will see.

Of course, the dramas of the 20th century influenced him deeply too. He was dismayed by the failure of the democratic parties to halt the rise of fascism, and the Marxists effectively welcoming them, as they saw fascism as a dialectical step leading to the implosion of capitalism. Later, this led Popper to posit the “tolerance paradox”. Very crudely, it means that democrats should be intolerant against intolerance. A strong argument for government intervention in processes and entities aiming to subvert democracy, such as spreaders of disinformation.

Popper, of Jewish descent, in 1937 accepted a position as philosophy professor in New Zealand, where he stayed during the Second World War. The annexation of Austria, his country of birth, in 1938 stimulated Popper to focus more on political philosophy. In 1945, he published his critique of totalitarianism, The Open Society and Its Enemies.

We have written about the power and dangers of the story of “eternity”. Popper would call this story a form of “Historicism”, which he saw as the belief that history develops inexorably and necessarily towards a fixed end according to certain principles or rules. Popper had particularly Plato in mind as a philosopher who paved the way for these concepts. These ideas, Popper argues, have their origins in what he names:

one of the oldest dreams of mankind—the dream of prophecy, the idea that we can know what the future has in store for us, and that we can profit from such knowledge by adjusting our policy to it.

A number of Popper’s works, especially The Logic of Scientific Discovery (1959), are now widely recognised as spearheading classics in the field of philosophy of science. The core of the book is quite straightforward: a universal claim is falsified by a single genuine counter-instance. It has greatly influenced the development of modern science and has also important implications for us, students of disinformation.

Infoshum = Bullshit

Now, back to our times. In 2018, EUvsDisinfo wrote about a special kind of pro-Kremlin media content: infoshum. It is more profound, effective and darker than the concept of eternity.

The word infoshum has its roots in the internationally known term “white noise”, i.e., random and meaningless noise, otherwise known as “info-noise”. It is in the grey zone between information and disinformation. And we have evidence that it is actively being pushed by pro-Kremlin media.

An important concept to help us understand infoshum is bullshit.

Interestingly, there actually exists a theory of bullshit and it has implications for disinformation. This theory was presented in the book “On Bullshit” by philosopher Harry Frankfurt. The very first lines:

“One of the most salient features of our culture is that there is so much bullshit. Everyone knows this.”

According to Frankfurt, bullshit is speech intended to persuade without any concern for truth. This lack of concern distinguishes the bullshitter from the liar. The bullshitter is more radical. Liars know and care about truth, and this is precisely why they try to cover up the lie. Bullshitters, however, do not care whether they utter truths or lies. Their focus is solely on persuasion.

Frankfurt does not claim that there is more bullshit in society than before. Instead, he explains that all forms of communication have increased, leading to more visible bullshit. When Frankfurt published his book, in 2005, the world’s most valuable companies still managed oil and money instead of information; Facebook had a meagre 6 million users. We were years away from the spectacular rise of social media.

Fast forward to 2021, and bullshit has become a lot darker than even Frankfurt predicted. Sometimes, it seems like our democracies are drowning in bullshit. Sadly, our database covers quite some bull.

Occupying the information space

Why would anyone spread bullshit? In the end, the goal is to occupy the information space.

As we flagged before, a Kremlin-funded think tank published an essaySecuring Information for Foreign Policy Purposes in the Context of Digital Reality” which claimed that:

“A preventively shaped narrative, answering to the national interests of the state, can significantly diminish the impact of foreign forces’ activities in the information sphere, as they, as a rule, attempt to occupy “voids” [in the information flow].”

This strategy points to the ambition to take away attention from a certain truth. Therefore one who applies this strategy is a liar, not a bullshitter.

However, both the tactical liar and the bullshitter share an attitude: substance is secondary, and the primary goal is to flood the information system.

With this perspective, even false information that does not seem to be directly harmful is dangerous because it occupies space, hurting the general conditions to establish truth. It is in itself telling that in Russia, almost half of all political conversation on Twitter is conducted by bots.

Researchers proved that this is also the case with regards to COVID-19. They scrutinised more than 200 million virus-related tweets worldwide and concluded that since January about 45 per cent of these tweets were sent by accounts that behave more like computerised robots than humans.

In 2018, Facebook deleted 835 million fake accounts – the equivalent to almost 10 per cent of the earth’s population.

Steve Bannon once notoriously said: “The Democrats don’t matter. The real opposition is the media. And the way to deal with them is to flood the zone with shit.”

The falsifiability of bullshit

If you want to flood the information system, bullshit is a fine instrument, because it might be harder to falsify than a lie. This is where we get back to Karl Popper, as this is exactly what he meant with falsifiability: the capability of a claim to be contradicted by evidence.

For example, “all swans are white” is falsifiable. You need only one black swan to disprove it. “This human action is altruistic”, by contrast, is a non-falsifiable statement. We have no instruments to decide whether or not an action is driven by self-interest.

In the disinformation context, things work the same.

For example, if you push the narrative that the BBC claims the MH17-flight was downed by a Ukraine fighter jet, this can easily by disproved. The BBC documentary in question was clearly misrepresented. But try to refute the following hypothetical claim (made up by us): “George Soros is the driving force behind a secret society supporting colour revolutions, their hidden intent is to overthrow all nation states, to make space for a world government”.

This can never be disproven completely. However, Popped and Frankfurt showed that does not make it true either: rather, it strongly suggests we are dealing with bullshit. And as we have seen, bullshit is not innocent.


Read more

Heidegger and the Age of the World View


The suppressed philosophers

Last summer, EUvsDisinfo spotted a case that claimed “liberalism is cancelling Western culture”, calling it symptomatic of “incipient totalitarianism” when the ideas of great philosophers are suppressed. This case calls to mind a typical metanarrative, impossible to falsify, suggesting Western culture has cut itself off from its roots. Similar cases predict that Europe will collapse, unless Russia comes to its rescue. Surely, one cannot falsify that liberalism is cancelling Western culture. However, it is certain that three of the five thinkers whose ideas were allegedly suppressed, have already been discussed in our series on disinformation and philosophy (Plato, Aristotle, Nietzsche).

Today, we will focus on a fourth in the list; Heidegger, a major figure in philosophy, who himself lambasted Western culture. From Plato to Descartes, he perceived that Western thinking often equates knowledge with control. He warned that scientific, or technical, thinking has its limits and argued how, paradoxically, scientific rigor could lead to subjectivism (and disinformation).

Sein und Zeit

Heidegger dedicated his magnum opus, Time and Being (Sein und Zeit), to his teacher Edmund Husserl. This philosopher and mathematician wanted to renew philosophy. In Husserl’s view, very crudely, the power of ideas hindered access to the things themselves, or phenomena. His philosophical project aimed to facilitate a return to the phenomena, which, in time, led to the (current-day) movement of “phenomenology”. The study concerned itself with the appearances of things, or things as they appear in our experience.

In order to renew philosophy, Husserl swept aside Descartes’s work. As some might know, the Frenchman created a new fundament of philosophy, no longer depending on God, the almighty creator, but the human subject, imagining the world. The problem of Descartes, however, was this subject, or consciousness, was closed. How do we know the outside world corresponds to our thinking?

Husserl settles this. Although, being a true philosopher, he did not provide an answer. Instead, Husserl argues the question is based on wrong assumptions, as consciousness is not closed, but intentional. Experience is directed towards things in the outside world. To be conscious, always means to be conscious of something. The new question becomes; how do we know there even is an inside?

Heidegger radicalises Husserl’s idea. He not only applies it to the field of knowledge, but also to existence itself. Before Heidegger, for Kant, Descartes, and even Husserl, the question was: what can I know? Heidegger goes beyond knowing. For him, it is bizarre to state: I think, therefore I am. Rather, it is the other way round; I am, and thinking is just one of the modes of being.

To free himself from the connotations connected with old terms such as consciousness, humans, soul, Heidegger introduces a new notion to designate the being we are ourselves, or human experience: there-being (Dasein). It refers to a certain “openness”; a pre-intellectual attitude to Being that is required for us to encounter beings as beings in specific ways (practically, theoretically, aesthetically). In dramatic terms: you cannot love another being, if you do not know you are a being too.

Heidegger famously explains how anxiety is an important gateway to self-understanding because anxiety confronts us with our end, with nothingness. If we allow this painful confrontation to happen, we can find an authentic answer to this inescapable deficit of being.

The Age of the World View

According to Heidegger, the problem with modern culture is that it makes us forget that life is finite. In blunt terms, if we forget we are dying, we ignore the totality of being (Seinsvergessenheit). Heidegger creatively questions concepts we use every day, to demonstrate we normally overlook this.

“World View” is one such concept. In his 1938 text Age of the World View (Zeit des Weltbildes), he questions modern times, modern culture and the way we value information. This has implications for our thinking on disinformation.

In short, Heidegger characterises the modern age as dominated and defined by science—whose model is experimental science: “the fundamental event of the modern age is the conquest of the world as picture”.

What does this mean? According to Heidegger, this conquest again originated with Descartes, who proposed that man is a subject looking at a separated object (the world). The results of this looking: a picture, or view, of the world. The wording of world view implies a viewpoint in opposition to the world. According to Heidegger however, this is never possible, as man looks at the world, yet at the same time is part of it. What we think we do with our scientific method, is leave behind our subjective viewpoint, in order to understand the world, and then return in our subjectivity with the world view we have gained.

Heidegger’s line of thinking provides a few questions on disinformation.

First, in line with spreaders of disinformation, who sometimes use Heidegger to justify their ignoring the evidence, we could ask: is Heidegger against science or technology? This is definitely not the case. Heidegger does not oppose science but objects to the elevation of research results to absolute truths, in other words, scientism.

Second, Heidegger in his thinking often turns things upside-down, forcing one to look in the mirror. Perhaps, our fear of disinformation also shows how much we care about information. Perhaps, we are a bit addicted to it even, like junk food. Just, think about the growth in the amount of data that is stored. Information technology, as the logistics of information, gave disinformation momentum; therefore, it is unlikely that the problem of disinformation will be solved by technical means. Disinformation is a social problem in the first place, and the less people are willing to listen to each other, and respect each other’s viewpoint, the easier it becomes for disinformation to exploit that. This also means that disinformation is not a phenomenon outside of modernity, but rather a consequence of it.

Third, Heidegger warns how in modernity, our objective methods, paradoxically, can lead to extreme subjectivism. This might sound counterintuitive. Heidegger thinks that if modern man takes scientifically informed views back into his subjective mind, there is a risk of endlessly building views upon views, leading to a situation where the views are completely detached from the reality as was perceived.

This could be compared to the biological concept of “runaway selection”, which originates from peacock feathers. For some time, biologists struggled to explain this phenomenon. How could one reconcile these long extravagant tails, embellished with ornaments, with the harsh process of natural selection? They resolved this with the concept of “runaway selection”: the evolution of exaggerated male ornamentation by persistent, directional female choice.

An example in disinformation is the wide-spread false story that Danes are “zoophiles”, part of a larger moral decline of the West.

“Zoophilia” is a beautiful example of this process in the world of disinformation. A colourful, extravagant story, which seem too unfounded to survive media selection. Nevertheless, sometimes it just runs away. It starts with something small, such as the claim that a Danish zoo was collecting unwanted house pets as food for carnivores, and then, story for story, it’s extravagantly blown out of proportion.

Heidegger is a notoriously complicated scholar. Yet, perhaps, the most important lesson he has to offer is simple. When approaching disinformation, we should bear in mind that individuals targeted with disinformation are Beings. What about a real conversation over a cup of coffee?

Read more

Foucault, Did He Blaze a Trail for the Post-Truth Era?


Trailblazer for the post-truth age?

Ideas can be powerful. According to some serious thinkers, Foucault’s ideas facilitated the age of disinformation and misinformation, or the “post-truth era”.

Academic philosophers were warning about postmodernism as far back as the 1980s, well before our times of fake. They even seriously tried to stop the main thinkers of that philosophical movement, like Foucault and Jacques Derrida, receiving any formal recognition.

An example of this was a bitter letter sent to the Times in 1992, in which several philosophers protested against awarding Derrida an honorary degree from Cambridge.

The dismay continues today. In 2017, the highly esteemed philosopher Daniel Dennett called the postmodernists “truly evil”; they made it “respectable to be cynical about truth and facts.” Susan Neiman, another well-known philosopher, claimed that post-truth and identity politics gained momentum because “well-educated people who went to good liberal arts colleges, who are inclined to go into the media” read so much Foucault.

Ironically, this resistance ignited curiosity about Foucault. What did he say, and how damaging was it?

Deconstructing Michel

Michel Foucault was born in Poitiers in France, and studied philosophy and psychology. The Stanford Encyclopedia of Philosophy called him a brilliant but “psychologically tormented” student. In the later part of his life he was active politically. He often protested on behalf of marginalised groups. Foucault died in 1984, an early victim of AIDS.

Foucault, together with Jacques Derrida, is broadly recognised as the most influential postmodern thinker. Both were interested in how structures (for example institutions and language) are used to create order and meaning in human experience.

According to Foucault, knowledge, as a structure, should never be analysed separately from power. In this line of thinking, he was clearly influenced by Nietzsche, and, to a lesser extent, Heidegger.

Foucault adopted an approach to explain systems that drew from Nietzsche’s genealogy. He observed historical transformations and manifestations of power and what they mean for the individual. Because for power to be effective, he argued, it must be hidden. This power can only be observed in the changes in social and political relations.

In his widely read Discipline and Punish, Foucault examines the evolution of the penal system. In the 18th century, it moved from corporal and capital punishment to the penitentiary system. Foucault stresses how this reform, making the system “gentler”, also became an instrument of tightened control: “to punish less, perhaps; but certainly to punish better”. Foucault also shows that this evolution is not the result of rationally inevitable trends, but the outcome of contingent twists of history.

Panopticon; Discipline through Technology.

Technology strengthens discipline. Foucault used the concept of panopticon to illustrate this.

The panopticon is philosopher Jeremy Bentham's plan for the "optimal prison". It consists of a circular building. Each cell is inhabited by only one prisoner and has two windows. The first to let in light from outside. The second points to the centre of the circle-shaped building. In the centre is a tower where a guard can be placed to observe the prisoners. The prisoners – crucially - will never know whether the guard is present, and if they are being observed or not. Consequently, they will internalise the disciplinary power and adjust their own behaviour.

It is not difficult to see that modern technology, including social media, have panopticistic features.

Source: Wikimedia

At the same time, technology cultivates subjectivity. For example, by providing the illusion that your online world is tailored to your needs entirely, which it is not of course, this reinforces the idea that the offline world can also be moulded to your subjective needs.

There is No Spoon

A classic movie that brilliantly plays with ideas concerning technology, truth and reality is, of course, the Matrix. It helps us to understand Foucault.

In our article on Heidegger, we wrote how, paradoxically, the scientific method can lead to the use of images, building on other images, gradually walking away from reality. Reality becomes abandoned. The French philosopher Baudrillard famously called that forgotten reality “the dessert of the real”.

This inspired the makers of the Matrix. In this movie, the reality humans perceive is actually a simulation, created by computers to control humans. They don’t see reality but written (computer) text. Although they don’t recognise it as such.

On first glance, this seems quite postmodern: reality is made up of text. The movie suggests, however, that behind this constructed reality is the real reality.

Foucault would deny the existence of a deeper reality behind the images or beneath our language. According to him, metaphysically, there is no reality outside our minds.

Navalny as a Parrhesiast Speaker

The postmodern world view is accompanied by uncertainty. How should we deal with it? Foucault’s answer: read classical philosophy! To develop a certain attitude of truthfulness in chaos, antiquity can provide inspiration for philosophy as a way of life committed to truthfulness.

Foucault developed the concept of “parrhesia” as a mode of discourse in which one speaks openly and truthfully about one's opinions and ideas, without the use of rhetoric, manipulation, or generalisation.

What characterises this parrhesiastic truth-telling? The combination of truth, courage and criticism. A person who speaks the truth in a parrhesiastic way, speaks the truth at the risk of his or her own life. The parrhesiast speaks 'truth to power': he or she criticises political ruler(s), thereby consciously running the risk of social exclusion, exile and (in extreme cases) death. Socrates was a parrhesiast, as were Rosa Parks and Martin Luther King. The ultimate parrhesiast of our times? It might be Alexei Navalny.

Post-modernism ≠ post truth

The problem with postmodernist philosophy is perhaps not the philosophy itself, but rather its followers. Some of them seem to think that as the world is a human construct, we can remake it as we please. This line of thinking does not follow from Foucault. Foucault would not discard the results of science, and his thinking does not imply universal relativism, or anything goes.

On the other side are those claiming that Foucault created a split between modernistic and post-modernistic thinking. They portray him as some sort of adolescent rebelling against philosophical tradition. This misrepresents both Foucault and that tradition. As we have seen, Kant, hardly a subjectivist, had already written about the problematic subject-object relationship, and the difficulties for us, as a subject, to be sure we really know the object.

Deconstructing the concepts that underpin our society is a tricky business. However, that should not lead to the exclusion of Foucault. The alternative to this risk might be worse, because it could entail abandoning critical thinking. As Arendt has said, “thinking itself is dangerous.”

Looking at the power relations behind public discourse should not necessarily lead to cynicism, disinformation and division by identity politics.

One who deconstructs certain truths, social norms or institutions, does not have to end up as a hardened identity-politician. Because those really committed to deconstruction should also deconstruct their own identity, as the publicist John Gray pointed out.

With Foucault, we could say we can only know our own identity if we recognise it changes constantly. This understanding of the relativity of one’s own identity, should make conversation with others (with similar relative identities) a lot easier.

Nevertheless, in practice, this is not easy for most of us. So maybe it is the other way round with Foucault. His thinking is perhaps not too cynical, but rather too idealistic in expecting people to be willing to engage in open discussion.

The question might not be whether Foucault was right or not, or plain evil, but rather:

Are we, as a society, able to engage in public discourse while also being willing to critically examine and deconstruct our own positions and identities?


Read more


Read more


  • September 17, 2023

    Think Before You Share

    No one wants to be the person who contaminates their friends’ social media feeds with conspiracy theories or disinformation. Use this list to make sure you stay ahead of disinformation!

  • September 17, 2023

    Vaccine Hesitancy

    How do you actually talk to someone who embraces vaccine-related conspiracy theories they encounter online? Here are a few tips that may be helpful.

  • September 17, 2023

    Propaganda Must be Opposed by the Language of Values

    How to oppose propaganda? Find out in an exclusive interview with journalist Andrei Arkhangelsky, one of Russia's most active commentators on the topic of disinformation and propaganda.

  • September 17, 2023

    How to impose costs on perpetrators of disinformation?

    Sanctions are one of the ways to punish those who produce disinformation. What does it mean? how to do this? And finally, are they effective?

  • September 16, 2023

    Read Quality Media and Your World Will be Healthy

    We asked Ukrainian journalists about their recipe to avoid disinformation. Their recommendation? Read quality media and your world will be healthy!

  • September 16, 2023

    Check the Source. Check the Source. Check the Source…

    Tips from Moldovan journalists: in order to avoid falling for disinformation, always check the source and be careful with what you read.


Disinformation is tricky. Test your resilience with our quiz to see if you can identify disinformation, unreliable information and falsified content and if you know how to think before sharing things online.

Still Curious?

The following content is of an informative character and does not represent an official EU position. Unless otherwise stated it is not a product of the European Union.

  • Teaching Tools

    Better Internet for kids

    Resources for teachers, parents and children. Lessons plans, courses, games and teaching resources and discover the online world safely. Platform run by Safer Internet Centres, the European network, which informs, advises and assists children, parents, teachers and carers on digital questions and fights against online child sexual abuse.
    Multiple languages

  • Teaching Tools

    Check or Cheat

    A collection of educational material for secondary education students and teachers to learn how to critically evaluate media content, fact check and build resilience to disinformation. Features teacher training, cards game, educational material.
    Available in English, Greek, Lithuanian and Spanish

  • Teaching Tools

    Conspiracy Theories: what teachers need to know

    Guidebook on conspiracy theories prepared by UNESCO.
    Available in English

  • Teaching Tools

    Council of Europe

    Reference Framework of Competences for Democratic Culture. The Council of Europe has developed a set of learning activities that may be used in primary and secondary education, all of which are based directly on the disinformation challenges raised by the COVID-19 pandemic.
    Available in English and French

  • Teaching Tools

    Council of Europe Materials

    Dealing with propaganda, misinformation and fake news, definitions, recommendations.
    Available in English and French

  • Teaching Tools

    Countering Disinformation Guidebook

    The guide is divided into three categories, examining the roles of specific stakeholder groups in building a democratic information space, legal, normative, and research responses, as well as dimensions for addressing disinformation and other harmful forms of content targeting women and marginalized groups.
    Available in Arabic, English, French, Russian, Spanish.

  • Teaching Tools


    Creative Audiovisual Lab. The aim of the project is to enhance critical thinking and media literacy among young people between 14-19 years old, parents, and educational staff. Features and online training for teachers and trainers.
    Available in English

  • Teaching Tools

    Digital resistance handbook for teachers

    Handbook for teachers by the Council of Europe on how to support their students to recognise fake news and false information found in the online environment.
    Available in English

  • Teaching Tools

    Disarming Disinformation

    List of resources by the Global Engagement Center (GEC) of the U.S. Department of State. It includes infographics, taxonomy and literature, among others.
    Available in English

  • Teaching Tools

    DO’s and DON’Ts on Twitter: Defend Democracy

    The list of basic actions to deal with disinformation and propaganda on Twitter. suitable for all twitter users and to explore in classrooms. Prepared by the Defend Democracy NGO.

    Available in English

  • Teaching Tools

    E-learning: vaccination

    E-learning course on how to address online vaccination misinformation offered by the European Centre for Disease Control.
    Available in English

  • Teaching Tools

    eSafety Commissioner

    Australian parent guide to mental health: webinars, resources and trainings to help young people develop strategies for their mental health while they are online.
    Available in English

  • Teaching Tools


    Teaching Media Literacy with eTwinning. Best practices, project ideas.
    Available in English

  • Teaching Tools

    EU Gudelines for teachers and educators

    The Guidelines for teachers and educators on tackling disinformation and promoting digital literacy through education provide hands-on guidance for teachers and educators, including practical tips, activity plans, insights on topics and cautionary notes grounded in what works as concerns digital literacy and education and training.

    Available in English, Bulgarian, German, French

  • Teaching Tools

    EU Neighbours East

    Training opportunities for media and Civil Society Organizations across Eastern Partnership countries.
    Available in English, Armenian, Azeri, Georgian, Romanian, Russian and Ukrainian

  • Teaching Tools

    EuroGuide Toolkit

    The guide offers teachers and social workers practical tools to respond to socio-political or religious arguments in order to prevent radicalisation in the school. Guidance is offered on how to create resilient environments and safe spaces where vulnerable young people can open up, sharpen their social and emotional skills, and improve their self-esteem.
    Available in English, Dutch, French, Hungarian, Italian and Swedish

  • Teaching Tools


    European Schoolnet has published an online MOOC to foster critical thinking and tackle online disinformation through intergenerational collaboration and community engagement. The course is targeted at primary and secondary school teachers of any subject.
    Available in English

  • Teaching Tools


    Fakescape is an escape game teaching media literacy to high school students (14+). It can be played in a classroom during a 45 minutes class. Run by a Czech NGO.

    Available in English and Czech

  • Teaching Tools

    Get Your Facts Straight

    The platform offers a 10-hour media literacy training course on disinformation on social media for 14–16-year-olds as well as their parents and grandparents. The course focuses on what disinformation is, why it is vastly present on social media, and how to recognise and respond to disinformation. The course can be implemented both in schools, as well as in non-formal educational settings such as youth clubs, libraries and NGOs. By ALL DIGITAL.
    Available in English, Bulgarian, Catalan, Croatian, German, Italian, Latvian, Romanian and Spanish

  • Teaching Tools

    Global Media and Information Literacy Week

    Resources, best practices, events across the globe marking the Global Media and Information Literacy Week.
    Available in English, French, Spanish, Russian, Arabic and Chinese

  • Teaching Tools

    Guide for Public Communicators

    Strategic Communications guide for public communicators with strategies development and response examples developed by EUvsDisinfo.
    Available in English

  • Teaching Tools

    ISD Explainers

    An overview of extremist narratives, movements and actors. Offers background and history of the term, related narratives and background reading. Terms include among others ‘The Manosphere’, ‘The New World Order’, ‘Accelerationism’, ‘The Great Replacement’ and ‘The Order of Nine Angels’. By Institute of Strategic Dialogue.
    Available in English

  • Teaching Tools


    Digital literacy in the MENA-region that targets young people and educators. I promotes media literacy with educational short videos filmed in the region as well as offers pedagogical material for educators.
    Available in English, French and Arabic

  • Teaching Tools

    News Literacy Project

    News Literacy Project provides programs and resources for educators and the public to teach, learn and share the abilities needed to be smart, active consumers of news and information, and equal and engaged participants in a democracy.
    Available in English

  • Teaching Tools

    OECD Assessment rubrics

    Assessment rubrics for critical thinking (largely to be used in formative feedback).
    Available in English

  • Teaching Tools

    Online Media Literacy Resources

    A list of online resources recommended by the UK government. The list includes tips on reporting inappropriate content, preventing online harassment, cyberbullying and much more.

    Available in English


  • Teaching Tools

    RAN Collection

    More than 200 inspiring practices on preventing radicalisation to terrorism and violent extremism.
    Available in English

  • Teaching Tools

    She Persisted

    A Digital Resilience Toolkit for Women in Politics with respond and prevention techniques. Offered by She Persisted.
    Available in English

  • Teaching Tools

    The Wall of Beliefs

    Toolkit for understanding false beliefs and developing effective counter-disinformation strategies.
    Available in English

  • Teaching Tools

    The World Unplugged

    This activity requires asking a group (a classroom/a group of students) to avoid all screens, connections and media activity for 24 hours. After that, follows a guided discussion about how they have felt during that time, how dependent they are from technology, and positive and negative outcomes after being disconnected.
    Available in English

  • Teaching Tools

    Toolkit for Teachers

    Spot and fight disinformation: presentation and introduction booklet including real life examples and group exercises for your classroom.
    Available in all EU languages

  • Teaching Tools

    UNESCO resources

    Media Information Literacy Tools for Teachers offered by UNESCO.
    Available in multiple languages

  • Teaching Tools

    UNESCO: Think critically, click wisely!

    A comprehensive guide by UNESCO Media and information literate citizens: Think critically, click wisely! is a curriculum for educators and learners. 2021 edition.

    Available in English

  • Teaching Tools

    Very Verified

    Online course on Media Literacy developing critical thinking. It’s available in 3 different learning modules, and covers topics such as media landscape, types of media, social media and disinformation and manipulation. Developed by IREX.
    Available in English, Estonian, Latvian, Lithuanian and Russian

  • Teaching Tools

    Медіапутівник: як розпізнавати якісну інформацію та чому це важливо?

    Cерія з дев’яти коротких відеолекцій, які пояснюють, що таке якісний медіапродукт, яким стандартам мають відповідати правдиві інформаційні повідомлення та як медіа часто можуть маніпулювати свідомістю споживачів інформації. Ця серія відеолекцій створена експертками Інституту демократії імені Пилипа Орлика / POID, авторитетної української організації, що спеціалізується на розвитку незалежних медіа та розширенні можливостей громадянського суспільства.

    Кожен із дев’яти епізодів цього відеокурсу присвячений одній темі та триває в середньому сім хвилин. Ця серія щодо розвитку медіаграмотності пропонує низку корисних інструментів, які допомагають навчитися відрізняти якісну інформацію від низькоякісної та вміти самостійно її перевіряти.


  • Fact-checking Tools

    Anti-misinformation actions by Poynter

    A guide to anti-misinformation actions around the world. What is the focus of tackling disinformation per country? What is its current state of play?

  • Fact-checking Tools

    DFR Lab

    Read recent research and analysis exposing disinformation by Digital Forensic Research Lab and become a Digital Sherlock.

  • Fact-checking Tools

    DISARM Framework

    DISARM is an open-source, master framework for fighting disinformation for those cooperating in the fight against disinformation. It provides a common playbook, language and approaches for diverse teams and organizations to coordinate their efforts and act in harmony.

  • Fact-checking Tools

    EU Disinfo Lab

    Tools for monitoring and analysis gathered by the EU Disinfo Lab think tank.

  • Fact-checking Tools

    European Digital Media Observatory

    Find your local fact-checker in the EU with EDMO. Check their site for the map of fact-checking initiatives across all EU member states, recent debunks, analysis and much more.

  • Fact-checking Tools

    International fact-checking Network

    Find your local fact-checker worldwide among the signatories of the code of principles of the International Fact-Checking Network.

  • Fact-checking Tools

    Mapping media policy and journalism

    The mapping by the Centre for Media Pluralism and Media Freedom offers, among other topics, information about media ownership, whistleblowing protection, freedom of information across the EU.

  • Fact-checking Tools

    Open Source Tools by Bellingcat

    Collection of open source tools for online investigations and fact-checking. Trainings, analysis, investigations recommended by Bellingcat, an independent international collective of researchers, investigators and citizen journalists.

  • Fact-checking Tools

    OSINT Framework

    A comprehensive collection of open source intelligence (OSINT) tools and resources.

  • Fact-checking Tools

    WeVerify Verification Plugin

    An open source platform aiming to engage communities and citizen journalists alongside newsroom and freelance journalists for collaborative, decentralised content verification, tracking, and debunking. Accessible from your browser.

  • Films

    After Truth: Disinformation and the Cost of Fake News (2020)

    Must see. This HBO documentary investigates disinformation campaigns and frauds, including the Pizzagate hoax.

  • Films

    Agent of Chaos (2020)

    A two-part documentary from director Alex Gibney investigates Russia’s interference in the U.S. presidential election of 2016.

  • Films

    Bob Roberts (1992)

    In the category “satirical mockumentary”, this film depicts the rise of Robert “Bob” Roberts Jr., a right-wing politician. He is a candidate for an upcoming United States Senate election. Roberts is rich and famous thanks to his folk music, which presents his conservative ideas with zest. This film might help to understand the current political context of populism.

  • Films

    Coded Bias (2020)

    Documentary about biases literally coded in the algorithms. When MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislation against bias in algorithms that impact us all.

  • Films

    Death of Stalin (2017)

    This comedy explores historical revisionism, and perhaps what the Kremlin fears most in historical remembrance. The film was eventually banned from screening in Russia. Pavel Pozhigaylo, a member of the Russian Culture Ministry’s advisory board said about it: “The film desecrates our historical symbols — the Soviet hymn, orders and medals, and Marshal Zhukov is portrayed as an idiot.”

  • Films

    Network (1976)

    A television network exploits a crazed former anchor’s ravings and revelations about the news media for its own profit. Quite prophetic considering this movie was made in the 70’s.

  • Films

    Operation InfeKtion: How Russia Perfected the Art of War (2018)

    Modern classic, produced by the New York Times. “Operation InfeKtion” reveals the ways in which one of the Soviets’ central tactics — the promulgation of lies about America — continues today, from Pizzagate to George Soros conspiracies. Big advantage: it is on YouTube!

  • Films

    The Great Hack (2019)

    Documentary by Netflix. It explores how a data company (Cambridge Analytica) came to symbolise the dark side of social media in the wake of the 2016 U.S. presidential election, as uncovered by journalist Carole Cadwalladr.

  • Films

    The Hater (2020)

    Netflix’s production exploring the dark world of social media smear tactics and its violent real-life consequences. A young man searches for purpose in a net of hatred and violence that he tries to control.

  • Films

    The Social Dilemma (2020)

    Netflix’s production is a great watch on the impact of social media on our societies and our mental health. Also interesting to crosscheck is Facebook’s responses to the creators.

  • Films

    Truman Show (1998)

    An insurance salesman discovers his whole life is actually a reality TV show. While critically portraying the role of modern media (even before the rise of the Internet), the film plays with themes such as human autonomy and reality.

  • Films

    Wag the Dog (1997)

    This comedy is about a spin-doctor and a Hollywood producer, who shortly before an election join efforts to fabricate a war in order to cover up a Presidential sex scandal. Starring Dustin Hoffman, Robert de Niro, Willie Nelson, Kirsten Dunst and Woody Harrelson.

  • Games

    BBC Reality Check

    Role-play game for young media professionals. Your role as a BBC journalist is to cover a breaking news story to be featured on “BBC Live” site. Teaching how to balance accuracy, impact and speed in live reporting.
    Available in English

  • Games

    Cat Park

    The game simulates the tactics and techniques of media manipulation that are used in the real world to exploit social tensions for personal or political game. For users 15+. Developed by the U.S. Department of State GEC.

    Available in English, Dutch, French and Russian

  • Games

    Cranky Uncle

    The Cranky Uncle game uses cartoons and critical thinking to fight misinformation. Available on iPhone and Android. Teachers’ Guide to Cranky Uncle
    Available in English

  • Games


    Learn how to deal with cyberbullying.
    Available in Armenian, Azeri, English and Georgian

  • Games

    Detect Fakes

    At Detect Political Fakes, we will show you a variety of media snippets (transcripts, audio, and videos). Half of the media snippets are real statements made by Joseph Biden and Donald Trump. The other half of the media snippets are fabricated. The media snippets that are fabricated are produced using deepfake technology. We are asking you to share how confident you are that a media snippet is real or fabricated.
    Available in English

  • Games

    Disinformation Diaries

    The Disinformation Diaries is a game-based media literacy tool to help you better understand how disinformation and deepfakes can interfere with democratic elections.
    Available in Albanian, English, French and Georgian

  • Games

    Dr Fake

    Dr. Fake has invaded the Media Literacy City! A player has to confront four of his companions: Mr. Deepfake, Mr. Troll, Mr. Clone and Mr. Phisher and answer their questions correctly.
    Available in Armenian, Azeri, English and Georgian

  • Games

    Factitious 2020

    A game intended to help players learn how to identify fake news stories. The game shows actual news articles, without revealing their publication source until the player clicks to see it. Are they true or false?
    Available in English

  • Games

    Fajnie, że wiesz

    Game teaching resilience to disinformation based on true / false answers. Developed by Polish NGO Demagog.
    Available in Polish

  • Games

    Fake It To Make It

    Simulation-style social-impact game. Players take on the role of someone creating and distributing fake news for a profit.
    Available in English and German

  • Games

    Fake News Bingo

    Bingo game on disinformation to print out.
    Available in German

  • Games


    The game teaches media literacy, though differentiation and analysis of news.
    Available in English

  • Games

    Fakt oder Fake: Das Handysektor Fake News Quiz

    Spot disinformation on social media. Can you tell the difference between facts and fakes?
    Available in German