• Russia’s election meddling toolkit consists of ten key methods that fall into four categories of interference: 1) information manipulation, 2) cyber disruption, 3) political grooming, and 4) extreme intervention.
    • There is no universal template for the Kremlin’s election meddling operations – every case is different and involves a unique combination of methods based on the Kremlin’s objectives and the specific context and vulnerabilities of the target country or election process.
    • Many of these methods overlap and complement each other, with the use of one indicating the likely use of another. For example, a phishing operation against a political campaign may suggest plans for a hack-and-leak operation. Social media influence campaigns typically involve both disinformation and sentiment amplification, sometimes combined with political advertising.
    • Russian electoral interference is a long game: many of these methods are used, in varying degrees, far in advance of elections themselves, and only intensify during campaign periods. Influence efforts are now a persistent feature of our political landscape: for example, the Kremlin’s disinformation war against Europe shows no signs of abating, while reports of cyberattacks are ever more common.
    • This long-term strategy of manipulation derives from the Soviet concept of “active measures”: a slow process of ideological subversion and psychological warfare that aims, over many years, to alter the target’s perception of reality and lead them to act in ways that benefit their opponent. The Kremlin’s current and ongoing influence efforts are a sophisticated adaptation of this strategy for the digital era.

 

 

CLASS OF METHOD: information manipulation

DEFINITION: The fabrication or deliberate distortion of news content aimed at deceiving an audience, polluting the information space to obscure fact-based reality, and manufacturing misleading narratives about key events or issues to manipulate public opinion. Disinformation is the most persistent and widespread form of the Kremlin’s interference efforts. Importantly, it is not limited only to election cycles, but has now become a viral feature of our information ecosystem.

OBJECTIVE: To paralyse the democratic process by fuelling social fragmentation and polarisation, sowing confusion and uncertainty about fact-based reality, and undermining trust in the integrity of democratic politics and institutions.

CASE REFERENCES: 2014 Ukrainian elections, 2016 Dutch referendum on the EU-Ukraine Association Agreement, 2016 Brexit referendum, 2016 US elections, 2017 Catalan independence referendum, 2017 German elections, 2017 French elections, 2018 Italian elections

 

CLASS OF METHOD: information manipulation

DEFINITION: Using a fake identity or non-attributable false-front account to purchase online political ads, primarily on social media sites, to propagate disinformation about certain political parties, candidates, issues, or public figures.

OBJECTIVE: To promote and artificially inflate the (un)popularity of certain political parties, candidates, issues, or public figures in order to influence an election outcome.

CASE REFERENCES: 2016 Brexit referendum, 2016 US elections, 2017 German elections

 

CLASS OF METHOD: information manipulation

DEFINITION: The use of fake accounts, trolls, and/or automated bots on social media and other online fora (e.g., the comments sections of newspapers) to spread disinformation and inflate the prominence of particular narratives. Sentiment amplification can occur both overtly (where the source is easily identifiable) and covertly (where the source is obscured or disguised to prevent correct attribution). Chain emails are another form of this method, commonly used in Central and Eastern Europe to target older, less digitally-literate citizens.

OBJECTIVE: To increase the proliferation and visibility of disinformation and related tendentious narratives in order to fuel social fragmentation and polarisation, sow confusion and uncertainty about fact-based reality, and undermine trust in the integrity of democratic politics and institutions.

CASE REFERENCES: 2016 Brexit referendum, 2016 US elections, 2017 Catalan independence referendum, 2017 German elections, 2017 French elections, 2018 Italian elections

 

CLASS OF METHOD: information manipulation, cyber disruption

DEFINITION: The establishment of a fake online identity, either by an individual or a group, which is used for false-front interaction with target audiences. Identity falsification can take numerous forms, including the creation of fake social media accounts to spread disinformation or run political ads, or the impersonation of specific individuals to conduct a sophisticated spear-phishing operation.

OBJECTIVE: Different forms of identity faking have different objectives. For instance, the purpose of creating a fake social media account may be to spread disinformation, organise an event, or incite a public reaction under the guise of an alternate identity in order to prevent attribution and create the illusion of authentic behaviour. The motive behind spear-phishing is the theft of user credentials in order to conduct a cyberattack.

CASE REFERENCES: 2016 Brexit referendum, 2016 US elections, 2017 French elections (also here), 2017 German elections, 2018 Italian elections

 

CLASS OF METHOD: information manipulation, cyber disruption

DEFINITION: The theft of emails or documents through hacking or phishing operations, followed by their strategic public release, typically via proxy to prevent attribution. The stolen documents may be altered (or additional ones fabricated) to manufacture greater controversy and increase negative perceptions of the target.

OBJECTIVE: To expose, disgrace, or otherwise undermine a particular individual, campaign, or organisation in order to influence public opinion during an election cycle.

CASE REFERENCES: 2014 Ukrainian elections, 2016 US elections, 2017 French elections, Germany 2018

 

CLASS OF METHOD: cyber disruption

DEFINITION: Hacking operations against state institutions or publicly influential organisations such as think tanks, NGOs, and media organisations.

OBJECTIVE: To collect intelligence about these institutions’ activities and research, to identify vulnerabilities for future exploitation, and to lay the groundwork for potential hack-and-leak operations.

CASE REFERENCES: 2016 US elections (pre and post), 2018 Italian elections, 2018 US elections, 2019 European Parliament elections

 

CLASS OF METHOD: cyber disruption

DEFINITION: Infrastructure attacks encompass a variety of specific cyber tactics. Broadly, they involve any attempt to penetrate a country’s electronic voting system, voter databases, or related IT networks. Specifically, these tactics may include distributed denial-of-service (DDOS) attacks, hacking of voter databases (either to gather information or to modify data), and manipulation of electronic vote transmission or vote counts in order to alter the election results.

OBJECTIVE: Motives vary in attacks on electoral infrastructure. They may include efforts to collect data for reconnaissance purposes or to identify vulnerabilities for future exploitation, to distort data (e.g., alter voter databases, manipulate votes), or to undermine the functionality of key IT systems or networks in order to weaken a particular party or candidate, or to broadly undermine the legitimacy of election results.

CASE REFERENCES: 2014 Ukrainian elections, 2015 German Bundestag attack, 2016 US elections, 2017 German elections

 

CLASS OF METHOD: grooming

DEFINITION: The cultivation of favourable relationships with key public- and private-sector elites. This relationship-building may take a number of forms, including business or trade incentives, academic and institutional influence via pro-Kremlin expert networks, “cooperation agreements” between political parties (the ruling United Russia party has several such agreements with European parties), and the use of individual operatives to infiltrate target circles (e.g., the case of Maria Butina in the US).

OBJECTIVE: To influence national decision-making and public opinion in the target country.

 

CLASS OF METHOD: grooming

DEFINITION: The overt or covert provision of funding to a particular party or election campaign, typically through a proxy institution without direct links to the Kremlin.

OBJECTIVE: To support and increase the chances of electoral success for a given party or candidate whose platform judged to benefit the Kremlin’s agenda.

 

CLASS OF METHOD: extreme intervention

DEFINITION: The use of hard power to intervene in a country’s political developments and democratic process, typically via overt or covert military action cushioned within a broader hybrid framework that violates the target country’s territorial sovereignty.

OBJECTIVE: To directly alter the course of political developments in a target country, typically when other influence efforts have failed to yield the desired results.

CASE REFERENCES: Georgia (2008, ongoing), Ukraine (2014, ongoing: annexation of Crimea, invasion of eastern Ukraine), Montenegro (2016 coup plot)

 

 

When evaluating and responding to a case of foreign election meddling, there are three levels of state involvement to consider: was the interference state-directed, state-sanctioned, or state-aligned?

STATE-DIRECTED INTERFERENCE: interference activities that have either been financed or directly carried out by the government or other state institutions (such as the military or intelligence services).

STATE-SANCTIONED INTERFERENCE: interference activities that are informally sanctioned or encouraged by the government or state organs, but not financed or directly carried out by the state.

STATE-ALIGNED INTERFERENCE: interference activities that are carried out by non-state actors, without any apparent coordination with a foreign government, in support of that foreign government’s agenda (e.g., independent hackers/hacktivists, homegrown disinformation sites that regurgitate pro-Kremlin narratives, etc.)

Distinguishing between these levels of state involvement in a given election meddling operation is vital for developing an effective response strategy. State-directed and state-sanctioned interference efforts require the most decisive action to punish the guilty state and deter future interference efforts.

 

Russian Election Meddling in the US and Beyond”. (2018). EUvsDisinfo.

Election Interference in the Digital Age: Building Resilience to Cyber-Enabled Threats”. (2018). European Political Strategy Centre of the European Commission.

Galante, L. and Ee, S. (2018). “Defining Russian Election Interference: An Analysis of Select 2014 to 2018 Cyber Enabled Incidents”. Scowcroft Center for Strategy and Security, Atlantic Council.

Laurinavičius, M. (2018). “A Guide to the Russian Tool Box of Election Meddling”. International Elections Study Center.

Brattberg, E. and Maurer, T. (2018). “Russian Election Interference: Europe’s Counter to Fake News and Cyber Attacks”. Carnegie Endowment for International Peace.

Parks, M. (2018). “5 Ways Election Interference Could (and Probably Will) Worsen in 2018 and Beyond”. National Public Radio.

Greenberg, A. (2017). “Everything We Know About Russia’s Election-Hacking Playbook”. Wired.


share with