Read this article in EN

The Facebook perspective on information operations

May 16, 2017

Facebook recently put forward a paper recognising the existence of information operations on the popular social networking site. Much of its rather technical analysis confirms what we already know about pro-Kremlin disinformation.

The document acknowledges that governments or organised non-state actors take to Facebook to “distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome.” There is more to information operations than intentionally false news, Facebook states. Such operations also include the deliberate spreading of the disinformation on social media as well as outreach on traditional media.

Facebook says that information operations on their platform have three steps:

  • targeted collection of non-public data to expose information that can have an impact onpublic discourse;
  • creation of content reflecting this information, either spreading stories to the press or generating stories and images for online use
  • coordinated activity by fake Facebook accounts to amplify content online, including the creation of fake grassroots groups as well as trolling, i.e. fake accounts spamming the comments section on Facebook pages.

Sounds familiar? Facebook states that it recognised these activity patterns multiple times during the 2016 US Presidential election campaign though it asserts that the “reach of the content shared by false amplifiers was marginal compared to the overall volume of civic content during the US election”. This contrasts with the analysis made by Buzzfeed, which concluded that fake news received more engagement (comments, likes, shares) than news by established media in the final three months before the election in November 2016. When it comes to identifying the source of this information operation, Facebook says that “our data does not contradict the attribution provided by the US Director of National Intelligence”, referring to the assessment of Russian activities published in January 2017.

What does Facebook intend to do about the problem? As a first step, Facebook promises to help users protect their accounts better and recommends that election campaigns pay particular attention to cyber security. But the real deal is in the action against fake accounts and false amplification of compromised stories. The engineers have observed that “most false amplification in the context of information operations is not driven by automated processes but by coordinated people who are dedicated to operating inauthentic accounts”. This is because the online activity presupposes “people with language skills and a basic knowledge of the political situation in the target countries”. This reminds us of the reporting about a pro-Kremlin “troll factory” in St Petersburg where employees are paid to leave comments in online fora and on social media. Facebook targets these via the “authenticity of the accounts in question and their behaviours, not the content of the material created”, for example by detecting repeated posting of the same content and unusual increases in the volume of content creation. The organisation claims that it has taken action against over 30,000 fake accounts in France. It provides, in any case, the possibility for its users to report fake profiles.

At the same time, the social media organisation says it supports wider efforts to promote quality journalism through its Facebook journalism project, which seems primarily to focus on helping quality news organisations increase their reach and revenues on the platform, and through collaboration with fact checkers for doubtful Facebook content.

Read the full paper