If There’s No Sign Of Interference, It Must Be Fabricated

The RAND Corporation is setting new records in bias and partiality.

The political planners from America are well aware of the need for the timely release of specific information. Good timing is necessary both to influence public opinion and for them to formulate their own strategies, including engaging partners and satellites.

In this regard, the publications recently released by the RAND Corporation on the eve of the presidential election are revealing. One of the new reports deals with “Russian propaganda”. At the same time, it states that the authors experimentally tested the impact of the propaganda itself and counter-interventions. Shortly before, the RAND Corporation also published two reports on interference in US elections that send a clear message about Russia’s role in the process.

The authors of the report on propaganda write: “Our findings indicate that Russian content is particularly effective at achieving its goal of generating strong reactions along partisan lines, engendering stronger emotional responses than those brought on by real or false news and creating a starker partisan divide. Also of concern: Strongly positive emotional reactions to such social media content increase the chances that participants will self-report ‘liking’ and sharing it. These findings appear to be in line with our understanding of Russian goals and objectives for propaganda: to elicit strong reactions in partisans that, in turn, can facilitate the spread and potential influence of manipulated content. We also found that revealing the source of the Russian memes reduced the probability of a positive emotional response to content that aligned with a participant’s ideology. Compared with the emotional effects generated among participants for whom the source was hidden, participant willingness to engage with the content on Facebook by ‘liking’ or sharing material for which the source was exposed was weaker. In the overall sample, revealing the source reduced the likelihood that participants would ‘like’ pro-U.S. Russian content, but no other effects for ‘liking’ or sharing were statistically significant.”

The US target audience was broken down into “partisan left”, whose members usually read the New York Times, and “partisan right”, whose members watch Fox News and read conservative publications. Both groups typically reacted in some way to material associated with Russian memes.

The report contains something more interesting than the reference groups and the history of propaganda during the Cold War era, however, and that is how the experiment itself was conducted.

Participants aged 18 and over were targeted by ads on Facebook, resulting in 762,624 unique profiles being reached and 6,968 clicks being generated.

RTShort videos and memes were used, while the test itself included a mix of Russian propaganda (apparently how the authors themselves see it), false news headlines and factual content. In other words, wide-ranging information with a clear “Russian-made” tag was placed on an open social network using US taxpayers’ money. What’s more, many of its participants were unaware. Could it be that previous “traces of Russian interference” were also experiments like these done as part of the RAND Corporation’s latest project?

When it comes to the study on interference in US elections, there are five key findings: 1) foreign interference in US politics has been a problem since the country was founded; 2) Russian information efforts aim to provoke a strong reaction and push people to extreme positions to reduce the chances of achieving a consensus – the bedrock of US democracy; 3) new technologies have made Russia’s information efforts easier compared with the USSR’s propaganda campaigns during the Cold War; 4) studies on how to defend against these efforts have focused on different units of analysis: some studies focus on the original content; others focus on how this content is spread within networks; and still others focus on protecting consumers; and 5) in order to respond to foreign interference, we recommend (a) taking a holistic approach that anticipates which groups of Americans may become targets, and (b) designing evidence-based preventive methods to protect them.

So, two findings relate exclusively to Russia! And not a word about interference by any other country. At the same time, the authors hypothesise that Russia uses reflexive control theory to manipulate people within the US.

Also of interest is the opinion of US authors who have identified 11 key areas of Russian propaganda:

  1. Tailored disinformation – based on pitting different groups against each other by identifying content and topics to which each targeted group might be most susceptible.
  2. Conspiracy theories – promoting or focusing on an issue, sowing distrust and spreading confusing information, rumours and leaks.
  3. Paid advertising – pushing people to like pages, follow accounts, take part in events, and visit websites.
  4. American asset development – focusing on reducing the likelihood of discovery by recruiting Americans to carry out tasks for handlers.
  5. Narrative laundering – shifting a narrative from its state-run origins to the wider media ecosystem through witting or unwitting participants.
  6. Hack and leak operations – getting hold of information illegally and sharing it through platforms such as WikiLeaks.
  7. False online personae – creating false personae, sometimes using information that belongs to real people, to hide real identities.
  8. Social media groups – exacerbating existing issues, gathering information, and recruiting for events by creating social media groups dedicated to divisive issues.
  9. Memes and symbols – using memes to create simple and easy-to-share fragments of information that could resonate emotionally with people.
  10. Secessionist support – undermining the US by establishing links with and supporting secessionist ideas and movements.
  11. Fringe movement support – building support for Russia’s values and society by establishing links with extremist groups.

Significantly, neither the names of such groups and organisations nor specific examples are provided.

Putin writingA report on direct interference in the 2020 election states: “We found credible evidence of interference in the 2020 election on Twitter. This interference includes posts from troll accounts (fake personas spreading hyperpartisan themes) and superconnector accounts that appear designed to spread information. This interference effort intends to sow division and undermine confidence in American democracy. This interference serves Russia’s interests and matches Russia’s interference playbook. Our methods can help identify online interference by foreign adversaries, allowing for proactive measures.”

Although the title refers to the 2020 US election, the study uses work previously carried out for the UK Ministry of Defence, along with data from 2016. Evidently, the main clue now is the old story about the Internet Research Agency in Saint Petersburg, since no new data has been provided.

The report states that 630,000 Twitter accounts were analysed, but only 130 of these had high troll ratings. Can 0.02 per cent of the total number of accounts analysed be considered a real factor in US election interference? There is also no information that even one account is associated with Russia. And anyway, if someone outside of the US discusses the US election (from whatever position), is that really interference? Despite the beautiful infographics with clusters of electoral preferences and hashtags, such baseless claims just aren’t convincing enough. The list of sources also shows how weak the presented arguments are. It is made up of a small group of authors who have the same interest as employees of the RAND Corporation. And, in the near future, this report will be cited as a reliable and verified source, although it is not.

Of course, America’s problem is that it will regard any suggestion made by Russia to resolve the issue as some kind of trick, even if it is genuine. The US has mastered putting terrible labels on others but is unwilling to admit when it is doing something wrong itself.

If There’s No Sign Of Interference, It Must Be Fabricated

0 thoughts on “If There’s No Sign Of Interference, It Must Be Fabricated

Leave a Reply

Your email address will not be published. Required fields are marked *