• Skip to primary navigation
  • Skip to main content
  • Skip to footer
  • Home
  • About
    • Editorial Staff
      • Bryan Strawser, Editor in Chief, Strife
      • Dr Anna B. Plunkett, Founder, Women in Writing
      • Strife Journal Editors
      • Strife Blog Editors
      • Strife Communications Team
      • Senior Editors
      • Series Editors
      • Copy Editors
      • Strife Writing Fellows
      • Commissioning Editors
      • War Studies @ 60 Project Team
      • Web Team
    • Publication Ethics
    • Open Access Statement
  • Archive
  • Series
  • Strife Journal
  • Strife Policy Papers
    • Strife Policy Papers: Submission Guidelines
    • Vol 1, Issue 1 (June 2022): Perils in Plain Sight
  • Contact us
  • Submit to Strife!

Strife

The Academic Blog of the Department of War Studies, King's College London

  • Announcements
  • Articles
  • Book Reviews
  • Call for Papers
  • Features
  • Interviews
  • Strife Policy Papers
    • Strife Policy Papers: Submission Guidelines
    • Vol 1, Issue 1 (June 2022): Perils in Plain Sight
You are here: Home / Archives for Disinformation

Disinformation

Want to tackle disinformation? Stop using the same tactics.

March 2, 2021 by Sophia Rigby

By Sophia Rigby

General Sir Mark Carleton-Smith presents on how the British Army is adopting to new warfighting domains, including information operations (MOD, 2020)

Disinformation is nothing new. It seems to be a commonly held belief that disinformation is a new style of warfare and interference, put to perfect use in the 2014 Ukrainian Crisis, the 2016 US election, and the 2016 Brexit referendum. But disinformation has been around for centuries to spread malicious rumours and to discredit rivals; what is new is the manner of spreading disinformation and how quickly it can spread.

The advent of social media and technological advances have meant that we have a mass of information at our fingertips and expect to be able to find a concise answer to complex problems in seconds. Or 0.37 seconds, which is how long it took Google to find me results relating to the Internal Market Bill. However, unlike the encyclopaedias of old, few of these results will come with verifiable and reliable evidence attached. Anyone can post on a blog or Wikipedia and almost anyone can doctor a photograph or a video (to varying degrees of success),yet we have very little in the public sphere, especially education, about evaluating sources of information and treating news critically. 

The 2015 National Security Strategy and Strategic Defence and Security Review[1] failed to recognise disinformation as a significant threat to national security under its cyber section. But the recently published Russia Report[2] in the UK found that Russian disinformation was fomenting political extremism around Brexit and other divisive issues. This puts disinformation purely in the domain of political and national security, an area of life that for many people seem as remote from their daily lives, as the countries in which the threats originate.

However, in the context of the growing anti-vax movement and alternative therapies for Covid-19, we observe how disinformation coupled with public ignorance of the facts are negatively  impacting our everyday lives. Anti-vax and anti-lockdown conspiracy theorists have taken to the streets in European capitals (including London on 19 September), to protest against the lockdown measures and the mandatory wearing of face masks, in attempts to discredit any future vaccine[3]. Anti-vax theories are gaining a greater following in the UK, but the impact can be clearly seen in many American cities which are seeing an increase in cases of measles, mumps, and tuberculosis as vaccination levels decrease[4]. 

Despite accumulated scientific evidence pointing to the reliability of vaccines, not least the eradication of devastating diseases in the UK such as polio, and the discreditation of the scientists who first supported anti-vax theories, people are still inclined to believe some stranger on Facebook. This is made possible by disinformation methods that have become far more sophisticated and appear in articles on websites, in videos on news sites, and rarely find engagement with vigorous debate. The anonymity of social media and the courage (or bravado) this instils in people mean that reasonable voices are drowned out by those spouting vitriolic abuse at any dissenting voices. Mainstream views are pushed out as extreme voices resort to threats and insults to get their point across more firmly. 

‘Knowledge is power’ (was first written down in Thomas Hobbes’ political tome Leviathan in 1668) is perhaps not the most powerful argument in favour, but how are we to make sure that the knowledge being distributed and circulated in social media networks is accurate? Firstly, and most importantly, we have to stop using the same tactics. From the politician who purposely manipulates statistics to make a false impression of reality, to the wordsmith who uses language to mask the truth, to the politician who rebrands their party political account to appear as an independent fact checking organisation.  

We know statistics can be manipulated and it is done time and time again in debates on poverty statistics. Relative poverty and absolute poverty are two different measures – relative poverty is set at 60% of the average net household income in the year in question and can fluctuate from year to year whereas absolute poverty is set at 60% of the average net household income of 2010/11 and does not fluctuate over time. According to the Institute for Fiscal Studies data[5], relative poverty rates have increased for children and everyone overall, for working-age non-parents and pensioners they have stayed fairly level. However, absolute poverty rates have decreased for pensioners and working-age non-parents, stayed fairly level for everyone overall, and increased for children. So, the Government can claim to have reduced poverty and use statistics to back up that fact, the Opposition and charities can claim poverty has increased, and the public are none the wiser to the actual state of affairs. 

Politicians will always use the best evidence to support their claims, and the opposition will always pull another piece of evidence that seems to suggest otherwise – that’s just the way politics works. With elections and Government at stake, it seems impossible as well as naïve to assume that for politicians would speak plainly and leave the party-political rhetoric at the door. But journalists have a responsibility, not just to support the politicians whose party their editor or paper supports, but to analyse claims and show their respective strengths and weaknesses. They also need to look at the use of anonymous sources and treat them as factual. Without the opportunity to assess the reliability of sources, we are both failing to look critically at information and encouraging belief in faceless facts. 

Ultimately, we need critical thinkers. Schools try to teach critical thinking through History and English Literature, but all subjects have a role to play in teaching us to look at the world more critically and analyse what is being told to us. Maths is important in showing us how statistics can be manipulated, Science can show us the complex systems in place to develop vaccines as well as look at the ethics of experimentation, Drama can teach us to look at the character behind the rhetoric and eloquent speeches. Above all, coursework and project work teaches more than teamwork and presentation skills; it teaches us how to research and balance the various claims, how to look critically at who is writing and explaining, and what their motives are. This? Pedagogy you mean? is as important as the actual content, so that people learn to look past the emotive and sometimes the shocking elements to the trustworthiness of the content. 

We’ve seen the pernicious and deadly impact that disinformation can have on people’s lives. From the war in Ukraine to the Covid pandemic, disinformation is a threat to national security. But we are not taking it seriously and we are not taking adequate steps to tackle it. Social media platforms must be made responsible for the content on their sites, politicians must be made accountable for comments they make, “inside sources” must face greater scrutiny from journalists, and we must ensure that tackling disinformation is incorporated into the curriculum. National Service was used to prepare the nation when the threat of conventional war was present; education promoting critical thinking is our preparation for disinformation at present.

 

Sophia Rigby is a Doctoral Researcher in the Department of Defence Studies at King’s College, London. Her research is focusing on realist-constructivist theories of international relations and how it relates to Russian foreign policy in Europe. She holds a BA in Modern Languages and a Masters focusing on Russia and Eastern Europe. Since graduating, she has been working in political strategies and communications.

Filed Under: Feature Tagged With: Covid, Disinformation, Fake News, Politics, theory

Feature — Winning the Disinformation War Against the West

May 12, 2019 by Andrzej Kozłowski

By Andrzej Kozłowski

13 May 2019

The Ministry of Defence badge on a computer chip. Britain will build a dedicated capability to counter-attack in cyberspace and, if necessary, to strike in cyberspace. (Crown Copyright/Chris Roberts)

The rapid expansion of the Internet in the nineties encouraged the expectation among Western politicians and experts that liberal democracy would come to dominate the world and authoritarian regimes would slowly collapse. It was hoped that the easy and fast access to uncensored information would strengthen civil society and opposition in authoritarian countries by empowering a free press, facilitating the planning and organisation of social and revolutionary movements which would overwhelm the  ruling governments. However, things took a different trajectory and Internet tools such as social media have become a double-edged sword, effectively being employed against democratic countries to wreak information havoc and spread propaganda to undermine democratic processes.

A more serious problem than we think

The key event which demonstrated the power of social media, was the presidential election in the United States in 2016, when Russian hackers from the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (GRU) and the Federal Security Service of the Russian Federation (FSB), along with trolls from the Internet Research Agency, engaged in a disinformation campaign to influence the outcome of the election. Their main aim and the impact on  Trump’s  victory are  disputed. However, this incident showed the importance of the Internet and social media and how easily public opinion can be manipulated by the newest technologies. Since that moment, policymakers and chiefs of intelligence and counterintelligence of NATO and EU countries have warned about the potential threat of external meddling in other elections in the West. Indeed, Russia attempted to meddle in the elections in France, the Netherlands and Germany but did not achieve an outcome comparable to the American presidential election of 2016.

Not only has  disinformation been used to influence election processes, it has also been deployed to split societies by drawing attention to the most controversial cases. The waves of immigrants, which came to Europe in recent years, have divided societies of Western countries. This division has been strengthened by the fake stories of grave crimes committed by  immigrants. A lot of people believed in them and were upset by this immigrant behaviour, leading some towards feelings of vengeance. Here, disinformation contributed to the violent acts against immigrants but also increased distrust in  the mainstream media and towards politicians who seem to have overlooked these events. Social media were also used to influence voter behaviour in important referendums like the one in Spain and the United Kingdom, supporting Brexit and the secessionist movement in Catalonia.

Last but not least, the anti-vaccination movement, strongly present on social media, poses a threat for the lives of citizens in the West. As a result, illnesses like measles, which had formerly been eliminated by vaccines, have reemerged. The latest research shows that this movement was not spontaneous  but rather  state-inspired and strongly promoted on the Internet.

Western institutions have identified Russia as the perpetrator of these campaigns, but facing up to the problem of disinformation has become one of the most crucial challenges.  However, there is a high probability that other countries could follow or have already followed in Russia’s footsteps. The West needs to prepare by building a resilient society resistant to disinformation and propaganda and ready to deter potential foes.

Front page of European Commission’s “Final Report of the High Level Expert Group on Fake News and Online Disinformation” (European Commission)

Building an information-resilient society

Building an information-resilient society requires the close cooperation of four main entities: the government, civil society, social media platform owners, and the traditional media.

Governments

Despite the growing role of the private sector in cyberspace, the government ought to play a crucial role in initiating and coordinating actions that counter disinformation. First and foremost, the government needs to engage with professionals in combating this phenomenon. Think tanks and non-profit organisations cannot resist information and psychological operations orchestrated by the professionals from secret services due to their lack of financial sources, access to sensitive information, advanced systems of early warning and a sufficient staff size. However, some of these abilities and tools are in the hands of the counterintelligence agencies, which ought to assist such think tanks.

Moreover, the government needs to dominate the information sphere before elections and referendums. The constant warnings about the potential interference and manipulation of public opinion ought to come from the heads of intelligence and counterintelligence. Some may claim that spreading panic would be counterintuitive. Yet embedding a form of vigilance and awareness similar to that which occurs on, say, April Fool’s Day, about potential disinformation is crucial and can strip potential assailants from their biggest advantage: surprise. The cases of France and Germany are particularly telling. Before the elections in both countries, politicians and secret service officers warned about potential manipulations, which during the elections themselves were restrained to a minimum.

The government also ought to prepare  a clear legal framework to help social media bigwigs eliminate detrimental content from their platforms. These laws ought to be effective and feasible but also remain adaptable to technological reality and transparent in order to avoid accusations of political bias. Internet-users ought to be aware that they  can be penalised for  inappropriate behaviour and not for political views.

The next task of the government is to prepare politicians and administrative staff for possible disinformation campaigns. It should be done on two levels: by organising training and practice for politicians and civil servants on how to recognize disinformation on the Internet, and by ensuring that political parties are prepared, especially during election campaigns.

Governments  should also not hesitate to ban certain media from attending official press conferences if it has been established that these media act as propaganda instruments . For example, during the election in France, the French government’s administration of Emmanuel Macron banned the Sputnik and Russia Today journalists, limiting the freedom the media to spread disinformation.

The social media enterprises

Social media are used as tools to spread disinformation and influence democratic processes in many countries. It has become a significant problem for their executives, especially for Twitter and Facebook. Particularly after the 2016 presidential election in the United States both companies were under considerable public criticism. In response, they heavily invested in eliminating fake content and accounts responsible for spreading disinformation. This policy should be continued in close cooperation with government entities, which should help by identifying hostile accounts. However, the decisions made by social media enterprises should be clearly explained to avoid accusation of censorship.

Civil Society

The role of non-profit organizations cannot be underestimated, but they ought not play a  central role in fighting disinformation. Instead, they ought to help government and social media enterprises identify propaganda and fake content but their role ought to remain advisory. Such organisations could effectively set up educational campaigns, teaching citizens how to avoid disinformation by fact-checking news and content on the Internet.

The Media

In the past, traditional media played the role of gatekeepers by filtering the flow of information and eliminating fake news. Currently, in the era of social media and direct access to information, this role has changed. However, traditional media still has a role to play. They ought to create special roles  in the editorial team to trace fake news and stories and reveal it to the public. It would give them back the role of modern gatekeepers in the new era of  social media. Furthermore, journalists are among the most popular persons on Twitter and Facebook and are often a source of news and information. If journalists spread fake news intentionally or unintentionally, this fake news becomes more and more reliable. Therefore media should organize special courses and training to  raise awareness among journalists about appropriately sourcing information.

Last but not the least, the government needs to coordinate the efforts of all entities engaged in fighting disinformation. If the government fails  in this role, the system will not work as one cohesive entity, but there  will be a constellation of single, loosely related entities with overlapping tasks and lack of resources.

Creating effective and reliable deterrence

Building a society resistant to disinformation is a part of an effective strategy to fight disinformation. The remaining task is to deter potential agents of disinformation by establishing punishments. These penalties ought not be limited to cyberspace, but may also consider other measures, such as economic sanctions. In most cases, it is difficult to respond to these agents of disinformation by proportional information campaign. The obstacle lies in the authoritarian nature of the aggressor.

Because the election process in authoritarian states serves as a mere formality where public opinion and society cannot be effectively influenced, approaching aggressors with economic sanctions might be more effective in deterring such actions. However, even considering the authoritarian nature of the regime, online activity ought to be considered. The possible options could include demonstrating to the home population the inherently corrupt nature of the regime, under which the average citizen lives under inadequate conditions. The next potential strategy refers to the example of “The Union of the Committees of Soldiers’ Mothers of Russia”, a former Soviet organisation that influenced the attitudes of Soviet-Russian society towards the Afghan world. A similar scenario could be used to reveal the number of troops killed in the wars in Syria or in the Ukraine. Thirdly, it is a good idea to create the Russian version of Wikileaks service, that would deliver materials compromising the Russian materials and put them on the a bulletproof website.

Economic sanctions are another powerful tool that can be used by the West. Such sanctions have been surprisingly effective against Russia. Freezing oligarchs’ assets or introducing travel bans can hurt  the closest circle of Moscow’s cronies and stop them from visiting their luxurious residences in Western Europe.

The next powerful  tool of punishing Russia for its aggressive behaviour in this domain would be to expel Russian diplomats from Western countries. At first glance, it may look like the standard  retaliation in the international arena. Considering the fact that in some countries like the UK, half of the Russian embassy staff  worked for the Russian intelligence services, expelling Russian diplomats could effectively paralyse the work of the Russian intelligence network.

Every kind of Russian interference in the Western infosphere should be met  with one of  these effective measures  Such measures would also deter any other country from following Russia. The West needs to demonstrate the willingness and determination to punish agents of disinformation, who have tried to infiltrate its own Internet sphere.

Conclusion and recommendations

The key to win the disinformation war is, first and foremost, to treat it as an existential threat perceived as a strategic priority. Thus significant financial resources need to be invested to counter this problem. Success is determined by the resilience of society and reliable forms of deterrence. Both require effective cooperation among the government, traditional media, social media enterprises and civil society; professional government agencies should be included in fighting against disinformation.

Effective cooperation among these entities allows us to create a warning system, which is crucial because opponents benefit from the element of surprise. Therefore, every user of the Internet, from government clerks to journalists, has to be educated to raise awareness of information threats. There should be a transparent legal framework, which helps to eliminate the disinformation from the public sphere without being accused of political bias. However, building a resilient society is not enough– forms of deterrence are also required. Such deterrence consists of a variety of measures, extended beyond the information sphere.

Flagging certain media outlets as propaganda instruments and banning their journalists from attending press conferences is the next step.


Dr Andrzej Kozłowski is the editor-in-chief of CyberDefence24.pl, the biggest portal on cyber security and information warfare in Poland. Alongside with his work as a journalist, he is a lecturer at the University of Lodz, Collegium Civitas in Warsaw and European Academy of Diplomacy (EAD). In 2016, Dr Kozłowski successfully defended his PhD dissertation entitled: “The Security Policy of the United States in Cyberspace (1993-2012). Comparative Analysis”.  He is an expert at several Polish think-tanks such as The Institute of Security and Strategy Foundation, Warsaw Institute For Strategic Initiatives  and  The Casimir Pulaski Foundation.


Image source: Flickr

Filed Under: Blog Article, Feature Tagged With: Civil society, Disinformation, Facebook, Fake News, Hybrid warfare, Immigrants, Instagram, Russia, social media, Trump, Twitter, West

Footer

Contact

The Strife Blog & Journal

King’s College London
Department of War Studies
Strand Campus
London
WC2R 2LS
United Kingdom

blog@strifeblog.org

 

Recent Posts

  • Climate-Change and Conflict Prevention: Integrating Climate and Conflict Early Warning Systems
  • Preventing Coup d’Étas: Lessons on Coup-Proofing from Gabon
  • The Struggle for National Memory in Contemporary Nigeria
  • How UN Support for Insider Mediation Could Be a Breakthrough in the Kivu Conflict
  • Strife Series: Modern Conflict & Atrocity Prevention in Africa – Introduction

Tags

Afghanistan Africa Brexit China Climate Change conflict counterterrorism COVID-19 Cybersecurity Cyber Security Diplomacy Donald Trump drones Elections EU feature France India intelligence Iran Iraq ISIL ISIS Israel ma Myanmar NATO North Korea nuclear Pakistan Politics Russia security strategy Strife series Syria terrorism Turkey UK Ukraine United States us USA women Yemen

Licensed under Creative Commons (Attribution, Non-Commercial, No Derivatives) | Proudly powered by Wordpress & the Genesis Framework