Open Access

The transformation of propaganda: The continuities and discontinuities of information operations, from Soviet to Russian active measures


Cite

Introduction

The current surge in viral disinformation, fake news, and trolling is often referred to as “propaganda”, implying a direct continuation of the practices from the mass communication–dominated media ecology of the twentieth century to the world of Web 2.0. However, to what extent is this justified? Problematising this assumed continuity, in this article, we compare practices and techniques of Cold War propaganda with contemporary computational propaganda, with a particular focus on the Soviet and then Russian effort in the Nordic context (especially in Sweden). Russia under Vladimir Putin has not only consolidated and controlled its internal media system (Lipman, 2014), but also actively intervenes in other countries’ media spheres, especially during elections (e.g., the 2016 American election and the UK’s Brexit referendum). This surge of activity since 2014 has not escaped scholarly attention (Bolin et al., 2016; Horbyk, 2015; Horbyk et al., 2021; Kragh & Åsberg, 2017), and it has also increased with the Russian military invasion and all-out war against Ukraine since 24 February 2022. Therefore, the need to study exactly how Soviet practices continue in post-Soviet Russia is urgent, especially against the background of difficulties to grasp and adequately represent what has been really happening in Russia, apparent from the shock and unpreparedness experienced by both elites and broader international publics at the invasion’s beginning. Moreover, there has been limited effort to study computational propaganda empirically, let alone compare it with “traditional” Cold War propaganda.

In this study, we focus explicitly on the comparative perspective. Our aim is to analyse the continuities and discontinuities of the Soviet model of propaganda from the Cold War era in present-day Russian propaganda. To do so, we have chosen two cases: propaganda manuals [metodichki], published by the KGB for internal use, and a particular subset of texts identified as propagandistic and originating in contemporary Russia: the Secondary Infektion campaign. By comparing Soviet instructions and present-day practices (as we lack access to comparable contemporary documents), we can trace how the instructions and recommendations on propaganda techniques by the Soviet practitioners may be actualised in contemporary Kremlin-linked influence efforts.

This focus entails the following research questions: 1) How were propaganda practices prescribed in the Cold War KGB manuals? 2) To what extent can these practices be identified in specific contemporary Russian efforts described as coordinated propaganda campaigns? 3) What are the differences and how can they be explained?

We argue that, although the Soviet propaganda model was not static, continuities prevail, including reliance on disinformation, combination of propaganda with operative work, integration of “open” and “closed” channels, the use of techniques like forgeries, staged leaks, “planted inferences”, and “source laundering”. At the same time, some discontinuities are also present thanks to the transformation of the Soviet model in today’s Russia. Furthermore, we link the Secondary Infektion campaign, Russia Today (a state funded and controlled international news television network), and the Swedish edition of Russian-owned foreign media outlet Sputnik.

What is propaganda? The conceptual foundation

Propaganda has been a subject of academic scrutiny since World War I, when mass influence techniques were systematically used (Lasswell, 1927/2015). A standard definition describes propaganda as “the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behaviour to achieve a response that furthers the desired intent of the propagandist” (Jowett & O’Donnell, 2018: 6). Although this definition relies on parameters that are difficult to determine and that often become accessible only in retrospect, such as intent or deliberate character, it facilitates the systematic categorisation of our material.

We acknowledge there are many alternative definitions pointing to the complexity of the phenomenon, such as its organisation: Virtually all definitions underscore its systematic and centralised character, which would logically exclude amplified “participatory propaganda”. Even organisationally, propaganda is no longer merely a state effort run from the governmental quarters. It is often dispersed and commercialised (see Bolin & Ståhlberg, 2023), to a large extent outsourced to privately owned organisations that make money on governmental propaganda commissions or to “people who claim to be Internet entrepreneurs” (Haigh & Haigh, 2020: 309). Such is the case of Yevgeni Prigozhyn, the owner of both PMC Wagner and the Internet Research Agency “troll farm”. Whereas these examples demonstrate the complexity of the conceptual debate, we bracket it for now and capitalise instead on the pragmatic working definition of propaganda which helps us analyse our empirical cases, in turn leading to prospects for refining the definition.

What do we really know about the Soviet roots of Russian propaganda?

Both Soviet and contemporary Russian propaganda have been studied from different perspectives. However, since our focus is explicitly on the continuities and discontinuities, we situate our study among previous research that explicitly addresses that problem. The question of continuities and discontinuities between Soviet and Russian propaganda is not entirely new, though it certainly lacks a definite answer and a thorough systematic examination. Still, different aspects have been noted.

According to the researchers who studied the 2014–2016 Russian “active measures” campaign in Sweden, “the continuity with Soviet mass communication themes is very strong” (Kragh & Åsberg, 2017: 782). Other scholars, proposing “the firehose of falsehood” model of the current Russian propaganda, suggested that it “builds on Soviet Cold War–era techniques […]. In other ways, it is completely new” (Paul & Matthews, 2016: 1). Conversely, Sanovich (2017: 5) proposed that “the digital elements of the Russian strategy […] had little Soviet foundation to build upon”. Given that there is only reflection on the role of media change in the evolution of propaganda techniques (Jowett & O’Donnell, 2018), but no empirical grounding and no explicit focus on the KGB manuals for information operations, there is an identifiable gap in the empirical research on how Soviet propaganda evolved: Three major studies disagree on the continuity from the Cold War to the present day, taking positions of strong, mixed, and little use of the Soviet legacy.

In a systematic and grounded effort, Fedchenko (2016) analysed 500 items of disinformation debunked by the Kyiv-based StopFake fact-checking agency and found that Russian disinformation directly continues the practices of active measures as first conceived in the 1950s, with the key innovation of seizing “in a parasitic way” the Western liberal values and critical doubt and “a postmodernist denial of everything” (Fedchenko, 2016: 146). 18 major frames were identified in contemporary Russian propaganda, with many resembling Soviet rhetoric from the Cold War. These frames include Ukraine as an American conspiracy, the fascism narrative, and an obsession with biolaboratories. Direct continuities have been noted by Fedchenko not only in themes or methods, but even in individuals involved: Putin’s deputy chief of staff Alexei Gromov, who started his career as a Soviet diplomat (or more than a diplomat?) and who has been in this position since 2012, is named the coordinator of mainstream media censorship in Russia (Fedchenko, 2016: 150). The Soviet mouthpiece on foreign policy, Valentin Zorin, contributed to developing the Russian takes on Ukraine before he died in 2016. In one case, a Western propagandist has been inherited: Michael Opperskalski was the founder of “front” magazine Geheim in 1980s Germany and is now a contributor for Russia Today. Under Gorbachev, the apparatus was strengthened, and departments were given new names. As a result, Fedchenko argued, the current Russian forgery output is in fact far greater than in the Soviet times.

Haigh and Haigh (2020) also located continuities between Cold War forgeries and present-day fake news, admitting, however, differences such as the rise of bots, trolls, and “peer-to-peer propaganda”. Modern fake news, they found, has been more amateurish but still effective due to social media’s levelling effect: On a smartphone screen, a spurious article looks the same as bona fide journalism.

More recently, Samoilenko and Karnysheva (2020) tracked the origins of Soviet propaganda to Marxism–Leninism. However, they focused on only one tool (character assassination) and presented only a cursory review, while conveniently omitting that “character assassination” a la sovietique often involved actual assassinations of the characters, such as Ukrainian nationalist leaders Lev Rebet (in 1957) and Stepan Bandera (in 1959). The authors noted similarities with present-day Russia, but they admitted that “further comprehensive enquiry is needed to assess the degree to which the traditions of Soviet propaganda determine strategic and tactical governance decisions in contemporary Russia” (Samoilenko & Karnysheva, 2020: 201). This article is a first step to answering this call.

Material and methods

In 2018, a total of eight 1970s KGB manuals, including some on propaganda, were published by The Interpreter (2018). This is just the tip of the iceberg, and many more manuals and textbooks are accessible in the former KGB archives in Kyiv (presently the SBU archive, particularly in files 17 and 24; and the former Communist Party archive, TsDAHOU, particularly in files 287 and 312). Our search in the Ukrainian archives revealed that the KGB of the Ukrainian SSR had a special library for the “extracurricular” study of its staff. It has formerly classified textbooks, research, and manuals on how to conduct intelligence, counterintelligence, and information operations. These manuals have not been studied before.

In this article, the content of the KGB textbooks is analysed and mapped using qualitative content analysis (Scheier, 2012). We first devised a coding frame containing the categories based on Jowett and O’Donnell’s (2018: 268) ten divisions for analysis of propaganda (modified to eight to exclude the divisions irrelevant to our material). This enabled us to code the manuals and outline a normative model of Soviet propaganda as part of active measures. The manuals (published in 50–100 copies by the KGB for internal use and classified as “Secret”) include The Use of Opportunities Presented by the Soviet Committee on Cultural Ties with Compatriots Abroad in Intelligence Work by Colonels A. A. Fabrichnikov and I. A. Ovchinnikov (1968), the anonymous Principal Directions and Objects in Overseas Intelligence Work (1970), and Political Intelligence from the USSR Territory by General Major V. M. Vladimirov and Colonel Yu. A. Bondarenko (1989). One author whose identity we were able to decipher is General Major Viktor Vladimirov (1922–1995), who had an especially profound knowledge of the Nordic context, being a long-time spy in Finland (1955–1959; 1970–1972) and the coordinator of the Soviet network in Finland (1974–1984), each time under the legend of a Soviet embassy employee.

In the second stage, a case study of the information influence operation commonly known as Operation Secondary Infektion (EUvsDisinfo, 2019), mainly active in 2014–2020, was carried out on the database associated with the case (containing over 2,500 documents, including a subsample of content particularly aimed at Sweden and other Nordic countries). The database, published by Graphika (2020), is publicly available and consists of a spreadsheet with headlines and links to specific stories. We sampled materials related to the Nordic countries and used the links to access the original Secondary Infektion (SI) materials. The focus on the Nordic countries is motivated by the fact that Sweden (which dominates the SI Nordic materials) was clearly the campaign’s core target, alongside Germany, the US, and the UK, similar to Soviet influence campaigns, thus making a strong case to investigate continuities and discontinuities. Narrowing the sample to one region and mostly one country allows us to study it closely in a qualitative framework.

The SI database has not been studied academically (although Graphika and DFRLab made solid expert contributions), let alone in comparison with Soviet propaganda. Our article thus presents a major new contribution in terms of introducing to propaganda studies the KGB manuals, the SI campaign, and the comparative analysis thereof.

We applied the same coding framework based on Jowett and O’Donnell’s divisions of propaganda analysis, yet at this stage augmented with the KGB propaganda model we had reconstructed. Using the subcategories we had provisionally formulated, we applied it to the SI campaign’s 155 items that dealt with Nordic countries, that is, either published in a Nordic country, in a Nordic language, or addressing a Nordic country; alongside the core focus on Sweden, Denmark also appears in the campaign’s margin. This enabled us to compare how propaganda was prescribed and carried out during the Cold War and to what extent the computational propaganda within the SI campaign reflects the same principles, perceptions, and prescriptions.

The items were published between 5 February 2015 and 3 December 2018, most intensively during February–May 2016 (clustering thematically around Eurovision, Sweden’s integration and security policies, and relations with Poland and Ukraine) as well as in August–December 2018 (prompted by the Skripal poisoning and the Swedish election). There was a hiatus in the “Nordic” campaign between May 2016 and June 2018. The analysed items come in seven languages: Swedish, English, German, Russian, French, Spanish, and Ukrainian. Swedish was the only Scandinavian language used by the campaign.

We identified twenty unique stories in the Nordic sample, three of which are related to Denmark (totalling 21 items), one to all Nordic countries (promoting the idea of “a Nordic republic” as a counterweight to NATO and the EU; 4 items), and the rest focused on or were directed at Sweden (130 items based on 16 unique stories).

We are aware of the differences between the two subsamples (historical manuals versus present-day propaganda practices). While we work on preparing relevant and comparable samples from each era for future research, our research questions here specifically concern the comparison between historical instructions (as expressions of a normative, theoretical model) and present-day practices.

The Soviet model of propaganda

Russian propaganda has roots older than the Soviet era, with similarities between asymmetrical disarmament proposals made by Tsar Nicholas II and the Bolsheviks (Barghoorn, 1964: 103). Soviet and Russian skills in active measures and disinformation extend back to Tsarist times, with traditions of the CheKa and the Okhrana traceable to the nineteenth century (Brantly, 2021: 27). Political surveillance in Muscovy and the Russian Empire was omnipresent, establishing a well-developed system of policing with isolation of foreigners and severe punishments. Testimonies of marquise de Custine in 1839 (Van Herpen, 2016: 3) and Paul of Aleppo in 1654 (Paul of Aleppo, 1836: 258–270) suggest long traditions behind active measures, despite the term being formulated in the twentieth century.

Our focus is, however, on Soviet and post-Soviet security institutions, starting with the CheKa and culminating with the KGB, superseded in post-Soviet Russia by the SVR (Foreign Intelligence Service), the FSB (Federal Security Service) and the GRU (Chief Intelligence Directorate). The Bolsheviks’ consolidation of the means of communication in 1918 allowed them to develop a unique model of political communication that combined Marxist ideology, Lenin’s (1902/1961) doctrine on the press, and the experience of both tsarist police and its nemesis: the criminal Bolshevik underground. The model’s emphasis was on organising, and the Department for Agitation and Propaganda was established with CPSU’s Central Committee in the early 1920s (Samoilenko & Karnysheva, 2020: 193). The concept of “active measures”, however, was developed only after Stalin’s death to “include disinformation, propaganda, forgery and in some cases assassination” (Brantly, 2021: 27). The agency in charge was the KGB First Chief Directorate’s Department D (later Department A, founded in 1959 and already producing forgeries and disinformation by 1961, employing about 300 staff by 1983). Active measures involved methods such as forged documents, statements from Soviet-sponsored nongovernmental organisations, and dissemination of false or misleading reports in the foreign press to expand control and stimulate anti-Western movements, divide opponents, and disrupt their alliances (Brantly, 2021: 33).

One of the most detailed scholars of Soviet propaganda, Barghoorn (1964), defined all “Soviet political communication” as propaganda, with two tendencies observed since Lenin’s time: 1) the messianistic promotion of Marxism–Leninism and 2) the “expedient” manipulative element, appealing to a broader non-Marxist audience. Barghoorn grouped the techniques used in Soviet propaganda around the categories of organisational structures, campaign types, rhetorical stances, and argumentation. A classic example of their application is Operation Infektion: On 13 July 1983, the Soviet operatives published an article in the Indian newspaper Patriot, alleging that HIV was created in a secret American laboratory. This allegation, through a reference in a Soviet newspaper, was included in a report by East German scholars at a conference in Zimbabwe (Brantly, 2021 33–34). Thus, the case is a classic example of “information laundering” (Carrasco Rodríguez, 2020), distancing false information from its actual source to make it more believable, which we propose calling “source laundering”. Similar cases, such as allegations of children’s organ trafficking by the US, demonstrate the lasting impact of Soviet propaganda on public opinion in Russia and beyond (Brantly, 2021: 33–34.).

How does this compare with prescriptions in the KGB propaganda manuals available today? To make our analysis systematic, we used the analytical schema devised by Jowett and O’Donnell:

The ideology and purpose of the propaganda campaign

The context in which the propaganda occurs

Identification of the propagandist

The structure of the propaganda organization

The target audience

Media utilisation techniques

Special techniques to maximize effect

Audience reaction to various techniques

(Jowett & O’Donnell, 2018: 270)
Ideology and purpose

The ideology of Soviet propaganda is clearly Marxist–Leninist, as documented in previous research. However, one clear yet often ignored aspect is the realist perspective on international politics as a zero-sum game: “Weakening the capitalist countries and strengthening the international position of the countries of the socialist commonwealth and progressive forces in the world” (Anonymous, 1970: 11). Furthermore, propaganda was not treated separately, but in combination with other methods such as active measures, initially defined as “offensive actions of the intelligence agencies of socialist states in all areas of their activities”, and later, simply as “all-sided support for bringing about the foreign policy objectives of the Soviet government” (Vladimirov & Bondarenko, 1989: 86).

The local context

The manuals recommend planning and undertaking any actions only after a thorough study of the local situation, providing examples of how an operation can be disrupted if carried out without proper knowledge. The local context’s role is obvious in the importance of tip-offs from the KGB operatives in their home country (Vladimirov & Bondarenko, 1989: 28). Other information is gathered from media monitoring and attending conferences and events.

Identification of the propagandist

Propaganda as part of active measures was carried out by KGB personnel but also recruited agents and trusted persons (using informal ties). Sometimes, locals could be used to spread the (dis)information unknowingly. By 1989, Service A was in charge of active measures and coordinating the work of other departments involved, such as the First Chief Directorate (PGU) and the Homeland Foreign Intelligence Department (the RT; Vladimirov & Bondarenko, 1989: 86). These departments then used legal Soviet organisations as a platform for their overseas work, such as the Soviet Committee on Cultural Relations with Compatriots Abroad. Therefore, the official position of the propagandist was crucial, as it defined his ability to carry out propaganda under the “flag” of his institution. It was also essential to be able to use the cover of “fake” organisations. A 1971 manual openly dealt with podstava [a trap], sometimes created through establishing “a fake [fiktivnyi] agency or a publisher” (Anonymous, 1971: 15–16).

The ideal propagandist is described as having a strong character, adaptable to any situation, able to convince people, subordinate them to his influence, and lead them. He must be unconditionally loyal, disciplined, and following the Soviet secret services’ instructions. Although the gender aspect appears interesting for further research, it is obvious that the main stake was put on men. Soviet and foreign researchers and academics are particularly identified as a resource for intelligence, active measures, and propaganda.

The structure of the propaganda organisation

Complex active measures are recommended as a combination of different methods and forms united by a single objective. Such campaigns should start with a specific problem definition which leads to the formulation of the aim and a selection of relevant methods. When it comes to information and media, the work is organised in what is called “the closed channel” (where the KGB acts under a false flag) and “the open channel” (where legal institutions work openly under their name). Soviet organisations and media can be incorporated into the KGB campaigns: “Part of the institutions, according to their functions, work on external propaganda on the same problems where the KGB carries out active measures” (Vladimirov & Bondarenko, 1989: 94). They are used in the KGB campaigns, yet the KGB does not repeat what they do; rather, it amplifies and extends the message through its closed channel.

The target audience

The 1989 manual defined the US, their allies, and China as key directions of interest. Recruiters were recommended to target students, researchers, media professionals, members of the military, and officials who spent time in the Soviet Union, not only because they can be a source of information, but also agents of influence in their home countries. Additionally, the focus was often on the people linked with the Soviet Union (resembling the aims of the Okhrana). As of 1968, according to KGB estimates, there were twelve million natives of the USSR abroad, most of whom lived in capitalist countries and did not participate in political life. They were to be the basis for the formation of the fifth column.

Even in 1971, active measures, compromising materials, and disinformation were proposed as key methods against the anti-Soviet emigres, while the pro-Soviet groups and individuals had to apply active measures targeting the host countries more broadly (Fabrichnikov & Ovchinnikov, 1968: 40–41). Teachers with Soviet roots at Harvard University, Columbia University, and what is described simply as “Massachusetts University” were approached as gatekeepers to alumni appointed to governmental and security jobs (Fabrichnikov & Ovchinnikov, 1968: 38).

The political emigration from Ukraine and the Baltic republics (Lithuania, Latvia, and Estonia) was especially active. Neutralising their influence through discreditation was among the tasks of Soviet propaganda. Émigré centres in the US, Canada, and Latin America were of primary interest; the KGB identified 150 relatively large anti-Soviet émigré centres globally that published a total of 78 newspapers and magazines. The manuals promote using the propaganda capabilities of the so-called progressive, pro-Soviet émigré organisations to neutralise the anti-Soviet diaspora centres. Their total number in the West was estimated as smaller, only 55, and which printed a total of 27 publications. The instructions maintain that the solution to propaganda problems depended on the correlation of these two “camps” in each country.

Media utilisation techniques

There is no idea of the independent press in the KGB documents; the concept that is used when talking about the foreign media is “the bourgeois propaganda” or “the press and propaganda”. The core tenet is that there can be no independent media, only propaganda, which perfectly aligns with Lenin’s original press theory. In practice, the manuals cite cases where the Soviets managed to establish new influential organisations and parties by starting seemingly independent, critical newspapers that exposed corruption but were fully subsidised and controlled by the KGB.

The Soviet Committee for Relations with Compatriots Abroad organised radio broadcasting and exported many media materials, such as books, newspapers, magazines, brochures, and so on. Likewise, cultural products, such as exported Soviet music and films, served as tools of propaganda, as did visits by artists.

The authors characterised press conferences as “a sharp measure” and showed a great deal of media awareness, praising the authenticity of voice as a medium:

The advantage of this form of active measures is in the urgency of using the openings it provides. Besides, press-conferences are an ordinary matter for people living in capitalist countries and are considered there a sign of “democracy”, the statements by press conference participants are trusted especially if it is taped and broadcast on the radio and the listeners recognize a familiar voice. (Fabrichnikov & Ovchinnikov, 1968: 74)

As an example, a high-profile defection of Kazymyr Dzhuhalo, the treasurer of the Organization of Ukrainian Nationalists (OUN), was amplified by a press-conference, taped, broadcast, and published in the Committee media as well as provided to the Western media. The support campaign included a series of articles and a book, Behind the Curtains of the OUN Puppet Show. As a result, the reputation of the OUN suffered, and it lost the exposed members (Fabrichnikov & Ovchinnikov, 1968: 75).

Within the closed channel, directed information stood in contrast to disinformation (Vladimirov & Bondarenko, 1989: 82–83). Directed information can be understood as often true, though possibly incomplete or distorted, information that is delivered to a specific person, institution, or group with a specific purpose, in expectation that it will prompt a desirable action.

Disinformation, on the contrary, was defined as “covert propagation [prodvizhenie] to the adversary of invented [vymyshlennykh] information, specially prepared documents and materials in order to deceive him and motivate to such decisions and actions that would be meeting the interests of the Soviet state” (Vladimirov & Bondarenko, 1989: 86–87). The conceptual development is evident when compared with an earlier, more general definition of disinformation as “measures elaborated in advance and aimed at misleading the adversary on certain issues in a favourable direction for socialist countries” (Anonymous, 1970: 74). There is also a notable evolution from a more limited understanding and use of disinformation targeting small groups of decision-makers (1960s–1970s) towards disinformation directed at entire societies (late 1980s). Disinformation was aimed at encouraging division between and within societies, disrupting their effort against the Soviet Union and influencing decisions (an overlooked aspect is the use of economic disinformation to procure lucrative trade deals). Contacts and trips could also be used to spread disinformation.

Disinformation is another major opportunity, even for the Soviet Committee:

For example, it is well known that the adversary’s special agencies and anti-Soviet émigré centers in their service very closely analyze articles in the newspapers published by the Soviet Committee and the content of its radio broadcasts. The adversary is aware that these newspapers publish information on life in the USSR more easily than the central press. They readily answer requests by the emigres to tell about this or that area in the Soviet Union. This creates rather favorable conditions so as to conduct disinformation measures through the Soviet Committee regarding the obscuring of the location of crucial installations on the USSR territory, distracting the adversary’s attention etc. (Fabrichnikov & Ovchinnikov, 1968: 65–66)

On par with disinformation, active measures included exposing crimes and secret plans of the foe to compromise and undermine the enemy state, political, and cultural institutions, particularly the émigré centres. Delivery methods could include virtually all available channels: personal conversations with decision-makers; propagation of directed information and disinformation; providing state, political, and public leaders with curated documents; publishing newspaper articles, books, brochures, and leaflets by foreign authors; radio, television broadcasts, and interviews with notable figures, including scholars, to publicise the theses prepared in advance by Service A; and the inspiration of rallies, protests, and open calls on governments. Inauthentic compromising materials “unmasking” immorality and corruption, however, must be believable.

Active measures were carried out through agents recruited among foreigners and Soviet citizens and informally affiliated trusted persons. They also required technical means that altogether are termed as “active measures realization channels, […] the correct choice [of which] largely determines the overall success of the enterprise” (Vladimirov & Bondarenko, 1989: 88). Agents of influence were recruited among officials, politicians, public leaders, businesspeople, military, cultural figures, and academics. Particularly journalists and publishers were singled out as effective in influencing heads of state; others include religious figures, relatives, and children of foreign leaders who studied in the USSR. Embassies, including foreign embassies in the USSR, constituted “the official channel” for podstava [fake leaks].

One particular direction is disorganisation and discreditation of the émigré circles and organisations, particularly of the Independence Day celebrations of Ukraine, Belarus, and Lithuania. A curious example is a publication of an obituary for an old émigré with praise to create an impression he had been a KGB agent. Discreditation simultaneously had to support the positive promotion of the Soviet Union. The methods comprised infiltration to create division and open discreditation through critique in the associated media, especially through accusations of ties to Western intelligence and open corruption. Fake documents must be “legalised” (i.e., laundered) in the West beforehand. Corruption and embezzlement were combined with operative and information work, for example, arranging a high-profile defection combined with document leaks and disinformation.

An exemplary case is the discreditation of Arvo Horm and Johannes Mihkelson, figures of the Estonian National Council in Sweden. The KGB published a brochure detailing Horm’s collaboration with American intelligence based on allegations by a Soviet Estonian, whom Horm was said to have attempted to recruit. A similar brochure was published on Mihkelson. From the brochures, the allegations spread to the Estonian newspapers and on television. An article about Horm, “Dead Souls from CIA”, was published in Komsomolskaya Pravda and republished by Finnish and Swedish newspapers (Fabrichnikov & Ovchinnikov, 1968: 80–81). However, the effect was probably limited, as Horm and Mihkelson continued to be prominent figures in the leading positions of the diaspora.

Special techniques to maximise effect

The KGB manuals recommend a range of particular techniques to maximise the effects of propaganda. One method suggests planting information with members of foreign delegations so that the prepared theses are communicated to the target separately and in an increasingly nuanced way through a variety of individuals, both formally connected with the Soviet state and not, individually and in group, which enables planting the thesis in a conversation between seemingly independent agents as the target’s “own” observation. We propose calling this technique “planted inference”. The media materials are used only as a support for these theses, as a warm-up and a follow-up. The key effect was to lead the target to the conclusions that the KGB wanted as if through their own thought process. When planting directed information in this way, the agents must foresee questions about its source and contextualise it in the local media publications, open-source information, and so on, to increase the semblance of their expertise and authority (Vladimirov & Bondarenko, 1989: 95).

Likewise, techniques of disinformation included, besides mass media publications, the more believable staged “confidential” conversations in front of discovered enemy wiretapping devices, or, similarly, the loss of fake documents (deliberately exposing them to theft or leak). This would increase the believability of disinformation.

The plausibility of disinformation and directed information was enhanced by creating multiple sources (compare with Operation Infektion above), “legalising” a forgery or false information. We propose calling this technique “source laundering” (compare with the less specific “information laundering”; Carrasco Rodríguez, 2020).

The authors were also concerned with counterpropaganda, which was mainly aimed at exposing the plans and agents of foreign intelligence services. The exposing documents could be leaked as an abridged abstract, as a full text of the document, a published photocopy, or a full article.

Audience reaction

The aspect of audience reaction is discussed only in passing in the material. The main effort was to create a “smoking gun” effect and to make the opposite side justify itself, that is, to seize the strategic initiative. “A necessary work element in active measures is to determine the reaction to the performed operation” (Vladimirov & Bondarenko, 1989: 95), and here, the use of “operative-technical measures” (wiretapping, searches, etc.) and personal conversations was recommended.

Based on these, one can provisionally formulate the Soviet model of propaganda as media-specific active measures, as illustrated in Figure 1.

FIGURE 1

The Soviet model of propaganda

Secondary Infektion: The neo-Soviet model of propaganda in use?

Secondary Infektion (SI) is a provisional name given to a centralised propaganda campaign discovered by security teams at Facebook, and later Twitter, and studied from a practical perspective by OSINT groups DFRLab and Graphika. On 6 May 2019, Facebook announced that they shut down 16 accounts, four pages, and one Instagram profile as elements of a “small network originating in Russia” (Nimmo et al., 2019). The name hints at the original Operation Infektion due to a similar mode of operation (the difference being that the original Soviet campaign operation spread only one story, while the Secondary Infektion campaign spread many; see Nimmo et al., 2019). Still, its original name (if there is a name) is unknown, as is the identity of the actor or actors behind it. However, evidence such as the digital forensic trace, the content (repeating the narratives by Russian officials and state media), and the characteristic language “accent” point out its origins in Russia. The earliest campaign materials, in reaction to Euromaidan and Russian opposition, can be traced to early 2014. SI remained active until at least 2020, with the publishing intensity falling around 2019 (when the operation was discovered). However, a Telegram channel associated with SI, Krit SBU (“SSU mole”, or “Security Service of Ukraine mole”) is still active and posting even today. Altogether, SI includes at least 2,642 items (articles, videos, and blog posts) published by the associated network of mostly single-use burner accounts connected to the campaign (Graphika, 2020). The key targeted countries appear to be Germany, the US, the UK, Francophone and Spanish-speaking countries, Sweden, Ukraine, and Russia (or Russian-speaking populations worldwide). Table 1 shows the materials dealing with the Nordic countries.

Associated items in the SI Nordic subcampaign

Story No. of items Date of publication Language Countries involved Link to original item
Flight SK 1755: who needs false sensation? 4 11 Feb. 2015 Swedish, German, English Denmark, Sweden indymedia.org.uk/en/2015/02/519460.html
Sweden is demanded that expiation of collaboration with the Third Reich should b [sic] 3 24 Feb. 2015 English Sweden indymedia.org.uk/en/regions/world/2015/02/519609.html
Europe’s share of the Arctic Pie 8 19 May 2015 English, Russian Sweden indymedia.org.uk/en/2015/05/520617.html
Valkampen börjar i Ukraina [The election campaign begins in Ukraine] 1 19 Aug. 2015 Swedish Sweden pressbladet.se/articles/view/valkampen-borjar-i-ukraina
Schweden will nicht über Verbrechen der Söldner in der Ukraine erzählen [Sweden does not want to tell about crimes committed by mercenaries in Ukraine] 4 4 Sep. 2015 German, English, Russian Sweden de.indymedia.org/node/5699
#European Integration without the #EU. Nordic dream of Swedish nationalists 4 15 Oct. 2015 English Denmark, Finland, Iceland, Norway, Sweden, beforeitsnews.com/eu/2015/10/uropean-integration-without-the-eu-nordic-dream-of-swedish-nationalists-2592024.html
Ärftlig sjukdom hos Kaczynski [Hereditary disease of Kaczynski] 1 26 Feb. 2016 Swedish Sweden svenskpress.se/papers/pressbladet/articles/view/arftlig-sjukdom-hos-kaczy-ski
Carl Bildt: hovering between prison and the Ukrainian Premiership 15 11–14 Mar. 2016 Swedish, English, Russian Sweden beforeitsnews.com/politics/2016/03/carl-bildt-hovering-between-prison-and-the-ukrainian-premiership-2785737.html
Sweden could have prevented terrorist attacks in Brussels! 16 23–28 Mar. 2016 Swedish, English, German, Russian Sweden beforeitsnews.com/politics/2016/03/weden-could-have-prevented-terrorist-attacks-in-brussels-2790278.html
Ukraina struntar i Nederländernas opinion [Ukraine ignores Dutch opinion] 1 01 Apr. 2016 Swedish Sweden svenskpress.se/papers/pressbladet/articles/view/ukraina-struntar-i-nederlandernas-opinion
Arctic: The Ecological Mercenaries 11 6 Apr. 2016 English, Russian Denmark homment.com/mstbfRAebq
Terrorist threats at Eurovision 2016 8 27 Apr.–3 May 2016 English, German Sweden beforeitsnews.com/terrorism/2016/04/terrorist-threats-at-eurovision-2016-2458464.html
Eurovision 2016 offended Ukraine 2 23 May 2016 English Sweden indymedia.org.uk/en/2016/05/525191.html
Silence commanded. People in Sweden are incensed by London’s pressure on scientists 10 4 Jun. 2018 English, Spanish, Ukrainian Sweden defendingthetruth.com/t/silence-commanded-people-in-sweden-are-incensed-by-londons-pressu-re-on-scientists.66513/
Marine Le Pen and Russian hackers improve The Sweden Democrats approval rating 15 21 Aug. 2018 English, Ukrainian, French, Swedish Sweden thestudentroom.co.uk/showth-read.php?t=5536072
The Sweden Democrats is suspected of DDoS attacks against opponent websites 11 4 Sep. 2018 English, Spanish Sweden homment.com/3ymztMtVBO8svO2KVvxs#!
Ukraine ‘drowns’ The Sweden Democrats? 22 30 Aug.–5 Sep. 2018 English, Spanish, French, Russian Sweden playbuzz.com/item/a3546aad-4fec-4d19-ad6e-9a837f8b0b4b
En Suecia proponen crear su propia “Comisión de Mueller” [In Sweden they propose to create their own “Mueller Commission”] 16 1–16 Nov. 2018 Spanish, French, Ukrainian, Russian Sweden globedia.com/suecia-proponen-crear-propia-comision-mueller
Greenland. How Much Does a Deal with the Devil Cost? 2 5 Nov. 2018 English Denmark indybay.org/newsitems/2019/11/05/18827857.php
Ukraina tillverkar en bomb för Europa [Ukraine is making a bomb for Europa] 1 3 Dec. 2018 Swedish Sweden facebook.com/Motpol/posts/2186943094649097
Ideology and purpose

In the sampled items, there are traces of Marxist–Leninist Soviet ideology, for example, in the occasional accusations of NATO and American imperialism or references to social inequality in the West. Still, these are rather marginal compared with the realist zero-sum game perspective, implicitly seeking to weaken the target countries through divisions and infighting. The texts very often seek to undermine sympathy and support for Ukraine, portraying it as extremely corrupt, criminal, threatening with its negligence of nuclear security, and meddling in the normal state of affairs in Europe (through politicising Eurovision or even election meddling). They also depict situations that can sow mistrust between Sweden and Poland (a false story on the Polish demands that Sweden atone for its WWII neutrality), the UK (a false story on London pressuring the Swedish government to conceal that Novichok poison was made in Sweden), and the US (false stories on the near-collision of a SAS flight with an unidentified plane, alleged to be from NATO). Similarly, they seek to discredit the Swedish state (such as falsely claiming that Sweden could have prevented terror attacks in Brussels because a known suspect lived in Stockholm) or fan discord within Sweden (e.g., made-up stories of integration failure, crimes by migrants, and election fraud).

The local context

The materials are somewhat attentive to the local context, as advised by the KGB manual. Items narrating the five core stories (out of 20) contain links to local media (e.g., Dagens Nyheter) and repurpose actual journalism, contextualising it within the disinformation context. Two of the stories display familiarity with core local institutions (e.g., the Tax Agency, Skatteverket, and its folkbokföring, or civil registration) and values (e.g., Swedish neutrality). Seven stories display a profound familiarity with the national political system; for example, the story alleging that Carl Bildt was going to be the next prime minister of Ukraine went into minute details of his biography, such as his activity during the conservative government formation in the 1970s and his publicly known contacts with the CIA around that time. Moreover, the situation and business activities of Bildt’s wife, Anna Maria Corazza Bildt, were correctly explained. Likewise, the story insinuating that Denmark was ready to sell Greenland to Trump demonstrated good knowledge of the local authorities in Greenland in addition to presenting a doctored document, presumably leaked from one of them. These examples point out the fact that the role of media monitoring is still very strong but also that insider information, such as tip-offs from experts and operatives, and possibly embassy staff, is used. This is in perfect sync with the Soviet-era KGB instructions.

The identification of propagandist

The identity of people behind SI remains undetermined, but the campaign is likely connected to either a state institution, such as an intelligence agency, or a closely related (quasi-)private entity, such as Yevgeni Prigozhyn’s Internet Research Agency or, later, the Patriot media holding, working in the outsource mode. It is worth noting that one of the most frequently used Russian websites in the 2014–2020 campaign now openly advocates for Prigozhyn as a political figure in 2023, raising the question of whether Prigozhyn’s assets did play a role. The overall Russian attribution is based primarily on Facebook’s own digital forensics, and, while it is impossible to verify without access to the backend data, there is little reason to doubt this conclusion, especially in light of contextual evidence, such as coincidence with Russian strategic narratives and typical linguistic mistakes (struggle with articles, lack of idiomatic writing, direct translations of Russian idioms, etc.). Mistakes abound in all the languages used, including French, Spanish, Swedish, and Ukrainian, but not in Russian.

The structure of the campaign

SI involved the use of the closed channel (because of the secrecy, illegality, and the use of fake personas to seed disinformation), which strengthens identification with the Russian intelligence agency. Domestic and foreign operatives were likely involved, but the materials were published using “alternative media”, blogs, and citizen journalism. Typical platforms to plant disinformation included the citizen-journalism websites Medium (www.medium.com) and Nyhetsverket (www.nyhetsverket.se), the alternative media platform Indymedia UK (www.indymedia.org.uk), and the right-wing alternative platform Before It’s News (www.beforeitsnews.com). At the same time, the operatives used the opportunities of commercial self-publication platforms, such as Pressbladet (www.svenskpress.se) where anyone can publish for a fee. Another interesting alternative included Homment (www.homment.com), a free German tool that allows creating anonymous web pages. These original “seeds” were then cross-referenced by several personal blogs based on WordPress (such as the German-language Politgraben) and LiveJournal (e.g., the Russian bloger_nasralny), creating a mirror hall of sources, and then shared by social media accounts on Facebook, Twitter, and Instagram. Many items were published on Reddit and even as questions-and-answers on Quora; others were posted to general interest web forums. This clearly deviates from the KGB manual and likely represents an attempt to adapt to a hybrid media system. At the same time, we are dealing with partial material. These acts were likely complemented by simultaneous operations through the open channel, such as publicly unsuspected agents of influence who could seed similar narratives in the independent press and the legal Russian organisations such as media Sputnik and Russia Today.

We have noted similarities in the narrative between five out of twenty unique SI stories and reports by Russia Today published at the same time. Especially resilient is interest in the Arctic security and energy resources, even though direct parallels are hard to find. At the same time, there are clear parallels between SI and Russia Today in Carl Bildt’s speculated Ukrainian prime minister job, the false “Swedish lead” in the Skripal poisonings, and the story on the petition to strip Ukraine of victory in the 2016 Eurovision Song Contest (the latter seems to originate in the same source). Additionally, Kragh and Åsberg (2017) analysed the presence of forgeries in the now-defunct Swedish edition of the Russian Sputnik news agency. Compared to the list of 26 forgeries published by the website in 2015–2016 that they found, 9 of 17 forgeries in the SI Nordic sample were reported by Sputnik. Therefore, we can be certain that the SI operations via closed channel were complemented by open channel operations (especially via Sputnik), in accordance with the KGB advice.

The overlap between SI, Sputnik, and Russia Today

SI forgeries (date published) Sputnik forgeries* (date published) Russia Today publications (date published)
Party of Swedes interview with Right Sector leader Dmytro Yarosh (17 Dec. 2014)
Flight SK 1755: who needs false sensation? (11 Feb. 2015) Near SAS plane collision in airspace was NATO’s fault, not Russia’s (13 Feb. 2015)
Sweden’s secret military aid to Ukraine (21 Feb. 2015)
Sweden is demanded that expiation of collaborationwith the Third Recih should b [sic] (24 Feb. 2015) Poland demands explanation regarding Swedish–German cooperation in World War II (25 Feb. 2015)
Europe’s share of the Arctic Pie (19 May 2016)
Ukraine seeks nuclear weapons (18 Jul. 2015)
Pro-EU party in Ukraine is extremist (24 Jul. 2015)
Valkampen börjar i Ukraina [The election campaign begins in Ukraine] (19 Aug. 2015) Swedish PR-firm Kreab supports Poroshenko (19 Aug. 2015)
Schweden will nicht über Verbrechen der Söldner in der Ukraine erzählen [Sweden does not want to tell about crimes committed by mercenaries in Ukraine] (4 Sep. 2015) Swedish cover-up of Ukrainian war crimes (2 Sep. 2015)
Sionist Kyiv seeks European far right’s support (7 Oct. 2015)
New Maidan in Moldova (21 Oct. 2015) ‘Right wing practicing violence in Ukraine no mystery to anyone’ (1 Sep. 2015)
OSCE hides Ukrainian corruption problem (25 Oct. 2015)
#European Integration without the #EU. Nordic dream of Swedish nationalists (15 Oct. 2015) Swedish nationalists seek EU alternative (15 Oct. 2015)
Terrorism needs to be defeated (17 Nov. 2015)
Germany supports Kurds against Turkey (25 Dec. 2015) Thousands in Germany protest Turkish crackdown on Kurds [Video] (27 Dec. 2015)
Money is more important than security (30 Dec. 2015)
Indonesia supports the Islamic State (15 Jan. 2016)
Migrants and the end of Europe (26 Jan. 2016)
UK ferments unrest in Middle East (29 Jan. 2016)
Ärftlig sjukdom hos Kaczynski [Hereditary disease of Kaczynski] (26 Feb. 2016) Kaszynski is mentally ill (24 Feb. 2016)
Sweden supports the Islamic State (3 Mar. 2016)
Carl Bildt: hovering between prison and the Ukrainian Premiership (11–14 Mar. 2016) Carl Bildt’s ominous ‘advice’ on Ukraine (8 Feb. 2016)
Sweden could have prevented terrorist attacks in Brussels! (23–28 Mar. 2016) Sweden did not prevent terror attack in Brussels (29 Mar. 2016) Alleged Brussels & Paris attacks accomplice was integration poster boy in documentary (16 Apr.2016)
Ukraina struntar i Nederländernas opinion [Ukraine ignores Dutch opinion] (1 Apr. 2016) Ukraine disregards the Dutch EUreferendum (1 Apr. 2016) Ukraine’s EU bid: Dutch PM says ‘ratification can’t go ahead’ as Kiev says ‘nothing has changed’ (7 Apr. 2016)
Dutch referendum on Ukrainian EU association agreement (11 Apr. 2016) Referendums not part of parliamentary democracy, Luxembourg FM says after Dutch vote on EU/Kiev deal (11 Apr. 2016)
Terrorist threats at Eurovision 2016 (27 Apr.–3 May 2016) Swedish security threat at Eurovision 2016 (28 Apr. 2016)
Eurovision 2016 offended Ukraine (23 May 2016) Euro-revision? Over 300,000 sign petition demanding recount for Eurovision 2016 Song Contest (17 May 2016)
NATO to undermine the UN (10 June 2016)
Savchenko next UN General Secretary (19 Jul. 2016)
Silence commanded. People in Sweden are incensed by London’s pressure on scientists (4 Jun. 2018) Still multiple leads in Skripal poisoning case, says Scotland Yard (5 Jun. 2016)
Marine Le Pen and Russian hackers improve The Sweden Democrats approval rating (21 Aug. 2018)
The Sweden Democrats is suspected of DDoS attacks against opponent websites (4 Sep. 2018)
Ukraine ‘drowns’ The Sweden Democrats? (30 Aug.–5 Sep. 2018)
En Suecia proponen crear su propia “Comisión de Mueller” (1–16 Nov. 2018)
Greenland. How Much Does a Deal with the Devil Cost? (5 Nov. 2018)

based on Kragh & Äsberg, 2017

Media utilisation and special techniques

The SI campaign heavily leaned towards disinformation. The materials were adapted for the online environment, yet the text form dominated, often following the model of a journalistic article. In two cases, the core story was written as a blog, and in one case, it was in video form. The method provided for planting the initial information, often with a forged document, via pseudonymous accounts always used only once, to be referenced by other publications (often in other languages) on the same or the next day, and then shared by the associated social media accounts in the span of one to three days. Due to the high operational secrecy (unique accounts used only once without a possibility to develop any following), the content almost never went viral and remained obscure. Very few stories went beyond the Russian networks, but at least one was seeded with the far right in Germany and continued to be referred to on social media (Nimmo et al., 2019).

Out of 20 core stories in the Nordic sample, 13 featured a forged document, often an internal government memo or a formal letter on letterhead, for example, an epistle from then President of the European Commission Jean-Claude Juncker to the Swedish minister of foreign affairs at the time, Margot Wallström, strong-handing Sweden to support Brussel’s inclusion in the Arctic Council. The argument structure was typically built around a “revelation”, exposing and unmasking a certain conspiracy by powerful and egocentric actors, often the US, NATO, and the European Union. Fourteen unique stories contained this “exposing” tool (identified as a key argument by Barghoorn, 1964). Some of these “revelations” concern the alleged enemy’s (the US or NATO) operative network, disinformation, or propaganda operation and can count as “counterpropaganda”. Even blog posts can be forged, such as a 2018 fake blog post appearing to be posted by Carl Bildt, which called for setting up an equivalent of the Mueller commission in Sweden to investigate Russian election meddling. The fake text was photoshopped onto a screenshot of a real blog post that Bildt had published about Syria, although the makers negligently forgot to change the original headline (see Mogano, 2018).

The most interesting case concerns Ukraine’s alleged meddling in the 2018 Swedish general election. It started unusually when a series of 13 articles were published on 21 August 2018 by English-language, French, and Ukrainian SI assets. They all alleged that the Sweden Democrats political party was obtaining help from Russian hackers and Marine Le Pen to win the forthcoming election through fraud. The articles were very critical of Russia, Le Pen, and the Sweden Democrats. They were followed by eleven articles in English and Spanish telling the core story of Sweden Democrats launching major DDoS attacks against rival parties. The articles stated that the Swedish Civil Contingencies Agency (MSB) was investigating the party figure Linus Bylund; again, these materials were hypercritical of Russia and Sweden Democrats. However, on 30 August, a new series of 22 articles was launched exposing a staged “leak” of a forged memo from the office of Ukrainian Prime Minister Volodymyr Groysman. The “memo” detailed a plan to help “our allies”, the Swedish Social Democratic government, by launching an online disinformation campaign against Sweden Democrats and Russia. Thus, the main purpose of the first two batches of Russia-critical articles was to create false “evidence” of Ukraine’s disinformation meddling in the Swedish election, while also contributing to the general suspicion and confrontation in society, mistrust of Ukraine, and possibly whitewashing the Russian meddling scandal in the US by “normalising” it or deflecting from it (“but Ukraine is meddling in Sweden!”).

We have considered the SI campaign, finding major continuities with the Soviet model but also certain differences. How should these continuities and discontinuities be understood?

Discussion: From the art of propaganda to the mass market?

The Soviet model of propaganda was not static and developed, from the roots in tsarist repressive organs and the Marxist–Leninist theory of the press, to the modus operandi of “active measures” by the 1960s, taking a highly detailed and sophisticated form by the late 1980s. It was media-conscious and concerned with choosing the right channel; its main focus was achieving impact through leveraging both directed information and disinformation via different channels, particularly defined as open (legal) and closed (illegal). The information work was implicitly divided into passive (obtaining information) and active (applying information), hence “active measures”, while the emphasis rested on the seamless blending of propaganda and operative work. For the KGB, propaganda was not thought of separately from intelligence and active measures: Personal communication, press conferences, broadcasts, articles, books, and brochures were all used simultaneously with academic exchanges and cultural events to promote the Soviet image and discredit and divide the enemies at the same time.

The Soviet model of propaganda as an instrument in the active measures toolkit continues in post-Soviet Russian propaganda as practised in the 2010s and until now. Even on a personal level, there are continuities of old Soviet professionals spearheading Russian influence campaigns in the 2010s. Of all the core elements we identified, most remain, in practice, little changed, especially when it comes to adapting the materials to the local context (through media monitoring, insider observations, and tip-offs) and delivering disinformation, and counterpropaganda, such as exposing the alleged enemy networks, in a complex closed channel. Special techniques include staged leaks, forgeries, planted inference, and source laundering. Setting up fake personas (impersonators or organisations) is still very relevant. All these major principles were developed by the KGB and only slightly adapted to new realities; so, for example, planted inference can be organised online by planting different pieces of the narrative separately, allowing the audience to “make their own conclusion” by assembling the bits of the puzzle. Most stories are constructed in such ways as to create tensions between Western countries; fuel racial, religious, and ethnic hatred; and discredit Ukraine, which the KGB toolkit also emphasised. Composite stories blending forgery with some facts (often quite a lot of them) are equally typical for both Soviet and contemporary Russian propaganda, enhancing its plausibility.

At the same time, there are aspects that did not adapt so well, making the assumption of today’s Russian propaganda “simply a continuation” of the Soviet policies less tenable (Fedchenko, 2016: 156). Marxism–Leninism has only a very niche appeal today, although traces or certain aspects may remain. At the same time, the pessimistic perspective on international relations “combined with an extremely self-centered world outlook, which views any accretion of strength to a rival power as a grave threat to Soviet security” (Barghoorn, 1964: 84) deeply characterises the perspective of the Kremlin regime and its propagandists, just like in the Soviet era. A sense of moral superiority is derived from the perceived defensiveness of Russian disinformation, whereas in the Soviet time, it was justified as deceiving the enemies of the working class (Fedchenko, 2016). The defensiveness, however, was already present in the KGB manuals.

A striking dissimilarity seems to be the target groups. While the Soviet propaganda prioritised decision-makers and high-profile experts, researchers, and journalists, the SI material seems to target fringe ideologies and conspiracy theorists. However, since our analysed scope of the SI campaign is certainly only one part of a bigger operation, we may be limited by the material’s character here. It is in line with what the contemporary Russian propagandist, Darya Dugina, who was killed in a car bombing in August 2022, wrote on “mental mapping” as a new model of “non-classical” “network warfare”, whereby “mind control” becomes the decisive factor. It is possible only through apprehending the reception, “therefore different population groups are translated different information flows that should produce a synergetic output”; this model is employed equally to one’s own society, the enemy society, and neutral societies (Dugina, 2022: n.p.).

Further differences were previously noticed in the generous budgets and the abuse of the relative openness of Western media, unlike in the Cold War era (see Van Herpen, 2016: 3). Organisationally, the SI campaign yielded limited evidence for using directed information or influence agents. However, this may be an artefact of the campaign’s material, where personal communication or networking is not recorded. The Graphika database, while extensive, may be incomplete, and the fact that books, brochures, and press conferences (all recommended by the KGB) were out of this specific campaign’s scope does not mean that they remained unused. On the contrary, one must remember that the KGB instructed its personnel in an integrated use of different channels and media, united with a single objective in each active measures campaign. It is plausible that the SI campaign, like an iceberg, has this invisible part going well beyond the limits of this article.

There is, however, strong evidence that the SI campaign aimed at the Nordic countries was amplified through the open channel. One of our significant findings is that most of the SI forgeries (9 out of 13) were cited by the Swedish edition of Sputnik (Kragh & Åsberg, 2017), and there are also parallels with the material on Russia Today. Owing to this “legalisation”, some forgeries even found their way to the mainstream Swedish media, for instance, when allegations of Carl Bildt becoming the prime minister of Ukraine were cited in Dagens Nyheter (Holender & Carlsson, 2016). Thus, we can link the SI campaign with open Russian operations under its own flag, and we can consider Sputnik, Russia Today, and the SI forgeries and disinformation as parts of a single, centralised, and coordinated operation.

Contextually, other forms of open channel work are inherited from the USSR. For example, the Soviet Committee on Cultural Relations with Compatriots Abroad continues in the form of Rossotrudnichestvo, supporting such organisations as the Union of Russian Associations in Sweden [Ryska riksförbundet i Sverige]. Local “trusted persons” are identified and recruited via Russian institutions and cultural initiatives abroad and at home, such as the Valdai Discussion Club. Soft power redefined as “hard power in a velvet glove” (Van Herpen, 2016: 10) is practiced through Western public relations firms, or even the purchase of Western media such as Independent and France Soir, spearheaded by political corruption and lucrative energy deals.

There remains a question of audience feedback analysis that we could not study directly here, which is also related to the issue of the SI campaign’s effectiveness and overall purpose. The discoverers of the SI campaign noticed the lack of virality and overall impact despite its industrial scale: operating regularly, consistently, and systematically throughout several years in several languages. Surely, Soviet propaganda was also often ineffective (see the case of Horm and Mihkelson above). The working standards may have deteriorated (evident in sloppy stylistic quality and mistakes in the texts), hinting at the imitation rather than continuation of the Soviet model. However, it is equally possible that the impact on mainstream media or public opinion was not the main purpose, as earlier researchers noted, raising the possibility that the campaign was used to identify and recruit new agents.

Whatever its actual purpose, the areas of uncertainty outlined here remain key directions for future research. As of now, there can be several explanations for the campaign’s lack of reach:

it was a poorly executed attempt at viral influence;

the very project was fake, aiming to use up the “active measures” budget by an organisation like Prigozhyn’s media holding;

its purpose was to create the semblance of a disinformation campaign to frighten Western policy-makers;

rather than aiming for “big impact”, its objective was “small impact” – influencing and revealing a small number of individuals who could be recruited as potential operatives;

or it was used to communicate with already recruited agents conveying talking points or even coded messages (as the KGB did in the press).

One of the sharper contrasts between the KGB recommendations and the contemporary Russian practice is in the meticulous work that the Soviet generals and colonels insisted must be put into active measures versus the mass, almost “spamming” approach of the SI practitioners. Whereas successful propaganda, as considered by the KGB, was an individual work of art by operatives with profound expertise and a sophisticated and creative approach, the SI mill churning out new forgeries in a dozen languages marks a transition to a mass market. Therefore, its poor style and linguistic mistakes (e.g., leaving mismatching headlines when photoshopping new text) can be explained by the fragmented environment. In a fragmented and hybrid media system, it makes much less sense to invest a lot of effort in a small number of high-quality artefacts. Unlike in the mass media society, where strong gatekeeping and agenda-setting functions of a few dominant news organisations guaranteed exposure for a well-made forgery, today, any such product is more likely to be buried under petabytes of constant updates, no matter its quality. This is not without some positives, as the propagandist need not recruit influencers among experts and journalists as much as before to gain access. Social media, alternative media, and citizen journalism have lowered the entry cost – but also decreased the expected effectiveness. Therefore, fragmentation motivates the propagandist to invest in a greater number of cruder interventions that can have a chance at impact as a critical mass. The poorly predictable laws of virality provide incentive to put out a lot of content, hoping that perhaps one story will take off. The propagandist must limit aspirations, being content with influencing small groups rather than societies at large, hoping that these will influence divisions, emotions, and behaviours in the mainstream.

Finally, what can we say about propaganda as a concept and its transformation from the 1960s to the 2010s? The core elements of most definitions certainly remain in the Soviet–Russian case: centralised organisation, strategic character, and quest for impact. We would like to draw attention to the interesting formulation used in the KGB manuals: prodvizheniye informatsii (literally, “advancing the information”, used with dative komu, “to whom”). It can be translated as “propagating information to whom”, “promoting it with whom”, or more idiomatically, “feeding the [directed or dis]information to the enemy” (compare with Dugina’s “mental mapping” and “mind control”). Such concepts suggest a practice where certain information, such as a fact, allegation, interpretation, or narrative, is continuously “fed” to the target until it is fully advanced to its perception and accepted, possibly as one’s own (compare with “planted inference”). Further conceptual nuances concern specific features of the Russian style of propaganda, such as its covertness, its negativity, and destructiveness (aiming to divide, discredit, and weaken), and its artificiality. Intertwined with the operative work, both Soviet Cold War and Russian propaganda have been conceived and designed as inorganic, inauthentic interventions. This element of artificial intervention is still a largely overlooked component of propaganda.

eISSN:
2003-184X
Language:
English