Open Access

Propaganda and the Web 3.0: Truth and ideology in the digital age

   | Jun 21, 2023

Cite

Introduction

In this contribution, I set out to refine our understanding of contemporary propaganda as a concept and an empirical phenomenon by combining existing knowledge with additional insights and identifying significant and often overlooked aspects. While propaganda is “taken quiet seriously by governments, the military and foreign service apparatus […], it has occupied a tenuous conceptual place” in academia (Anderson, 2021: 2). The aim is to bring together building blocks of a conceptual model that can contribute to the study of propaganda. Propaganda is far from a new phenomenon, yet its return to prominence comes at a time of artificial intelligence (AI)-based advances in the digital communication age, characterised by emergent Web 3.0 technologies (Hendler, 2009). It also coincides with a wave of polarised political tension and an institutional crisis of distrust and profound scepticism towards official sources of information and objective fact-finding (Cassam, 2019; Flew, 2021). Reflecting on the Spanish Civil War, Orwell (1968: 252) observed that “atrocities are believed in or disbelieved in solely on grounds of political predilection. Everyone believes in the atrocities of the enemy and disbelieves in those of his own side, without ever bothering to examine the evidence”. This thought continues to apply to violent conflict today; however, it now also has bearing for debates regarding medical science, climate change, economics, gender, abortion rights, AI, and so on, leaving these issues to be adjudicated by ideological predilection rather than thorough examination of evidence, argument, or reflection.

These conditions are prime opportunities for North’s (1981) “ideological entrepreneurs”, a market analogy describing both an innovator and pusher of ideological views. These individuals “play a key role in ideological change and thrive in times of upheaval and alienation”, which they exploit to propagate their ideological positions (Hyzen & Van den Bulck, 2021a: 182), and “work to capture the ideological marketplace, competing fiercely against other ideological entrepreneurs as well as against the weight of existing public opinion” (Storr, 2011: 107–108). This ethos of epistemological upheaval permeates the so-called post-truth, alternative-facts paradigm (Mclyntre, 2018), in which, to reformulate Orwell, “everyone” believes their respective antagonists are merely engaging in deception, fake news, and mis- and disinformation in the form of manipulative propaganda.

Different from definitional-conceptual efforts around “post-truth”, “misinformation”, “disinformation”, “malinformation”, “conspiracy theory” (Wardle, 2018), and around AI and algorithmic and computational targeting (Zuboff, 2019), recent scholarly work has focused less on propaganda as a concept. As a result, little attention has been given to how “true” information contributes to propaganda and to how developments in digitalisation create propaganda opportunities, both for significant decentralised non-state actors as well as central institutionalised forces, enabling both new forms of audience participation in the creation and dissemination of propaganda and quick repurposing of content for counter-propaganda. Once introduced into the media landscape to spread ideas, propaganda can have a dynamic existence beyond a limited or original target.

To capture these aspects, I start from existing conceptualisations of propaganda, including its formative academic roots (Bernays, 1928/2005; Ellul, 1965; Lasswell, 1927; Lippmann, 1957) to build my definition of propaganda as “a sustained campaign of communication to enforce ideological goals, manage opinion and codify loyalties of target groups, whether specific social sets or broad populations” (Hyzen, 2021: 3482). While propaganda is inherently manipulative, its multidimensional process allows for different sites to realise ideological goals through intentional distortions. The digital age has amplified this capability, expanding the propagandists’ toolkit to target and recruit individuals. I elaborate on the relationship of propaganda to ideology, the diverse nature of the propagandist, and the paradox of causality in understanding propaganda.

Next, I examine several constitutive aspects of propaganda, representing fields of enquiry and explanation that are grouped into three sections. The first focuses on the relationship between propaganda and facts, “true” and “false” information and truth, crucially arguing the importance of a conceptualisation of propaganda that includes true information. Inspired by Billig (1995), I also suggest a notion of “hot” and “banal” propaganda to address near-term and long-term ideological goals.

In the second section, I elaborate on distribution, dissemination, and repetition; the latter, I argue, is fundamental to propaganda and stereotyping. I maintain that repetition is not just a rhetorical devise but related to means of distribution and dissemination. This is historically situated in media and communication evolutions, before examining contemporary developments in computational propaganda and the digital landscape.

The third section addresses audiences’ relationship to propaganda. I refute old but lingering views of propaganda audiences as passive and gullible “targets” while accepting the limits of actively “decoding” audiences. I argue a cyclic nature of consumption, production, and dissemination, following Livingstone (2015) and, like Bolin (2012), I distinguish two types of audience activity: interpretation and participation. I elaborate on how an audience can be both an actively decoding recipient and a participant in the propaganda process as consumer and propagator in the contemporary media ecology. Moreover, how propaganda reaches both intended and unintended audiences, with both intended and unintended consequences.

Lastly, I introduce a visual model to identify and clarify the constitutive parts of propaganda. I conclude by reiterating key points and discussing shortcomings. Throughout, I illustrate my arguments with contemporary examples of propaganda relating to the Russian invasion of Ukraine, the “Stop the Steal”-related January 6 riots and aftermath in the US, conspiracy theories including those propagated by QAnon and US-based alt-right media personality Alex Jones, as well as religion-based movements like Islamic State.

Propaganda defined

Definitions of propaganda tend to rely on persuasion and manipulation as the fundamental properties (Wanless & Berk, 2017; Jowett & O’Donnell, 2019). Bakir and colleagues (2019) place propaganda inside the larger framework of “Organized Persuasive Communication”, which my proposed definition could work within. While persuasion and manipulation are part of propaganda, they are also wider concepts, making them less useful to narrow down propaganda’s specifics.

Stanley (2015: xiii, 5) considers “propaganda as the employment of a political ideal against itself” which “rests on a theory of flawed ideology”. I argue against the idea that propaganda is always about turning a political ideal against itself and about flawed ideology, but I agree that ideology is a primary constituent of propaganda. Likewise, Jowett and O’Donnell (2019) identified ideology as the first step in propaganda analysis. They further stated that “the purpose of propaganda is to achieve acceptance of the propagandist’s ideology” and “successful propaganda campaigns tend to originate from [a] strong, centralized, decision making authority” (Jowett & O’Donnell, 2019: 269, 270). I argue that these points do not capture relevant developments, particularly regarding Web 3.0. First, propaganda is used cynically, not necessarily to instil the propagandist’s own ideology, but any that ensure the desired ideological effect. Second, like Benkler and colleagues’ (2018: 29) view of propaganda as “intentional communications […] targeted at a population, excluding interpersonal manipulation or very small-scale efforts [emphasis original]”, Jowett and O’Donnell’s fails to capture movements like QAnon, conspiracy theories, and ISIS, which lack “decision making authority” and grow from “small-scale efforts” into a relevant organisation by spreading propaganda. The Web 3.0 and social media afford both large states or institutions and small movements’ propaganda campaigns at an “interpersonal” level, for example, through bots. Moreover, content can quickly be further propagated by audiences, as I argue below. While I agree that too small a communicative interaction hurts a conception of propaganda, I aim to build upon their work to encompass these contingencies so that both non-state actors of various sizes utilising propaganda effectively alongside dominant ones can be captured.

Propaganda is about establishing or manipulating ideological viewpoints and belief formation to “organize a cartel of imagery and identity” (Price, 2002: 667), and a campaign of this nature must be flexible and recurring in complicated ways, not simply persuasion by rhetorical device. Revisiting the academic roots of propaganda studies (Bernays, 1928/2005; Ellul, 1965; Lasswell, 1927, 1935; Lippmann, 1957) allows us to sharpen and clarify its contemporary manifestations. Closely examining some of the classic problems and paradoxes can help flush out ways to operationalise the contemporary study of propaganda, by combining Lasswell’s (1927: 627) notion that propaganda is the “manipulation of significant symbols” and Ellul’s (1965: 63) argument that successful propaganda represents “the penetration of an ideology” and that all propaganda “implies an attempt to spread an ideology”. Furthermore, Bernays (1928/2005: 20) argued that propaganda was the “organized effort to spread a particular belief or doctrine”.

In this light, I situate propaganda as a tangible expression of ideology: a mode of communication to spread ideas, achieve ideological goals, and exercise power (Hyzen, 2021). This definition also follows from the intuitive observation that removing ideological language, symbols, and icons from messages removes the “propaganda quality”, leaving behind more “neutral” information. Martin’s (2015) “simple” programme of ideology as values + beliefs = opinion conceives of propaganda as tactically employed to spread or usurp values and beliefs towards and about “X”. If such ideological (values + beliefs = opinions) positions are accepted or established, logical conclusions and behaviours are then expected towards or about “X”. So, if the 2020 American presidential election (X) is accepted as being stolen by a deep state or Democratic conspiracy (ideological position), propaganda (tactic) can enforce that conclusion (beliefs), alongside the moral obligation (values) to resist a stolen or illegal election, then the January 6 events appear logical and predictable as a course of action – thus realising a goal (strategy) by way of the propagandist.

One could say propaganda campaigns are ideological redescriptions of events. The January 6 events can be propagandistically (re)described as a rally, protest, riot, or revolt. Through the sustained repetition of such ideological redescriptions, values + beliefs can be codified into an opinion favourable to the ideological goals of a larger strategy or complex doctrine. Furthermore, ideological goals are larger than one propaganda message. We must grapple with this to understand later-stage propaganda where views have previously been established, what Lippmann (1957) called stereotypes, and to understand how immediate or reflexive propaganda messages fit larger political-social goals and longer-term strategies. Audience participation, both interpreting and further spreading propaganda, becomes more relevant over the long term. Older ideological symbols can be revitalised or renew their significance in a contemporary context – deliberately or otherwise. This definition and conceptualisation of propaganda finds its value in explaining propaganda in relation to a wide range of ideological conflicts, contexts, or competitions, whether classic political rivalries, war between countries, flat-earth conspiracy theories, religious evangelism, identity politics, or others, which can be analysed primarily by evaluating the ideological goals, advanced or challenged.

At the most fundamental level, anyone consistently communicating mediated messages to spread ideas is a propagandist (see Figure 1: 1). The propagandist’s role is to establish ideological views that create conditions favourable to furthering or accepting actions, policy, or doctrine. This is often a functional role fulfilled by intellectuals and institutional structure, which I argue, along with Bernays (1928), that propaganda is what the public relations or communications department or ministry does. The title “Propagandist” (capital-”P”) is generally reserved for state or other large, institutionally backed public relations professionals; yet, the prominence and political importance of conspiracy theorists and terrorists remind us that, just like a typical cult-leader versus an institutional religious cleric or priest, they are not different as ideological leaders, apart from our familiarity towards them or their relative size. We could juxtapose the institutionalised Propagandists versus the decentralised, non-state propagandists, but these are not hard categories. Both follow the same principles of propaganda, as per my definition. Propaganda campaigns’ effectiveness undoubtedly correlates to the powerful interests that support them, but grassroots movements regularly find ways to spread effectively where passion and persistence substitute for resources. Indoctrination introduces ideologues (and vice versa): individuals more committed to a doctrine or their institutions than to the “truth of what happened”. Intellectuals of this sort fit both the Propagandist and propagandist moulds and present a consistent stream of propaganda to shoehorn reality into their doctrines. I examine this more closely in the following sections.

FIGURE 1

Model of propaganda

Comments: The figure shows the model culminating from the components discussed in this article, with the following corresponding numbers: Propagandist (1); Truth, misinformation, disinformation (2); Towards the facts or events, away from the facts or events (3); Contextual environment (4); Dissemination, distribution (5); Repetition (6); Audience reception or interpretation (7); Active audience redistribution (8); Intended and unintended audiences (9).

Propaganda usually creates a paradox about causation, and the manipulative and flexible nature of propaganda makes unmasking intentions confusing, especially in the context of analysis. Country A requires oil. Country B has oil. Country A’s policy is to obtain oil from Country B; however, there are political barriers. Is it the propagandist’s goal to take the oil from Country B? Following my definition, no. It is the propagandist’s goal to create ideological conditions such that it is politically or socially acceptable for Country A to acquire oil from Country B, whether that means through bilateral trade or military aggression. This can be achieved by propagandising Country A, Country B, or both – whatever the task requires. Applied to Russia’s oil sales after its invasion of Ukraine: To now purchase Russian oil is unacceptable for the US, problematic and declining for EU countries, and largely acceptable for India and China, as neither officially recognise the war (Xu & Verma, 2022). Propaganda does not necessarily shape the behaviour of the countries, governments, or institutions; typically, propaganda shapes public opinion and belief formation to permit or justify existing or imminent behaviours, policy, strategy, or tactics. Often, in propaganda studies, there is confusion between strategy, tactics, and causation; this conception of propaganda treats it as the tactic of enforcing ideological goals. Similarly, Herman and Chomsky’s (1988) five-filter model describes the tactics of censoring or removing undesirable information and ideological competition in the propaganda process, within Price’s (1994) “market for loyalties”, where power interests generate collective identity or indoctrination. Ultimately, this must be evaluated on a case-by-case basis to make necessary distinctions.

In what follows, I discuss in more detail the constitutive components of propaganda, treating it as a programme for analysis. It culminates in a visual model. The explanation of the model’s various components are indicated in the text with the number corresponding to its inclusion in Figure 1.

Towards the truth, away from the truth: Hot and banal ideological goals

Having established propaganda as a form of communication embedded in ideology, several key points need elaboration. For one, I argue that the notion that “propaganda brings the target towards the facts/events or away from the facts/events [emphasis original]” (see Figure 1: 3) (Hyzen, 2021: 3483, 3488), ultimately to serve broader ideological goals, is a crucial, though not singular, part of propaganda analysis. Ideological goals are larger than one propaganda message or one campaign: They are about establishing an entire viewpoint. Once established, propaganda can maintain and adjust opinion further according to current demands. Borrowing from Billig’s (1995) conceptualising of “hot” and “banal” nationalism, I argue there is a slower evolving, “banal” propaganda of symbols and ideologies, which can be called upon or revived to serve as “hot” propaganda at moments of political and social dispute. The long-term management of public opinion calls for both. Illustrating the concern of banal symbols becoming hot propaganda, Estonia recently moved Soviet-era monuments from public sites to museums and restricted public access to others in the wake of Russia’s invasion of Ukraine. Estonian Prime Minister Kallas, stating her government would not “afford Russia the opportunity to use the past to disturb the peace in Estonia. [Due to] the increasing tensions and confusion around memorials in Narva [Russian-speaking Estonian city], we must act quickly to ensure public order and internal security” (BBC News, 2022: para. 8–9). As the present situation “heats up”, previously irrelevant symbols become powerful, and so the tactic of limiting ideological competition is used to manoeuvre unwanted ideological symbols away.

Continuing from the Estonian example, one site included a historical World War II grave to be relocated with “a neutral grave marker” (BBC News, 2022). It was not a false or untrue symbol, nor mis- or disinformation; in fact, its truth rather than its falsity is what made it concerning. Propaganda is largely about enforcing and realising ideological goals, and to this end, there is no reason propaganda must be false information (see Figure 1: 2). Similarly, Wardle (2018: 45) noted that “propaganda is true or false information spread” to an audience, and Stanley (2015: 42) concurred: “a true claim, uttered with sincerity, can be propaganda”. In fact, outright lies, if or when exposed, are damaging to the efforts of governments, institutions, organisations, or individuals who deploy such propaganda. Importantly, while the Internet is often seen as an important space for mis- and disinformation (Wardle & Derakhshan, 2018), it has also allowed the development of sites such as Wikileaks for information dumps. Alongside national laws declassifying information, leaks have arguably made lying riskier (Hyzen, 2023).

“Truth” is explicitly and non-controversially postulated in mis- and disinformation. False information, convened mistakenly (misinformation) or deliberately (disinformation), infers the premise that true or accurate information could exist. Conspiracy theory also holds this property: Behind the “conspiracy to deceive”, there is a truth. There is some agreement on this point – for example, the notion of “Induced Misperceptions” (Benkler et al., 2018: 34) – yet it is often overlooked or taken for granted in propaganda analysis. The simple point that truth is relevant and that propaganda can bring us towards the truth or away from it accepts a value-neutral conception of propaganda, whereas others argue to retain “the negative connotation of propaganda” (Benkler et al., 2018: 28). Moreover, I argue that any notion of propaganda requires a conception of truth and a minimal commitment to the facts – that events did or did not happen. Nor is a deep epistemological theory about truth required to hold such a commitment in ordinary life. Though the term truth is a more complicated formation built upon facts, propaganda can rely on facts to distort or manipulate. It also follows that there is no greater friend to the propagandist than a relativist view of truth: A propaganda campaign of ideological redescription, repetition, and manoeuvring of relevant frames and symbols does not answer to what events in fact happened separate from our opinion towards them.

For example, Ukrainian officials claimed responsibility for sinking the Russian warship Moskva, while Russian authorities claim that a fire broke out, exploding munitions, and that ultimately the Moskva sunk while being towed to port (El-Bawab & Theodorou, 2022). By playing with the event’s causation, both accounts could be true and serve as propaganda: Ukraine did attack the ship, and a fire did break out which reached ammunition that did explode and caused Moskva to sink. Propagandists chose the most advantageous descriptions and omissions to frame the event, but did not outright lie. Moreover, propaganda about the Moskva is but a moment in a much larger ideological battle involving legitimacy, morale, and perceptions of who is winning a military conflict.

A third point that requires examination is when events do or do not accord with some doctrine, principle, or law. Propaganda is used for the ideological goal of making acts and events permissible, acceptable, and justified, despite breaking professed doctrines or even acknowledging that doctrines have been broken. Following Searle (1995), I characterise these events as in the realm of epistemically objective facts and claims of ontologically subjective domains (social constructions). These facts, though relative to our interpretation, are events in the world (speech acts), which our actions conform to or not. This involves laws being abided, principles being adhered to, or rules of the game being followed. These doctrinal-related events are less complicated than the jargon implies. This is a powerful angle for propaganda: laws, theology, political doctrine, institutional procedure, conspiracy theories, and so on can easily be argued ad nauseum. It is more difficult to see how something like “monetary policy” is an event in the world because it exists bureaucratically or administratively as speech acts. Though more details of the Moskva may emerge, it presents a clearer case of an event compared with questions like the following: Is Russia at war with Ukraine without declaring war? Was the 2020 American election free and fair? Does a small group (cabal) of powerful (globalist) elites (illuminati) organise (rule) the world with enormous decision-making authority? Is this violence justified, considering some law or moral code, and why? These questions have true or accurate answers, yet they require a depth and breadth of knowledge that propaganda typically, and specifically, exploits. Propaganda is most effective where there is a lack of knowledge or moral intuitions are pushed to exhaustion. Hashing out details of moral behaviour, law, theology, political economy, social theory, and geo-politics creates openings for propagandists to project ideology and manage opinion because these disputes are subject to far more observer-relative interpretations than a sinking boat.

Propagating and Web 3.0: Distribution, dissemination, and repetition

A main goal of the propagandist is to spread the message to target audiences and to do so effectively. Therefore, distribution, dissemination, and repetition are inherent to a campaign’s success. The most elusive and arguably most important factor is repetition. Repetition (see Figure 1: 6) in propaganda has been studied in various forms, from a rhetorical technique within given (sets of) texts like speeches (Al-Ameedi & Khudhier, 2015) to symbolic ritualisation across media and cultural forms (e.g., Ellul, 1979; Monger, 2015), but mostly in the development and maintenance of stereotypes as ideological constructs. Lippmann (1957: 26) argued that reinforced stereotypes work within and to construct ideologies, a “way of realizing the world”, and that propaganda alters “the picture to which men respond, to substitute one social pattern for another”. Repetitive, mediated representations function to instil certain stereotypes and alter or manipulate values, beliefs, and attitudes, thus constructing opinion and ideology. Neale (1993: 42) argued that stereotypes themselves work to “reduce the complexity and heterogeneity inherent in a process and its relations to a single, homogeneous (and repetitive) function”. I argue that stereotypes, but also symbols and slogans, are small, discrete units of ideology that function in their own repetitive process, making them easier to propagate and harder to resist, especially groups, events, or ideas like the January 6 riots, ISIS, Marxism, ANTIFA, or Vladmir Putin, which most people (targets of propaganda) have little, if any, direct contact with but access through mediated representations. This barrier or censorship between target and subject creates the space for a propagandist to do their work. This can range from effecting specific responses to ideas or events through repeating symbols, slogans, or stereotypes, to displacing an entire “social pattern” or ideology in times of, for example, revolutionary change or religious conversion. Repeating messages, symbols, or slogans fostering familiarity and ideological norms with banal propaganda, while the more immediate problems find response via stereotyping and hot propaganda.

Repetition is, of course, impossible without the “endless” redistribution and dissemination of the propaganda message and symbols (see Figure 1: 5). Scholars have studied distribution and dissemination of propaganda dating from neolithic cave paintings depicting battle victories (Sample et al., 2019) to the explosion of media and communication networks of the twentieth and twenty-first centuries (Jowett & O’Donnell, 2019). Current propaganda studies (Bakir et al., 2019; Howard & Kollanyi, 2016; Wardle & Derakhshan, 2018; Woolley & Howard, 2016, 2017; Zuboff, 2019) focus on developments in Big Data, AI bots, and algorithms within communication networks, alongside the fact that linking social media accounts allows any user to simultaneously disseminate content across multiple platforms easily and cheaply, potentially reaching large, diverse audiences, instantly repeating and reproducing the message. The considerable impact of these digital developments on distribution and dissemination of propaganda notwithstanding, it is worth noting that other media, like television, and symbols like hats (in the case of Trump), paintings (in the case of Obama), and written Z’s (in the case of Russia) remain significant when disseminated effectively. Herman and Chomsky (1988) developed a filter model describing the censorship of information and perspectives that run contrary to powerful interests, sharply limiting the discourse that is distributed through the mass media, using political economy as an explanatory mechanism. The model has been updated for the contemporary media ecology (Boyd-Barrett, 2017; Hyzen, 2023; Pedro-Caranana et al., 2018; Robinson, 2015).

Less discussed is how distribution and dissemination represent another site of manipulation in the process of propagandising: the contextual environment (see Figure 1: 4). In reaction to Ellul’s (1965) book Propaganda, McLuhan adjusted his famous slogan “the medium is the message” to “the medium is the massage”, stating: “The environment as a processor of information is propaganda” (as cited in Allan, 2014: 37). As such, the manipulative power of the contextual environment in which messages exist can be distorted, or massaged, towards an ideological goal as much as the message itself. This is a crucial method by which true or accurate information can be unduly contextualised, reproduced, and amplified to shape discourse and ideological beliefs or dominate echo chambers in a propaganda campaign. While simultaneously limiting dissenting accounts or marginalising competing topics (Herman & Chomsky, 1988), controlling the contextual environment combined with robust repetition can push specific, or chosen, ideological agendas to centre stage to be exploited.

Currently, the “vast majority of political speech acts [occur] over digital platforms” (Woolley & Howard, 2016: 4882), and the advent of computational propaganda has given propagandists new and effective tools in the management of public opinion. Particularly, it is possible to repeat and reproduce symbols and messages cheaply and automatically through bots, which produce automated content and appear as “real” users, and algorithms which dictate the order and amount of content that appears to social media users. Howard and Kollanyi (2016: 5), analysing the role of bots in the run-up to the UK’s vote to exit the EU, found that these have been deployed “around the world to attack opponents, choke off hashtags, and promote political platforms [and] for amplifying messages”, continuing that “less than 1 percent of sampled accounts generate almost a third of all the messages”. This development gives potency to propaganda in a most fundamental way, as bots can generate repetition, confirmation, and the appearance of hegemony:

The volume of these messages that reinforce values can effectively alter the target’s perception. When the targeted user seeks to verify the content of a message, a large number of similar messages are returned, and the target now knows that other people share the same values and beliefs. (Sample et al., 2019: 177)

Bots can also effectively be used to heat up conflict: vaccine debates (pre Covid-19) were instigated and propagated by Russian bots acting as trolls, taking both sides of debates, using mis- and disinformation “with the intention of promoting discord” (Broniatowski et al., 2018: 1378). Despite these striking technological developments, it is worth noting that propaganda, in essence, remains the same. Bots remain a programmable tool that a propagandist uses and directs in the same manner as pamphlets, television interviews, and the like.

Global digital networks allow individuals or groups disseminating unorthodox ideologies, like extremist religion or conspiracy theories, to shape, maintain, insulate, and propagate their views to nearly anyone in diverse ways. Conspiracy theorist Alex Jones’s messages take a leap into the irrational: the globalist cabal’s world domination via interdimensional aliens, animal-human hybrids, and chemicals that “turn the frogs gay”. However, his online Infowars programme is nearly identical to cable news production, following a highly conventional format: Jones anchors the programme, goes live on location, has ticker headlines, hosts regular reporters, and interviews experts. Before Infowars was removed from mainstream social media, it reached a massive worldwide audience through Internet distribution, garnering “two million weekly listeners to his syndicated and streamed radio show, up to 1.3 billion views to his YouTube channels, and twenty million monthly visits to Infowars.com” (Hyzen & Van den Bulck, 2021a: 180; see also Hyzen & Van den Bulck, 2021b). In contrast, QAnon’s support of Trump amounts to the most conventional ideological goal and use of propaganda but used an innovative interactive approach. Dissemination started with “Q” posting cryptic messages claiming to be a top US military insider through “anonymous imageboard 4chan, later extending activities to 8chan/8kun” (Hyzen & Van den Bulck, 2021a: 181), before moving to mainstream message boards, social media, and even reaching the nightly news. These allowed followers to engage with Q, the “evidence”, and each other, mirroring a roleplay game, to find ways to resist and expose the deep state. This may mark a truly novel approach to propaganda, yet it resonates with McLuhan’s assertion “that propaganda is directed toward the masses via the media in such a way that individuals feel the message is directed solely toward themselves” (as cited in Allan, 2014: 37). In the case of QAnon, this was done with the familiar message of political intrigue, coup d’état, and conflict between factions of the ruling class, disseminated in a unique and politically focused package, and in the case of Jones, with the unfamiliar, endlessly sceptical conspiracy theories disseminated in the most orthodox package, mimicking familiar legacy media in its fundamental content. Both Jones and QAnon represent grassroots efforts with great impact afforded by recent technological developments but conforming to the premise of disseminating propaganda to spread values + beliefs to achieve ideological goals.

Propaganda audiences: Patsies, propagators, or both?
Transmission, (trans)action, cycle

There is no propaganda without an audience, yet the exact relationship and impact remains difficult to capture both theoretically and empirically. Given the mediated nature of propaganda, understanding the relationship between propaganda and audiences is caught in a longstanding debate between effects models (what do media do to audiences) and audience theory models (what do audiences do with media). Early authors of propaganda studies often postulated a passive, malleable “mass audience” subjected to domineering mass media. These and similar views were reflective of a “position emphasizing the structural constraints […] often ascribed to the effects research within the mainstream of mass communication research or critical theory in the wake of the Frankfurt School” (Bolin, 2012: 798), emphasising the social cement that media provide in a world of atomised masses. Such transmission frameworks have been finetuned by generations of theories and “thousands of empirical studies, into the cognitive, emotional, attitudinal and behavioural effects of media” (Valkenburg & Peter, 2013: 221), also regarding propaganda, suggesting at best consistent but small, and often contradicting, effects. What is more, the overall paradigm has been criticised by various interpretations of an active audience, from a psychological uses-and-gratification seeker, over a cultural studies’ active audience as sociological being, to a digital participatory audience (Bolin, 2012).

Nevertheless, Lippman’s sentiment is often echoed in current literature of the post-truth, social media echo chambers, and computational propaganda, particularly towards conspiracy theories and Trumpists – that these phenomena and techniques simply dominate a passive, gullible audience. It includes notions that an average social media user lacks the ability to “critically assess the forcefulness of an argument” (Howard & Kollanyi, 2016: 5), that conspiracy theorists are “amateur sleuths and Internet detectives” (Cassam, 2019: 23–24), and that bots and trolls take advantage of an “inability to rapidly discern truth”, relying on “a mixture of ‘true believers’ (also known as ‘useful idiots’) acting as trolls, paid trolls, and AI-controlled chatbots” (Sample et al., 2019: 177). These statements seem to confirm Livingstone’s (2015) observation that with each new wave of socio-technological developments (and new generation of researchers), the media-audience relationship is rearticulated, despite earlier insights. Nevertheless, there are empirical indications of the apparent success of propaganda campaigns spreading new, old, and unpredictable ideological views, like PEW data showing that the conspiracy theory that Covid-19 pandemic was deliberately planned by powerful people is considered definitely true by 5 per cent and probably true by 20 per cent of the American population – roughly 82 million people (Schaeffer, 2020).

To understand the relationship between propaganda and audiences then and now, I employ Livingstone’s (2015: 441–442) notion of a “cyclic and transactional model [that] recognizes that audiences are vital to completing the ‘circuit of culture’”, or in the case of propaganda, the circuit of ideology “to shape the forms and flows of meanings in society” and the emphasis on the dynamic interaction between the various steps in the cycle. To account for the historical and current context of propaganda, I follow Bolin’s (2012) distinction between two types of “active audience”, or audience production-consumption circuits: audience activities that “produce social difference (identities and cultural meaning) in a social and cultural economy” (Bolin, 2012: 796) and how these become the “object of productive consumption” (Bolin, 2012: 796) or audience labour for the propagandist and related media. While Bolin emphasised economic profit as the end goal, I argue that the propagandist’s primary goal is ideological “profit” of proliferation (cf., Price’s “market for loyalties”).

Varied, not limitless, interpretations

The first interpretation of active audience refers to the active role of audiences in decoding messages and symbols, what Fiske called semiotic productivity and Hall (1985) identified as modes of decoding (see Figure 1: 7) (Bolin, 2012). It is important to recognise, as propagandists do, that what is normative to one audience is decoded as propaganda and confrontational by another audience and, of course, propagandists exploit this. As such, “significant symbols” have different meanings at different times due to interpretation. For example, subtle political and cultural reference points are being exacerbated by social media platforms. An expression like “Let’s Go Brandon” – a slur towards President Biden – has no literal meaning outside the propaganda slogan of the small in-crowd of his opposition. It is even more prevalent within conspiracy theorists to develop, code, and propagate deeply meaningful symbols and expressions on Internet forums, uninterpretable to outsiders, like considering all globes as heliocentric propaganda or holding up three fingers as an illuminati sign. What is relevant and persistent is audiences engaging with the core ideological meaning that is being repeatedly propagated.

Just as propaganda is mostly not about engaging audiences with rigorous epistemology or newsroom practices, so do audiences engage their own “methods” of information gathering. When Lippman argued the masses do not act on “direct and certain knowledge” but rather “pictures made by himself or given to him”, continuing, “if his atlas tells him that the world is flat he will not sail near what he believes is the edge” (Lippmann, 1957: 25), he was wrong. Today, Flat Earthers hold conferences to plot expeditions to Antarctica to obtain “direct and certain knowledge”, specifically to locate the edge of the Earth (Dobson, 2019). In other words, when scepticism takes hold, often the success of relentless repetition of particular propaganda messages like flat Earth or a stolen election, the pictures of events “made by himself or given to him” becomes a more dynamic, negotiated process that hinges upon ideological foundations and on Cassam’s (2019: 96) point that they reject “officially sanctioned experts or sources of information”, not their sources of trusted information. This point holds for expert opinion regarding economics, gender politics, medical science, and so on – not just geology. Uncertainty is an excellent opportunity for a propagandist to introduce new pictures, but audiences do not simply accept these carte blanche or without requiring some explanation or epistemology, the way Lippmann’s passive characterisation insinuates.

Producing propaganda: The cyclic nature of consumption, production, dissemination

Beyond semiotic or interpretative productivity, audiences can also manifest textual productivity (see Figure 1: 8), which results in semi-permeable boundaries between the audience and the propagandist. This is not as new as some research into propaganda and social media would suggest. As messages move from creator to audience, we recall Lasswell’s (1935: 187) observation that “the propagandist is more object than subject of propaganda” and it applies to peer-to-peer content production and distribution which can snowball over and across platforms: “whether modified or not by the consumer, the core message often remains intact, acquiring a ‘new life’ in each new wave of content dissemination” (Wanless & Berk, 2017: 113). As such, we can distinguish between Propagandists like President Zelenksy or Alex Jones and propagandists, or audiences of a Propagandist who consume and embrace the message and then take part in its further dissemination.

In this regard, Wanless and Berk’s (2017: 112) notion of participatory propaganda incorporates an active audience into the propagation aspect, particularly in the study of computational propaganda, echo chambers, and social media. They noted that, due to developments in digital media, “the traditional separation between ‘the propagandist’ and ‘target audience’ is rapidly blurring, with the latter beginning to play a more significant role in spreading propagandistic content and influencing others through personal networks” (Wanless & Berk, 2017: 112). Their study accounts for the digital forums and social media that amplify peer-to-peer propagation of information, important for the consolidation of imagery and identity within an engaged community like QAnon, next to dissemination generally. Alongside the blurring of propagandist and audience, there is a blurring of mass, social media, and personal networks: forums like those used by QAnon are accessible to both Propagandists and propagandists. Likewise, memetic content, derivative work including memes, remix, mashups, and so on, highlight “the internet as a communications medium characterized chiefly by its participatory creative culture” (Sharbaugh & Nguyen, 2014: 141), fruitful ground for propaganda and repetitive dissemination. The audience can use new or existing images, upload these to meme generators that remix content, to then post them on social media, taking part in both creation and distribution of propaganda. Moreover, memes allow messages, even extremist content, to “hide in plain sight”, cloaked in ironic comedy and obscuring intent with cynicism while being repeated (Hyzen & Van den Bulck, 2021a: 183). The autonomy of such a participant audience should not be overestimated or taken for granted, as this type of engagement or “circuit” can quickly result in the audience performing free labour, that is, becoming a form of “free” ideological dissemination for the original Propagandist.

Intended and unintended audiences

Finally, propaganda reaches both intended and unintended audiences (see Figure 1: 9), possibly more often today due to rapid and widely online communication networks and dissemination opportunities provided by bots and other advanced Web 3.0 means. For instance, President Zelensky posting on Instagram a photo of a Ukrainian Soldier wearing a SS Totenkopf patch commemorating Victory Day: The post was quickly deleted but reached unintended audiences when Russia seized upon it to illustrate their de-Nazification claims (Norton, 2022). In this regard, baiting an audience with counter-propaganda, made possible by the flexibility of propaganda, can be purposefully done to energise a group or incite conflict. Propaganda that focuses on a specific issue makes its “content” effective in multiple ideological directions, specifically towards an indoctrinated audience that is both unlikely to change views by a unique propaganda message or that responds strongly due to entrenched beliefs. As such, the propaganda message of Propagandist A intended to rally support with target audience B can, in turn, be used by Propagandist B to send their own message, both to a different audience C or even repackaged and reframed as counter-propaganda for the original audience B. For example, the American January 6 Committee investigating the Capitol Hill riots replayed messages and footage of Trump’s speech and tweets. Trump tweeted “Statistically impossible to have lost the 2020 Election”, after his top advisers and legal team informed him that he had, in fact, lost (Dreisbach, 2022), continuing in his January 6 broadcast speech: “We fight like hell. And if you don’t fight like hell, you’re not going to have a country anymore” (Naylor, 2021). Both propaganda messages, containing clear ideological goals, were intended to rally a target audience of MAGA supporters. The January 6 crowd ranged from mainstream Republicans to ex-police officers, from Proud Boy men’s rights activists to QAnon followers, to conspiracy theorists hoping to break up paedophile rings or ascend to higher dimensions. All of these groups must have decoded the “Stop the Steal” campaign somewhat differently, yet coherently placed Trump’s political propaganda inside unique ideological views. Furthermore, “Stop the Steal” propaganda reached a far wider audience, a large part of whom decoded it radically differently: that Trump, rather than being the victim of a coup d’état, was perpetrating a coup d’état. It was reproduced as neutral information on news outlets who stated the fact that he had lost the election with no evidence of fraud. But it was also used as propaganda by the January 6 Committee as part of their public relations efforts to frame Trump’s actions as at the least unethical and at the most seditious.

The notion of unintended and intended audiences introduces several paradoxes. We cannot pretend to have a god’s eye view into the exact intentions of a propagandist, whose goals might be multidimensional or highly manipulative disinformation campaigns. Propagandists typically do not openly explain the thinking behind their efforts, while others sincerely believe their intention is spreading the “truth”. Furthermore, it reintroduces the issue of causation: Would a rebroadcast to a new audience be “the original” propaganda or two distinct propaganda efforts, the second being a further ideological redescription, thus new propaganda? Either way, the point remains that propaganda quickly becomes a double-edged sword. This is apparent in terrorism conflict and terrorism studies. For example, the highly publicised ISIS “Beatles” (four English-speaking British citizens recruited to fight for ISIS) produced and distributed video, overlaid with verbal and symbolic propaganda messages, of killings of four American hostages in Iraq and Syria (Singh, 2022). These messages, in a new contextual environment, served as propaganda in Western media to frame Middle East policy and justify drone strikes against ISIS and similar groups, not by altering the content, but “by constantly discussing horrific videos on the news, as was done with the ISIS beheading video”, it ultimately “inspire[s] fear and draw[s] individuals’ attention” (Redmond et al., 2019: 566). As discussed above, this argument turns on whether we wish to view the propagandist’s intended audience as the audience intended at the time of the message creation or at dissemination, which could, for example, be repeated as part of the broadcast without ideological intentions to a different audience.

Conclusion

Maintaining that propaganda constitutes the expression of ideology and power in the form of mediated communication, I have sought to elaborate on various underexplored points to further propaganda studies and analysis. Specifically, propaganda can, and often is, true information and, like mis- and disinformation which are answerable to the facts, propaganda can be answerable to the larger truth. I have further argued that the repetitive nature of propaganda is fundamental to successful campaigns. Tied to distribution and dissemination, repetition enables propaganda to impose stereotypes upon individuals through repetitive mediated content, who then absorb it as beliefs and values – or not. As such, the contextual environment, both as an information processor itself, and as the structural forces that have bearing on it, plays a key role in propagandising and is a prime location to “distort” true information. Lastly, I have argued that the audience not only receives the message in various ways but can also play an important role in spreading propaganda: by actively interpreting the message, engaging social media platforms and participating in the remix culture of the Web 3.0, they become propagandists in their own right or ideological labour for the original propagandist. Following Livingstone and Bolin, I have argued that a cyclic transactional model is the best fit for propaganda to account for the audience as both interpreters and participators.

While I aim to help define a programme for analysis of propaganda, I have not discussed methods for the main reason that the various aspects of propaganda identified above can be studied using the entire qualitative and quantitative methodological toolkit at a scholar’s disposal, from ethnographic research of audiences actively engaged in propaganda dissemination through meme culture, semiotic analysis of propaganda messages, to Big Data analysis of the role of bots in dissemination. Soon, researchers will be able to apply AI to extract visual content, linguistic content, and metadata, for example, of a meme or TikTok video, to study propaganda. Crucial is the attention to validity: A qualitative analysis of propaganda messages can gauge the relationship to the facts and truth, but it will not say much about intent. Quantitative analysis of a propaganda message’s dissemination can track repetition but cannot account for impact. The complexity of propaganda, in my view, makes it a sum greater than its parts, and our understanding of it would benefit from a multi-perspective, multi-method, and interdisciplinary approach.

While I have put forth arguments to better conceptualise and operationalise propaganda, there are inevitable shortcomings. I have tried to capture propaganda as a more general phenomenon, rather than use a reductionist approach contrary to other scholarship. Likewise, my defence of the possibility of “truth” may attract criticism from sceptics. As argued, causation and intention both remain problems for future theoretical work. Ultimately, I hope this contribution will be the beginning of such questions, rather than the end.

eISSN:
2003-184X
Language:
English