Accès libre

A Network Analysis of Twitter's Crackdown on the QAnon Conversation

À propos de cet article



Even casual observers of the raid on the U.S. Capitol on January 6, 2021, could not help but notice the large numbers of individuals wearing clothes and other paraphernalia declaring their allegiance to ‘Q’ and the QAnon movement. Their presence was somewhat unsurprising since QAnon followers have appeared at Trump rallies since 2018 (Bank, Stack and Victor 2018). QAnon is a conspiracy theory that believes that former President Trump is (or was) fighting a ‘deep-state’ cabal of Satan-worshipping, cannibalistic pedophiles running a global child sex-trafficking ring,

Cabal members reportedly kill and eat their victims to extract a life-extending chemical from their blood.

as well as controlling just about everything that matters: politicians, the media, Hollywood, the world (Rozsa 2019). Conspirators allegedly include liberal Hollywood actors (e.g., Tom Hanks), Democratic politicians (e.g., Hillary Clinton), financial elites (e.g., George Soros), and even some religious leaders (e.g., the Dalai Lama). QAnon holds that Trump is (or was) planning a day of reckoning known as the ‘Storm’ when he would first unmask the conspiracy and then arrest, prosecute, and execute its leaders for their crimes. Accompanying the Storm will be the ‘Great Awakening,’ when Americans finally see the corruptness of the U.S. political system that has enslaved them for some time. Followers of Q just need to ‘trust in the plan.’

Many followers believed that the raid on the U.S. Capitol was, in fact, the advent of the Storm, and many were disappointed when it did not materialize. For some, disappointment turned into disillusionment two weeks later with Joe Biden's inauguration, an event many QAnon followers believed would not happen (Sardarizadeh and Robinson 2021). For example, Ron Watkins, a former site administrator for 8chan, where many of Q's posts appeared and who some believe is Q, suggested it was time to ‘go back to our lives as best we are able’ (Collins and Zadrozny 2021a). Others, however, did not abandon the faith, believing that Trump and Q were still controlling events and that Biden's inauguration was ‘all part of the plan’ (Harwell 2021, Sardarizadeh and Robinson 2021).

Beginning with a post on the 4chan imageboard in October 2017,

Imageboards are online forums where users post images, often with accompanying text and discussion.

QAnon narratives diffused across popular social media platforms, such as Twitter and Facebook, highlighting the challenges that cross-platform social media networks pose for understanding conspiracy theories, as well as countering misinformation contained therein. The movement has attracted a wide range of followers and admirers, including Republican members of Congress, individuals involved in former President Trump's attempts to overturn the 2020 election, and perhaps most notably, retired Lieutenant General Michael Flynn, who pledged an oath to QAnon on July 4, 2020 (Cohen 2020, Rosenberg 2021). Although Trump plays a central role in the conspiracy, he has pleaded ignorance about it, except for remarking that he heard it was working against pedophilia (Berman 2020). He has, however, helped publicize QAnon by retweeting or mentioning 152 QAnon-affiliated Twitter accounts, sometimes several times a day (Nguyen 2020). Some followers refer to him as ‘Q+’ (LaFrance 2020).

In May of 2019, the FBI issued an intelligence bulletin identifying QAnon as a potential security threat (Federal Bureau of Investigation 2019). A little over a year later, Twitter, claiming that QAnon messaging could lead to violence, took steps to counter the diffusion of QAnon content on its platform and prevent it from spreading to others. It banned thousands of QAnon-affiliated accounts and attempted to block QAnon-related URLs from being shared on its platform. It also altered its algorithms to no longer recommend content from other accounts associated with QAnon or highlight them in search and conversations (Conger 2020). Other platforms soon followed suit. Facebook first restricted QAnon activity and then banned it altogether (Collins and Zadrozny 2020, Heilweil 2020). YouTube and TikTok took similar steps. Then, in January 2021, shortly after the Capitol raid, Twitter took additional steps by removing more than 70,000 accounts promoting QAnon (Savov 2021). This appears to have led followers to migrate to other social media outlets, such as 8chan (now 8kun), Gab, Telegram, Parler, and so on (Bond 2021, Rahn and Patterson 2021).

This paper draws on insights drawn from diffusion/contagion studies and uses social network analysis to examine online aspects of the QAnon movement, in particular, networks created by Twitter users sharing URL links to other social media platforms. Specifically, it analyzes users’ URL sharing patterns before and after Twitter's attempt to crack down on the movement in the summer of 2020. While there are several ways to expose people to materials (e.g., videos) on other social media platforms (e.g., through colleagues and family), sharing URLs is an efficient means for doing so in near real-time (Singh et al. 2020). They can offer new pathways into extant social networks on other social media platforms and forms of media (Fu and Shumate 2017), especially when one platform becomes increasingly hostile. However, extant literature has paid little attention to multiplatform diffusion, including the spread of conspiracy theories from one platform to others. Thus, our analysis is novel in that it focuses on between-platform diffusion of QAnon-related information during critical periods of its existence.

The remainder of our paper proceeds as follows. In the next section, we provide an overview of conspiracy theories and an abbreviated history of QAnon. We then draw on diffusion/contagion theory to help place its spread in context. Next, we describe the data we use in our analysis. For reasons discussed below, we focus primarily on tweets and retweets about the diffusion of QAnon URLs that occur in specific periods before (peak) and after (post) Twitter's banning of QAnon-related accounts in July 2020. Our analysis first considers the distribution of links to other social media platforms and then examines the peak and post retweet networks using standard social network analysis measures and algorithms. We follow this with a discussion of potential strategies for impeding the spread of QAnon (and other conspiracies). It concludes by considering opportunities for future research.

Conspiracy Theories and QAnon

Conspiracy theories contend that lying behind significant political and social events is a small group of powerful people secretly acting for their benefit and (usually) against the common good (Douglas et al. 2019, Sunstein and Vermeule 2009, Uscinski 2020). Because they attribute exceptional powers to conspirators, they are notoriously difficult to refute. Followers are unlikely to accept any evidence that purports to refute a conspiracy since they most likely believe that those attempting to do so are either agents or dupes of the conspirators (Rosenblum and Muirhead 2019, Sunstein and Vermeule 2009, Uscinski 2020). Moreover, many of today's conspiracists use innuendo and make bare assertions to spread false claims and shape narratives that can fuel conspiracy theories, including those related to QAnon (Rosenblum and Muirhead, 2019).

True to form, QAnon attributes extraordinary powers to a group of people who it claims have successfully duped Americans for some time. Although outsiders believe numerous studies have discredited it and many, if not most, of Q's predictions have failed to come true, this has had little effect on QAnon followers. This pattern aligns with work on how followers often continue to believe even in the face of evidence to the contrary (Festinger, Rieken and Schachter 1956, Uscinski 2020). One difficulty in countering QAnon lies in its followers’ distrust of mainstream media. They see outlets such as CNN, The New York Times, NPR, and The Washington Post as dishonest and (either wittingly or unwittingly) in league with the conspiracy. Thus, they have turned to alternative outlets that, while not necessarily promoting QAnon, have not actively sought to undermine it either.

QAnon is built, in part, on the Pizzagate conspiracy, which claimed that the personal emails of John Podesta, Hillary Clinton's campaign manager, contained coded messages tying several high-ranking Democratic Party officials and U.S. restaurants to a human trafficking and child sex-trafficking ring (Martineau 2017, Tangherlini et al. 2020).1 Comet Ping Pong pizzeria in Washington, D.C., was reportedly one such restaurant. This led a North Carolina man to travel to D.C., where he entered the restaurant, fired shots, and then opened a door he believed led to the basement where he would find evidence of child trafficking. It was, however, just a broom closet (Comet pizzeria has no basement). Although this particular event undermined Pizzagate's credibility, Q successfully adapted aspects of it (Uscinski 2020, Wikipedia 2021b).

Q's first post (aka, ‘Q drop,’ ‘intelligence drop’) occurred on October 28, 2017, on 4chan, an anonymous English-language imageboard.

Anonymous imageboards deliberately obscure their posters’ identities. Those who wish to maintain a consistent identity but remain anonymous use a tripcode, which associates a post with a unique digital signature for anyone who uses the password associated with that tripcode (Wikipedia 2021a).

Q attracted an audience largely thanks to two 4chan moderators and a video producer. Together, they publicized Q, including creating a ‘subreddit’ to reach a larger audience and encourage discussion about Q's posts (Zadrozny and Collins 2018).

Reddit is a news aggregation, discussion, and web content rating website. Registered members can submit content (e.g., links, text posts, images, and videos), which are then voted on by others. Posts are organized by subject into user-created discussion groups called ‘subreddits.’

Reddit eventually shut down the subreddit, fearing that the discussions could incite violence (Zadrozny and Collins 2018). Q's posts eventually moved to 8chan because Q believed 4chan might have been infiltrated (Zadrozny and Collins 2018). Once there, Ron Watkins, a site administrator who many believe is Q, helped publicize Q's posts further, as did a Falun Gong YouTube channel (Rothschild 2019). Q also received a boost from Russian-backed Twitter accounts; beginning in November 2017, #QAnon was the most frequent hashtag they tweeted (Menn 2020).

Hashtags are tags prefaced by the hash symbol, #. They are widely used on social media platforms because they allow users to cross-reference particular subjects or themes.

Q initially claimed to be a high-level government official with access to highly classified information. Many QAnon followers now believe that Q is actually a team of high-ranking individuals executing a military intelligence operation that has been in the works for some time. Some believe the military specifically chose Donald Trump to help them carry out this operation. Many outside observers also believe that Q is more than one person. A machine-learning stylometric analysis of 4,952 posts from October 28, 2017, to November 13, 2020, concluded that two authors wrote Q's posts at two different points in time (OrphAnalytics 2020).

Q's posts tend to be cryptic. Followers claim this reflects the classified nature of much of the information that Q seeks to share. As such, Q only drops ‘breadcrumbs’ that followers then interpret to uncover ‘the truth,’ which they then share with the QAnon community through their own posts. Followers also parse Trump's speeches, believing they contain hidden clues. For instance, QAnon's belief in the coming Storm originates from a remark made by Trump at a gathering of military officials in October 2017: ‘You guys know what this represents? Maybe it's the calm before the storm. Could be the calm, the calm before the storm’ (quoted in Hartman 2017). The pouring over and reflecting upon Q's posts and Trump's speeches have led to a proliferation of beliefs,

In late 2017 or early 2018, Dylan Louis Monroe, reportedly a QAnon follower, mapped QAnon's numerous beliefs:

such as the government is controlled by Jews, Hillary Clinton is in jail, JFK, Jr. is still alive and will soon return to help Trump bring down the deep state, Covid-19 is a hoax, and Joe Biden's election was a fraud.

Not everyone embraces all of the beliefs associated with the movement, but most believe Trump will bring down the deep state cabal of elites who worship Satan, abuse children, and running the world for their own benefit.

A popular slogan among QAnon followers is the acronym WWG1WGA (“Where we go one, we go all”), which Q first used in April 2018 and soon appeared on t-shirts and bumper stickers, and used as a hashtag.

Diffusion/Contagion of the QAnon Conversation - Hypotheses

See Table 7 for a summary of the hypotheses and the results of the analysis.

Diffusion is a theoretical perspective to explain how ideas and information spread across populations (Rogers 2003). From a social network perspective, numerous factors can affect the speed and reach by which a belief system such as QAnon spreads (Valente 1995). Often, central actors embrace an idea or practice sooner (e.g., Banerjee et al. 2013, Christakis and Fowler 2010, Valente and Vega Yon 2020). However, there is evidence that they can be reluctant to adopt new ideas or practices that are ‘costly, risky, or controversial’ (Centola and Macy 2007:703), suggesting that in some instances, the earliest adopters may lie on the margins (Hassanpour 2017, Steinert-Threlkeld 2017) where ‘they are freed from system norms’ (Valente and Vega Yon 2020:5). While the QAnon conversation probably started on the fringes of Twitter and other social media outlets, it is likely that as it became more popular and more socially acceptable, perhaps as early as 2018 (Bank, Stack and Victor 2018), it spread to more popular Twitter accounts that could have been seen as relatively ‘reliable sources’ of information—e.g., accounts with numerous followers, those that have been frequently retweeted (for all tweets, not just QAnon-related ones), and those whose tweets have been ‘liked’ by other users (favorites)—who then helped spread it across Twitter and potentially expose others to relevant materials on other platforms. Since the period we analyze here lies in this later stage, we expect the tweets and retweets of a handful of actors to account for most of the sharing of links to external social media sites, which is likely to lead to clustering and increased centralization. Restating these observations more formally as hypotheses:

H1: A handful of actors will account for most of the link-sharing of external social media sites.

H2: “Reliable” actors will be central before and after Twitter's crackdown in July 2020, even if they are different users in each period.

H3: Because of the presence of a handful of reliable actors, we expect relatively high levels of clustering among networked accounts in each period.

H4: Because of the presence of a handful of reliable actors, we expect moderate-to-high levels of centralization in both periods.

Closely related (but not identical) to network centralization is whether the QAnon retweet networks will exhibit the characteristics of a scale-free network. In scale-free networks, most nodes have very few ties, but a few have numerous ones such that the distribution of ties across nodes follows a power law, where the number of nodes with degree (number of ties) k is proportional to ka, and a > 1 for ‘weakly’ scale-free networks and 2 < a < 3 for ‘strongly’ scale-free networks (Barabási and Albert 1999, Broido and Clauset 2019). In evaluating whether a network is scale-free, Broido and Clauset (2019) apply this power law ‘requirement’ to only the distribution's upper tail, although they require the upper tail to have at least 50 nodes. Scale-free networks often develop through preferential attachment, which occurs when new actors joining a network tend to form ties to existing actors with numerous ties rather than those with very few (Barabási and Albert 1999). Because we believe that as users joined the QAnon conversation, they were more likely to retweet posts from ‘reliable sources,’ we expect the peak and post networks to be scale-free:

H5: The QAnon peak and post retweet networks will be scale-free.

Some have speculated that the distribution of followers on Twitter is probably scale-free and caused by a preferential attachment process (see e.g., Borgatti, Everett and Johnson 2018:302, Klarreich 2018).

Tie strength can also play an important role in diffusion. Mark Granovetter (1973) has argued that weak ties are ‘strong’ because they are more likely than strong ties to form bridges between otherwise unconnected social circles. More recently, Damon Centola and Michael Macy (2007) have shown that although weak ties can facilitate the spread of some ideas or innovations, the adoption of a behavior, idea, or information that is ‘costly, risky, or controversial,’ what they call complex contagion, requires numerous, strong ties to prior adopters because individuals often ‘require independent affirmation or reinforcement from multiple sources’ (Centola and Macy 2007, 703). Consistent with their analysis, studies have found that the process by which hashtags spread across Twitter is a function of whether they are controversial or noncontroversial (Romero, Meeder and Kleinberg 2011). For example, the diffusion of controversial political hashtags exhibits the characteristics of complex contagion because most users need to have ‘contact with up to five or six adopters before adopting a new political hashtag.’ In contrast, the diffusion of less controversial hashtags exhibits the characteristics of simple contagion in that they ‘typically spread from person to person with only a single contact’ (Centola 2018, 89).

Given that the cost or riskiness for sharing QAnon-related content on Twitter during the period we analyze here was probably low (although it may have become riskier since the raid on the Capitol), our peak and post retweet networks are more likely examples of simple rather than complex contagion. As such, we expect them to exhibit three broad topological characteristics: low levels of reciprocity, sparseness, and transitivity. Stated formally:

H6: We expect the peak and post retweet networks to display low levels of reciprocity (i.e., where node pairs that share a tie also reciprocate their ties, such as account A retweets account B and vice versa), which we interpret as evidence of an absence of strong ties between user accounts.

H7: Because the relative lack of strong ties lowers the probability of transitivity (Granovetter 1973), we expect the peak and post retweet networks to be relatively sparse and exhibit low levels of transitivity.

Aside from the post network being smaller and less interconnected than the peak network, the effect of Twitter's crackdown in terms of the network metrics outlined above is not necessarily obvious. Assuming Twitter targeted more prominent accounts, we should see a decline in centralization from the peak to the post networks. The crackdown could have similarly affected the scale-free nature of the two networks in that the post network could be ‘less’ scale-free than the peak. Stated formally:

H8: The network will become less centralized after Twitter's crackdown in July 2020.

H9: The network will become less scale-free after Twitter's crackdown in July 2020.

Data and Methods

Using the R package, rtweet (Kearney 2019), we collected QAnon-related Twitter data (tweets) daily from April 2020 to November 2020 via Twitter's REST API, which allowed access to 18,000 tweets every 15 minutes going back seven days.

In January of 2021, Twitter amended its policy allowing researchers access Twitter data beyond 7 days. See

It is important to note that these data do not constitute a random sample, unlike a streaming API. For this paper, we focus on tweets containing the hashtag #QAnon.

If we include several other commonly used QAnon-related hashtags in this count (e.g., #thegreatawakening, #pizzagate, #deepstate, #savethechilrden), then the percentage of users sharing QAnon-based hashtags along with #QAnon rises to 42% and 39% of all tweets. Since we examined only twenty commonly referenced QAnon-related hashtags, it is highly likely these statistics underestimate the extent to which users tweeted QAnon related hashtags.

To include a tweet in our analysis, it had to include the hashtag #QAnon and a link to an external social media platform (i.e., social media other than Twitter). This approach presented tradeoffs for understanding URL sharing patterns among accounts interested in QAnon.

Account-based approaches—those that collect data starting from QAnon-based accounts identified by some sort of criteria—can offer insight into the movement's followers. For instance, Benigni, Joseph, and Carley (2017) identified five known ISIS accounts and used a two-step snowball sample of followers to analyze the organization's support network on Twitter. By focusing on URL sharing behavior, we may have excluded data about those who are not ‘card carrying’ members of the movement but still expose others to QAnon materials. A combination of hashtag and account-based approaches can also offer benefits (Lorentzen and Nolin 2015).

For example, not all accounts in the data adhere to QAnon beliefs. Users can tweet #QAnon for many reasons, including ridiculing its followers or offering counter-narratives. However, 37% of all accounts in the data set tweeted #WWG1WGA at least once during the April–November timeframe, and 32% of tweets contained the hashtag. That Twitter suspended 71% of the ‘peak’ network's most central accounts,

The ‘peak’ accounts (n = 17) are those that scored in the top ten for indegree, authority, and betweenness centrality. Although they correlate highly with one another, but they give us an idea of which accounts were influential on sharing URLs.

many of whom tweeted other QAnon-related hashtags

For instance, 76% of the central accounts retweeted #WWG1WGA at least once. Overall, 82% of the most central accounts either tweeted this popular hashtag along with #QAnon, Twitter suspended them, or both. All but one of the accounts that were neither suspended nor tweeted #WWG1WGA were in Japanese.

with #QAnon, suggests most of the highly retweeted accounts held QAnon beliefs to some extent. An advantage of our approach is that it excludes accounts like ‘@JoeBiden’ and ‘@HillaryClinton,’ who many Twitter users engaged with or mentioned frequently in their tweets but did not tweet #QAnon and are considered enemies of the movement. Moreover, while #QAnon captures only a portion of the broader conversation about QAnon and does not address ‘conversational drift’ (Lorentzen and Nolin 2015), it is an obvious hashtag for users to locate and become exposed to QAnon-related materials. Similarly, our emphasis on tweets with links to external platforms permits us to focus only on the network engaged in exposing others to external resources (e.g., videos). A few other points are worth noting. The tweets did not have to be original; they included retweets as well (see the description of different types of tweets below). Also, we included links to all social media platforms. Most of these were easy to identify (e.g., YouTube, Facebook), but we did have to look more carefully for some social media platforms, such as Bitchute.

Figure 1 plots our data. The vertical line indicates July 21, 2020, when Twitter cracked down on QAnon-affiliated accounts and tweets by removing accounts, blocking trends related to QAnon, and attempting to prevent users from posting QAnon-associated URLs (Conger 2020). These restrictions affected approximately 150,000 accounts (Conger 2020). As the figure shows, there was a substantial decline in #QAnon-related tweets after the crackdown.

Figure 1

Tweets About #QAnon: Daily Aggregated #QAnon Status Counts

Our analysis primarily examines a peak-period before the crackdown (June 11–14, 2020—highlighted in yellow) and a post-period after the crackdown (July 21–August 11, 2020—highlighted in purple). In particular, we compare data before and after Twitter's ban to explore any shifts in URL sharing behavior. The peak period coincides with civil rights protests in Seattle and Atlanta, COVID cases exceeding 2 million in the U.S., as well as shortly after Washington D.C. renamed 16th Street to BLM Plaza and Trump held a photo-op grasping a Bible in front of St. John's Episcopal Church after forcefully removing protestors from Lafayette Square in D.C. The post period coincides with Trump's ‘surge’ in which he sent federal police to “Democrat cities,” his suggestion to delay the November election, wildfires in California, the Sturgis Motorcycle Rally, and Joe Biden announcing Kamala Harris as his running mate.

Table 1 summarizes our data (far right column). It also compares our data to the total number of tweets in each of the peak, post, and entire periods (#Tweets), the number and percentage of tweets that contained a URL in each period, the number and percentage of all tweets that had a social media URL in each period, and the number and percentage of tweets that included a URL to an external social media platform in each time frame. As it indicates, from April to November, approximately 19% of all tweets shared a URL, 7% shared a social media URL, and 3% shared an external social media URL. The percentages change somewhat when looking at the peak and post periods. During the peak period, approximately 25% shared a URL, 10% a social media URL, and 5% an external social media URL, while during the post, about 15% shared a URL, 6% a social media URL, and 4% an external social media URL. The tweets that shared an external social media URL interest us here. Notably, in the peak network, of the tweets that contained a link to a social media URL, 48% (5,938/12,374) were to external social media sites; however, in the post network, the percentage jumped to 60% (4,420/7,355), evidence consistent our expectation that Twitter's crackdown enticed followers to migrate to other social media outlets.

Data Summary

Period Tweets Tweets with URL Tweets with Social Media URL Tweets with External Social Media URL
April – November 2,330,581 451,490 (19.4%) 166,392 (7.1%) 78,439 (3.4%)
Peak (July 11th – 14th) 124,670 30,966 (24.8%) 12,374 (9.9%) 5,938 (4.8%)
Post (July 21st – August 11th) 115,882 17,459 (15.1%) 7,355 (6.3%) 4,420 (3.8%)

Apart from examining shared URLs, we focus our network analysis on the retweet networks. Twitter data includes six types of relations: (1) mentions, which are tweets in which users mention (usually with the @ symbol) another user; (2) replies, which are tweets that answer another user and are often used to initiate conversations; (3) retweets, which are re-distributed or re-posted tweets; (4) quotes, which are retweets that include a comment; (5) friends, which are those that a specific user follows; and (6) followers, which are those that follow a specific user (Giachanou and Crestani 2016, Twitter 2021). The data also include account attributes (characteristics), several of which we include in our analysis: (1) number of retweets, which indicates the number of times an account has been retweeted; (2) tweets, which is the number of tweets an account tweeted during each time frame; (3) favorites, which shows the number of times other users ‘liked’ an account's tweets; (4) friends, which indicates the number of accounts each user follows; and (5) followers, which is the number of users that follow each user.

We focus on retweet networks because they are excellent for capturing flows (Kane et al. 2014) and URL sharing networks. Moreover, other types of tweets (e.g., mentions) include accounts not actively engaged in sharing URLs. For instance, some #QAnon mentions include ‘@realDonaldTrump,’ although Trump never shared any URLs (at least in our data set); thus, although Trump was a topic of the conversation, he did not participate in the conversation. Finally, it is essential to note that connected pairs include only ‘source’ tweet accounts (i.e., crafters of original tweets) and accounts that retweeted them, regardless of how the retweeting account was exposed to the original tweet. For instance, suppose user A retweets one of user B's initial tweets, and then user C, as a ‘friend’ of A, sees user B's tweet and subsequently retweets it. In this scenario, retweet ties only exist between users A and B and between B and C.

As with social media data collection in general, this is the result of the API we leveraged to collect this data set. It is possible this impacts some our statistics, such as driving down reciprocity. In this example, one could argue that A and C could have a retweet tie (C→A). If A had already retweeted C on another occasion (C←A), then we would miss out on the former arc (C→A) whereby C retweeted A's redistribution of B's original tweet. Nonetheless, the API output does not provide this information. As we discuss below, it still holds that the well-connected, prominent accounts did not reciprocate (i.e., retweeting) others who retweeted them.

Network Measures

Our analysis of the retweet networks begins by calculating commonly used topographical measures related to the hypotheses outlined above: density, average degree, transitivity, reciprocity centralization, diameter, average path length, and size. Density, average degree, transitivity, and clustering coefficient help gauge a network's interconnectedness. Density is the standard measure of interconnectedness, but because it is inversely associated with network size (de Nooy, Mrvar and Batagelj 2018), we use average degree as an alternative measure. Transitivity equals the ratio of triangles (closed triads) over the total number of connected triads in a network. We use degree and betweenness centralization, diameter, and average path length to estimate network spread. Centralization is the standard measure and is based on the variation in actor centrality within the network (i.e., each actor's score is compared to the highest score). More variation yields higher scores, while less yields lower ones. Diameter equals the longest shortest path (geodesic) between all connected pairs of nodes, while average path length equals the average length of all geodesics between all connected pairs of nodes. Larger diameters and average path lengths indicate greater spread. Finally, reciprocity equals the proportion of node pairs that share a tie also reciprocate their ties, and size equals the number of actors in a network.

Numerous algorithms exist to detect subgroups within a larger network. Here, we draw on community detection algorithms, which seek to detect clustering levels greater than one would expect purely by chance (Clauset, Newman and Moore 2004, Girvan and Newman 2002, Newman 2006). Next, we calculate four centrality measures—indegree, betweenness, hubs, and authorities—which we correlate with various user account attributes. Indegree indicates the number of users in each network who have retweeted a particular user account's tweets. Betweenness measures the extent to which users lie on the shortest path between all other pairs of users and is often seen as capturing a node's brokerage potential. Finally, hubs and authorities (Kleinberg 1999) identify relatively well-connected actors tied to other well-connected actors. They are similar to eigenvector centrality, but they differ in that they take into account tie direction. Hubs are associated with outgoing ties (outdegree), while authorities are associated with incoming ties (indegree).

Results and Analysis
URL Sharing

Table 2 presents the shared URL statistics for the peak period, broken down by social media platform. These data contain more than the network data we examine below in that they include retweets, mentions, quotes, and replies. The third column indicates the number of unique URLs per platform, the fourth the number of tweets that shared the domain, the fifth the number of unique accounts that shared the domain, and the sixth, the percentage of all users (13,908) who shared any type of URL. The table indicates that users are linked to multiple social media platforms, although YouTube clearly dominates. Notably, neither Gab nor Parler, which launched in 2016 and 2018, ranked in the top ten during either time period despite evidence that numerous followers migrated to them (Bond 2021, Daly and Fischer 2021, Jasser 2020, Yurieff 2020).

Peak Period URL Statistics

Rank Platform Unique Links Tweets Users % Users
1 YouTube 1,311 4,980 3,408 24.5
2 Facebook 66 85 62 0.4
3 PeriscopeTV 62 39 31 0.2
4 Instagram 56 29 24 0.2
5 Bitchute 31 42 27 0.2
6 Pastebin 13 21 8 0.0
7 TikTok 8 8 5 0.0
8 Spotify 7 8 6 0.0
9 Telegram 6 4 4 0.0
10 Soundcloud 6 4 4 0.0

Note: Percentage of users calculated based on all users (13,908) who shared any type of URL

Table 3 presents the post period shared URL statistics. YouTube continued to dominate but not to the extent that it did in the peak period. Whereas 25% of users shared links to YouTube in the peak period, only 21% did in the post. More revealing, however, is the increase in shared URLs to other platforms, such as Instagram, PeriscopeTV, Facebook, and Bitchute. There is also some reshuffling in the rankings, with Instagram and PeriscopeTV jumping ahead of Facebook and Reddit.

Post Period URL Statistics

Rank Platform Unique Links Tweets Users % Users
1 YouTube 1,550 3,162 2,122 21.0
2 Instagram 362 208 139 1.4
3 PeriscopeTV 138 88 49 0.5
4 Facebook 97 117 97 1.0
5 Bitchute 39 343 324 3.2
6 TikTok 24 31 24 0.2
7 Etsy 15 30 24 0.2
8 Tumblr 13 12 8 0.1
9 Reddit 13 11 10 0.1
10 Soundcloud 12 16 14 0.1

Note: Percentage of users calculated based on all users (10,128) who shared any type of URL


We now turn to our topographical analysis of the retweet networks. Figure 2 presents the main components of both networks (i.e., the largest set of directly or indirectly connected accounts), while Table 4 reports key topographical and community detection metrics. The post network is much smaller than the peak, even though the number of days of data collection was far longer (21 vs. 4). Notably, the two networks consist of very different user accounts. Only 155 from the peak network remain in the post. This could be due to accounts being suspended and/or ‘self-selecting’ out (e.g., not active, switched to a different conversation, migrated to another social media platform).

Figure 2

Main (Weak) Components of Peak (Left) and Post (Right) Retweet Networks

Topographical and Girvan-Newman Clustering Metrics of Peak and Post Networks

Metric Retweet Network

Peak Post
Size 3,130 1,985
Size (Largest Component) 2,566 988
Density < 0.001 < 0.001
Average Degree 1.988 1.747
Transitivity < 0.001 < .0010
Reciprocity 0.000 0.000
Degree Centralization 0.341 0.159
Betweenness Centralization 0.586 0.149
Diameter 15 14
Average Path Length 4.416 4.670

Number of Groups 217 301
Modularity 0.821 0.905
Normalized Modularity 0.824 0.908

Reciprocity in both the peak and post retweet networks is quite low, lending support to H6; in fact, it is nonexistent, indicating that users did not exchange URL content in either time frame, perhaps suggesting an absence of ‘strong ties.’ Because such an absence lowers the probability of triadic closure, the two networks are, as expected in H7, relatively sparse and exhibit low levels of transitivity. For both networks, density (< 0.001, < 0.001) and average degree (1.988, 1.747) are quite low, as is transitivity (< 0.001, < 0.001). The decline in average degree indicates that the post network is somewhat less interconnected (i.e., sparser) than the peak.

The diameter for both networks is quite large, indicating that both networks are pretty dispersed. However, the moderate average path length suggests that a handful of nodes lie fairly close to all others. This combination probably reflects, at least in part, social media's unique nature (Kane et al. 2014). Although users did not exchange URL content (reciprocity = 0.00), they did retweet URL links from a handful of key accounts, supporting H1. Some of this likely reflects how Twitter's API provides retweet data, but in both time frames, prominent central accounts appear to serve as ‘authorities’ because they offer links, mainly to YouTube videos, that many others retweet.

For instance, 65% peak network's most central accounts (n=17, see note 14) shared links to YouTube videos on at least one occasion.

For instance, the peak network's most central accounts, which have 69% of all network ties, offer URLs to other social media platforms but do not retweet others’ URLs. The substantial higher degree (0.341) and betweenness centralization (0.586) scores for the peak network as compared to the post (0.159, 0.149) suggest that this was especially true before the crackdown (H8). These results partially support our expectation that the two networks would display moderate-to-high levels of centralization (H4). This was true of the former but not the latter.

There is some evidence that although centralization scores can range from 0.0 to 1.0, they seldom climb much higher than 0.60 (Cunningham, Everton and Murphy 2016, Oliver et al. 2014).

We, of course, did expect the post network to exhibit lower levels of centralization than the peak, but we did not anticipate the crackdown to have such an impact.

Community Detection

We applied the Girvan-Newman (2002) community detection algorithm to both networks to identify subgroups.

We also applied Spinglass (Reichardt and Bornholdt 2006), but it yielded a poorer measure of fit (i.e., modularity score).

As Table 4 indicates, the algorithm detected 217 groups in the peak network and 301 in the post. Both networks have high modularity scores, suggesting they are highly clustered around a few central accounts, supporting H3. However, the increase in groups and modularity scores from the peak to the post indicates a higher level of clustering after Twitter's crackdown.

We next collapsed the main components of both networks by the Girvan-Newman subgroups. Figure 3 presents the results where each node's pie chart indicates the proportion of ‘external’ domain-specific URLs in each subgroup. The results again highlight YouTube's popularity, as it is distributed across clusters in both periods, especially in the peak network. We can also see that other social media platforms, such as Instagram and Bitchute, gained purchase in some of the post network's subgroups.

Figure 3

Main (Weak) Component Collapsed by Girvan-Newman Subgroups

Node Level Patterns

Our next question considers the patterns of prominent accounts during each time frame. As we have seen, the peak network was centralized around a few accounts whose tweets were frequently retweeted by others. This helped create a highly clustered network, as captured in Figure 2 and reflected in the modularity scores reported in Table 4. By contrast, the post period network was less centralized. Still, its numerous subgroups and high modularity scores suggest that a handful of user accounts were retweeted far more often than the typical user account. Figure 4, which presents the indegree distribution for both networks, captures this.

Figure 4

Indegree Distribution, Peak and Post Retweets Network

Both distributions are highly skewed, indicating that most users are seldom retweeted, but the tweets of a few are retweeted frequently. Are the distributions scale-free? Broido and Clauset's (2019) tests for whether networks exhibit a power law distribution indicate that the peak network is ‘weakly’ scale-free with an alpha > 1 but < 2 (a = 1.91), while the post network is ‘strongly’ scale-free with an alpha > 2 (a = 2.74) The results lend support to H5 but not for H9. Importantly, the power-law region for both includes 50 or more nodes.

Additionally, Kolmogorov-Smirnov tests indicate that the power-law distributions cannot be rejected (p > 0.1).

Table 5 presents peak network correlations between four centrality metrics (indegree, betweenness (undirected), hubs, and authorities) and several user account attributes: number of retweets, tweets, URLs shared, favorites, friends, and followers. The coefficients support H2 and indicate a positive and statistically significant association between ‘reliable sources’ and standard centrality metrics.

Since network data violate standard statistical assumptions (Borgatti, Everett and Johnson 2018), we used permutation tests as implemented in UCINET to determine statistical significance (Borgatti, Everett and Freeman 2002).

In particular, positive correlations exist between accounts that are repeatedly retweeted, have many followers, and are frequently ‘liked’ (favorites) and three of the four centrality measures: indegree, betweenness, and authorities. The one exception is the marginally negative correlations between hubs and retweets, favorites, and followers. Recall that hubs capture the eigenvector equivalent of outdegree, that is, the extent to which users retweet frequently-retweeted users. As such, users scoring high in terms of hubs are those doing the retweeting, not necessarily those being retweeted. Thus, the negative correlations suggest that other users do not see them as ‘reliable sources.’

Peak Network Correlations Between Key Centrality Metrics and User Account Attributes

Indegree Betweenness Hubs Authorities
Retweets 0.61*** 0.59*** −0.06*** 0.53***
Tweets 0.05* 0.07** −0.09*** 0.02*
URLs 0.06** 0.08** −0.04** 0.03*
Favorites 0.90*** 0.88*** −0.04* 0.90***
Friends 0.13** 0.13** −0.08*** 0.04*
Followers 0.78*** 0.76*** −0.06*** 0.76***




Table 6 presents post network correlations for the same centrality and account attributes. Here, the correlations differ somewhat, but the same patterns hold for retweets and favorites. Notably, the correlations between the three centrality metrics and ‘followers’ are much lower in the post than in the peak network, suggesting that after Twitter's crackdown, which eliminated several prominent QAnon-affiliated accounts, users began searching for others they should follow (and rely on) in the QAnon Twitter network.

Post Network Correlations Between Key Centrality Metrics and User Account Attributes

Indegree Betweenness Hubs Authorities
Retweets 0.76*** 0.63*** −0.02 0.95***
Tweets 0.21*** 0.26*** −0.00 0.25***
URLs 0.31*** 0.31*** 0.05* 0.38***
Favorites 0.73*** 0.60*** −0.02 0.99***
Friends 0.06** 0.04* −0.04 0.07**
Followers 0.15*** 0.11*** −0.05* 0.17**




Discussion and Conclusion

Table 7 presents a summary of the results of our analysis.

Because our analysis is of a single case study, we cannot directly generalize our results to other conspiracy conversations occurring on Twitter. Generalizability occurs at the abstract level and the theoretical principles being tested. Theory is what bridges our analysis to other settings.

In brief, it found that both the peak and post networks exhibited low levels of reciprocity and transitivity, suggesting that the diffusion of QAnon on Twitter was more akin to simple rather than complex contagion. And as expected, there was a strong positive association between accounts that QAnon followers could see as ‘reliable sources’ and key centrality metrics, indicating that a handful of users played an essential role in connecting Twitter users to the QAnon conversation. The central role these actors played also contributed to the high levels of clustering we saw in both the peak and post networks and the moderate-to-high level of centralization in the peak (but not the post) network, which, in turn, probably facilitated the diffusion of the conversation. Both networks also exhibited the characteristics of a scale-free network, although the peak network did so only weakly. Clearly, Twitter's crackdown impacted the QAnon conversation. The post network was less centralized, smaller, and sparser than the peak network, and there was a relative increase in multiplatform URL sharing from the peak to the post network.

Summary of Hypotheses and Results

Hypotheses Result Comments
H1. A handful of actors will account for most of the link-sharing of external social media sites. + New central actors emerged after the crackdown to diffuse external content.
H2. Reliable actors will be central before and after Twitter's July 2020 crackdown, even if they are different users in each period. +
H3. Both networks will exhibit relatively high levels of clustering. +
H4: Both networks will be moderately to highly centralized. +/− See H8 below.
H5: The retweet networks will be scale-free. + See H9 below.
H6: Both networks will exhibit low levels of reciprocity. +
H7: Both networks will be sparse and exhibit low levels of transitivity. + Unsurprisingly, the post network was smaller and sparser than the peak network.
H8: The network will become less centralized after Twitter's crackdown in July 2020. +
H9: The network will become less scale-free after Twitter's crackdown in July 2020. The post network was more strongly scale-free than the peak network.

As noted earlier, one reaction to the proliferation of QAnon beliefs, including through URL sharing, has been for social media companies, such as Twitter, to deplatform QAnon-related accounts: suspending, banning, or restricting access to particular users. Deplatforming can target everyone expressing (in this case) QAnon sympathies, or it can focus on a few key accounts, such as those scoring high in terms of various centrality and brokerage metrics. Although there is a rationale for deplatforming (Singer 2021), doing so could further validate the perception that one side of the political spectrum is being isolated and misrepresented by the other (Singer 2021). As we saw, it can also point followers to other social media platforms; it can also increase offline, face-to-face interactions that could increase informational and/or social isolation, which, in turn, increases the likelihood of radicalization and possibly violence (Everton 2016, Foot 2018, Sunstein 2002, Sunstein 2009). Targeting key accounts is probably preferable since it may be less likely to provoke a backlash that leads users to switch platforms. Moreover, given the scale-free nature of the two networks, removing a handful of key nodes can substantially fragment them (Barabási and Albert 1999).

An alternative strategy is to ‘infiltrate’ conspiracy theory discussion groups and introduce counter-narratives that would increase informational diversity and decrease the likelihood of radicalization. Sunstein and Vermeule (2009), for instance, suggest that state and federal governments could ‘cognitively infiltrate’ conspiratorial groups, but this raises ethical issues that institutional review boards might find unacceptable, not to mention that doing so probably violates civil liberties and is unconstitutional in the U.S. (Marantz 2017):

Their proposal for a government-sponsored conspiracy to combat conspiracy recalls the efforts by J. Edgar Hoover, the founding director of the FBI, who ran a counterintelligence program from 1956 to 1971 that, among other illegal and improper techniques, infiltrated civil rights groups. Infiltrating conspiracist groups is a similar offense against civil liberties—especially when the group is not a terrorist cell or some other violent association.

(Rosenblum and Muirhead 2019:154)

It is also questionable whether such an approach would be effective. It assumes that users would retweet posts containing the counter-narratives, but as our analysis has shown, the tweets of most users are not retweeted.

A third alternative is for social media platforms to adjust their algorithms to expose users to multiple points of view and potentially form ties with ‘outsiders.’ Twitter has done this, and although it may seem like a somewhat benign strategy, it is easy to underestimate the power of social media feeds to influence behavior (Bond et al. 2012, DiResta 2018, Kramer, Guillory and Hancock 2014, Lanier 2018). Consider, for example, the story of Ashley Vanderbilt, who, after the November 2020 election, spent much of her time consuming QAnon-related social media. By “inauguration day, she was convinced that if… Joe Biden took office the United States would literally turn into a communist country” (O’Sullivan 2021). This occurred because she kept clicking on videos suggested by TikTok:

It's there, she says, that she was first introduced to QAnon… She mostly followed entertainment accounts on the platform, but as the election neared she began interacting with pro-Trump and anti-Biden TikTok videos. Soon, she says, TikTok's “For You” page, an algorithmically determined feed in the app that suggests videos a user might like, was showing her video after video of conspiracy theories. (O’Sullivan 2021)

Adjusting such algorithms can help alter the underlying structure of users’ informational networks, potentially transforming the central narratives of those networks (Fuchs 2001) and thus reducing the likelihood of radicalization (Yeaton 2021).

Twitter recently introduced ‘Birdwatch,’ ‘a feature… meant to… combat misinformation and disinformation by tapping users in a fashion similar to Wikipedia to flag potentially misleading tweets’ (Collins and Zadrozny 2021b).

Because users likely migrated to other platforms after Twitter's crackdown, future studies may want to consider the multiplatform nature of QAnon and other conspiracy theories. One possibility is analyzing these multiplatform networks as multilayer networks (Boccaletti et al. 2014, Dickison, Magnani and Rossi 2016, Kivelä et al. 2014), although this poses many analytical and technical challenges (e.g., collecting data from platforms with varying access levels, including historical data). Researchers may also want to consider how other hashtags account for variation in link-sharing behavior among Twitter users. While our analysis captures some of the critical characteristics of QAnon URL-sharing, many QAnon adherents may have believed other hashtags (e.g., #savethechildren) contained more relevant or ‘insider’ information. Finally, scholars could leverage the power of exponential random graph models (ERGMs) to identify underlying patterns of interaction before and after deplatforming. In fact, they could combine such an analysis with text mining techniques to see if (and how) changes in the underlying structure of the QAnon conversation impacted the stories they told about Q and themselves.

Our analysis has only scratched the surface of the rich amount of data available. Clearly, much is left to be explored, not just about QAnon but about online conspiracies in general. We would like to think that the ground we have broken helps contribute to that effort.

Volume Open
Sujets de la revue:
Social Sciences, other