Canadian Journal of Communication Vol 44 (2019) 211–237 ©2019 Canadian Journal of Communication Corporation http://doi.org/10.22230/cjc.2019v44n2a3329
Scott S.D. Mitchell, Carleton University
Scott S.D. Mitchell is a PhD student in the School of Journalism and Communication at Carleton University. Email: firstname.lastname@example.org .
Background Disease outbreaks are often accompanied by sensationalist news media coverage, social media panic, and a barrage of conspiracy theories and misinformation. The Zika virus outbreak of 2015–2016 followed this pattern.
Analysis Drawing on frame analysis, this article examines the construction and circulation of a conspiracy theory concerning the 2015–2016 Zika outbreak, analyzing the flow of misinformation across online platforms including “conspiracy” websites, online discussion threads, and Twitter.
Conclusion and implications Conspiracy theories produced and shared on social and digital media platforms have the power to discursively construct contagious diseases such as Zika, which may fuel misguided public perceptions and impact health policy.
Keywords Conspiracy theories; Frame analysis; Social network analysis; Zika; Twitter
Contexte Les pics épidémiques suscitent souvent une couverture médiatique sensationnaliste, la panique dans les médias sociaux et une panoplie de théories du complot et de désinformation. La flambée du virus Zika en 2015–2016 en est un exemple.
Analyse Cet article se fonde sur une analyse des cadres pour examiner la construction et la circulation de théories du complot relatives à la flambée du Zika en 2015–2016, analysant la désinformation sur diverses plateformes en ligne, y compris des sites complotistes, des fils de discussion et Twitter.
Conclusions et implications Les plateformes en ligne développent et partagent des théories du complot qui ont le pouvoir de décrire des maladies contagieuses telles que le Zika de manière à entraîner des perceptions publiques erronées et à influencer les politiques sur la santé.
Mots clés Théories du complot; Analyse des cadres; Analyse des réseaux sociaux; Zika; Twitter
“This seems like a case to me where mankind’s
arrogance may have backfired on us.”
“They’re always coming up with new ways to wipe Africans off the map.”
“The solution ‘stop having kids’ fits the global
population control agenda,
and the warnings that the virus is moving north tells me that even the
US might have a time come where public health mandates
a freeze on procreation nationwide.”
These are comments from a January 2016 discussion thread on the social media platform Reddit, speculating that there is a link between genetically modified (GM) mosquitoes that were released in Brazil and the 2015–2016 Zika virus outbreak. As of January 2018, more than 80 countries and territories had reported mosquito-borne transmission of the virus, with over half a million suspected cases, and there had been over 500 travel-related cases reported in Canada (Pan American Health Organization, 2018). Although the U.S. Centers for Disease Control and Prevention (CDC) deactivated its emergency response in September 2017 since spread of the virus had declined sharply by that point, concerns remain about Zika continuing to spread to new areas. Moreover, combatting the virus throughout Central America and the Caribbean depends on international collaboration. The virus can be transmitted from a pregnant woman to her fetus, through sexual contact, or by needle, but is most widely spread through the bite of an infected Aedes species mosquito. Typical symptoms include mild fever, rash, joint pain, and red eyes, which may last for several days to a week after infection; most people who are infected do not become sick enough to require hospital care, and many do not even realize they have been infected (Bateman, 2016; Grennell, 2018; “Zika Virus Infection Fast Facts,” 2016).
Throughout 2015 and 2016, health officials in Brazil—which experienced the most significant outbreak of the virus—began reporting an increase in the number of babies born with microcephaly, a birth defect or condition in which a baby’s head is smaller than expected and which may involve improper brain development. Babies born with the condition may suffer from seizures, developmental delays, intellectual disabilities, and other problems. These cases of microcephaly were linked to the babies’ pregnant mothers being infected with Zika. In rare cases the virus also caused Guillain-Barré syndrome; this leads to a (typically temporary) paralysis (Specter, 2016; World Health Organization, 2016).
Despite most health officials believing that the virus is not linked to GM mosquitoes in any way, at the height of the outbreak in 2016 a poll found that more than a third of U.S. adults believed that GM mosquitoes caused the spread of Zika (Annenberg Public Policy Center, 2016). A British biotechnology company genetically engineered mosquitoes to produce short-lived offspring, a move that has been shown to reduce mosquito populations by 95 percent in some areas (Schipani, 2016). This initiative involving GM mosquitoes has been proven to significantly lower disease transmission, so why did so many people believe that this effective response to the outbreak had somehow caused it? Further, the majority of people polled were concerned that the Zika virus would spread to where they live, even though health authorities assured the public that the primary vectors of the disease do not reach the southern and eastern United States—an assessment that was ultimately proved to be accurate (Annenberg Public Policy Center, 2016). As one commentator in The New Yorker observed:
The logic of the genetically-modified-mosquitoes conspiracy theory is hard to grasp: it has been four years since they were first released in Brazil. Human pregnancies last nine months. Surely some babies must have been born in the intervening three years. Why were they spared the microcephaly if the genetically modified mosquitoes are to blame? Moreover, the altered mosquitoes had previously been released in the Cayman Islands, Malaysia, and Panama without causing problems. (Specter, 2016, para. 9)
Women and people living in poverty have been the most affected by the spread of the disease, and a coordinated global health response is necessary to mitigate its damage. However, as exemplified by the 2014 Ebola crisis, if the public in Canada and the U.S. is mobilized by misplaced fears of a domestic outbreak, this can result in harmful measures being taken; support for a travel ban to and from regions affected by the Ebola crisis remained high in the U.S. and Canada, presumably because people believed such measures would increase their safety. However, travel bans are more likely to slow medical help and increase economic burdens to affected countries. This travel ban ultimately obstructed efforts to contain the virus where the outbreak was most acute, and thus placed the international community at greater risk (Centers for Disease Control and Prevention, 2014; York, 2014).
Misinformation spread through social media was listed by the World Economic Forum (WEF) as a major threat to society (Howell, 2013). In the wake of the 2016 presidential election in the United States, there has been much commentary and debate surrounding the apparent rise of misinformation and “fake news” online (Carson, 2019; Daudin, 2018; Waterson, 2018). A report from the Data & Society Research Institute found that certain internet subcultures, such as white nationalists and conspiracy theorists, have developed social media strategies to increase the visibility of their beliefs and messages (Marwick & Lewis, 2017). In response to this seemingly widespread concern about the spread of misinformation through digital media, social media and technology companies such as Facebook and Google have announced or implemented various changes to their algorithms and policies, though there is notably limited information about the effectiveness of these efforts (Anderson & Rainie, 2017; Levin, 2018, 2019).
Research on social media and health communication often calls for the use of social media platforms to disseminate accurate information and challenge problematic beliefs (Chou, Hunt, Beckjord, Moser, & Hesse, 2009; Gesser-Edelsburg, Diamant, Hijazi, & Mesch, 2018; Korda & Itani, 2013; Scanfeld, Scanfield, & Larson, 2010). Social media conversations may reflect public concerns or reveal underlying misunderstandings (Signorini, Segre, & Polgreen, 2011), yet attempting to harness social media to address and educate the public may prove difficult (Bhattacharya, Srinivasan, & Polgreen, 2014; Heldman, Schindelar, & Weaver, 2013; Neiger, Thackeray, Burton, Thackeray, & Reese, 2013). In contrast to the conceptualization of online communication as a public sphere conducive to debate, discourse, and varied interaction (Uldam & Askanius, 2013), others have described so-called filter bubbles: intellectual isolation caused by social media algorithms controlling the content that people encounter online, which serves to segregate users according to pre-existing belief and tendencies (Bozdag & van den Hoven, 2015; Pariser, 2011).
Further, combatting misunderstandings through exposure to accurate information is reminiscent of the “deficit approach” to science communication, which has been criticized for its apparent lack of effectiveness and oversimplification of the dynamics between publics, “experts,” and the media (Simis, Madden, Cacciatore, & Yeo, 2016). This criticism is supported by work on how people assess information about science and health issues. Corner, Whitmarsh, and Xenias (2012), for example, found that newspaper editorials supporting climate change were viewed as less persuasive and less reliable by climate change sceptics, compared to non-sceptics. Similarly, in a study on myths about the influenza vaccine, exposure to information adapted from the Centers for Disease Control and Prevention (CDC) generally reduced mistaken beliefs, yet among those with high levels of concern about vaccine side effects, this corrective information significantly reduced intent to vaccinate (Nyhan & Reifler, 2015).
This article examines the construction and circulation of the Zika GM mosquito conspiracy theory, analyzing the origins and flow of this misinformation across different online platforms and examining how this conspiratorial narrative was framed by the users and platforms that spread it. The methodology was guided by three main goals:
Conspiracy theories produced and shared on social and digital media platforms have the power to discursively construct contagious diseases such as Zika, which may fuel misguided public perceptions and impact health policy. Past work on the spread of misinformation and conspiracies online has focused on single sites or social media platforms, despite often noting the interactive and “shareable” nature of the digital information landscape, which results in wide availability and circulation of diverse content (Briones, Nan, Madden, & Waks, 2012; Meylakhs, Rykov, Koltsova, & Koltsov, 2014; Vicario, Bessi, Zollo, Petroni, Scala, Caldarelli, Stanley, & Quattrociocchi, 2016). Other studies have analyzed the language, framing, or discursive construction of health-related misinformation (Faasse, Chatman, & Martin, 2016; Kata, 2012; Ma & Stahl, 2017). This article characterizes the tropes and devices that were used to frame the Zika conspiracy narrative in the Reddit thread and in widely shared articles from conspiracy sites; it builds on previous work by investigating the ways in which conspiracy theories and misinformation travel from site to site, potentially impacting wider public perceptions about a significant health crisis.
Conspiracy theories have been described as having “serious consequences for public health and environmental policies” and being notably “easy to propagate and difficult to refute” (Goertzel, 2010, p. 493). Some of these theories resonate with an already mistrustful public, reflecting fears of “big government” and supposedly unchecked scientific research, such as fears that the HIV virus is not the true cause of AIDS, that vaccines or genetically modified organisms are not safe, or that anthropogenic (human-caused) global warming is a political sham based on spurious or manipulated data (Goertzel, 2010).
Typically used with a derogatory connotation, a “conspiracy” is a contested rhetorical notion applied to certain events or phenomena according to someone’s perceptions, point of view, and belief system (Gallie, 1964). Conspiracy theories have been characterized as the attribution of events “to secret manipulative actions by powerful individuals” (Grimes, 2016, para. 1), or an attempt to ascribe events or practices to the actions of powerful groups or people, who conceal their machinations (Sunstein & Vermeule, 2009).
Many polls have found that large portions of the public believe in various conspiracy theories (Goertzel, 1994; Sunstein & Vermeule, 2009). When these beliefs concern scientific and medical issues such as climate change, vaccinations, or genetically modified organisms, this can serve to obstruct policy development, exacerbate health crises, and slow down scientific progress. Perhaps one of the most pervasive examples is the anti-vaccination movement, which received much support after The Lancet published a now-debunked study that reported a link between the measles-mumps-rubella (MMR) vaccine and autism. The news media covered the study, often without contextualizing the alleged findings (the study had a notably small sample size). In the aftermath of the study, there was “a decline in … parents having their children vaccinated and a subsequent increase in disease” (Goertzel, 2010, p. 495). Similarly, public support for climate change scepticism contributes to legislative inaction (Grimes, 2016).
In the case of the anti-vaccination movement, many people were not reassured by health authorities citing large epidemiological studies (Goertzel, 2010). Indeed, the construction—and maintenance—of conspiracy theories is often characterized by logical flaws, inconsistent narratives, contradictory rationales, and a lack of evidence. To explain how such beliefs can become pervasive in the public consciousness, Goertzel (2010) suggested that we think of conspiracy theories as a type of “meme.” Coined by evolutionary biologist Richard Dawkins, a “meme” originally referred to a unit of cultural transmission (such as a fashion trend or phrase), but is now typically used to describe a digital artifact (e.g., a music video, commercial, image, GIF, or joke) that gains a high level of popularity and which gains influence or cultural currency through online transmission (Davison, 2012). The “meaning” or success of a meme depends on users’ shared understanding of its cultural context.
Conspiracy theories operate similarly: they are an easily shareable sentiment or concept that is undergirded by a shared understanding (which often takes the form of distrust in the “establishment,” such as governments or scientific authorities). The construction and circulation of such theories as dependent on underlying cultural beliefs, shared sentiments, and emotional appeals—such as institutional distrust and suggestion of imminent threat—can potentially illustrate why a conspiracy theory, despite its implausibility, can still “be used as a rhetorical device to appeal to the emotions of a significant public” (Goertzel, 2010, p. 494). Indeed, conspiracy theories, rumours, and the “fringe” positions in scientific debates often involve rhetorical and argumentative strategies that seemingly mirror the rigour and validity of scientific inquiry, yet have no basis in scientific fact. It is also common for such theories to distort and decontextualize information to frame the “accepted” science as suspect, or to frame their denialism in a manner that appears credible, authentic, and evidence based (Ceccarelli, 2011).
Conspiracy theories are often difficult to dismiss and subsequently factor into mainstream discourses, often shaping large segments of the public consciousness and ultimately affecting the course of whatever issue or event is at stake (Goertzel, 2010; Grimes, 2016; Sunstein & Vermeule, 2009). Ceccarelli (2011) describes the difference between a scientific controversy that devolves into a “wild” theory with no clear motives or grounding and the more resilient—and potentially damaging—conspiracy theories that can emerge “when a rhetor invokes a closed-minded orthodoxy rather than a nefarious cabal, and when he points to institutional structures that reinforce that orthodoxy, such as peer review in publication and funding decisions” (p. 203). Further, this rhetorical move lends credence to any subsequent claims that the “truth” is being “marginalised by a dominant and well-funded consensus” (p. 203). Dispelling the conspiracy theory becomes increasingly difficult, as the “alternative ‘dissident’ paradigm” has a low burden of proof and must “only claim that inquiry is being unfairly stifled, and then wait as outraged defenders of the orthodoxy unwittingly confirm that claim through their response” (p. 203).
Misinformation and ignorance concerning contagious diseases, such as Zika, is of course not solely constructed and propagated by communities of conspiracy theorists; the news media, popular culture depictions of disease, and even health authorities can all be implicated in public (mis)understandings. Indeed, misinformation and conspiracy theories concerning Zika are merely a recent example of how prominent contagious disease outbreaks are frequently constructed in the public mind: in the past, alarmist predictions, sensationalist rhetoric, rumours, and incorrect information surrounded the severe acute respiratory syndrome (SARS) outbreak (Washer, 2004), swine flu epidemic (Fitzpatrick, 2009), and Ebola crisis (SteelFisher, Blendon, & Lasala-Blanco, 2015; Towers, Afzal, Bernal, Bliss, Brown, Espinoza, … Castillo-Chavez, 2015). Further, as discussed previously, the broader significance of public understandings—or misunderstandings—of contagious diseases is that public perceptions can impact policy (Leach & Dry, 2010; Wald, 2008).
Research on risk perception and the communication of disease information has shown that the public conceives of risk in accordance with social and cultural factors that can lead to conflicts and confusion (Kasperson, Kasperson, Pidgeon, & Slovic, 2003; Slovic, 1987). Past research has examined how news coverage of contagious diseases is often characterized by alarmist predictions, “fear-mongering,” and sensationalist rhetoric (Washer, 2004, p. 2565), demonstrating the potential for mass media to impact public perceptions of contagious diseases (Boyd, Jardine, & Driedger, 2009). Indeed, much of the blame for public anxiety has often been placed on traditional news media coverage that supposedly amplifies or decontextualizes scientific information (SteelFisher et al., 2015; Towers et al., 2015), with much attention focusing on how diseases are constructed in the news according to certain frames (Boyd et al., 2009; Washer, 2004).
Yet this kind of explanation, which focuses on the mainstream news media as the sole or primary culprit, overlooks other potential sources of misinformation. For example, some work has shown that understandings of real-world diseases often resonate affectively with horror and science fiction films (Joffe & Haarhoff, 2002) or apocalyptic narratives (Gerlach & Hamilton, 2014; Ironstone-Catterall, 2011). Further, social media must also be considered as a source of disease (mis)information. Some work has simply quantified the volume of activity on social media concerning a particular disease or health issue, while other research has attempted to describe how instantaneous, unlimited access to a multitude of views and opinions can impact public beliefs. This research has typically argued that being exposed to other social media users’ expressions of fear can lead to misguided or amplified concerns (see Fung, Tse, Cheung, Miu, & Fu, 2014; Nagpal, Karimianpour, Mukhija, & Mohan, 2015). Other research on social media has attempted to connect patterns of activity to external events (see Dyar, Castro-Sánchez, & Holmes, 2014), often examining how prominent events during public health crises, media hype, or increased news coverage lead to flurries of activity online (see Mollema, Harmsen, Broekhuizen, Clijnk, De Melker, Paulussen, Kok, Ruiter, & Das, E, 2015; Rodriguez-Morales, Castañeda-Hernández, & McGregor, 2015; Towers et al., 2015).
Sumiala, Tikka, Huhtamäki, and Valaskivi (2016) argue that a multi-method approach, incorporating social network analysis and digital ethnography, is necessary to understand so-called hybrid media events, which are characterized by a “complex intermedia dynamic” with “circulations between messages and actors and the recombination of media on a variety of media platforms” (p. 98). Specifically, social network analysis is used to explore the larger communication structures and practices, while qualitative analysis provides more in-depth understanding of the event under study. The methods used in this article are inspired by Sumiala and colleagues (2016), performing quantitative analysis to trace the flow of information across platforms and networks, and a qualitative analysis of the “conspiracy narrative” that was constructed through these information flows.
The origin of misinformation concerning GM mosquitoes can be traced back to a January 2016 Reddit thread, titled “Genetically modified mosquitoes released in Brazil in 2015 linked to the current Zika epidemic?” (Schipani, 2016; for original thread, see Redditsucksatbanning, 2016). Mainstream media outlets, including the Daily Mail, The Ecologist, the Wall Street Journal, and Fox News proceeded to report on the theory and further spread the misinformation, with notably increased Twitter activity concerning Zika and GM mosquitoes (Schipani, 2016).
Reddit is a news aggregator and social media platform, the self-described “front page of the Internet,” which involves users submitting links and posts that other users can “upvote” or “downvote,” affecting the visibility (and prominence) of the content. Content is submitted to specific subforums, smaller communities or “subreddits” that exist within the infrastructure of the main site. Reddit receives almost 250 million unique visitors each month, from more than 200 countries, collectively viewing more than eight billion pages across the entire site (Reddit, 2016). Twitter, with over 328 million monthly users, remains one of the most popular social networks (Statista, 2017). Past work has analyzed Twitter as a platform that allows audiences to co-produce (and not simply consume) media events, or as a site that subverts traditional power relations and communication dynamics by connecting people to information networks that bypass journalists, governments, and other traditional information gatekeepers (Girginova, 2015; Procter, Vis, & Voss, 2013). Due to its nature as a platform for sharing breaking news, and the fact that much of the communication through the site is publicly available, Twitter provides invaluable data for the study of media events.
Two data sets of tweets containing the keywords “GM,” “mosquitoes,” and “Zika” were gathered. The most commonly used social analytic programs do not allow researchers to gather historical data, but instead, a smaller sample of more recent tweets, due to limitations imposed by the Twitter API (Gruzd, Wellman, & Takhteyev, 2011; Himelboim, Smith, & Shneiderman, 2013). To collect tweets from the desired time period, I used a Python script to scrape the Web for all publicly available tweets containing the three keywords and posted during the specified date range. The output from this search was converted into a .csv file that for each tweet specified the username; number of likes, replies, and retweets; content of the tweet; and date it was published.
The first data set was built from tweets posted in the month following the Reddit post, including the keywords “Zika,” “GM,” and “mosquitoes” (of course, some tweets about this topic would not have necessarily included all three of these keywords, and many tweets may have featured alternative spellings). This data set of 2,803 tweets allowed me to trace the flow of the information, noting which actors (Twitter users and media organizations) helped shape the conversation. The .csv file was imported into Netlytic, a free Web-based program that analyzes and visualizes text and social network data (Gruzd, 2016), which was used to create a network visualization.
A second data set was built from tweets, including the three keywords that were posted before and after the date of the original Reddit thread (January 18 to February 1, 2016, encompassing a week before and after the post), providing a data set of 1,093 tweets. This data set was constructed to examine how the discourse around using GM mosquitoes to combat Zika might have changed. Each tweet was hand coded as conveying a “positive,” “negative,” or “unidentified” sentiment based on the language, context, and tone of the tweet, and also coded as to whether it provided links to articles that supported the use of GM mosquitoes or suggested conspiratorial belief. For example, the tweet “GM mosquitoes fighting Dengue fever and Zika virus? I call that humanity kicking ass and taking names!” was coded as positive; the tweet “GM Mosquitoes May Be Responsible For Zika Virus Outbreak” was coded as negative. “Unidentified” tweets could not be confidently classified as either positive or negative. Often, they did not link to an article and instead merely commented on the fact that GM mosquitoes were being deployed, with no indication of the user’s sentiment; other times, they linked to an article that was not directly addressing the efficacy (or supposed risks) of GM mosquitoes, such as an article by @TechReview that speculated about whether such GM organisms would be in higher demand in the future.
To explore how these websites, tweets, and Reddit posts constructed, legitimated, and circulated this conspiracy theory, and to identify the rhetorical and argumentative strategies and framing devices used, a frame analysis of a sample of these sources was conducted. The focus of this effort was on the Reddit thread and the alternative/conspiracy media articles, as these texts were the source of the conspiracy theory.
As defined by Ryan, Carragee and Meinhofer (2001), a frame is an “organization of news stories and other discourses by their patterns of selection, emphasis, interpretation, and exclusion” (p. 216). A frame is presented by taking information out of context, sensationalizing information, or conveying a certain narrative. The texts were analyzed thematically for the presence of conspiratorial discourses and tropes, and they were coded and categorized according to whether they contributed to a conspiracy frame. By identifying how these constructs and themes were deployed, I characterized how the dominant narrative frame was constructed. Examining this content qualitatively, taking note of how various elements of the texts (their titles, content, imagery, and more) conveyed meaning, I recorded salient features and coded my notes, to identify dominant framing devices. As described by Sumiala, Tikka, Huhtamäki, and Valaskivi (2016), this entailed a process of mapping a field (through a larger-scale social network analysis) and then following “in and through” different media platforms in more depth, which they characterize as a kind of “digital ethnography” (p. 105). By reading the texts, clicking on hyperlinks, and engaging with the content on an experiential level, it is possible to uncover the “hidden representations, discourses, actors and symbols and related communicative practices” that produce meaning (p. 105).
The top 30 most mentioned twitter usernames (not including retweets or likes) from the first data set were determined using Netlytic’s name explorer list (see Table 1). This list includes, for example, @AntiMedia (214 mentions), @ActivistPost (113), @FT (65), @RT_com (46), @rfreeuk_listen (40), @SciDevNet (39), @OxtiTec (34), @YouTube (34), and @RealAlexJones (28).
Table 2 displays the top 20 most-retweeted tweets from the first data set, alongside the content of the tweets and the number of likes and replies. This information concerning the most high-profile interactions on Twitter—the accounts that were mentioned most frequently, and the tweets that were the most highly retweeted—showed which “nodes” were the most influential in terms of spreading information and potentially shaping the discourse around GM mosquitoes and Zika; the external links and images provided in these tweets were examined.
Figure 1 is a network analysis visualization from Netlytic. The network is shown with five main nodes; each node in Netlytic can be calculated using the Total Degree, which combines Indegree and Outdegree counts. Indegree centrality is calculated according to the number of ties directed toward a node (mentions from other users), while outdegree is determined by the number of ties directed from a node to others (tweets that mention other Twitter users frequently). The five main nodes are @SciDevNet, @RT_com, @FT, @Activist_Post, and @AntiMedia. Notably, the node @SciDevNet encompasses users such as @WashingtonPost, @CNN, @Guardian, @Oxitec, and @CNNMoney. This points to an unsurprising divide in the network, through which traditional news sources and alternative/conspiracy news sources are mostly isolated from one another. Figure 2 depicts the network analysis with only the main five nodes shown and labelled.
Along with the most-tweeted articles highlighted in Table 3, other external links from the first data set were recorded, to trace the flow of (mis)information across and between networks. Reddit, the original source of the conspiracy theory, was directly linked to by two of the most-retweeted articles sharing a negative sentiment (the articles by AntiMedia and RT_com), as well as an article from conspiracy site YourNewsWire. The article from YNW did not appear among the most-retweeted, and the site’s Twitter account did not appear on the most-mentioned. This is because the article was shared by a large number of users (who did not mention the site’s Twitter name), and none of these individual tweets sharing the article were frequently retweeted. The majority of articles sharing a negative sentiment that were linked to by tweets from the data set, in turn, cited these three articles (from AntiMedia, RT_com, and YourNewsWire) within the body of their text. Figure 3 shows this flow of information: starting from the Reddit post, which was linked to by these three articles, and then spreading out to a network of other sites.
From Figure 3, the flow of (mis)information is clear: several days after the Reddit thread was posted, AntiMedia, a popular conspiracy news site, repeated the allegation that GM mosquitoes were responsible for the spread of Zika; the article cited the original Reddit thread and featured a map with a large red arrow pointing out how the GM mosquito release site was conspicuously close to the central Zika-affected areas. Yet as science blogger Christie Wilcox (2016) demonstrated, there are two cities named Juazeiro in Brazil, and the red arrow on AntiMedia’s map was pointing at the wrong one: the other Juazeiro, where the GM mosquitoes had actually been released, was over 300 kilometers away. Further, the timeline was flawed, with Zika first being reported in Brazil in 2015, and the Oxitec GM mosquito releases beginning four years earlier. The 2015–2016 Zika outbreak in Brazil has been tentatively traced back to a 2013 outbreak in French Polynesia, which has been linked to a 2007 outbreak in Micronesia. In other words, the GM mosquitoes were being condemned for a disease outbreak that was emerging several thousand miles away and that began years before they were released (Lynas, 2016; Wilcox, 2016). Anti-Media describes itself as a source of “independent journalism” that is “against the current mainstream paradigm … [which is] influenced by the industrial complex, [and] is a top-down authoritarian system of distribution” The Anti-Media, 2016). Anti-Media is followed by more than 1.5 million people on Facebook and has a presence on other social media platforms, such as Twitter, Instagram, and Google Plus.
Alongside Anti-Media, the popular alternative news site YourNewsWire also linked to the Reddit post in a widely shared article. The tagline of YourNewsWire is “News. Truth. Unfiltered.” Alongside typical news categories such as “Health,” “Technology,” and “Entertainment,” there is a separate section labelled “Conspiracies,” which regularly features articles about alleged covert CIA operations, secretive government actions, and cover-ups in the scientific community. Finally, RT is a broadcaster that was created in 2005 by Russian president Vladmir Putin; the site has been noted for its “insufficient impartiality on stories of interest to Moscow, like Ukraine, Syria and Turkey” (Erlanger, 2017).
The second data set, tweets containing the keywords “Zika,” “GM,” and “mosquitoes” published in the week before and after the Reddit post, comprises, as previously mentioned, 1,093 tweets. Notably, in the week before, there were 101 tweets, compared with 992 in the following week. Tweets expressing a negative sentiment toward GM mosquitoes went from 1.3 percent in the week before up to 79.7 percent in the week after, as users began sharing links from sources such as @AntiMedia, @RT_com, and other users that circulated the conspiracy theories.
My frame analysis uncovered the following four main framing devices: institutional distrust, suggestion of generalized or specific threat, references to historical precedent, and appeals to an alternative knowledge community. This section analyzes each framing device in more detail.
“Institutional distrust” refers to the framing of governments, agencies, the scientific method, or any other “establishment” or “powerful structure” as deceptive, nefarious, or fundamentally flawed, suggesting systematic abuse, incompetence, or organizational corruption. Kata (2012) examined the tropes used by the anti-vaccination movement online and found a similar “disillusionment and suspicion” toward the broader enterprise of science and notion of expertise, which was ascribed to a “postmodern medical paradigm” in which the legitimacy and credibility of science and authority are questioned (p. 3779). Institutional distrust describes a pervasive sense that the very mechanisms by which scientific knowledge is acquired are corrupted, possibly through corporate interference, attempts by the government to seize greater control or power, or corruption in the scientific method or publishing process that incentivizes the fabrication or exaggeration of results.
This institutional distrust was typically a central point of discussion, as seen in the posts presented in the following elements:
This is where I find the situation a bit too apparent. The solution “stop having kids” fits the global population control agenda, and the warnings that the virus is moving north tells me that even the US might have a time come where public health mandates a freeze on procreation nationwide. (Reddit comment)
How exactly would you miss a tenfold increase in children born with most of their brain missing? (YourNewsWire article)
The Olympic games will create the highest concentration of worldwide foreign travelers to spread the zika virus when they go all around the world upon their return home. You literally probably could not conceive of a better human spread plan worldwide than to introduce a human infection vector at the Olympic games if evil was your intent. (Reddit comment)
Reference was often made on the sites to “agendas,” both explicitly and implicitly. For example, in Element 1 (a comment on the Reddit thread), an allusion to a “population control agenda” is made. In Element 2 (from the main body of the YourNewsWire article), the fact that government and health authorities “miss[ed] a tenfold increase in children born with most of their brain missing” is called into question, implying that authorities were not responding to the situation due to ineptitude or deliberate oversight. Element 3 (another comment from the Reddit thread) attempts to connect the outbreak to the upcoming Olympic Games, contending that the virus was deliberately released to coincide with a high “concentration of worldwide foreign travelers to spread the zika virus.”
Oreskes and Conway (2008) discussed how climate change scepticism was manufactured by fostering institutional distrust. Similar to some of the elements found in this article’s analysis, such as beliefs that past viral outbreaks were government engineered, or that pharmaceutical companies or health agencies were intentionally spreading disease to financially or politically benefit themselves, Oreskes and Conway (2008) noted that political “agendas” and secretive ulterior motives were often ascribed to climate change scientists and its proponents, including suggestions that
the threat of global warming had been manufactured by environmentalists based on a “hidden political” agenda against “business, the free market, and the capitalistic system.” The true goal of those involved in the global warming issue was not so much to stop global warming … but rather to foster “international action, preferably with lots of treaties and protocols.” (p. 77)
Rather than critically engaging with the claims and evidence of official accounts, this kind of argument fosters distrust in whatever governing body or authority has provided the official account (or that has some kind of alleged interest—whether it be political or economic—in the issue or event at hand). Goertzel (2010) recounted how a retired physicist inspired distrust in the 1996 Intergovernmental Panel on Climate Change report, not by making a scientific argument concerning the report’s information or conclusions, but rather by attacking the editing procedure of the committee and accusing them of a “major deception on global warming,” which “proved remarkably effective in providing a rallying point for opponents of the report’s conclusions” (p. 497).
Institutional distrust may be particularly common among conspiracy theories involving large industries, which are accused of pursuing profits to the detriment of the public’s interest. Such conspiratorial narratives have surrounded the HIV/AIDS pandemic for decades, including claims that pharmaceutical companies are covering up “natural” treatments such as vitamins or that the virus was produced in government laboratories (Sunstein & Vermeule, 2009). A similar narrative emerged during the H1N1 outbreak in 2009, when it was suggested that pharmaceutical companies were conspiring with the mainstream news media to amplify fear and increase profits (Wagner-Egger, Bangerter, Gilles, Green, Rigaud, Krings, Staerklé, & Clémence, 2011).
A sense of urgency was evident throughout the Reddit thread and Anti-Media and YourNewsWire articles. In the Reddit thread, many commentators asserted that scientists, by creating GM mosquitoes, had created an inevitable crisis with unforeseen implications (see Element 4). Generally, this theme or argument appeals to affective understandings of risk, suggesting imminent or eventual catastrophe (see Element 5). The conspiracy at hand is implicated in some kind of dangerous situation, a menace that has either been deliberately created or unwittingly unleashed; in either imagined scenario, the responsible authorities are taking steps to cover up the evidence to further their goals or hide their culpability. Reassurances from public health authorities and governments are perceived as either conscious misdirection or simply insufficient: it is impossible to discount the alleged danger of the situation.
I also feared that those GM mosquitoes would lead to something unforeseen. … What humans have done is cheat nature and evolution, by suddenly releasing a mutation into the wild, which had no chance to be ‘tested’ by nature in order to see what impact it will have. (Reddit comment)
The World Health Organization announced it will convene an Emergency Committee under International Health Regulations on Monday, February 1, concerning the Zika virus ‘explosive’ spread throughout the Americas. The virus reportedly has the potential to reach pandemic proportions—possibly around the globe. (AntiMedia article)
David Magnus (2008) described how the so-called precautionary principle was originally a resource to “assist in science-based risk assessment, one that would allow regulation in the face of uncertainty,” yet it evolved into “an epistemological hurdle that led to an agnotological strategy” (p. 264). Magnus’ description is related to the form of argument described in this section. In the sources analyzed, the suggestion of a generalized or specific threat was typically predicated on the “uncertainty” of the situation. There is a burden of proof on the authorities to demonstrate that there is no risk, and even when evidence is provided, it is either characterized as dubious or deemed inadequate. Magnus (2008) outlined a similar situation with regard to the debate around genetically engineered organisms (GEOs), describing the strategy by which scientific progress is mired in “political battles” by both calls for more scientific evidence and attempts to discredit whatever evidence is provided.
Emotions and affective understandings of risk may play a central role in this particular framing device. Disasters and existential threats trigger intense feelings of fear and anger, affective factors that make rumours and misinformation more likely to spread and less likely to be critically analyzed (Huang, Starbird, Orand, Stanek, & Pedersen, 2015). To explain this dynamic, Sunstein and Vermeule (2009) argued that there are many potential sources of anxiety in the media landscape, but that moral panics may lead to “emotional selection” and focusing attention on “relatively small sources of risk” that then undergo “emotional snowballing” and the sense of danger and threat is amplified (p. 216). Conspiracies and misinformation about climate change and GMOs similarly present dramatic, disastrous scenarios—collapsing economies resulting from misplaced environmental efforts, or the threat of runaway, irresponsible scientific activity leading to deadly viruses being mistakenly unleashed, or corporations such as Monsanto engaging in plots to topple the agricultural sector and create poisonous foods (Uscinski, Douglas, & Lewandowsky, 2017). Likewise, dramatic, emotional narratives about serious vaccine side effects have been found to impact risk perception and behaviours, “demonstrating the power of emotional appeals and anecdotes” (Kata, 2012, p. 3784).
Conspiracist narratives have been characterized as frequently appealing to historical precedent, “for example arguing that the existence of ‘false flag’ attacks in history makes their existence in the present more likely” (Robertson, 2016, p. 48). Bricker (2013) examined “Climategate” as a case study in the rhetorical construction of a conspiracy theory. The so-called Climategate conspiracy occurred in 2009, when the Climate Research Unit had over 1,000 emails stolen and leaked to the public, with several of the emails widely reported in the media as alleged proof of conspiracy among scientists supporting anthropogenic climate change. Even though independent scientific investigations found that no scientific misconduct had transpired, public opinion of climate science was strongly influenced. Historical precedent played a central role in the lasting and serious effect of Climategate, as the event was framed to the public as part of an ongoing conspiracy, providing a kind of “resonance” to the narrative.
This resonance—a theme or argument that I describe as “references to historical precedent”—is an illusion of credibility fostered through the creation of a historical continuity. Previous events, real or imagined, are drawn into the current conspiracy to trace a seemingly logical or causal progression from a past occurrence to the immediate situation. Elements 6 through 9 all make references to other contagious diseases and illnesses, including AIDS, influenza, Ebola, and H1N1. Element 8 is an external link to a documentary titled The Origin of Aids, which alleges that the cause of the virus “remains a mystery about which the scientific community has long kept silent. Lack of interest, or fear of knowing?”
It makes sense. South America, and Africa are where most of the population growth is happening. The people in these places just won’t use contraceptives it seems. And AIDS kills too slowly. I have been watching the situation, and wondering when something was going to be done. Looks like this is it or at least part of it. (Reddit comment)
Additionally if you think large scale mutations can result in pathogens that severely decrease our numbers; it is a possibility (which is why I don’t support that new controversial research where scientists engineer new flu variants to make new vaccines—airborne pathogens are just way too transmissible to mess around with IMO) but there are also “natural” pathogens that have high lethality, like ebola, as a fairly successful virulence strategy. (Reddit comment)
The Origin of AIDS (Reddit external link)
This article makes perfect sense. So once again the medical mucky mucks are “baffled” by the same “elephant in the room” … that caused similar misery during the whole H1N1 scare and the rush to vaccinate pregnant women which resulted is a 4000% increase in fetal deaths. (YourNewsWire comment)
The theme or argument making references to historical precedent is related to the previous framing device, suggestion of generalized or specific threat, since it draws on understandings of previous disease outbreaks to position the conspiracy at hand as more threatening and concrete by association. Elements 7 and 9 both refer to alleged past examples of irresponsible medical research leading to dangerous vaccines, a scenario in which GM mosquitoes causing a health crisis is seemingly unsurprising because such an event has played out repeatedly in the past. Thus, the dramatic occurrence of a biotech company causing a major epidemic is framed to not seem so dramatic or unbelievable—it has happened before, and so it could be happening now.
Indeed, anti-vaccination messages often take the form of a so-called Galileo gambit—invoking the names of past scientists who were challenged by the scientific dogma of the time—to suggest that because there are historical precedents of the “establishment” shutting down legitimate inquiry, it is reasonable to believe that anti-vax figures are being persecuted or challenged because they do not conform to current dogma (Kata, 2012, p. 3783). Similarly, anti-vaxxers often cite instances where scientific bodies were initially mistaken about the safety of something that was later found to be hazardous, such as Thalidomide or cigarettes. “The implication is that because of previous errors, the science supporting vaccination is also in error” (p. 3783).
This fourth theme or argument takes two general forms. First, it may present “alternative” information that often masquerades as rigorous scientific inquiry, positioning itself as a viable alternative to the mainstream, accepted viewpoint. Appeals to journalistic objectivity or critical thinking demand that such alternative viewpoints be considered in the name of fairness, free speech, or scientific investigation, perhaps relying on the “well-established rhetorical meme … of the courageous independent scientist resisting orthodoxy” (Goertzel, 2010, p. 496). Elements 10 through 13 are examples of this framing device.
These genes have sequences called promoters which bind due to some stimuli, and this binding allows for gene readout to begin after it. In viral techniques, there is a growing concern since it seems that in some cases active viral genes are also transferred to the organism. Sometimes it’s because we thought they were inert, other times it’s just carelessness. (Reddit comment)
The particular strain of Oxitec GM mosquitoes, OX513A, are genetically altered so the vast majority of their offspring will die before they mature—though Dr. Ricarda Steinbrecher published concerns in a report in September 2010 that a known survival rate of 3–4 percent warranted further study before the release of the GM insects. Her concerns, which were echoed by several other scientists both at the time and since, appear to have been ignored—though they should not have been. (AntiMedia article)
When examining a rapidly expanding potential pandemic, it’s necessary to leave no stone unturned so possible solutions, as well as future prevention, will be as effective as possible. (AntiMedia article)
At the time, concerns were raised about the release of GMMs without further studies into possible side effects. “It’s a very experimental approach which has not yet been successful and may cause more harm than good,” Dr Helen Wallace, director of GeneWatch, told the Guardian in 2012. (RT article)
Element 11, which is text from the main body of the Anti-Media article, makes reference to a study that it frames as a dissenting scientific opinion. In the cited study, it was noted that a very small portion of the offspring of the GM mosquitoes released in Brazil could survive (despite being genetically altered so that they would die before maturing). This finding is positioned as proof that GM mosquitoes could cause Zika, which authorities overlooked (or perhaps wilfully ignored). This kind of rhetorical move—the theme here characterized as an “appeal to an alternative knowledge community”—often involves the appropriation of legitimate scientific information, misinterpreting or decontextualizing it in a manner that lends credence to the proposed conspiracy theory. Similarly, proponents of Intelligent Design—a religiously motivated “alternative theory” to evolution—use a veneer of scientific rigour to position their theory as a legitimate alternative (Ceccarelli, 2011).
In addition to the presentation of alternative information, the second form this theme or argument takes is inviting readers to join a community of truth seekers or critical thinkers. Throughout the Reddit thread, and the Anti-Media and YourNewsWire articles and comment sections, there are calls to evaluate the information and question the mainstream explanations and narratives; commenters characterize themselves as critical thinkers who know better than to accept the official accounts from governments and other authorities.
By invoking an alternative knowledge community, conspiratorial narratives are reframed from fringe beliefs into “another way of knowing” (Kata, 2012, p. 3783). Subscribing to the narrative is presented as being critical and reflective, perhaps by identifying apparent gaps in scientific knowledge or claiming that it is necessary to take a balanced approach that assesses “the other side.” In the words of Goertzel (2010), “Dissenters from mainstream science often invoke a meme that there are two sides to every question and each side is entitled to equal time to present its case” (p. 496), a rhetorical move that reframes the dissent so it is not being “anti-science” or rejecting the facts, but rather presenting an alternative scientific viewpoint.
This article examined both the construction and circulation of a conspiracy theory, characterizing the framing devices present and tracing the flow of the (mis)information across networks—a dynamic that serves to spread misinformation, ignorance, and ultimately, in many cases, undue fear and panic that negatively affects public health policy. Through the network analysis of Twitter activity surrounding GM mosquitoes and Zika, it was found that there is a sharp divide in the network: traditional news sources and alternative/conspiracy news sources are mostly isolated from one another. This suggests that people viewing and sharing the conspiratorial narratives are unlikely to encounter or engage with information supporting the release of GM mosquitoes, such as an article by theGuardianthat was widely shared and engaged with by supporters yet not by sceptics. The flow of the misinformation was documented, starting with the original Reddit post, spreading to three popular articles from different conspiracy sites (Anti-Media, RT.com, and YourNewsWire), and then spreading out to a network of other sites. Notably, the volume and sentiment of tweets concerning GM mosquitoes and Zika changed drastically once this misinformation began spreading, with 101 tweets the week before compared to 992 the week after, and 1.3 percent expressing a negative sentiment before compared to 79.7 percent after. These findings support the idea that due to self-selecting behaviours and social media algorithms, the networks that spread misinformation—and the ones that attempt to counter it—may be isolated from one another (Bode & Vraga, 2015; Diresta, 2018; Svalastog, Allgaier, & Gajović, 2015).
Finally, the frame analysis uncovered four main framing devices: institutional distrust, suggestion of generalized or specific threat, references to historical precedent, and appeals to an alternative knowledge community. The prevalent institutional distrust is in line with previous work on anti-vaxxer beliefs and climate change scepticism, which suggests that such conspiratorial beliefs are not based on the rejection of specific scientific evidence, but rather a broader distrust of institutions such as governments, health agencies, corporations, and scientific research in general (Attwell, Leask, Meyer, Rokkas, & Ward, 2017; Goertzel, 2010; Uscinski et al., 2017). This in turn suggests that in terms of effectively countering such conspiratorial narratives and misinformation, “more evidence, statistics and debunking strategies are the least likely to work” (Greenberg, Dubé, & Driedger, 2017, para. 32; also see: Peter & Koch, 2016). In the case of anti-vaxxer beliefs, it is often mistakenly assumed that this stance is a result of lacking knowledge and information—rather than underlying distrust toward the institutions that generate and propagate scientific knowledge, which results in anti-vaxxers doubting the veracity of the data and intentions of the people and organizations involved in producing it (Greenberg et al., 2017).
The other frames uncovered in this study similarly suggest that an information deficit is not the source of the conspiratorial beliefs, and that health authorities merely sharing accurate information would not be an effective means of countering these beliefs. For example, the suggestion of generalized or specific threat indicates that such distrust is motivated by affective understandings of risk, a situation in which people are “invested in their belief systems” and “it is unlikely that ‘the facts’ alone will ever sway the truly committed” (Kata, 2012, p. 3784). Yang and Chu (2018) found that U.S. public perceptions of the 2014 Ebola outbreak were connected to emotions such as fear, anger, and anxiety. They posited that such emotions and concerns about a domestic outbreak might have inhibited how people processed information and affected support for “institutional mitigation measures such as sending more health professionals to help countries in West Africa deal with the Ebola outbreak” (p. 834). For the conspiratorial narratives concerning Zika, framing that involved historical precedents and appeals to alternative knowledge communities indicate that people may attempt to reframe their beliefs, presenting their viewpoint as something based on established history and alternative scientific facts rather than an anti-science stance.
If providing accurate information is not an effective counter to conspiratorial narratives, how can the spread of health and science misinformation be combatted? Some work has suggested the use of “inoculating messages” and pre-emptive attempts to raise awareness of the flawed argumentation techniques prevalent in misinformation (Cook, Lewandowsky, & Ecker, 2017). Roozenbeek and van der Linden (2018) contend that such methods are particularly useful given the rapid spread of misinformation through social and digital media. This makes it difficult to address such misinformation and offer a corrective after the fact. Further, although pre-emptive engagement with material that addresses and counters conspiracy theories appears to offer a “protective” effect, such conspiratorial beliefs are difficult to change once they are established (Jolley & Douglas, 2017).
Health- and science-related conspiracy theories and misinformation can result in behaviours that are detrimental to public health, such as forgoing vaccinations in the case of anti-vaxxer beliefs (Greenberg et al., 2017; Jolley & Douglas, 2017); pushing for harmful border closures and travel restrictions during epidemics (SteelFisher et al., 2015; Towers et al., 2015; Yang & Chu, 2018; York, 2014); or opposing beneficial policies and interventions such as the release of GM mosquitoes to reduce the spread of disease. Investigating how such conspiratorial narratives are discursively framed and circulated can help us understand how to more effectively address or pre-empt this misinformation.
Anderson, Janna, & Rainie, Lee. (2017, October 19). The future of truth and misinformation online. Pew Research Center. URL: http://www.pewinternet.org/2017/10/19/the-future-of-truth-and-misinformation-online [November 19, 2017].
Annenberg Public Policy Center. (2016). ZIKA February 12–16, 2016 survey (Week 1). URL: http://www.annenbergpublicpolicycenter.org/half-of-americans-concerned-zika-will-spread-to-their-neighborhoods [February 12, 2016].
The Anti-Media. (2016). About us. URL: http://theantimedia.org/about [March 2, 2016].
Attwell, Katie, Leask, Julie, Meyer, Samantha B., Rokkas, Philippa, & Ward, Paul. (2017). Vaccine rejecting parents’ engagement with expert systems that inform vaccination programs. Journal of Bioethical Inquiry, 14(1), 65–76.
Bateman, Tiffany. (2016, January 28). Zika virus: 6 things to know about the growing outbreak. CBC News. https://www.cbc.ca/news/health/zika-virus-explained-1.3422663 [December 3, 2016].
Bernish, Claire. (2016, January 28). Zika outbreak epicenter in same area where GM mosquitoes were released in 2015.Anti-Media.URL: http://theantimedia.org/zika-outbreak-epicenter-in-same-area-where-gm-mosquitoes-were-released-in-2015 [February 16, 2016].
Bhattacharya, S., Srinivasan, P., & Polgreen, P. (2014). Engagement with health agencies on Twitter. PloS One, 9(11), e112235.
Bode, Leticia, & Vraga, Emily K. (2015). In related news, that was wrong: The correction of misinformation through related stories functionality in social media. Journal of Communication, 65(4), 619–638.
Boyd, Amanda D., Jardine, Cynthia G., & Driedger, Michelle S. (2009). Canadian media representations of mad cow disease. Journal of Toxicology and Environmental Health, Part A, 72(17–18), 1096–1105.
Bozdag, Engin & van den Hoven, Jeroen (2015). Breaking the filter bubble: Democracy and design. Ethics and Information Technology, 17(4), 249–265.
Bricker, Brett J. (2013). Climategate: A case study in the intersection of facticity and conspiracy theory. Communication Studies, 64(2), 218–239.
Briones, R., Nan, X., Madden, K., & Waks, L. (2012). When vaccines go viral: An analysis of HPV vaccine coverage on YouTube. Health Communication, 27(5), 478–485.
Carson, James. (2019, February 18). Fake news: What exactly is it—and how can you spot it? The Telegraph. URL: https://www.telegraph.co.uk/technology/0/fake-news-exactly-has-really-had-influence [February 26, 2019].
Ceccarelli, Leah. (2011). Manufactured scientific controversy: Science, rhetoric, and public debate. Rhetoric and Public Affairs, 14(2), 195–228.
Centers for Disease Control and Prevention. (2014). CDC telebriefing: CDC update on first Ebola case diagnosed in the United States. URL: http://www.cdc.gov/media/releases/2014/t1008-ebola-confirmed-case.html [February 5, 2016].
Chou, Wen-ying Sylvia, Hunt, Yvonne M., Beckjord, Ellen Burke, Moser, Richard P., & Hesse, Bradford W. (2009). Social media use in the United States: Implications for health communication. Journal of Medical Internet Research, 11(4), e48.
Cook, J., Lewandowsky, S., & Ecker, U.K.H. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS One, 12(5), e0175799.
Corner, Adam J., Whitmarsh, Lorraine E., & Xenias, Dimitrios. (2012). Uncertainty, scepticism and attitudes towards climate change: Biased assimilation and attitude polarisation. Climatic Change, 114(3), 463–478.
Daudin, Guillaume. (2018, December 15). Fake news vs fact in online battle for truth. Phys.org. URL: https://phys.org/news/2018-12-fake-news-fact-online-truth.html [February 26, 2019].
Davison, Patrick. (2012). The language of internet memes. In Michael Mandiberg (Ed.), The Social Media Reader(pp. 120–134). New York, NY: New York University Press.
Diresta, Renee. (2018, November 13). Online conspiracy groups are a lot like cults. Wired. URL: https://www.wired.com/story/online-conspiracy-groups-qanon-cults [February 26, 2019].
Dyar, Oliver J., Castro-Sánchez, Enrique, & Holmes, Alison H. (2014). What makes people talk about antibiotics on social media? A retrospective analysis of Twitter use. Journal of Antimicrobial Chemotherapy, 69(9), 2568–2572.
Erlanger, Steven. (2017, March 8). What is RT? The New York Times. URL: https://www.nytimes.com/2017/03/08/world/europe/what-is-rt.html [June 4, 2017].
Faasse, Kate, Chatman, Casey J., & Martin, Leslie R. (2016). A comparison of language use in pro- and anti-vaccination comments in response to a high profile Facebook post. Vaccine, 34(47), 5808–5814.
Fitzpatrick, Mike. (2009). Swine flu panic. British Journal of General Practice, 59(563), 457.
Fung, Isaac C., Tse, Zion T., Cheung, Chi-Ngai, Miu, Adriana S., & Fu, King-Wa. (2014). Ebola and the social media [Correspondence]. Lancet, 384(9961), 2207.
Gallie, Walter B. (1964). Essentially contested concepts. In Walter B. Gallie (Ed.), Philosophy and the Historical Understanding(pp. 157–191). London: Chatto & Windus.
Gerlach, Neil, & Hamilton, Sheryl N. (2014). Trafficking in the zombie: The CDC zombie apocalypse campaign, diseaseability and pandemic culture. Refractory: A Journal of Entertainment Media. URL: http://refractory.unimelb.edu.au/2014/06/26/cdc-zombie-apocalypse-gerlach-hamilton [June 3, 2017].
Gesser-Edelsburg, Anat, Diamant, Alon, Hijazi, Rana, & Mesch, Gustavo S. (2018). Correcting misinformation by health organizations during measles outbreaks: A controlled experiment. PloS One, 13(12), e0209505.
Girginova, Katerina. (2015). New media, creativity, and the Olympics: A case study into the use of #NBCFail during the Sochi Winter Games. Communication & Sport, 4(3), 243–260.
Goertzel, Ted. (1994). Belief in conspiracy theories. Political Psychology, 15(4), 731–742.
Goertzel, Ted. (2010). Conspiracy theories in science. EMBO Reports, 11(7), 493–499.
Greenberg, Josh, Dubé, Ève, & Driedger, Michelle. (2017). Vaccine hesitancy: In search of the risk communication comfort zone. PLoS One. URL: http://currents.plos.org/outbreaks/index.html%3Fp=70808.html [January 2, 2017].
Grennell, Amanda. (2018, July 6). What happened to Zika? PBS News Hour. URL:https://www.pbs.org/newshour/science/what-happened-to-zika [January 2, 2017].
Grimes, David R. (2016). On the viability of conspiratorial beliefs. PLoS One, 11(1), e0147905.
Gruzd, Anatoliy. (2016). Netlytic: Software for automated text and social network analysis. URL: http://Netlytic.org [January 2, 2017].
Gruzd, Anatoliy., Wellman, Barry, & Takhteyev, Yuri. (2011). Imagining Twitter as an imagined community. American Behavioral Scientist, 55(10), 1294–1318.
Heldman, Amy B., Schindelar, Jessica, & Weaver III, James B. (2013). Social media engagement and public health communication: Implications for public health organizations being truly “social.” Public Health Reviews, 35(13), 1–18.
Himelboim, Ital, Smith, Marc, & Shneiderman, Ben. (2013). Tweeting apart: Applying network analysis to detect selective exposure clusters in twitter. Communication Methods and Measures, 7(3), 169–223.
Howell, L. (2013). Digital wildfires in a hyperconnected world. World Economic Forum report. URL: http://reports.weforum.org/global-risks-2013/risk-case-1/digital-wildfires-in-a-hyperconnected-world [January 24, 2017].
Huang, Linlin Y., Starbird, Kate, Orand, Mania, Stanek, Stephanie A., & Pedersen, Heather T. (2015). Connected through crisis: Emotional proximity and the spread of misinformation online. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing(pp. 969-980). New York, NY: ACM.
Ironstone-Catterall, Penelope. (2011). Narrating the coming pandemic: Pandemic influenza, anticipatory anxiety, and neurotic citizenship. In Paul Crosthwaite (Ed.), Criticism, crises, and contemporary narrative: Textual horizons in an age of global risk (pp. 81–94). London: Routledge.
Joffe, Hélène, & Haarhoff, Georgina. (2002). Representations of far-flung illnesses: The case of Ebola in Britain. Social Science and Medicine, 54(6), 955–969.
Jolley, Daniel, & Douglas, Karen M. (2017). Prevention is better than cure: Addressing anti-vaccine conspiracy theories. Journal of Applied Social Psychology, 47(8), 459–469.
Kasperson, Jeanne X., Kasperson, Roger E., Pidgeon, Nick, & Slovic, Paul. (2003). The social amplification of risk: Assessing fifteen years of research and theory. In Nick Pidgeon & Roger E. Kasperson (Eds.), The social amplification of risk(pp. 13–46). New York, NY: Cambridge University Press.
Kata, Anna. (2012). Anti-vaccine activists, Web 2.0, and the postmodern paradigm—An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine, 30(25), 3778–3789.
Korda, Holly, & Itani, Zena. (2013). Harnessing social media for health promotion and behavior change. Health Promotion Practice, 14(1), 15–23.
Leach, Melissa, & Dry, Sarah. (2010). Epidemic narratives. In Sarah Dry & Melissa Leach (Eds.), Epidemics: Science, governance and social justice(pp. 1–22). New York, NY: Routledge.
Levin, Sam. (2018). Facebook has a fake news ‘war room’—but is it really working? The Guardian. URL: https://www.theguardian.com/technology/2018/oct/18/facebook-war-room-social-media-fake-news-politics [February 26, 2019].
Levin, Sam. (2019). Snopes quits Facebook’s factchecking program amid questions over its impact. The Guardian. URL: https://www.theguardian.com/technology/2019/feb/01/snopes-facebook-factchecking-program-false-news [February 26, 2019].
Lynas, Mark. (2016, February 4). Alert! There’s a dangerous new viral outbreak: Zika conspiracy theories. The Guardian. URL: http://www.theguardian.com/world/2016/feb/04/alert-theres-a-dangerous-new-viral-outbreak-zika-conspiracy-theories [February 18, 2016].
Ma, Jinxuan, & Stahl, Lynne. (2017). A multimodal critical discourse analysis of anti-vaccination information on Facebook.Library and Information Science Research, 39(4), 303–310.
Magnus, David. (2008). Risk management versus the precautionary principle. In Robert N. Proctor & Linda Schiebinger (Eds.), Agnotology: The making and unmaking of ignorance(pp. 250–265). Stanford, CA: Stanford University Press.
Marwick, Alice, & Lewis, Rebecca. (2017). Media manipulation and disinformation online. New York, NY: Data & Society Research Institute.
Meylakhs, Peter, Rykov, Yuri, Koltsova, Olessia, & Koltsov, Sergey. (2014). An AIDS-denialist online community on a Russian social networking service: Patterns of interactions with newcomers and rhetorical strategies of persuasion. Journal of Medical Internet Research, 16(11), e261.
Mollema, L., Harmsen, I.A., Broekhuizen, E., Clijnk, R., De Melker, H., Paulussen, T., Kok, G., Ruiter, R., & Das, E. (2015). Disease detection or public opinion reflection? Content analysis of tweets, other social media, and online newspapers during the measles outbreak in the Netherlands in 2013. Journal of Medical Internet Research, 17(5), e128.
Nagpal, Sajan J.S., Karimianpour, Ahmadreza, Mukhija, Dhruvika, & Mohan, Diwakar. (2015). Dissemination of ‘misleading’ information on social media during the 2014 Ebola epidemic: An area of concern [Correspondence]. Travel Medicine and Infectious Disease, 13(4), 338–339.
Neiger, Brad L., Thackeray, Rosemary, Burton, Scott H., Thackeray, Callie R., & Reese, Jennifer. (2013). Use of Twitter among local health departments: An analysis of information sharing, engagement, and action. Journal of Medical Internet Research, 15(8), e177.
Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine, 33(3), 459–464.
Oreskes, Naomi, & Conway, Erik M. (2008). Challenging knowledge: How climate science became a victim of the cold war. In Robert N. Proctor & Londa Schiebinger (Eds), Agnotology: The Making and Unmaking of Ignorance(pp. 55–89). Stanford, CA: Stanford University Press.
Pan American Health Organization. (2018, January 4). Zika cases and congenital syndrome associated with Zika virus reported by countries and territories in the Americas, 2015–2018: Cumulative cases. URL: https://www.paho.org/hq/index.php?option=com_docman&view=download&category_slug=cumulative-cases-pdf-8865&alias=43296-zika-cumulative-cases-4-january-2018-296&Itemid=270&lang=en [February 26, 2019].
Pariser, Eli. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Chicago, IL: Penguin.
Peter, Christina, & Koch, Thomas. (2016). When debunking scientific myths fails (and when it does not): The backfire effect in the context of journalistic coverage and immediate judgments as prevention strategy. Science Communication, 38(1), 3–25.
Procter, Rob, Vis, Farida, & Voss, Alex. (2013). Reading the riots on Twitter: Methodological innovation for the analysis of big data. International Journal of Social Research Methodology, 16(3), 197–214.
Reddit. (2016). About reddit. URL: https://www.reddit.com/about/ [March 21, 2016].
Redditsucksatbanning. (2016, January 25). Genetically modified mosquitoes released in Brazil in 2015 linked to the current Zika epidemic? [Posted message]. URL: https://www.reddit.com/r/conspiracy/comments/42mhii/genetically_modified_mosquitoes_released_in [February 14, 2016].
Robertson, David G. (2016). UFOs, Conspiracy theories and the new age: Millennial conspiracism. New York, NY: Bloomsbury Publishing.
Rodriguez-Morales, Alfonso J., Castañeda-Hernández, Diana M., & McGregor, Alastair. (2015). What makes people talk about Ebola on social media? A retrospective analysis of Twitter use. Travel Medicine and Infectious Disease, 13(1), 100–101.
Roozenbeek, Jon, & van der Linden, Sander. (2018). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research. doi:10.1080/13669877.2018.1443491.
Ryan, Charlotte, Carragee, Kevin M., & Meinhofer, William. (2001). Theory into practice: Framing, the news media, and collective action. Journal of Broadcasting & Electronic Media, 45(1), 175–182.
Scanfeld, Daniel, Scanfeld, Vanessa, & Larson, Elaine L. (2010). Dissemination of health information through social networks: Twitter and antibiotics. AJIC: American Journal of Infection Control, 38(3), 182–188.
Schipani, Vanessa. (2016, February 23). A conspiracy theory links the Gates Foundation to the spread of Zika virus. Don’t believe it. Huffington Post. URL: http://www.huffingtonpost.com/entry/no-genetically-modified-mosquitoes-did-not-cause-zika-virus_us_56cc7b26e4b041136f184f64 [March 2, 2016].
Signorini, Alessio, Segre, Alberto M., & Polgreen, Philip M. (2011). The use of Twitter to track levels of disease activity and public concern in the U.S. during the influenza A H1N1 pandemic. PloS One, 6(5), e19467.
Simis, Molly J., Madden, Haley, Cacciatore, Michael A., & Yeo, Sara K. (2016). The lure of rationality: Why does the deficit model persist in science communication? Public Understanding of Science, 25(4), 400–414.
Slovic, Paul. (1987). Perception of risk. Science, 236(4799), 280–285.
Specter, Michael. (2016, February 25). The dangerous conspiracy theories about the Zika virus. The New Yorker. URL: http://www.newyorker.com/news/daily-comment/the-dangerous-conspiracy-theories-about-the-zika-virus [February 27, 2016].
Statista. (2017). Leading social networks worldwide as of August 2017, ranked by number of active users (in millions). Statista.com. URL: https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users [February 26, 2019].
SteelFisher, Gillian K., Blendon, Robert J., & Lasala-Blanco, Narayani. (2015). Ebola in the United States: Public reactions and implications. New England Journal of Medicine, 373,789–791.
Sumiala, Johanna, Tikka, Minttu, Huhtamäki, Jutta, & Valaskivi, Katja. (2016). #JeSuisCharlie: Towards a multi-method study of hybrid media events. Media and Communication, 4(4), 97–108.
Sunstein, Cass R., & Vermeule, Adrian. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17(2), 202–227.
Svalastog, Anna L., Allgaier, Joachim, & Srećko Gajović, S. (2015). Navigating knowledge landscapes: On health, science, communication, media, and society. Croatian Medical Journal, 56(4), 321–323.
Towers, Sherry, Afzal, Shehzad, Bernal, Gilbert, Bliss, Nadya, Brown, Shala, Espinoza, Baltazar, … Castillo-Chavez, Carlos. (2015). Mass media and the contagion of fear: The case of Ebola in America. PloS One, 10(6). doi: 10.1371/journal.pone.0129179
Uldam, Julie & Askanius, Tina (2013). Online civic cultures? Debating climate change activism on YouTube. International Journal of Communication, 7(2013), 1185–1204.
Uscinski, Joseph E., Douglas, Karen, & Lewandowsky, Stephan. (2017, September 26). Climate change conspiracy theories. Oxford Research Encyclopedia of Climate Science. URL: http://oxfordre.com/climatescience/view/10.1093/acrefore/9780190228620.001.0001/acrefore-9780190228620-e-328 [December 3, 2017].
Vicario, Michela Del, Bessi, Alessandro, Zollo, Fabiana, Petroni, Fabio, Scala, Antonio, Caldarelli, Guido, Stanley, H. Eugene, & Quattrociocchi, Walter. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America, 113(3), 554–559.
Wagner-Egger, Pascal, Bangerter, Adrian, Gilles, Ingrid, Green, Eva, Rigaud, David, Krings, Franciska, Staerklé, Christian, & Clémence, Alain. (2011). Lay perceptions of collectives at the outbreak of the H1N1 epidemic: Heroes, villains and victims. Public Understanding of Science, 20, 461–476.
Wald, Priscilla. (2008). Contagious: Cultures, carriers, and the outbreak narrative. Durham, NC: Duke University Press.
Washer, Peter. (2004). Representations of SARS in the British newspapers.Social Science and Medicine, 59(12), 2561–2571.
Waterson, Jim. (27 July, 2018). Democracy at risk due to fake news and data misuse, MPs conclude. The Guardian. URL: https://www.theguardian.com/technology/2018/jul/27/fake-news-inquiry-data-misuse-deomcracy-at-risk-mps-conclude [February 26, 2019].
Wilcox, Christie. (2016, January 31). No, GM mosquitoes didn’t start the Zika outbreak. Discover. URL: http://blogs.discovermagazine.com/science-sushi/2016/01/31/genetically-modified-mosquitoes-didnt-start-zika-ourbreak/#.VwQgRfkrKUl [February 13, 2016].
World Health Organization. (2016). Zika situation report. URL: http://apps.who.int/iris/bitstream/10665/204348/1/zikasitrep_5Feb2016_eng.pdf?ua=1 [February 17, 2016].
Yang, Janet Z., & Chu, Haoran. (2018). Who is afraid of the ebola outbreak? The influence of discrete emotions on risk perception. Journal of Risk Research, 21(7), 834–853.
York, Geoffrey. (2014, August 14). World moves to cut off West Africa as Ebola panic intensifies. The Globe and Mail.URL: http://www.theglobeandmail.com/life/health-and-fitness/health/ebola-hysteria-sweeps-africa/article20072025 [February 19, 2016].
Zika virus infection fast facts. (2016, July 18). CNN.com. URL: https://www.cnn.com/2016/07/18/health/zika-virus-infection-fast-facts/index.html [June 6, 2017].