20 April 2020

As the Coronavirus Spreads, Conspiracy Theories Are Going Viral Too

BY ELISE THOMAS
Source Link

Long dismissed as absurd, conspiracy theorists on social media are increasingly posing a potential global threat—and becoming an asset for states looking to disrupt the geopolitical narrative and spread disinformation.

In the context of the global COVID-19 crisis, conspiracy theories have exploded across digital news sites and social media. While propaganda campaigns amid pandemics are nothing new, what is new in the current crisis is the global information environment in which it is playing out. The all-too-real impacts and stresses of the pandemic feed into the preexisting dynamics of the online information ecosystem, amplifying rumors, misinformation, conspiracies, and outright lies. For governments seeking to build trust and communicate clearly, it’s a nightmare. For those looking to sow chaos and doubt, it’s an opportunity.

There is a concept in social media studies known as “context collapse.” Usually attributed to the researcher Danah Boyd, it refers to the way in which social media platforms take messages that the sender intended to be seen by one audience in a given context and serve them up to others who were not the intended targets.


Chances are you’ve experienced it yourself. Many of us have felt the awkwardness of posting a joke on Facebook with the intention of sharing it with your friends and instead your grandma replies or making some less-than-professional comments on a personal Twitter account only to have your boss bring it up on Monday. The nature of social media platforms has a way of smashing social contexts into one another so that messages tailored for one audience end up hitting others as well and being interpreted in unanticipated ways.
The nature of social media platforms has a way of smashing social contexts into one another so that messages tailored for one audience end up hitting others as well and being interpreted in unanticipated ways.

What the COVID-19 crisis is demonstrating is that this dynamic does not just apply to individual social media users managing personal and professional relationships; it also applies to the cacophony of conspiracy theories raging across the screens and through the minds of social media users around the world.

In the past, pandemic-related conspiracy theories and rumors in London, Tehran, Kinshasa, Shenzhen, and Moscow would have been different. In an era of global social media platforms, however, the dynamics of context collapse mean that conspiracy theories promoted by users in one place are colliding with users in others. The fragmented nature of social media chops conspiracies into little pieces—a factoid here, a false claim there—creating a kind of information petri dish for conspiracy cross-propagation, allowing half-true facts, decontextualized narratives, and false beliefs to flow and fold into one another and spread rapidly across the world.

One of the key ways in which this happens is through the use of hashtags. Common conspiracy hashtags, for example #coronahoax or #covid19hoax, are used by multiple groups of conspiracists in various countries and serve as vectors of transmission between them as, for example, 5G and anti-vaccination conspiracy theorists scroll through the hashtag and encounter far-right or QAnon content and vice versa. Conspiracy theorists also actively use hashtags to attempt to spread their message around the world, for example by using multiple region-specific hashtags.

Screenshot of #covid19hoax hashtag showing multiple conspiracies, including anti-5G, Agenda 21, anti-climate change and pro-Trump conspiracies. Captured April 7.

This conspiracy theory contagion has the overall effect of amplifying and strengthening conspiracy theories, partly due to the nature of social media algorithms, which are designed to optimize for engagement. On a basic level, the more conspiracists you have believing in a particular untruth, the more content they generate promoting that untruth and the more they engage with that content.

For example, there could be five different conspiracy theories, but if they all contain the untruth that the coronavirus was created in the Fort Detrick lab in Maryland, the result overall is far more content connecting Fort Detrick to COVID-19 than would have been the case if the untruth were confined to one conspiracy. Algorithms designed to optimize for engagement will factor in the high level of engagement on content that connects Fort Detrick to COVID-19 and start actively recommending the conspiracy to other users. For example, as of April 8, Google’s top recommended related searches for “Fort Detrick” included “Fort Detrick coronavirus” and “Fort Detrick bioweapon.”

This conspiracy theory contagion has the overall effect of amplifying and strengthening conspiracy theories, partly due to the nature of social media algorithms, which are designed to optimize for engagement.

These dynamics are having real-world consequences. For example, recent attacks on telecommunications infrastructure in the United Kingdom have been directly linked to conspiracy theories in which the COVID-19 crisis has been folded into existing anti-vaccination and anti-5G narratives. Conspiracy theories about the supposed health effects of 5G have been spreading like wildfire across social media in recent years, in part because the conspiracy often runs through long-established anti-vaccination groups, many of which now believe 5G is either causing illness directly or that it is a deliberate effort to use radiation to weaken immune systems to force everyone to accept vaccinations.

The specifics of how COVID-19 is grafted onto these preexisting conspiracies vary; some say the coronavirus crisis is a cover to fast-track the implementation of 5G networks, while others believe 5G trials in Wuhan, China, damaged the immune systems of residents as part of a wider plan to impose forced vaccinations. Some claim that maps of 5G hot spots match up with COVID-19 outbreaks or think it has something to do with interfering with atmospheric oxygen (or maybe it’s a plan to turn humanity into cyborgs, which is also in there). Some strains combine all of the above, spinning a nonsensical narrative about an effort by Microsoft co-founder Bill Gates to depopulate the planet using vaccines, 5G, and the coronavirus.

Google has banned advertising on search terms related to the 5G coronavirus conspiracy, and Twitter and Facebook are increasing their efforts to crack down on 5G coronavirus conspiracies on their platforms. It is not clear that this will be successful in preventing the spread of the conspiracy, however, firstly because removing conspiracy content can itself fuel conspiracies by creating a sense of victimization and being censored (“Here’s what they don’t want you to know!”) and secondly because the widespread mainstream media coverage that followed the attacks and highlighted the conspiracy will inevitably drive more users to search for information about it, thereby both spreading the theory and driving algorithmic feedback loops.

Acts of alleged domestic terrorism have also been linked to coronavirus-related conspiracies in recent weeks. In the United States, on March 31, 44-year-old Eduardo Moreno deliberately derailed a train in the Port of Los Angeles near the USNS Mercy, a naval ship being used in COVID-19 response efforts.

In his interview with the police, Moreno said his goal was to draw attention to the Mercy, which he believed “had an alternate purpose related to COVID-19 or a government takeover.” The case is being investigated by both the Los Angeles Port Police and the FBI’s Joint Terrorism Task Force. It is not clear yet which of several possible conspiracy theories about the Mercy Moreno subscribes to—a fact that speaks volumes in itself.

More #covid19hoax disinformation, showing 5G conspiracies and a hoax that the coronavirus originated in a U.S. lab. Captured on April 7.

Beyond domestic terrorism concerns, and broader concerns about the erosion of truth and trust in basic facts, the dynamics of conspiracy collapse matter geopolitically because it makes conspiracy theories a tremendously resource-efficient way for state actors and others to contest or undermine basic facts.

Russia has used conspiracy theories as a weapon against the West for decades. In a fascinating parallel to today, during the 1980s the Kremlin engaged in a yearslong disinformation campaign known as Operation Infektion to spread the conspiracy theory that HIV was a bioweapon created by the United States—also in Fort Detrick, which has become a staple of bioweapons conspiracy theories and also featured in conspiracies about Ebola and anthrax. History doesn’t repeat itself, but it can be digitally remastered.

Operation Infektion was a long-term, resource-intensive campaign involving multiple Soviet-funded print and radio outlets and took months and in some cases years to disseminate its narrative across the world. Today’s conspiracies, by contrast, are launched into an global information infrastructure optimized for virality.

In 2020, conspiracy theories can reach almost anywhere, almost instantly, and at incredibly low cost.

In 2020, conspiracy theories can reach almost anywhere, almost instantly, and at incredibly low cost. The internet has also helped erode the gatekeeping powers of traditional media, allowing all sorts of convenient fellow travelers—domestic conspiracy theorists, political pundits, concerned and confused ordinary social media users—to become witting or unwitting vectors, producing content and amplifying the narrative independently at no cost at all. The evolution of the information environment in recent decades has served to make conspiracy theories a vastly quicker, cheaper, and more effective tool for spreading distrust and disinformation.

Russia has been quick to grasp the opportunities presented by the new information environment. Increasingly, it appears China is also coming to see the appeal. Chinese state media outlets and diplomats on Twitter have fostered multiple conspiracy theories about the coronavirus, whether by amplifying fringe Western media outlets to promote a U.S. origin conspiracy narrative or through twisting the words of the Italian doctor Giuseppe Remuzzi to imply the virus might have originated in Italy.

What this reaffirms is that the actual details of the conspiracy don’t matter much, so long as it sows confusion and doubt. Instead of the laborious planting of narratives involved in other forms of information operations, it’s more like tossing a handful of dandelion seeds into the wind; all you have to do is wait and see how far they spread.

Conspiracy theories are a tremendously resource-efficient way for state actors and others to contest or undermine basic facts.

You don’t even have to supply the seeds yourself; it’s easier and more effective to harvest what’s already growing. Take the case of Maatje Benassi, a 52-year-old cyclist who races for the U.S. Military Endurance Sports Team. On March 23, the U.S. conspiracy theorist George Webb published a video on YouTube, labeling her as the “patient zero” who transmitted the COVID-19 bioweapon from Fort Detrick to China when she competed at the Military World Games in Wuhan in October 2019. (It should be emphasized that there is no evidence whatsoever for this conspiracy theory.) Webb is a well-established YouTuber with a long back catalogue of largely U.S.-centric conspiracy promotion, and his video about Benassi appears to have been aimed squarely at his usual audience of conspiracy and right-wing viewers. But this time, others were watching too.

Chinese state media outlets picked up the conspiracy, amplifying it across multiple Chinese-language outlets and on WeChat. Google Trends data shows a surge in interest in Benassi, particularly from China and other parts of Asia, on March 24. Enough users have been searching for the conspiracy that Google recommends “Maatje Benassi coronavirus” and “Maatje Benassi patient zero” as related searches for Benassi’s name. Within days, the conspiracy was running riot across Facebook, Twitter, and YouTube in a range of languages and had been picked up by digital media around the world from Indonesia to Iran and Kashmir to Cuba.

Benassi’s case is far from unique. Similar dynamics are playing out across multiple conspiracy narratives. For example, the Czech research scientist Sona Pekova commented in an interview with a Slovak TV channel that she believed some mutations in the virus did not appear to be of natural origin (although she later clarified that she was not claiming they were definitely artificial either, only atypical).

The actual details of the conspiracy don’t matter much, so long as it sows confusion and doubt.

In an eerie parallel to what happened to the Italian doctor, Pekova’s comments were picked up by a Hong Kong-based pro-China media outlet under the headline “Czech molecular biologist, Dr. Soňa Peková explains in layman terms that COVID-19 virus originates from a lab in U.S. and not China.” (She didn’t.) The story claimed that Pekova had “said that it is not China that needs to refute anything about this theory [that the virus was created in a lab].” Pekova in fact said nothing of the kind in the interview, but this has not stopped the article from being widely promoted across social media as evidence for the U.S. origin conspiracy.

Another case is Huang Yanling, a Chinese research scientist who yet another YouTube conspiracy video blamed for having created COVID-19 in a lab in Wuhan. The video was picked up by U.S. pro-Trump media, and Huang’s name is now being splashed around social media.

The global response to COVID-19 has been marked by critical shortages—of ventilators, masks, and protective equipment. There is, however, another resource with a rapidly dwindling supply: public trust. Trust in governments and medical authorities is absolutely vital to achieve the kind of mass behavioral change needed to bring the world through this crisis. The corrosive effects of conspiracy theories on social media, combined with nation-states all too willing to exploit them, imperil that response and could prove fatal in more ways than one.

No comments: