REVIEW COORDINATOR: Taylor Mills
The Practice of Social Media in COVID-19 Pandemic
Through employing Bruno Latour’s Actor-Network Theory (ANT), this paper mainly focuses on the practice of social media during COVID-19 Pandemic. Social media, as a mediator, has been profoundly politicalized and weaponized to facilitate the spread of misinformation and disinformation. Moreover, the network constructed by the governmental agency, social media, and the users is now evolved as a new strategy to manipulate the public discourse and to nudge the public opinion. Hence, in addition to discussing the nature of the mediator in the context of Latour, this paper will also explore how the involved actors’ power is further reinforced by fostering the circulation of misleading narrative and transferring the debate from COVID-19 Pandemic itself to our prejudice, bias, and extreme emotions.
While we spent years trying to detect the real prejudices hidden behind the appearance of objective statements, do we now have to reveal the real objective and incontrovertible facts hidden behind the illusion of prejudices?(1)
Since December 2019, COVID-19 pandemic has swept the world, causing significant burden and an increasing number of hospitalizations and deaths. When the worldwide healthcare officials are rushing to control the spread of the virus, all kinds of unbridled misinformation, however, are rapidly spreading over social media. If we look closer at the range and the route of the misinformation delivery, a network framed by the governmental agencies, social media, and the public emerges. Given all these three elements are equally involved in this process, during the times of public tension, who should be held responsible for the spread of false and misleading narratives? What kind of roles do they respectively play in shaping the public discourse online and offline? And what’s the purpose of amplifying misinformation? Moreover, as the pandemic gets worse, the controversy gradually shifts from COVID-19 pandemic itself to a rather bizarre direction: the governmental agencies and the people in different countries start blaming each other, the pandemic now has escalated as a social issue and a political game. Are those merely the side effects of COVID-19 pandemic? What’s the metaphor behind the blurred focus? And what actors caused this transition?
To decode these puzzles, actor-network theory (ANT) constructed by Bruno Latour will be applied in this paper to analyze the mediated relationship between the governmental agencies, social media, and the public. If these three elements are treated as mediators which are capable of translating, transforming, and distorting the meanings they are supposed to carry, how should we reexamine the significant roles they play in the spread of misinformation? Hence, in the first section, concepts from actor-network theory and new media studies will be explained, which not only reveals the symmetrical relationship between social media and human users, but also unfold their function as mediators. In the second section, to depict a more concrete picture of how these three mediators interact with each other, a series of cases will be discussed. Then, in the third section, another concept established by Latour is introduced: how does the transition from the matters of fact to the matters of concern impact our understanding of reshaped discourse? And why does the transition matter? And finally, in the last section, COVID-19 pandemic will be applied as a case study to demonstrate how misinformation has been circulated from top to the bottom, how public discourse was refigured and radicalized, and more importantly, how the network constructed by the governmental agencies, social media, and the public is exploited to reproduce and confirm the authorities’ power.
2. ANT: Human Users and Social Media as Mediators
The core concept of ANT is that human and non-human actors are mutually shaping and transforming each other in the practice of networking. In ANT, Latour abandons the simplistic assumption that only human beings possess the capacity of agency. As an actor, non-human can also “authorize, allow, afford, encourage, permit, suggest, influence, block, render possible, forbid, and so on”(2). In other words, they are not “black-box”—the passive containers of information. As he suggests, the symmetrical relationship between human and non-human actors allows anyone or anything to modify the state of affairs. Thus, the whole ANT is built upon the heterogeneous reality where human and non-human are equally involved in layers of connections.
In addition, the appearance of a new hybrid network, as Latour points out, is done “not by transporting a force that would remain the same throughout as some sort of faithful intermediary, but by generating transformations manifested by the many unexpected events triggered in the other mediators that follow them along the line,”(3) which leads to another recurring concept in ANT—intermediaries and mediators. In Reassembling the Social (2005), Latour distinguishes the notion of mediators from that of intermediaries. As he notes:
An intermediary, in my vocabulary, is what transports meaning or force without transformation: defining its inputs is enough to define its outputs. For all practical purposes, an intermediary can be taken not only as a black box, but also as a black box counting for one, even if it is internally made of many parts. Mediators, on the other hand, cannot be counted as just one; they might count for one, for nothing, for several, or for infinity. Their input is never a good predictor of their output; their specificity has to be taken into account every time. Mediators transform, translate, distort, and modify the meaning of the elements they are supposed to carry(4).
In other words, intermediaries connect cause and effect seamlessly, thereby accounting for predictable outcomes where objects perform repetitive and predictable tasks; and mediators, unlike intermediaries, are unpredictable. For Latour, a network is always a concatenation of mediators which “does not trace the same connections and does not require the same type of explanations as a retinue of intermediaries transporting a case”(5). Thus, instead of transporting in a predictable and routine way, mediators translate the meanings they are supposed to deliver. For instance, by processing information in binary form, a calculator, as an intermediary, transforms the input into an anticipated output. A dating app, however, uses math-based technology to predict a user's romantic desire or preference. During this process, the personal information that user inputs is transformed as an unpredictable result: as long as the app is being used, the recommendation list will be ceaselessly updated and adjusted for the user. In this sense, a dating app, as a mediator, brings different outputs for the same input.
Moreover, given actors keep enrolling other actors into the programs of action that they previously did not participate in, the actor-network is being continuously mobilized and generated. Hence, actor-network is hybrid while building an actor-network means making associations and establishing webs of relations between as many heterogeneous actors as possible. Meanwhile, for Latour, in the process of networking, the meaning is simultaneously constructed. As he notes, “existence and meaning are synonymous. As long as they act, agents have meaning. This is why such a meaning may be continued, pursued, captured, translated, morphed into speech.”(6) In other words, meaning is constantly being extended and reinterpreted as networking carries on, which, to a certain extent, reiterates the significant role of mediator during this process.
Following Latour’s definition, the term actor should comprise both human users and social media. Then, how to define human users and social media? Are they mediators or intermediaries? According to Krieger, digital media revolution has opened up a new possibility of expression to humans and non-humans, which “did away with the private as well as the traditional public spheres and has merged them into a new field of communicative action that we have termed the socio-sphere”(7). Meanwhile, when the isolated subject merges with information and becomes a participant in this socio-sphere, the voiceless objects become the participant as well(8). In this sense, both subjects (human users) and objects (social media) are being active in the process of communication, and they can no longer be merely treated as a platform or an account which passively presents or transmits the information.
In Interpreting Networks (2014), Krieger also introduces the concepts of connectivity and flow from new media studies. Connectivity, according to him, means “being connected via digital media with other people and with various sources of information” while flow refers to “the movements of contents through the various connections within the network”(9). And during this process, the more complex connectivity becomes, the more difficult that flow can be to predict, control, and steer. The association of connectivity and flow, therefore, is a double-edged sword that uncovers the dark side of new digital media. Viral communication and the loss of privacy are good demonstrations. Hence, considering connectivity coexists with the flow of actors to continuously make associations, through combining ANT and new media studies, Krieger concludes that flow “opens up black boxes and transforms purely functional intermediaries into actors, participants, and mediators”(10). In other words, not only human users, but also new digital media such as social media should be regarded as mediators.
For ANT, the activity of mutual translation is not the prerogative of humans alone. The knife in the hand can make someone into a criminal. By the same token, social media plays an indispensable role in the action of human users. In the context of ANT, both human and social media become entwined with each other. It is through these associations that actors are configured and reconfigured as another actor, meanwhile, extending or generating the new actor-network. More importantly, how to identify the role of human users and social media is a significant step to delineate their function and power in the digital landscape and real life. When Chinese government banned Facebook, Twitter, and YouTube to avoid the collapse of socialist ideology, or when established democratic governments such as the United Kingdom attempted to shut down Facebook and Twitter to react to social unrest in 2013, social media and its users have gone beyond the role of intermediary and become a structuring power to either cultivate or destabilize the principle of society. Hence, in the next section, the focus will be placed on how governmental agencies, social media, and its users empower each other as mediators, which not only transforms the information in the process of circulation, but also impacts our society to varying degrees.
3. Governmental Agencies, Social Media, and Users
As an important platform, new digital media gradually replaces the position of traditional media in the spread of news and information. Despite digital media, to a certain extent, is changing and molding today’s public discourse, when its function and importance are overemphasized, we must realize that in both virtual and real worlds, due to the openness of digital media, other mediators also participate in this network to foster the circulation of misleading narratives. As Latour suggests, “…an actor-network is what is made to act by a large star-shaped web of mediators flowing in and out of it. It is made to exist by its main ties: attachments are first, actors are second”(11). In other words, when both human and non-human mediators organize themselves into networks, connections, associations, and relations emerge through this communicative process. Hence, to examine how the network constructed by the governmental agencies, social media, and users distribute the false information, the attention should be paid to the interactions between them: when they are transporting and translating misinformation as mediators, who is the leading role in the program of action? Or do they play an equal role in challenging social discourse and reshaping social cognition? More importantly, when all these three mediators are involved, how and to which degree the information can be modified and distorted?
3.1 Governmental Agencies
It is not hard to understand why today’s governmental agencies are willing to use social media as the tool to get closer to the public: it is quicker, more effective, and the coverage of the target group is wider. However, when the public turns against the social media for spreading misleading rhetoric, they tend to ignore governmental agencies who hide behind the social media to govern, control, and manipulate the national discourse at the very beginning.
President Trump is definitely not the first politician who uses social media, but he is absolutely one of the most popular ones. In “How Trump Reshaped the Presidency in Over 11,000 Tweets,” The New York Times reporter Shear states, “When Mr. Trump entered office, Twitter was a political tool that had helped get him elected and a digital howitzer that he relished firing. In the years since, he has fully integrated Twitter into the very fabric of his administration, reshaping the nature of the presidency and presidential power”(12). Moreover, as he highlights, Twitter is President Trump’s instrument of foreign policy, more than half of his tweets are to attack his perceived enemies. Meanwhile, he also uses Twitter as the “alternative facts” to “spread conspiracy theories, fake information and extremist content, including material that energizes some of his base”(13). However, his impact on Twitter is getting increasingly ballooned: he has more than 77.7 million followers, the tweets he posted now average more than 20,000 retweets and 75,000 likes each.
In addition to the United States, the socialist countries such as China are also good at manipulating social media to marginalize dissents and unify domestic ideology. Despite the majority of foreign social media are blocked in China, the Chinese government still uses these platforms to promote overseas propaganda. In August 2019, China was accused of waging a disinformation war against Hong Kong protests. On Facebook and Twitter, these protests were portrayed as merely paid provocateurs, and demonstrations were branded as a prelude to terrorism. In Hong Kong, it was a large-scale protest movement, in mainland China and abroad, however, the story was replaced with an alternate version: a small, violent group of protesters, who were provoked by foreign forces and unsupported by Hong Kong residents, attempted to tear China apart and realize the independence of Hong Kong(14). Despite Facebook and Twitter both claiming that they had already deleted related accounts, the aggressive nationalist and anti-Western sentiment were rapidly stirred up in mainland China within a few weeks. And this is only one case of disinformation released by the Chinese government over recent years.
From the case of President Trump and the 2019 Hong Kong Protest, the leading role of governmental agencies in distributing disinformation is obvious. As an actor, they inscribe social media and the public as other actors in the program of action to achieve certain political goals. Meanwhile, as a mediator, they deliberately distort the information and publicize it. Without a doubt, spreading disinformation on social media has become a powerful and effective political strategy abused by governmental agencies. However, when social media is blamed for all the consequences, we must realize that, in some cases, governments, or the representative of government, should hold major responsibility for distributing misleading narratives.
3.2 Social Media
In The Filter Bubble (2011), Pariser highlights that different companies such as Facebook and Twitter exploit the algorithms and the abundant flow of data to realize “personalization”. Through collecting the data from users and social media, algorithms can accurately predict the preference and behavior of users, and thus select the most suitable information each user receives. During this process, the way information circulates on the social media changes, which leads to the “filter bubble,” a mechanism that limits “what we are exposed to and therefore affects the way we think and learn”(15). As Pariser notes, personalized filters can “upset the delicate cognitive balance that help us to make good decisions and come up with new ideas”(16). For instance, in 2018, The New York Times reveals that a company called “Cambridge Analytica” stole millions of Facebook users’ data to map out psychological profile and set up personalized messages to influence the behavior even the decision of voters(17). Despite there is no clear connection between personalized filters and the result of the 2016 U.S. elections, the impact of social media is palpable. On the one hand, through the technologies, the information is partially presented to the users; on the other hand, the scope and the target of information are manipulated by social media. In other words, social media becomes an invisible mediator that deliberately misleads the users, so that they can only access the contents chosen for them. In addition, in “Disinformation, dystopia and post-reality in social media,” Guarda mentions that in 2017, Facebook and Twitter admitted that they sold their services to Russian operators in order to “spread false information and promote polarization within the North-American society”(18). Hence, it is no wonder that social media is one of the major sources of fake news and false information.
While social media has profoundly transformed our means of communication, we should also realize that it is never a passive container of information. Social media, as a mediator, can directly act upon the meaning of information as well as the way we access and engage with it. When people tend to pay more attention to the convenience it brings to us, we forget the fact that it is only one side of the coin.
3.3 The Users
Since the end of the 2016 EU Referendum, the result of the Brexit vote has been attributed to the “cooperation” of social media and its users. In “Tweeting for Brexit,” Hänska and Bauchowitz collected more than 7.5 million Brexit-related tweets in the month preceding the Referendum, and they found out that “Twitter users who supported leaving the EU were much more active and motivated in advancing their cause, than Remainers were in advocating continued EU membership…meaning both online and offline citizens were more likely to encounter Eurosceptic voices”(19). Therefore, when an avid Eurosceptic was filled with fake stories and false information about how inimical the EU was to British democracy, or how millions of pounds could be saved for NHS by leaving the EU, the Remainers, in Hänska and Bauchowitz’s words, “were scaremongering”(20). Consequently, as the misinformation spread, different versions of stories appeared. The users of social media, on the one hand, were intimidated by all the unverified sources of information; on the other hand, they never stopped sharing, transforming, and reinterpreting them. According to a news release by BBC, Facebook groups were set up and operated to circulate disinformation and misleading news. In the “Brexit Party Supporters group,” more than 6,000 members “engage with and share content linked to far-right and potentially fake accounts, including some apparently US and Russian-oriented profiles.”(21) Another study reveals that, during the time around the EU Referendum day, there was a significant increase in the number of tweets created by human users(22). Meanwhile, “individuals are more active in interacting with similar-minded Twitter users”(23). In this sense, when a given sentiment or messages support the “echo chambers,” users tend to interact with others to share similar political beliefs, which not only reinforces their belief, but also affects public opinion by marginalizing the outside voice. Consequently, when “leave” was more popular than “remain” on Twitter, and more pro-leave misinformation was continuously posted, shared, and reposted by users, the public discourse was more likely to be controlled and changed. And the case of the Brexit vote shows us how easily the public discourse can be manipulated by the users who are both consumer and producer of misinformation, and how mainstream discourse can magnificently influence and convince the dissenters.
Despite we all live in a world where diverse viewpoints coexist, we have never learned to mutually respect the opposite opinions. Under such circumstances, with the help of social media, the marginal voice has been further suppressed or silenced. What’s worse, through creating an illusion to convince the public that we have reached a consensus, the manipulated contents can continue to grow. Meanwhile, in the process of digesting information, our emotions and beliefs seem to prevail over the verified facts, which provides the irresponsible information and inaccurate messages a chance to penetrate every corner of not only social media but also our mind. In this sense, we cannot ignore the seamless connection between social media and users: when social media attempt to tell the public how they should think and what they should think about, we should always be aware that the use and reliance upon social media are making us susceptible to this new form of manipulation. In other words, the malleability and manipulability of the human mind have become one of the most dangerous weaknesses which can bring our society devastating outcomes.
Hence, no matter in which case, governmental agencies, social media, the users should never be simplified as an intermediary; instead, being a mediator, all these three actors are capable of modifying, distorting, and transforming the meaning of information. However, it is far from enough to conclude that the natures of mediator and actor-network should hold responsibility for everything. To a certain extent, they create an external condition to facilitate the process of circulation, yet whether misinformation can be widely spread also depends on the receiver themselves, the internal factor that convinces them to accept misinformation is equally crucial. The questions, therefore, still remain: over the recent years, fake news emerges periodically, when distinguishing the authenticity of unverified sources, why does our judgement always fail? Why is the public so easily attracted by the misinformation and even firmly believe it? And in addition to mediator and actor-network, what drives these misleading narratives to be spread at such a rapid speed?
4. From Matters of Fact to Matters of Concern
In “Post-Truth Politics: Art of the Lie,” an article published on The Economist, the concept of “post-truth” is reinterpreted. According to the article, “post-truth” refers to assertions that “feel true” but have no basis in fact. Hence, given the truth itself has now become secondarily important, the purpose of “post-truth” is to reinforce the public’s prejudices further(24). In late 2016, Oxford Dictionaries selected “post-truth” as the “word of the year,” defining it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”(25). In another report, Truth Decay (2018), Kavanagh and Rich highlight four related trends of “truth decay”: “increasing disagreement about facts and analytical interpretations of facts and data; a blurring of the line between opinion and fact; an increase in the relative volume, and resulting influence, of opinion and personal experience over fact; and declining trust in formerly respected sources of factual information”(26). No matter “post-truth” or “truth decay,” both of them unfold a cognitive confusion: what is truth? Does it matter? And do we still care about the truth? Fortunately, Latour, as a sociologist, asked the very same question in 2004.
Latour wrote “Why Has Critique Run Out of Steam? From Matters of Fact to Matters of Concern” after the spread of misinformation about “weapons of mass destruction in Iraq”. In the essay, he questions “what were we really after when we were so intent on showing the social construction of scientific facts?”(27). According to his arguments, the objects are associated with thinginess — a series of useless attributes that make objectivity impossible. If one object cannot be disconnected from its thinginess, there will be no objectivity. In other words, the object is wrapped by the layers of preconceptions which prevents us from looking straight at it, which not only damage scientific and objective critique, but also stop the public from revealing the truth. Also, as Latour states, “Reality is not defined by matters of fact. Matters of fact are not all that is given in experience. Matters of fact are only very partial and, I would argue, very polemical, very political renderings of matters of concern and only a subset of what could also be called states of affairs.”(28) In this sense, given objectivity is disrupted by thinginess, the reality, in essence, is not constituted by matters of fact but matters of concern. In that case, without objectivity, we should really question the existence of fact and truth in reality.
Although people claim that they hold the fact or truth, in Latour’s account, both of them stem from ideological bias and prejudice rather than incontrovertible evidence. As he suggests, critique paradigmatically consists of two “debunking” moves that present how those “naïve believers” are blind to the truth. First, their naïve belief in objective facts is demonstrated by the critics that these objects or facts are nothing but the manifestations of their own “fetishes”; second, the critics come to show that those believers’ interests and desires are nothing but the projections of deeper structural forces such as history, sociology, and geography. If we contextualize Latour’s arguments in the circulation of fake stories and misinformation, the very same tendency can be perceived. First, the outlet of misinformation constantly produces “objective facts” that match the public’s “fetishes”. Second, it is not that people are obsessed with misleading information, they are simply responding to a series of unfinished and unconfronted debates embedded in deeper structural forces such as the unequal power, racism, or ethnocentrism. Hence, “objective facts,” as the products created to satisfy the public, on the one hand, cater to our desire and interest; on the other hand, going beyond fetishes, we project them as the manifestations, answers, and solutions of other issues, which eventually distract us from the truth itself and lead us to further reinforce our preferences and prejudices. In other words, the public selectively chooses unverified stories as the objective facts on the basis of ideological bias rather than incontrovertible evidence. As Latour points out, “there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint”(29). The real danger, therefore, no longer comes from excessive confidence in ideological argument posturing as matters of facts, but from the bad ideological biases are disguised as the good matters of fact.
As two prerequisites, the lack of objectivity and the excessive ideological biases not only make the conspiracy theories and misinformation more accessible and available, but also allow our personal experience, prejudice, and emotion to be easily provoked. And from the perspective of ANT, as long as these two prerequisites exist, there will always be the space for mediators to translate the information further and there are always increasing numbers of “naïve believers” who blindly follow the manipulated “truth”. In the meantime, given the actor-network continuously inscribes other actors and generates new networks, the speed of spreading disinformation is able to be accelerated and the scope of disinformation delivery is ceaselessly enlarged.
In the next section, I will focus on the information as well as disinformation issued by both Chinese and American governments on social media, and borrow ANT to analyze how governmental agencies, social media, and the public work together to translate, distort, and modify the meaning of the messages.
5. COVID-19 Pandemic: Social Media as the Battlefield
As of Monday, April 20, more than 2.4 million coronavirus cases were confirmed worldwide, including more than 166,000 deaths. And Wuhan, one of the giant cities in the center of China, is the first city that alerted the WHO regarding several cases of unusual pneumonia. On January 23, Wuhan was placed under lockdown, which affected a total of 56 million people. And on January 30, the WHO officially declared that the newly named virus, COVID-19, as a global emergency. After that, the worldwide situation was getting worse, the pandemic was raging in one country after another. And when people around the world were trapped into the fear, panic, and anxiety, the conversations on social media changed—people’s discussions shifted from the concern for safety and health to resentment, anger, and even hatred. Why is that? And what caused the change in public attitude and opinion?
One of the most controversial debates over COVID-19 pandemic is where it came from. Considering almost all the initial cases stemmed from Wuhan, the public tended to believe that Wuhan is the origin. Hence, some American right-wing news outlets started to claim that COVID-19 originated in a level 4 research laboratory in Wuhan, and the virus was engineered in the lab by humans as a bioweapon. Despite prestigious scientific journals such as The Nature and The Lancet published their studies and debunked these conspiracies, the rumors never die. And immediately, these strange stories were posted and shared on social media.
After criticizing American officials and the media for spreading misinformation, the Chinese government rapidly made the counterattack. On official news outlets and social media, they were pushing a new theory about the origins of the coronavirus, which proclaims that it is introduced by the members of the United States Army who visited Wuhan in October 2019. Soon after, the insinuation was posted on Twitter by Zhao Lijian, a Chinese ministry spokesman. He said, “When did patient zero begin in US? How many people are infected. What are the names of the hospitals? It might be US army who brought the epidemic to Wuhan. Be transparent! Make public your data! US owes us an explanation.” When the Chinese government was criticized by both domestic and international voices for their nontransparent administration, rotten bureaucracy, and poor control of the spread of pandemic, through bluntly pointing out American army, the theory that Zhao camp up with fulfilled the transition from matters of fact to matters of concern, which not only shifted the focus from Wuhan to the United States, but also transferred people’s anger from their own government to an imaginary enemy. Hence, the truth about COVID-19 was no longer important, his remarks were quickly disseminated on Chinese social media such as Weibo and WeChat. Within a week, the screenshot of original Twitter posts and the related topics had been viewed more than 150 million times, and in the comment section, the aggressive nationalism and anti-American sentiment soared.
Then on March 17, President Trump tweeted, “The United States will be powerfully supporting those industries, like Airlines and others, that are particularly affected by the Chinese Virus. We will be stronger than ever before!” The tweet received more than 325,000 “Likes” and over 71,000 comments. Among these comments, half of them defend for the name of the virus, and half of them defend for President Trump. By replacing “Corona” with “Chinese,” his thread not only caused a growing chorus of criticism, but also led to increased discrimination and racism toward Chinese and Asian Americans on social media and in real life. As Gilbert Gee, a professor with UCLA’s Fielding School of Public Health said when he was interviewed by The Washington Post, “Those statements are, in my mind, a game changer…Now, they’ve basically made it okay to have anti-Asian bias”(30).
Both cases reveal that when the public was fed disinformation and deliberately misled by the governments, sentiments will eventually prevail over reason. Despite the U.S. Army indeed attended the 2019 Military World Games in Wuhan during October, and China is the very first country that alerted the WHO; as a mediator, governmental agencies rephrased the stories in an insinuating way, which distorts the information to varying degrees. Moreover, through pushing notifications, predicting interests, and recommending similar contents, social media, as another mediator, attracted and assembled a large number of users to unconsciously become mediators who spread the misinformation further. In the meantime, the public’s desire, interest, bias, and prejudice were projected in these misleading narratives and even lunatic conspiracies. Consequently, the matters of fact were downplayed while people were gradually swallowed by their sentiments.
Fortunately, many people have realized the nature of this combative public diplomacy, the misinformation which has been rewritten and recirculated, however, cannot easily stop. Given social media is the place where misinformation continues to penetrate, it becomes urgent to halt the unrelenting spread on social media. Some companies such as Facebook, Twitter, and YouTube claim that they have worked harder to eliminate malicious accounts and messages, nevertheless, social media is still the one who drives the inaccurate messages.
According to a report released by Reboot Foundation in April 2020, there remains a large number of misinformation about COVID-19 circulating online, more importantly, “social media is playing a role, promoting a lot of COVID-19 myths as well as a lackadaisical attitude toward the pandemic in general.”(31) Another study released by Reboot Foundation shows that on Twitter, “25 percent of virus-related tweets contained information that was simply wrong. Another 17 percent of tweets disseminated information that could not be verified”(32). Similarly, on Facebook, 28 percent of all health-related posts were inaccurate. However, should we simply blame social media for their futile control and regulation? Despite social media being responsible for the rampancy of misinformation, they still need the “cooperation” of the users. In the study, the Reboot Team also found out that almost every minute in March, more than 1,000 tweets about the virus would be posted on Twitter. The majority of the posts aimed at gaining new information, and at least 9 percent of messages about COVID-19 are asking or answering virus-related questions. In other words, people exchanged information with another user via social media. And ironically, not all this information was necessarily accurate. Hence, the report concludes that while social media appears to be a weak source of information on COVID-19, they did not stop people from posting virus-related posts. When social media and users entangled together, coronavirus-related posts would only keep booming. As the report highlights, “Social media use appears to drive misinformation around COVID-19. The more time people spend on social media, the more they believe in COVID-19 myths. This pattern was clear in the survey. An increase in social media use correlated with an increase in people being misinformed about the virus.”(33)
If we follow the actor-network theory, the mediated relationship between governmental agencies, social media, and the users is palpable. Each of them undertakes a distinct role of shaping and translating the information: governmental agencies, as a dominant part, stand at the very top of the pyramid, which rewrite, distort, and transform the coronavirus-related message. Once information is posted on selected social media, the platforms will make them become more accessible to users. Since social media attempt to attract more users’ attention, they routinely suggest materials as well as groups that users might be interested in. Consequently, through deliberate guidance, users are more likely to gather together, meanwhile, re-modifying and spreading the distorted message. In addition, the more coronavirus-related posts are created, the more actor-networks will be formed. As long as users are ceaselessly inscribed into the new discussion or debate, the actor-network will never stop being generated. Hence, users play both roles of producer and consumer within a network of many-to-many communication. In short, neither governmental agencies, social media, nor users are instrumentalized into the role of intermediary, they are all mediators.
More importantly, unlike governmental agencies and social media, the number of users is huge while the potential for growth is promising. In the meantime, the nature of social media decides that users will not encounter an excessively high bar to access public discourse. Consequently, on social media, free speech entitled everyone to express themselves in relatively closed societies. In this sense, the real purpose of the governmental agencies, as a mediator, is more than inscribing other mediators such as social media and the users to spread the misinformation; in fact, through the rapidly increasing number of the related posts and users, the public discourse is able to be rewritten and manipulated. In “Discourse and Manipulation,” Van Dijk points out, “manipulation is one of the discursive social practices of dominant groups geared towards the reproduction of their power,”(34) and through providing information, the dominant groups are aimed at “influencing the knowledge, beliefs and (indirectly) the actions of the recipients”(35). Hence, when COVID-19 has been politicalized, by means of delivering disinformation, governmental agencies deliberately blur the boundary between the matter of fact and the matter of concern, which lead the public to shift from the concern for the pandemic to the swirl of racism, aggressive nationalism, xenophobia, and more: the public stops seeking for the truth about COVID-19 itself and stops questioning the accuracy of news or information they receive from social media. Simply based on their bias, prejudice, and emotions, they seek and select the narratives that align with their views.
No matter in China or in the United States, there are so many stories about COVID-19 pandemic are covered up, the stories about the struggles of ordinary people, nurses, doctors…but what struck us most are usually the most appalling ones. When a series of international and domestic issues are increasingly intensified, instead of actively resolving these conflicts, what our governments do is to further deteriorate them by feeding misleading rhetoric and spurious information to the public. Hence, we are unknowingly slipped into a dangerous web woven by lies, hatred, and all other negativities. Yet to the authorities, the untold stories are mere the numbers which should be buried under the ground. Why do we need these? They might ask. To maximize the political interests, governmental agencies painstakingly decide what kind of stories can be unfolded and which parts need to be repackaged. In other words, COVID-19 pandemic is a political game disguised by the cover of the public health issue to achieve diplomatic goals. And this time, social media and the public are their weapons. Consequently, in the process of manipulating public discourse, when the public is exposed to the well-designed disinformation, and being convinced and provoked, over and over again, the power of governmental agencies is reproduced and confirmed.
Unfortunately, as the pandemic spreads, public diplomatic combat does not end. The new wave of misinformation continues to be circulated and reinterpreted. COVID-19 has been translated as a political and social metaphor placed into a more complicated framework of public discourse. When the discussion has shifted from “what should we do?” to “who should be held responsible for it?” the game has changed. The whole society begins to find their own scapegoat to blame for. After all, it is their “reasonable” vent to unleash resentment, to criticize democracy, to censure opaque bureaucratic systems, or to call for liberalism during the quarantine. As Latour suggests, it is never about the matter of fact but about the matter of concern. And if you have fallen into any one of these traps, then either your government wins or mine wins.
In 1928, Edward Bernays, the “father of public relations,” wrote the following passage in his book Propaganda:
The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.
We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of.(36)
However, within less than one century, the attitude changed. As technology advances, the power to manipulate and control individuals is growing and evolving, and our concern over manipulation by governmental agencies emerges periodically. But the concern is entirely legitimate. The network constructed by the governmental agencies, social media, and the public has now been treated as a new tool to unconsciously model our personality and identify our vulnerabilities, which allow them to effectively nudge and shape our emotions, ideas, and expectations. More importantly, rather than out of fear, coercion, or anything from Discipline and Punish, we are now “voluntarily” being controlled and manipulated. To a certain extent, the new technology is more invisible, effective, and powerful.
Unfortunately, when the power of manipulating public discourse is abused by our governments, the public is always the one who swallows the bitter pill. And if we resort to Latour for help, it is disappointing to find out that he never taught us, to escape from such power, what should we do to disconnect with the actor-network? How to be vigilant against unwanted actor-network? Or how to avoid the defect of mediators? In my view, if there is no way to weaken the power of governmental agencies, the other actors should at least try to unshackle the stability of the network. As a potential mediator, it is necessary for the public to realize that something has been and will always be manipulated on social media; and for the companies such as Facebook and Twitter, they should be more transparent when cooperating with governmental agencies, meanwhile, taking strict steps to make it harder to register and operate fake accounts, more importantly, to avoid spurious information being amplified.
COVID-19 pandemic is merely another epitome of how social media is exploited to sway the public, we have witnessed very similar situations in the 2016 presidential election and United Kingdom EU membership referendum. But this time, as the pandemic spreads, more and more countries have waged the wars toward each other. Social media, as a crucial mediator, is weaponized by governmental agencies to radicalize the public towards extremism. These manipulators’ ultimate goal is to alienate the truth from both virtual and real worlds where strong sentiments and values are more likely to prevail. Therefore, when the collective revelry is sweeping across the world, the focus of the discourse changes, which is no longer about humans against COVID-19, but about humans against humans. In this sense, understanding the vulnerabilities as well as the power that manipulators can see in this actor-network is able to help us prepare for the trap in the future. After all, only if we recognize that we are being manipulated, can it be possible for us to avoid taking the bait.
Bernays, Edward L., and Mark Crispin Miller. Propaganda. Brooklyn, N.Y.: Ig Pub., 2005. Print.
Bouygues, Helen Lee. “Going Viral: How Social Media is Making the Spread of the Coronavirus Worse” Reboot Foundation, April 2020. https://reboot-foundation.org/going-viral/.
Chiu, Allyson. “Trump has no qualms about calling coronavirus the ‘Chinese Virus.’ That’s a dangerous attitude, experts say.” The Washington Post, March 20, 2020. https://www.washingtonpost.com/nation/2020/03/20/coronavirus-trump-chinese-virus/.
Gorodnichenko, Yuriy, et al. Social Media, “Sentiment and Public Opinions: Evidence from #Brexit and #USElection.” 2018. EBSCOhost, doi:http://www.nber.org.proxy-remote.galib.uga.edu/papers/w24631.pdf.
Guarda, Rebeka F., et al. “Disinformation, Dystopia and Post-Reality in Social Media: A Semiotic-Cognitive Perspective.” Education for Information, vol. 34, no. 3, July 2018, p. 185.
Hänska, Max and Bauchowitz Stefan. “TitleTweeting for Brexit: How Social Media Influenced the Referendum”. Brexit, Trump and the Media (Mair, John., et al. eds). UK: Bury St Edmunds. 2017, p. 31-35. Print.
Kavanagh, Jennifer, and Rich Michael D. Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND, 2018. Print.
Krieger, David J., and Andréa Belliger. Interpreting Networks: Hermeneutics, Actor-Network Theory & New Media. Bielefeld: transcript Verlag, 2014. Print.
Latour, Bruno. “Why Has Critique Run out of Steam? From Matters of Fact to Matters of Concern.” Critical Inquiry, vol. 30, no. 2, 2004, pp. 225–248.
Latour, Bruno. Reassembling the Social: An Introduction to Actor-Network-Theory. New York: Oxford University Press, 2005. Print.
Latour, Bruno. “Agency at the time of the Anthropocene.” New Literary History, vol. 45, pp. 1-18, 2014.
Myers, Steven Lee and Mozur, Paul. “China is Waging a Disinformation War Against Hong Kong Protesters”. The New York Time. August 13, 2019. https://www.nytimes.com/2019/08/13/world/asia/hong-kong-protests-china.html
Oxford Language, (Oxford University Press, 2016). https://languages.oup.com/word-of-the-year/2016/
Pariser, Eli. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. Penguin Books, 2011. Print.
“Post-truth Politics: Art of the Lie”. The Economist. September 10, 2016. https://www.economist.com/leaders/2016/09/10/art-of-the-lie
Shear, Michael D., et al. “How Trump Reshaped the Presidency in Over 11,000 Tweets.” The New York Time. November 2, 2019. https://www.nytimes.com/interactive/2019/11/02/us/politics/trump-twitter-presidency.html.
Spring, Marianna and Webster, Lucy. “European elections: How disinformation spread in Facebook groups.” BBC Newsnight. May 30, 2019. https://www.bbc.com/news/blogs-trending-48356351
Teun A. Van Dijk. “Discourse and Manipulation” Discourse & Society, vol. 17, no. 3, 2006, p. 359. Print.
- Latour, Bruno. “Why Has Critique Run out of Steam? From Matters of Fact to Matters of Concern.” Critical Inquiry, vol. 30, no. 2, 2004, 227.
- Latour, Bruno. Reassembling the Social: An Introduction to Actor-Network-Theory. (New York: Oxford University Press, 2005), 72.
- Ibid, 107.
- Ibid, 39.
- Ibid, 107.
- Latour, Bruno. “Agency at the time of the Anthropocene.” New Literary History, vol. 45, 2014, 14.
- Krieger, David J., and Andréa Belliger. Interpreting Networks: Hermeneutics, Actor-Network Theory & New Media (Bielefeld: transcript Verlag, 2014), 187.
- Ibid, 187.
- Ibid, 142.
- Ibid, 143.
- Latour, Bruno. Reassembling the Social: An Introduction to Actor-Network-Theory. (New York: Oxford University Press, 2005), 217.
- Shear, Michael D., et al. “How Trump Reshaped the Presidency in Over 11,000 Tweets.” The New York Time. November 2, 2019. https://www.nytimes.com/interactive/2019/11/02/us/politics/trump-twitter-presidency.html.
- Myers, Steven Lee and Mozur, Paul. “China is Waging a Disinformation War Against Hong Kong Protesters”. The New York Time. August 13, 2019. https://www.nytimes.com/2019/08/13/world/asia/hong-kong-protests-china.html
- Pariser, Eli. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (Penguin Books, 2011), 83.
- Ibid, 83.
- Guarda, Rebeka F., et al. “Disinformation, Dystopia and Post-Reality in Social Media: A Semiotic-Cognitive Perspective.” Education for Information, vol. 34, no. 3, July 2018, 192.
- Ibid, 191.
- Hänska, Max and Bauchowitz Stefan. “TitleTweeting for Brexit: How Social Media Influenced the Referendum”. Brexit, Trump and the Media (Mair, John., et al. eds) (UK: Bury St Edmunds, 2017), 30.
- Ibid, 28.
- Spring, Marianna and Webster, Lucy. “European elections: How disinformation spread in Facebook groups.” BBC Newsnight. May 30, 2019. https://www.bbc.com/news/blogs-trending-48356351
- Gorodnichenko, Yuriy, et al. Social Media, “Sentiment and Public Opinions: Evidence from #Brexit and #USElection.” 2018, 13. EBSCOhost, doi:http://www.nber.org.proxy-remote.galib.uga.edu/papers/w24631.pdf.
- Ibid, 28.
- “Post-truth Politics: Art of the Lie”. The Economist. September 10, 2016. https://www.economist.com/leaders/2016/09/10/art-of-the-lie
- Oxford Language, (Oxford University Press, 2016). https://languages.oup.com/word-of-the-year/2016/
- Kavanagh, Jennifer, and Rich Michael D. Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life (RAND, 2018), 21-38.
- Latour, Bruno. “Why Has Critique Run out of Steam? From Matters of Fact to Matters of Concern.” Critical Inquiry, vol. 30, no. 2, 2004, 227.
- Ibid, 232.
- Ibid, 227.
- Chiu, Allyson. “Trump has no qualms about calling coronavirus the ‘Chinese Virus.’ That’s a dangerous attitude, experts say.” The Washington Post, March 20, 2020. https://www.washingtonpost.com/nation/2020/03/20/coronavirus-trump-chinese-virus/.
- Bouygues, Helen Lee. “Going Viral: How Social Media is Making the Spread of the Coronavirus Worse”. Reboot Foundation, April 2020. https://reboot-foundation.org/going-viral/.
- Teun A. Van Dijk. “Discourse and Manipulation” Discourse & Society, vol. 17, no. 3, 2006, 363.
- Ibid, 363.
- Bernays, Edward L., and Mark Crispin Miller. Propaganda, (Brooklyn, N.Y.: Ig Pub., 2005), 37.