Whether we like it or not, disinformation, or false information that is designed to mislead us, is everywhere on social media. In fact, according to a recent report by the Oxford Internet Institute, more than 70 countries have used disinformation campaigns to influence national and international opinion and/or to drown out the point of views of their critics. While these governmental efforts to affect our understanding of political issues are often reported on by news outlets – just see the recent reporting on China’s efforts to discredit protesters in Hong Kong, it is harder to sort facts from fiction during crises because the typical channels of information are disrupted. During and immediate following a crisis, it is an information free-for-all, and it is up to us to discern the facts from the falsehoods.
This is no less true of school shootings. During and immediately after a shooting, people flood to social media to find out what’s going on, to check in on loved ones, and to share what they’ve learned with others. However, not all of the information shared during these moments are true, and, occasionally disinformation shapes how social media users and the broader public understands and talks about the causes of – and solutions to – gun violence.
Who Spreads Disinformation?
My research with two former students (Cynthia Williams and Mackenzie Teek), which was recently published in New Media & Society, examines one factor that potentially effects the spread of disinformation: Opinion leaders, or, in this case, those who Twitter users deem to have access to credible information after a shooting. In some cases, an opinion leader may be a journalist who has access to law enforcement and able to share correct information quickly, or it may be someone involved in the incident reporting what they see and know from the scene. In other cases, it may be trolls pretending to be on the scene or to have credible information about a crisis for their own amusement.
To explore who emerges as opinion leaders and influences the quality of information spread, we analyzed tweets the week following two crises: The shooting at Florida State University in 2014 and the stabbing at Ohio State University in 2016.
On 20 November 2014 at 12:25 a.m. Myron May, a 31-year-old FSU graduate, went into the university’s library and opened fire with a .380 handgun. May injured three students before he was shot dead by police. In the days following the shooting, it became clear that May was mentally ill. He believed that his behavior was being managed by the US government via mind control. The second “shooting” occurred at the OSU [The incident was initially reported as a shooting and the hashtag stuck]. On 18 November 2016 at 9:52 a.m., Abdul Razak Ali Artan, an 18-year-old Somali refugee, drove his Civic into a crowd of people, deliberately striking pedestrians. Artan crashed the Civic and attacked students with a butcher knife, injuring 11. OSU police fatally shot Artan. The motive for Artan’s attack was not clear and was framed as a potential act of terrorism.
We found that the type of opinion leaders shaping the information environment about each case varies dramatically (See the Table below). In the FSU case, local and national journalists drive public conversation. This is not true of the OSU case, where trolls, or individuals intentionally circulating disinformation, tweet and are retweeted the most. This is true despite the fact that trolls accounted for only 1.1% of the 1662 total users tweeting about the incident.
What Kinds of Information Do Opinion Leaders Share?
In the FSU case, journalists played a critical role in shaping the information environment. Specifically, journalists shared correct information regarding the incident, which effectively quelled the spread disinformation. Notice in the table above that four of the top seven most (re)tweeted accounts belong to journalists and a news outlet. Andrew Perez (@PerezLocal10 in the above table), a local journalist who had recently moved from North Florida to Miami, was particularly influential in narrative construction because he was the first journalist to share information about the shooting. His initial reporting included pictures and videos from inside the library. This reporting was critical, particularly the first hour after the incident, because a conspiracy theorist immediately posted that the shooting was a hoax. An hour-by-hour analysis of the tweets reveals that 22.6% of the 239 tweets sent the first hour after the shooting contained disinformation about the incident, mostly that the shooting was an attempt to strip citizens of their guns which would make it easier for the “New World Order” to seize global control and exercise authoritarian rule. By mid-point into the second hour, Perez’s tweets (and those by another journalist) replaced the disinformation.
In contrast, trolls dominated discourse in the aftermath of the OSU incident. In fact, four of the top seven most often (re)tweeted accounts belonged to trolls spreading a meme identifying Sam Hyde as the shooter, and all four of these accounts began posting shortly after the incident (the table above). Sam Hyde is a self-identified comedian who claims credit for shootings on college campuses. The meme is considered humorous among trolls and spread after school shootings.
Notcurveme and roscoeSBJones, trolls posing as students, identified Sam Hyde as the OSU shooter and included a picture of Hyde holding an AR-15. Ess4emily and rex_caerulus upped the ante by offering political motivations for the attack. Ess4emily attributed the shooting to neo Nazis tweeting, “#OSUShooting hearing rumors that neo nazi leader, Sam Hyde is behind this atrocity, please stay safe.” Rex_caerulus instead blamed the antifa posting, “#OSUShooter identified as Hillary supporter and Antifa activist Samuel Hydestein #osushooting.” These were not the only trolls in our sample—we found 18 different accounts posting the Sam Hyde meme—only the most popular.
Does Disinformation Matter?
Yes, it matters. In our research, we look at the relationship between information quality and civility. In the table below, you can see the percentage of tweets that were correct information, disinformation, misinformation (or incorrect information), personal narratives and stories, and polemics. Not only can you see dramatic differences in the rates of correct information and disinformation between the FSU and OSU cases, also notice the difference in the civility of the discussion on Twitter. As you can see in the table below, the largely fact-based conversations associated with the FSU incident was far more civil than the OSU discussions, which were based on disinformation and polemics, which typically included insults about “libtards” attempts to undermine gun rights.
All of this has implications for democracy. While disinformation and polemics may stimulate a broader public conversation about social concerns such as gun violence, the relative incivility of these narratives which included polemics and insults are unlikely to increase users’ tolerance to individuals’ championing opposing perspectives—which is an important precursor to consensus-building . Conversely, fact-based narratives, particularly those discussing May’s mental health, could assist in consensus-building regarding health care in America. Even the personal narratives shared by students may help those holding opposing points of view regarding issues such as gun control better understand one another insofar as these stories can help individuals find areas of unanticipated agreement. Disinformation, in short, is bad for political conversation, political debate and deliberative processes.
Dr. Deana Rohlinger is a professor of Sociology. She is also a member of the National Institute for Civil Discourse’s Research Network, and grateful to the NICD for supporting this project.