Twitter CEO Jack Dorsey was scrolling through his feed in March of 2016 when he stumbled upon a seemingly innocuous and uplifting tweet: “Rihanna collects her Humanitarian of the Year award from Harvard. She kicked off #WomensHistoryMonth with a bang!”
The tweet came from @Crystal1Johnson, an account claiming to be a young African-American woman from Virginia. Dorsey quickly retweeted this and one other tweet from the account to his over 4 million followers without a second thought.
What he didn’t know at the time was the fake tweet and “Crystal Johnson” had a much more devious agenda then they let on.
According to a recent UW study, @Crystal1Johnson was a left-leaning Russian troll spreading misinformation to further polarize online debates. Kate Starbird, an assistant professor of Human Centered Design and Engineering, and her research team of Leo Stewart and Ahmer Arif, showed how several recently-suspended Twitter accounts relating to the #BlackLivesMatter movement and the gun debate conspired to deepen political divides in the United States.
According to the study, “Russian trolls not only took advantage of the polarized nature of the information space, but did so in the context of a domestic conversation surrounding gun violence and race relations.”
These accounts were linked to the Russian troll farm known as the Internet Research Agency (IRA), whose intention is to stir controversy by spreading propaganda through fake social media accounts, which would then be retweeted by unsuspecting Americans.
“They’re trying to be caricatures to make one side angry and then make the other side more angry,” Stewart said.
Upon the review of @Crystal1Johnson’s account, it became clear that among its more positive posts were also deliberate stories of misinformation regarding the Clinton campaign. The account tweeted: “Clinton’s True Face.. KKK leader claims he gave $20K to Hillary Clinton campaign” in May of 2016.
The researchers collected nearly 60 million tweets relating to the gun debate that contained words like “shooting,” “shooter,” “gun shot,” and “gun man.” They then filtered these tweets to include either “#BlackLivesMatter,” “#BlueLivesMatter,” or “#AllLivesMatter,” and filtered them again to include only those accounts with high retweet activity and reach. Their analysis yielded two distinct clusters of fake accounts, one left-leaning and one-right leaning.
After establishing these two communities, the last step was to filter out real account from IRA-run troll accounts. The end result showed how much the real accounts retweeted content from fake accounts, allowing propaganda to infiltrate online communities without much effort on the part of the trolls.
Online trolling has proven itself to be incredibly effective at spreading misinformation. Propaganda is nothing new, but social media, the internet’s global reach, and the anonymity of online accounts have given governments a whole new weapon to disrupt democracy.
“It’s cheap, it doesn’t involve manufacturing weapons or major investment in personnel,” Scott Radnitz, associate professor at the Jackson School of International Studies, said. “Russia has a large number of highly educated people who are skilled in the use of computers including hacking techniques and it turns out, because American politics is very polarized, it’s very easy to stoke and aggravate the existing divisions in the political system.”
Late last week, the United States Department of Justice indicted 13 Russian nationals and three companies associated with the IRA of illegally meddling in the 2016 American presidential election. These efforts continue to impact American politics, with experts like Radnitz fully expecting a similar disinformation campaign to disrupt the 2018 congressional elections.
“The Trump administration is deliberately not taking any measures to stop this,” Radnitz said. ”My sense is that because this is perceived to have benefitted Republicans and Republicans perceive that they will benefit again through Russian meddling, they don’t have a political interest to putting a stop to it.”
Radnitz also says tech companies may have some responsibility to address the issue, but he’s not confident they will act either. Russian trolls on Twitter and Facebook were incredibly successful, often garnering thousands of followers and likes, which amplified their divisive messages.
“Twitter, if they wanted to, could put a stop to the bots,” Radnitz said. “They have economic incentives to allow the most information to circulate. The more members they have, the more things that are posted to their sites, the more tweets and likes, the more money they can make from advertisers.”
But while larger entities may take time to address the issue of Russian propaganda, there are steps individuals can take to address the issue. When consuming information on the internet, Stewart encourages introspection and critically thinking through questions like,“‘How is this making me feel? Is it productive? Is it informing me or is it just making me angry?”
Radnitz believes the key to the intelligent consumption of information is skepticism. He advises readers to double- and triple-check the news they read and be cognizant that people often believe what they want to believe, choosing to live inside echo chambers of thought.
Reach writer Manisha Jha at firstname.lastname@example.org. Twitter: @manishajha_