Work in progress

Disinformation and censorship

I am currently working on a paper where I measure the effectiveness of online censorship in Ukraine as a measure against foreign propaganda and surveillance. In addition to this, my co-authors and I are working on a different project, where we use digital traces from social media to study gender inequality in diplomacy.

Recent publications

Measuring the scope of pro-Kremlin disinformation on Twitter

(2020) Humanities and Social Sciences Communications, 7(1), 1-11.

“This article examines the scope of pro-Kremlin disinformation about Crimea. I deploy content analysis and a social network approach to analyze tweets related to the region. I find that pro-Kremlin disinformation partially penetrated the Twitter debates about Crimea. However, these disinformation narratives are accompanied by a much larger wave of information that disagrees with the disinformation and are less prevalent in relative terms. The impact of Russian state-controlled news outlets—which are frequent sources of pro-Kremlin disinformation—is concentrated in one, highly popular news outlet, RT. The few, popular Russian news media have to compete with many popular Western media outlets. As a result, the combined impact of Russian state-controlled outlets is relatively low when comparing to its Western alternatives.”

Cross-Platform State Propaganda: Russian Trolls on Twitter and YouTube during the 2016 U.S. Presidential Election

(Co-authored with Cody Buntain, Gregory Eady, Megan A. Brown and Joshua A. Tucker)
(2020) International journal of Press/Politics 25(3) 357-389.

“This paper investigates online propaganda strategies of the Internet Research Agency (IRA)—Russian “trolls”—during the 2016 U.S. presidential election. We assess claims that the IRA sought either to (1) support Donald Trump or (2) sow discord among the U.S. public by analyzing hyperlinks contained in 108,781 IRA tweets. Our results show that although IRA accounts promoted links to both sides of the ideological spectrum, “conservative” trolls were more active than “liberal” ones. The IRA also shared content across social media platforms, particularly YouTube—the second-most linked destination among IRA tweets. Although overall news content shared by trolls leaned moderate to conservative, we find troll accounts on both sides of the ideological spectrum, and these accounts maintain their political alignment. Links to YouTube videos were decidedly conservative, however. While mixed, this evidence is consistent with the IRA’s supporting the Republican campaign, but the IRA’s strategy was multifaceted, with an ideological division of labor among accounts. We contextualize these results as consistent with a pre-propaganda strategy. This work demonstrates the need to view political communication in the context of the broader media ecology, as governments exploit the interconnected information ecosystem to pursue covert propaganda strategies.”

Mapping (Dis-) Information Flow about the MH17 Plane Crash

(Co-authored with Mareike Hartmann and Isabelle Augenstein)
(2019) paper published in the proceeding of the workshop on NLP for Internet Freedom (NLP4IF) at the Conference on Empirical Methods in Natural Language Processing (EMNLP).

“Digital media enables not only fast sharin gof information, but also disinformation. One prominent case of an event leading to circulation of disinformation on social media is the MH17 plane crash. Studies analysing the spread of information about this event on Twitter have focused on small, manually annotated datasets, or used proxys for data annotation. In this work, we examine to what ex-tent text classifiers can be used to label data for subsequent content analysis, in particular we focus on predicting pro-Russian and pro-Ukrainian Twitter content related to the MH17plane crash. Even though we find that a neural classifier improves over a hashtag based base-line, labeling pro-Russian and pro-Ukrainian content with high precision remains a challenging problem. We provide an error analysis underlining the difficulty of the task and identify factors that might help improve classification in future work. Finally, we show how the classifier can facilitate the annotation task for human annotators.”

State, media and civil society in the information warfare over Ukraine: citizen curators of digital disinformation

(Co-authored with Mareike Hartmann, and Rebecca Adler-Nissen)
(2018) International Affairs, 94(5), 975-994.

“This article explores the dynamics of digital (dis)information in the conflict between Russia and Ukraine. International Relations scholars have presented the online debate in terms of ‘information warfare’—that is, a number of strategic campaigns to win over local and global public opinion, largely orchestrated by the Kremlin and pro-western authorities. However, this way of describing the online debate reduces civil society to a mere target for manipulation. This article presents a different understanding of the debate. By examining the social media engagement generated by one of the conflict's most important events—the downing of the Malaysian Airlines Flight 17 (MH17) over Ukraine—we explore how competing claims about the cause of the plane crash are disseminated by the state, media and civil society. By analysing approximately 950,000 tweets, the article demonstrates how individual citizens are more than purveyors of government messages; they are the most active drivers of both disinformation and attempts to counter such information. These citizen curators actively shape competing narratives about why MH17 crashed and citizens, as a group, are four times more likely to be retweeted than any other type of user. Our findings challenge conceptualizations of a state-orchestrated information war over Ukraine, and point to the importance of citizen activity in the struggle over truths during international conflicts.”