Disinformation and Democratic Transition: A Kenyan Case Study

Technological developments have strengthened atrocity prevention efforts; however, these technologies can also be harnessed to exploit vulnerabilities and grievances during election periods

By  Gillian McKay

Rapid technological developments go some way towards strengthening early warning systems for preventative action, including where hate speech can be detected and targeted counter-speech campaigns deployed to mitigate offline harms.5 For example, see the work of HateLab: https://hatelab.net/publications/. But, as some malign actors have shown, these technologies can also be harnessed to exploit vulnerabilities and exacerbate underlying grievances. Through the lens of a single case study, this issue brief seeks to unpack the role that states, civil society, and social media can play in preventing mass atrocity crimes during electoral periods. Given the presence of various atrocity risk indicators–including a history of and impunity for election-related violence, existing ethnic and land grievances, and growing class divisions–Kenya will be the primary focus of this brief. As will be discussed, Kenyan elections over the last decade have suffered from domestic and foreign interference, and the rise of a coordinated ‘disinformation industry’ in the country now poses a particular threat to a peaceful political transition in August 2022. Moreover, as Twitter in Kenya has been described as the ‘preferred space for political discourse online’, the brief will focus on the role that this platform plays in exacerbating and mitigating atrocity risks specifically.

Executive Summary

Election-related disinformation is a widespread phenomenon with the potential to contribute to mass atrocity crimes globally. Such threats to peaceful political transitions have become more common in recent years thanks to an emerging weaponization of online media platforms to stoke division and polarize communities during these tense voting periods. Social media in particular has been used by dictators, demagogues and democratic leaders alike who wish to influence or control political narratives and electoral outcomes through increasingly sophisticated disinformation campaigns. This is concerning given that political transitions already present a particular challenge in democracies where the risk of mass atrocity is high, and where both foreign and domestic actors have used social media to shape or sway public opinion and influence political decisions. In this context, the instant and widespread reach of election-related disinformation can act as a threat multiplier and ultimately strain preventative efforts, and little is known about how to mitigate offline harms associated with this online phenomenon; particularly in fragile and emerging democracies.

Far from being a new phenomenon, records of political disinformation date back centuries to ancient Rome when Mark Antony’s rival, Octavian, launched a smear campaign with ‘Twitter-worthy slogans etched onto coins’.1  Note: Posetti, Julie, and Alice Matthews. “A Short Guide to the History of ‘Fake News’ and Disinformation.” International Center for Journalists 7 (2018).    The advent of the print press, and later broadcast radio and television, saw the proliferation of ‘fake news’ stories the world over, and rapid technological developments in more recent decades have altered the information environment in which electorates access critical news and updates. In 2016, social media was infamously exploited for political and financial gain in the British referendum to leave the European Union and later the presidential election of Donald J. Trump in the United States. Although these events have been central to discussions of political disinformation since, the problem of election-related disinformation infects at least 60 other countries worldwide (see Appendix 1); many of which are reported to be at grave risk of mass atrocity crimes.

 Kenya provides an important case study, given its history of ethnic and election-related violence, as well as a record of foreign and domestic interference in presidential votes. Data mining company Cambridge Analytica used the Kenyan elections in 2013 as a testing ground of sorts for its influential campaigns elsewhere in the years that followed, and again in 2017 to secure the ruling Jubilee Party’s grip on power. Moreover, one third of the country’s total population2  Note: Communications Authority of Kenya, “Second Quarter Sector Statistics Report for the Financial Year 2021/2022,” accessed May 5, 2022, https://www.ca.go.ke/wp-content/uploads/2022/03/Sector-Statistics-Report-Q2-2021-2022-.pdf.    now use their devices to access political news and updates, with social media platform Twitter having been described as the “preferred space for political discourse online.”3  Note: Nanjala Nyabola, Digital Democracy, Analogue Politics: How the Internet Era Is Transforming Politics in Kenya (London: Bloomsbury Publishing, 2018), 89.    Twitter has also been at the center of a growing “disinformation industry” in the country, whereby social media influencers and ordinary Kenyan citizens are being paid to share false or misleading narratives intended to sway public opinion and influence political decisions. It is within this context that the integrity of the August 2022 vote and a peaceful political transition are under threat.

This issue brief provides some practical recommendations that could pave the way for more collaborative approaches to preventing election-related atrocity crimes fueled by online disinformation campaigns. In particular, there is a need to improve how we understand and manage the online information environment around elections, specifically in how we educate digital citizens and regulate the digital space. These recommendations should be considered as a starting point for discussions on how stakeholders can better mitigate the problem of election-related disinformation and prevent related offline harms in contexts where the risk of mass atrocity is high.

Introduction

Mass atrocity crimes are fast becoming the “new normal”4  Note: Philippe Bolopion, “Atrocities as the New Normal: Time to e-Energize the ‘Never Again’ Movement,” Human Rights Watch, 2018, accessed May 5 2022, https://www.hrw.org/news/2018/12/10/atrocities-new-normal.     in a world characterized by creeping authoritarianism, rising temperatures, and unprecedented digital transformation. Where they have not already been displaced or otherwise affected by such crimes, populations in Europe, South America, Africa, and Asia currently face imminent or serious risk of widespread and systematic human rights abuses from state and non-state actors. While atrocity risk factors and threat multipliers may change over time, the pathology of mass violence generally does not. But it can — and must — be prevented.

Rapid technological developments go some way toward strengthening early warning systems for preventative action, including where hate speech can be detected and targeted counter-speech campaigns deployed to mitigate offline harms.5  Note: For example, see the work of HateLab: https://hatelab.net/publications/.     But, as some malign actors have shown, these technologies can also be harnessed to exploit vulnerability and exacerbate underlying grievances. The use of social media to stoke division and polarize communities is not unique to the toolbox of dictators and demagogues, however, but is also a means by which seemingly democratic leaders can influence or control political narratives and outcomes through increasingly sophisticated disinformation campaigns. This is concerning given that political transitions already present a particular challenge in democracies where the risk of mass atrocity is high. The United Nations Framework of Analysis for Atrocity Crimes in fact identifies elections as one atrocity risk indicator: a trigger event that “even if seemingly unrelated to atrocity crimes, may seriously exacerbate existing conditions or may spark their onset.”6  Note: United Nations, “Framework for Analysis of Atrocity Crimes,” accessed May 5, 2022, https://www.un.org/en/genocideprevention/documents/about-us/Doc.3_Framework%20of%20Analysis%20for%20Atrocity%20Crimes_EN.pdf.     

Through the lens of a single case study, this issue brief seeks to unpack the role that states, civil society, and social media can play in preventing mass atrocity crimes during electoral periods. Given the presence of various atrocity risk indicators (see Appendix 2),7  Note: An assessment by the author in May 2022 using the UN Framework found 26 risk indicators across 7 risk factors (see Appendix 2). In February 2022, Kenya’s NCIC identified 23 counties at risk of election-related violence (National Cohesion and Integration Commission, “The Inauguration of State of Peace and Political Decency in the Run Up to the 2022 General Election,” 2022, accessed  May 5, 2022, https://cohesion.or.ke/index.php/media-center/press-releases-speeches/396-media-brief-the-inauguration-of-state-of-peace-and-political-decency-in-the-run-up-to-the-2022-general-elections-friday-18th-february-2022).    including a history of and impunity for election-related violence, as well as existing ethnic and land grievances and growing class divisions,8  Note: See Sentinel Project:  https://thesentinelproject.org/2021/03/29/the-2022-kenyan-general-election-an-analysis-of-new-and-enduring-violence-risk-factors/.     Kenya will be the primary focus of this brief. As will be discussed, Kenyan elections over the last decade have suffered from domestic and foreign interference, and the rise of a coordinated “disinformation industry” in the country now poses a particular threat to a peaceful political transition in August 2022. Moreover, as Twitter in Kenya has been described as the “preferred space for political discourse online,”9  Note: Nyabola, Digital Democracy, 89.   the brief will focus on the role that this platform plays in exacerbating and mitigating atrocity risks specifically.

This paper draws largely on desk-based research as well as semi-structured interviews with four subject experts, including nationals from Kenya, the United Kingdom, and North America. Interviews were undertaken virtually with the author on the condition of anonymity, with some having agreed to speak wholly off-the-record. Due to the limited time available for this research, interviews could not be conducted with social media companies or government officials, potentially limiting the findings in this paper. Many of the recommendations in this brief have been informed by perspectives shared from within the United Kingdom’s civil service and national and international non-governmental organizations.

The first section examines relationships between disinformation, elections and mass atrocity crimes, while the second provides an overview of the Kenyan context and the third explores the political use of Twitter in the country. The final section concludes the brief and offers some practical recommendations for stakeholders in both the short and long term.

Election-Related Disinformation and the Risk of Mass Atrocity

Although this issue brief focuses only on disinformation, it is important to understand the overlaps and distinctions between each type of “information disorder” that might be encountered online — disinformation, misinformation, and malinformation. Drawing on the definitions proposed by Claire Wardle and Hossein Derakhshan,10  Note: Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking,” Council of Europe, 2018.    disinformation here is characterized by the spread of deliberately false or misleading information primarily for political or financial gain, while misinformation may be false or misleading information that has been shared inadvertently. Malinformation, on the other hand, refers to information that is shared with a deliberate intent to cause harm, like hate speech. Thus, what could begin as a coordinated disinformation campaign might also quickly evolve into the spread of mis- or malinformation.

The information environment that exists online today, while it presents many distinct challenges, still relies on a fundamentally human capacity to genuinely accept false information as fact. Indeed, some of the twentieth century’s most infamous atrocities were also preceded by sophisticated disinformation campaigns, albeit through print and radio, whereas the internet now plays a more central role. The Nazis, for instance, used print press to portray Jews as a threat to the German Reich,11  Note: United States Holocaust Memorial Museum, “Holocaust Encyclopedia: Nazi Propaganda,” accessed May 5, 2022, https://encyclopedia.ushmm.org/content/en/article/nazi-propaganda#the-role-of-newspapers-2.    while the Rwandan government used the radio to warn Hutu citizens that their Tutsi neighbors were mobilizing against them.12  Note: Christine L. Kellow and H. Leslie Steeves, “The Role of Radio in the Rwandan Genocide,” Journal of Communication 48, no. 3 (1998): 107–128.    While the relationship is not necessarily causational, these examples and others highlight an important link between the spread of disinformation and the commission of mass atrocity crimes. Social media is just the latest vehicle by which such disinformation can be shared unchecked, and the nature of these platforms means that this information can spread further and faster than was possible before the age of instant online communication. Moreover, the ability to network with individuals outside our proximal spheres of social influence — what Jamie Bartlett has referred to as “ever-smaller units of like-minded people”13  Note: Jamie Bartlett, The People vs Tech: How the Internet Is Killing Democracy (and How We Save It) (London: Random House, 2018), 47.    — only amplifies the risk that what we encounter in the digital world entrenches our (often harmful) preexisting beliefs and biases.

At the core of this problem is the human vulnerability that makes us susceptible to disinformation. Research suggests that it is the emotional response elicited by content that tends to shape what we share online, rather than how closely aligned it might be to the truth, particularly when it comes to political news.14  Note: Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science 359, no. 6380 (2018): 1146–1151.    This includes content that might be presented as a “careful one-sided selection of the truth” to explain real or perceived injustices experienced by far-right groups in the United Kingdom, for example.15  Note: Bartlett, The People vs Tech, 56.    The 2016 election campaigns in the United Kingdom and the United States also both drew heavily on anti-immigration and other divisive rhetoric, “activating” what academics have described as “polarized social identities (political, ethnic, national, racial and religious, etc.),” as well as “exploiting real or perceived political, economic, religious, or cultural wrongs and/or leveraging low institutional trust.”16  Note: Vosoughi, Roy, and Aral, “The Spread of True and False News Online.”    Recognizing the threat that disinformation campaigns could therefore pose to democratic integrity, these governments went on to develop a greater understanding of the disinformation “tradecraft” (i.e., perpetrator “tactics, techniques and procedures”) and therefore how to better counter it, with one civil servant describing the relative non-interference in the 2019 UK and EU elections as evidence that lessons have been learned and integrated since the 2016 experience.17  Note: Author interview, March 2022.     Preparation is said to have played a central role in deterring external interference in these European elections, but as the US Capitol riots in January 2021 showed, it is domestic political leaders themselves that often drive election-related disinformation campaigns.18  Note: Nate Ranner, “Trump’s Election Lies Were Among His Most Popular Tweets,” CNBC, accessed May 5, 2022, https://www.cnbc.com/2021/01/13/trump-tweets-legacy-of-lies-misinformation-distrust.html.    

There is also little understanding of how lessons for countering election-related disinformation might be exported to emerging or comparatively fragile democracies, or where a country might be considered only “partly free.” Indeed, although democratic societies are overall less likely to experience mass atrocities,19  Note: Stephen McLoughlin, “Understanding Mass Atrocity Prevention During Periods of Democratic Transition,” Politics and Governance 3, no. 3 (2015): 27–41.    it is in fact elections that often prompt the kind of widespread and systematic human rights abuses that constitute these crimes. Academics have suggested that this has been the case in Myanmar and the Democratic Republic of Congo, where an overemphasis on democratization to bring about peaceful solutions to complex local problems has failed to acknowledge the very risk that political transitions represent in themselves.20  Note: Kate Ferguson, “For the Wind Is in the Palm-Trees: The 2017 Rohingya Crisis and an Emergent UK Approach to Atrocity Prevention,” Global Responsibility to Protect 1, no. aop (2021): 1–28; Séverine Autesserre, The Frontlines of Peace: An Insider’s Guide to Changing the World (New York: Oxford University Press, 2021).     Given the tendency of elections to politicize identity, polarize communities, and exacerbate underlying grievances, political disinformation can act as a threat multiplier for mass atrocity crimes during these very tense voting periods.

Kenyan Elections

Kenya is a relatively stable country in the East and Horn of Africa, sharing borders with Somalia, where the insurgent group al-Shabaab conducts frequent and deadly terror attacks throughout the country and often the wider region; with Ethiopia, where the Abiy administration stands accused of crimes against humanity and genocide in the northern Tigray region; and with Sudan and South Sudan, where various state and non-state actors have been implicated in mass atrocity crimes for more than a decade. This is not to say that Kenya is not without its own problems. Although it has been a democratic state since gaining independence from Britain in 1963, Freedom House considers Kenya to be only “partly free” — having assigned a score of 48/100 compared to the United Kingdom’s 93/10021  Note: Freedom House, “Countries and Territories,” accessed May 5, 2022, https://freedomhouse.org/countries/freedom-world/scores.    — due at least in part to questions over the extent to which the country’s presidential elections are free and fair.22  Note: Freedom House, “Kenya: Freedom in the World 2021,” accessed May 5, 2022, https://freedomhouse.org/country/kenya/freedom-world/2021.  

Elections in Kenya have historically run along ethnic lines, but the introduction of multiparty politics in 1991 saw leaders rely more on violence to exert and protect their power in certain ethnic strongholds.23  Note: Philip Waki, Gavin McFadyen, and Pascal Kambale, “Commission of Inquiry into the Post-Election Violence (CIPEV),” Nairobi, Kenya, 2008.    Kalenjin President Daniel arap Moi (1978–2002), for example, used violence to suppress Kikuyu and Luo opposition during the 1992 and 1997 elections, while Kikuyu President Mwai Kibaki (2002–2013) was accused of stirring up ethnic tensions against Luo communities in the early 2000s.24  Note: Waki et al., CIPEV, 24, 28.    A buildup of tensions over this period is said to have contributed to the post-election violence that engulfed the country after the contested 2007 vote, in which Kibaki was accused by Luo opposition candidate Raila Odinga of having “stolen” the election.25  Note: Waki et al., CIPEV, 196.    In the first few months of 2008, an estimated 1,500 people were killed, at least 900 men, women, and children were treated for sexual violence, and hundreds of thousands more were displaced until a power-sharing agreement under a new constitution brought the violence to an end.26  Note: Waki et al., CIPEV.     Although it is disputed whether the Responsibility to Protect (R2P) was a driving force behind the international response to the crisis,27  Note: Graham Harrison, “Onwards and Sidewards? The Curious Case of the Responsibility to Protect and Mass Violence in Africa,” Journal of Intervention and Statebuilding 10, no. 2 (2016): 143–161.    experts in the field have argued that the case is exemplary of a “successful” intervention under the doctrine’s third pillar for “timely and decisive response”: “prevention, but hardly of an ideal sort.”28  Note: Edward Luck, “The Responsibility to Protect: Growing Pains or Early Promise?,” Ethics & International Affairs 24, no. 4 (2010): 352.  

During the post-election crisis, violence was mostly incited using inflammatory language broadcast through vernacular radio and other local media. A month-long ban on live reporting was then announced,29  Note: Abdullahi Boru Halakhe, “ ‘R2P in Practice’: Ethnic Violence, Elections and Atrocity Prevention in Kenya,” Global Centre for the Responsibility to Protect, Occasional Paper Series 4 (2013).    prompting many Kenyans to turn to the web to promote peace or spread malice where they were unable to do so using more traditional outlets. In response, the government’s newly formed National Cohesion and Integration Commission (NCIC) promised to prioritize the mitigation of online hate speech in future elections.30  Note: Halakhe, “R2P in Practice.”    Nevertheless, online forums like Mashada and Concerned Kenyan Writers became hotbeds of debate, and as social media platforms also gained popularity in the years that followed, along with increasingly available mobile internet, unverified content became a key source of information for many Kenyans throughout the country.31  Note: Nyabola, Digital Democracy, 27.  

While the 2013 elections  — in which Kikuyu President Uhuru Kenyatta was voted into power — saw comparatively less violence than the 2007 vote, many of the same drivers of violence, such as land grievances and impunity for past crimes (including Kenyatta’s own indictment at the International Criminal Court), remained largely unaddressed. Social media platforms were harnessed in the 2013 campaign strategies, with both presidential candidates — Kenyatta and Odinga — having already established a notable presence on Twitter at the time.32  Note: David Smith, “Africa’s Top 10 Tweeting Politicians,” The Guardian, 2012, accessed May 5, 2022, https://www.theguardian.com/world/2012/oct/30/africa-twitter-blogs-politicians.    Moreover, as Nanjala Nyabola has noted, it was in the run-up to this election that British firm Cambridge Analytica surveyed a selection of the Kenyan electorate to profile their voting behavior and understand their primary needs and fears.33  Note: Nyabola, Digital Democracy, 160.    The findings were then used to inform Kenyatta’s campaign strategy, and his Jubilee Party went on to sign another multimillion dollar contract to influence the 2017 outcome drawing on additional lessons learned from the Trump and Brexit votes. During this period, one poll found that 87 percent of 2,000 Kenyans surveyed had encountered disinformation in the leadup to the August election, including 35 percent who felt they were subsequently unable to make an informed voting decision.34  Note: GeoPoll and Portland, “The Reality of Fake News in Kenya,” 2018, accessed May 5, 2022, https://www.geopoll.com/blog/geopoll-and-portland-launch-a-survey-report-on-fake-news-in-kenya/.     The results were nullified due to allegations of fraud, and another vote was scheduled for October, during which time the proportion of Kenyan politicians influencing conversations on Twitter doubled (from 3% to 7%),35  Note: Portland, “How Africa Tweets 2018,” accessed May 5, 2022, https://portland-communications.com/pdf/How-Africa-Tweets-2018.pdf.     suggesting a growing emphasis on the platform as a space for political organization and as a tool for setting political agendas.36  Note: Nyabola, Digital Democracy, 95.    

More recently, the NCIC identified six key “roadblocks” to peaceful elections in 2022: namely, structural inequalities, delayed and uncoordinated conflict response mechanisms, a lack of trust in institutions and between communities, self-interested leadership, subcultures of violence, and ethnic polarization.37  Note: National Cohesion and Integration Commission, “A Violence Free 2022: Roadmap to Peaceful 2022 General Elections,” 2020, accessed May 5, 2022, https://cohesion.or.ke/images/docs/downloads/NCIC_Roadmap_to_Peaceful_Elections_in_Kenya.pdf.    Although the Commission’s report acknowledges the role that social media can play in exacerbating these threats, it is limited to the effects of mis- or malinformation on ethnic polarization. While these issues are important in the context of mass atrocity prevention, the roadmap and its subsequent updates38  Note: National Cohesion and Integration Commission, “The Third State of Peace and Political Decency in the Run Up to the 2022 General Elections,” 2022, accessed May 5, 2022, https://cohesion.or.ke/index.php/media-center/press-releases-speeches/406-the-third-state-of-peace-and-political-decency-in-the-run-up-to-the-2022-general-elections; National Cohesion and Integration Commission, “The Second State of Peace and Political Decency in the Run Up to the 2022 General Election,” 2022, accessed May 5, 2022, https://cohesion.or.ke/index.php/media-center/press-releases-speeches/403-the-second-state-of-peace-and-political-decency-in-the-run-up-to-the-2022-general-elections; National Cohesion and Integration Commission, “The Inauguration of State of Peace.”    ultimately fail to address concerns raised by election-related disinformation specifically, despite evidence of a growing disinformation industry in the country and despite being a critical threat to the integrity of the upcoming elections.

Kenyans on Twitter

Although Facebook remains the most popular social media app in Kenya, second only to the closed messaging platform WhatsApp (owned by Meta, the parent company of Facebook), the number of users has declined in recent years as Twitter gains popularity, particularly among urban dwellers aged 26–35.39  Note: Patrick Kanyi Wamuyu, “The Kenyan Social Media Landscape: Trends and Emerging Narratives, 2020,” SIMELab, 2020, accessed May 5, 2022, https://www.usiu.ac.ke/assets/image/Kenya_Social_Media_Lanscape_Report_2020.pdf.     Twitter has been identified as a key platform for political discussion in Kenya,40  Note: Nyabola, Digital Democracy, 89.    while one study found in 2016 that Africans in general tend to be more politically active on the app than users in the Northern Hemisphere.41  Note: Nyabola, Digital Democracy, 5.      This is particularly noteworthy given that Twitter users are more likely to interact with people they don’t know personally, compared to Facebook users who tend to engage more with those that they do.42  Note: Nyabola, Digital Democracy, 94. The study in question was undertaken in the United States by the Pew Research Center: available from https://www.pewresearch.org/internet/2016/10/25/political-content-on-social-media/.    In this way, Twitter users are likely to gravitate toward content and other users that align more closely with their preexisting beliefs and biases, further amplifying real or perceived differences and also reducing any potential corrective impact of fact-checking on false or misleading information.43  Note: Erik Nisbet and Olga Kamenchuk, “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy,” Hague Journal of Diplomacy 14, no. 1–2 (2019): 65–82.  

Twitter is a useful tool for those seeking to influence or control political narratives specifically because of the widespread reach that it offers its users. Moreover, the platform provides one of the most instant means by which users can keep up to date with unfolding events or crises from sources close to the ground — like when al-Shabaab attacked Nairobi’s Westgate shopping center in 2013 or Garissa University in 2015.44  Note: Nyabola, Digital Democracy, 91–93.    In the volatile context of the 2017 elections, fake polls and other manipulated content — which often tapped into ethnic and other grievances — created, produced, and distributed online by both political leaders reached millions of voters in a matter of minutes, polluting the information environment with unchecked claims and confusing the electorate ahead of the August vote.45  Note: Jacinta Mwende Maweu. “ ‘Fake Elections’? Cyber Propaganda, Disinformation and the 2017 General Elections in Kenya,” African Journalism Studies 40, no. 4 (2019): 62–76.    The ruling Jubilee Party in particular sought to build an illusion of consensus around President Kenyatta and heighten political tensions in opposition strongholds through fake news headlines and videos, as well as fake articles and interviews attributed to reputable national and international news organizations.46  Note: Maweu, “Fake Elections?”; see also Patrick Mutahi and Brian Kimari, “Fake News and the 2017 Kenyan Elections,” South African Journal of Communication Theory and Research 46, no. 4 (2020): 37.  

One Twitter feature that has helped politically motivated actors to amplify the reach of their campaigns is the hashtag, which plays a key role in connecting otherwise unconnected communities who are commenting on a certain topic. The #KOT (Kenyans on Twitter) hashtag, for instance, has over the years established itself as an online community with “a notable reputation for scaling up various political and social issues and stories . . . [that] routinely make ‘news’ in the mainstream legacy press.”47  Note: George Ogola, “#Whatwouldmagufulido? Kenya’s Digital ‘Practices’ and ‘Individuation’ as a (Non)Political Act,” Journal of Eastern African Studies 13, no. 1 (2019): 127.    Other hashtags tap into ethnic tensions that underscore the political spheres of influence in Kenya. #Madoadoa, for example, refers to the term “spot” or “blemish” that was recently uttered by Senator Mithika Linturi at a campaign rally in Eldoret for Kalenjin candidate (current Deputy President) William Ruto.48  Note: Author interview, March 2022; see also Victor Abuso, “Kenya: Politicians on Notice after Dozens of Political Dialects Banned,” Africa Report, 2022, accessed May 5, 2022, https://www.theafricareport.com/193658/kenya-politicians-on-notice-after-dozens-of-political-dialects-banned/.     The term is particularly sensitive in the region given its association with the Kalenjin-led atrocities that saw scores of ethnic Kikuyus burned alive on January 1, 2008 while seeking refuge in Kiambaa church49  Note: Waki et al., CIPEV.     — an incident that Ruto later described as “an accidental kitchen fire.”  Note: WikiLeaks, “A/S Carson and NSC Senior Director Gavin’s Meeting with Agricultural Minister Ruto,” 2009, accessed May 5, 2022, https://wikileaks.org/plusd/cables/09NAIROBI1083_a.html.      — an incident that Ruto later described as “an accidental kitchen fire.50WikiLeaks, “A/S Carson and NSC Senior Director Gavin’s Meeting with Agricultural Minister Ruto,” 2009, accessed May 5, 2022, https://wikileaks.org/plusd/cables/09NAIROBI1083_a.html

Another feature worth noting is the platform’s “trending” algorithm. Twitter’s algorithms generally prioritize content for a user based on their activity relative to certain topics. Given that online users tend to engage more with content that elicits a strong emotional response,51  Note: Vosoughi, Roy, and Aral, “The Spread of True and False News Online.”    algorithms are able to exploit this vulnerability and influence users by prioritizing content that appears in our individual online “feeds”. The type of ads that we see on social media platforms, whether encouraging consumers to buy a certain product or electorates to vote a certain political party into power, is largely underscored by data about our location, search history, and the type of content that we have previously engaged with. When it comes to disinformation campaigns, social media companies’ structures and policies currently allow perpetrators to target specific audiences with specific interests, and thereby reinforce preexisting beliefs or biases among a target community.52  Note: Bartlett, The People vs Tech.     Twitter’s “trending” feature then amplifies the reach of this engagement further by increasing the visibility of the content, making it a key performance indicator for disinformation influencers who make up Kenya’s growing disinformation industry.53  Note: Odanga Madung and Brian Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” Mozilla, 2021, accessed May 5, 2022, https://foundation.mozilla.org/en/blog/fellow-research-inside-the-shadowy-world-of-disinformation-for-hire-in-kenya/.     

A series of studies published by Mozilla Foundation recently provided insight into the tradecraft behind this industry. In early 2021, for example, many Kenyans were being paid — mostly through their mobile phones (using payment platform M-Pesa) — to promote predefined hashtags using multiple “sock puppet” (fake) accounts to create an illusion of popular support for proposed constitutional reforms.54  Note: Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya.”    The initiative in question, which was proposed in partnership with former political rival Odinga, was seen as controversial given that it could have secured Kenyatta a prime ministerial role where he was unable to run for a third presidential term. While the Supreme Court of Kenya dismissed the initiative as unconstitutional,55  Note: Al Jazeera, “Kenya: Supreme Court Blocks Controversial Constitutional Reforms,” 2022, accessed May 5, 2022, https://www.aljazeera.com/news/2022/3/31/kenyas-top-court-to-rule-on-contested-constitutional-reforms.     many of those who opposed the reforms were targeted on Twitter by political disinformation campaigns (e.g., using #JusticeForSale) designed to discredit the independence of the judiciary and promote the Kenyatta-Odinga initiative.56  Note: Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” 10.     

Given that paid, successful influencers can reportedly make more than the average monthly salary in Kenya if they are able to get content trending for just a few hours, it is perhaps unsurprising that a recent poll found seven in 10 Twitter users were willing to share potentially false or misleading information in exchange for payment.57  Note: Sigomba Ramadhan and Brian Murimi, “Social Media Role in Political Disinformation, Smear Campaigns,” Star, 2022, accessed May 5, 2022, https://www.the-star.co.ke/news/big-read/2022-01-18-social-media-role-in-political-disinformation-smear-campaigns/.     One influencer also told researchers that they have worked for both sides of the political divide simultaneously, with one set of accounts used to promote pro-Kenyatta content and another reserved for pro-Ruto content.58  Note: Odanga Madung and Brian Obilo, “How to Manipulate Twitter and Influence People: Propaganda and the Pandora Papers in Kenya,” Mozilla, 2021, accessed May 5, 2022, https://foundation.mozilla.org/en/blog/new-research-in-kenya-disinformation-campaigns-seek-to-discredit-pandora-papers/.     Whether using fake accounts operated by humans or software (“bots”), the core tactic here seems to be to overwhelm the information environment through the repetition of hashtags, phrases, or imagery and to build an illusion of popular support. This kind of “information gaslighting”59  Note: Nisbet and Kamenchuk, “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy,” 72.    has been used recently to target politicians and activists who supported two reproductive rights bills passing through Parliament.60  Note: Odanga Madung, “Exporting Disinformation: How Foreign Groups Peddle Influence in Kenya through Twitter,” Mozilla, 2022, accessed May 5, 2022, https://foundation.mozilla.org/en/campaigns/exporting-disinformation-how-foreign-groups-peddle-influence-in-kenya-through-twitter/.    Behind this particular disinformation campaign was CitizenGO — a right-wing organization based in Spain — pointing to the risk that foreign interference may play in upcoming political debates.

Looking ahead to the August vote, it is worth noting that while these political disinformation campaigns tend to originate on Twitter, the content is also often then shared on other platforms like Facebook and WhatsApp.61  Note: Bravin Yuri in Ramadhan and Murimi, “Social Media Role in Political Disinformation, Smear Campaigns.”    The potential reach of election-related disinformation in the coming months is therefore significant, posing a threat to the integrity of the election itself as well as wider voter participation.62  Note: Ramadhan and Murimi, “Social Media Role in Political Disinformation, Smear Campaigns.”    Indeed, poor registration turnout for young Kenyans aged 18–24 — a group that has arguably grown up on broken promises and political gaslighting63  Note: Patrick Gathara, “Kenyan Youth Are Not to Blame for Their Election Apathy,” Al Jazeera, 2022, accessed May 5, 2022, https://www.aljazeera.com/opinions/2022/2/6/why-are-kenyan-youth-apathetic-about-voting.     — speaks to the latter point in particular. More broadly, however, these campaigns risk both “scaling” and “normalizing” issues64  Note: Terms used in author interview, April 2022.    that exacerbate atrocity risks, particularly in societies like Kenya where rumors frequently fuel cycles of violence.65  Note: Author interview, March 2022.  

Conclusion and Recommendations

The art of disinformation has been described as “an old story, fueled by new technology,”66  Note: UNESCO, “Journalism, ‘Fake News,’ & Disinformation,” 2018, accessed May 5, 2022, https://en.unesco.org/sites/default/files/journalism_fake_news_disinformation_print_friendly_0_0.pdf.    and the Kenyan case exemplifies how existing grievances are now playing out in the digital space. During particularly divisive periods like elections, social media has been weaponized by foreign and domestic actors in order to shape or sway public opinion and influence political decisions. Where the risk of mass atrocity is high, the instant and widespread reach of election-related disinformation can act as a threat multiplier and strain preventative efforts. This problem extends beyond Kenya, with various European and Russian entities launching increasingly sophisticated disinformation campaigns to interfere with elections across the African continent and beyond.67  Note: On Europe in Kenya and Africa, see Madung, “Exporting Disinformation”; on Russia in Africa, see Anton Shekhovtsov, “Fake Election Observation as Russia’s Tool of Election Interference: The Case of AFRIC,” European Platform for Democratic Elections, 2020, accessed May 5, 2022, https://www.epde.org/en/documents/details/fake-election-observation-as-russias-tool-of-election-interference-the-case-of-afric.html; Shelby Grossman, Daniel Bush, and Renee DiResta, “Evidence of Russia-Linked Influence Operations in Africa,” Stanford Internet Observatory, 2019, accessed May 5, 2022, https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/29oct2019_sio_-_russia_linked_influence_operations_in_africa.final_.pdf.     There is therefore a need to improve how we understand and manage the online information environment around elections, especially in how we educate digital citizens and regulate the digital space.

As was the case in Europe in 2019, preparation could be one necessary — albeit insufficient — condition to successfully prevent election-related disinformation campaigns. Given that election timeframes are known in advance, stakeholders have the time necessary to adequately  prepare for election-related disinformation campaigns. There are some short- and long-term recommendations worth noting here, then, and the following have been informed by desk-based research and qualitative interviews with practitioners (civil service and civil society) who have expertise in atrocity prevention and/or countering disinformation. It is worth noting that the author received no response from policymakers and social media practitioners approached to participate, and the recommendations for these stakeholders (i.e., states and social media companies) ought to be interpreted with this in mind. An overview of the recommendations is provided below, followed by key recommendations for specific stakeholders: namely, states, civil society, and social media.

In the immediate term, stakeholders should work more collaboratively and at the grassroots level to build individual confidence and skills to distinguish fact from fiction online. This would ultimately help to slow the spread of disinformation and therefore its potential to result in offline harm. In the context of the upcoming Kenyan elections, this might mean bolstering the capacity of groups already promoting peaceful narratives online (e.g.,UNESCO’s Social Media 4 Peace program)68  Note: UNESCO, “UNESCO Launches Social Media for Peace Project in Kenya,” 2021, accessed May 5, 2022, https://en.unesco.org/news/unesco-launches-social-media-peace-project-kenya.    as well as building individual capacity to identify and reject false or misleading content they encounter in the digital space.

Relatedly, there is a need to develop a local-level understanding of how election-related disinformation campaigns can lead to offline violence in different contexts. A deeper appreciation of the link between the spread of rumors and the outbreak of violence is said to have helped prevent the recurrence of massacres in some regions of Kenya.69  Note: Author interview, March 2022.     Where humanitarian organizations like the Red Cross might provide more tangible support to affected populations (e.g., food and blankets), there can often be confusion over the relevance or importance of measures to counter different types of information disorder where the risk of mass violence is high. In this way, working to deepen an understanding of how election-related disinformation can act as a threat multiplier for atrocity crimes might also be one means through which stakeholders can help to mitigate relevant risks.

Given that several African countries are expected to hold elections in 2022 (see Appendix 3),70  Note: African Arguments, “Africa Elections 2022: All the Upcoming Votes,” 2022, accessed May 5, 2022, https://africanarguments.org/2022/04/africa-elections-all-upcoming-votes/; see also ISS Africa, “African Elections to Watch in 2022,” 2021, accessed May 5, 2022, https://issafrica.org/pscreport/psc-insights/african-elections-to-watch-in-2022.    amidst increasing internet penetration and growing political discourse on platforms like Twitter, as well as increasing foreign interference, there is a need for stakeholders to ensure that robust measures are in place to verify content in real time and minimize the spread of election-related disinformation. This ultimately requires greater investment in local capabilities and expertise71  Note: Author interview, April 2022.    (e.g., in vernacular languages) to moderate trends and content72  Note: Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” 13.    where elections are scheduled and the risk of mass atrocity is high.

Although fact-checking and detecting inauthentic activity — including inauthentic activity promoted by authentic accounts — goes some way to containing the spread of disinformation, it cannot be the only steps that stakeholders take to successfully mitigate risk.73  Note: Author interview, April 2022.      Without the proper channels through which early warning can be translated into early action, there is a risk that election-related disinformation campaigns will result in mass atrocity where threats have not been communicated.74  Note: Author interviews, March and April 2022.    

In the longer term, stakeholders should invest in tackling some of the root causes that are driving the spread of political disinformation. In Kenya, this might include efforts to address the worsening economic crisis and widespread unemployment that is ultimately pushing many young people to work in the disinformation industry.75  Note: Author interview, March 2022.    More broadly, as new technology emerges and the disinformation tradecraft continues to evolve, there is a need to acknowledge the human vulnerability that facilitates the spread of false or misleading information, and therefore a need to promote information literacy and critical thinking in digital education programs.

Finally, one interviewee suggested that there is a need to move away from (reactive) “content-based” regulation to a more (preventative) “systems-based” approach. While Twitter is to be commended for suspending hundreds of accounts implicated in Kenyan disinformation campaigns,76  Note: Madung, “Exporting Disinformation: How Foreign Groups Peddle Influence in Kenya through Twitter,” 14;  Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” 12.     there is still an inherent risk of over-moderation in the current approach, which, in turn, risks undermining freedom of expression and other digital rights.77  Note: Author interview, April 2022.    Indeed, from Kenya78  Note: Maweu, “Fake Elections?,” 64; see also Reporters Without Borders, “Kenyan Election Campaign Hits Journalists and Media Freedom,” 2017, accessed May 5, 2022,  https://rsf.org/en/news/kenyan-election-campaign-hits-journalists-and-media-freedom.    to Cameroon79  Note: Author interview, April 2022.    to Russia,80  Note: Victor Jack, “Russia Expands Laws Criminalizing ‘Fake News,’ ” Politico, 2022, accessed May 5, 2022, https://www.politico.eu/article/russia-expand-laws-criminalize-fake-news/.   “fake news” legislation is commonly manipulated to target individuals for content deemed politically sensitive or critical of government. On the other hand, a systems-based approach (the EU’s new Digital Services Act was raised as one promising example)81  Note: Quote from author interview, April 2022; see also EU DisinfoLab, “Tackling Disinformation Online: The Digital Services Act Opens the Era of Accountability,” 2022, accessed May 5, 2022, https://www.disinfo.eu/advocacy/tackling-disinformation-online-the-digital-services-act-opens-the-era-of-accountability/.     would instead “tackle harms at source” by targeting the business model at the heart of platforms that seek to prioritize profit over protection.82  Note: Author interview, April 2022.     

Key recommendations for:

States

  • Governments with existing capacity and experience in countering online disinformation could share resources and personnel through bilateral or multilateral arrangements. The United Kingdom, the United States, and Finland, among others, have amassed a wealth of expertise on the Russian tradecraft in particular that could help other states to increase their own resilience against electoral interference. One initial area that might be worth exploring in Kenya concerns the tools that the NCIC might be able to use for monitoring election-related disinformation on social media, as this is currently being done manually (and to monitor hate speech only).83  Note: National Cohesion and Integration Commission, “The Third State of Peace and Political Decency.”   
  • Financially, states should establish rapid-release funding mechanisms through which stakeholders can respond to imminent or ongoing election-related atrocities. In the longer term, multi-year funding can also contribute to more upstream, structural prevention to ensure that the key drivers of election-related disinformation and its offline impacts can be addressed.84  Note: Author interview, March 2022.    This support should feed into a broader strategy for mass atrocity prevention.

Civil Society

  • Civil society (including academia, the media, non-governmental organizations, and faith-based institutions) should (continue to) engage at the grassroots level — described by one interviewee as the “most important” recommendation they would make85  Note: Author interview, March 2022.    — and ensure that the “average person” is “the number one priority” when consulting for and designing interventions86  Note: Author interview, March 2022.    to mitigate election-related disinformation and relevant atrocity risks. Rather than relying on input from only a handful of “village champions”87  Note: Author interview, March 2022.    meeting in generally inaccessible locations (e.g., “fancy hotels”),88  Note: Author interview, March 2022.    a broader local perspective could facilitate a more comprehensive understanding of the challenges and opportunities for prevention on the ground.
  • Building individual confidence and capacity to identify and reject election-related disinformation should involve working directly with platforms on educational programs and to promote critical thinking in the digital space.89  Note: Author interview, March 2022.    Engaging with initiatives like Meta’s “trusted partner” program — through which civil society can contribute to preventing online and offline harm by providing expert recommendations90  Note: Meta, “Bringing Local Context to Our Global Standards,” 2022, accessed May 5, 2022, https://transparency.fb.com/policies/improving/bringing-local-context.     — could be a starting point for such discussions.

Social Media Platforms

  • In the periods before, during, and after elections, social media companies (including Twitter and Big Tech firms like Meta) should invest generously in local capability and expertise, particularly given that not all disinformation may be detected by artificial intelligence or moderators unfamiliar with the nuances on the ground. As one interviewee described it, “coded language” often clouds judgment of what constitutes harm in an online information environment, and the inherent “Western skew” with which moderation is generally undertaken requires better engagement with civil society to understand, communicate, and address risk.91  Note: Author interview, April 2022; see also Wilson Center, “Malign Creativity: How Gender, Sex, and Lies Are Weaponized Against Women Online,” 2022, accessed May 5, 2022, https://www.wilsoncenter.org/publication/malign-creativity-how-gender-sex-and-lies-are-weaponized-against-women-online.     Suspending the “trending” feature may also be one step that Twitter can take specifically, as it has done in Ethiopia amidst growing disinformation and hate speech.92  Note: Ranking Digital Rights, “Big Tech Scorecard,” 2022, accessed May 5, 2022,  https://rankingdigitalrights.org/index2022/companies/Twitter.   
  • In the longer term, social media companies should (continue to) increase meaningful standards of transparency and data accessibility (perhaps even “mandated information sharing”)93  Note: Author interview, April 2022.    to inform systems-based change. Although Twitter was highlighted by one interviewee as one of the more transparent platforms, there still exists a “knowledge asymmetry” between stakeholders.94  Note: Author interview, April 2022.    Better information-sharing on risks and patterns specifically — particularly as technologies evolve and new challenges emerge — would support more proactive measures to mitigate offline harms associated with online disinformation campaigns, and without infringing upon other digital rights like freedom of expression.

Acknowledgements

The author would like to thank those who agreed to be interviewed as part of her wider research into mass atrocity prevention, and whose inputs have contributed to this issue brief and shaped many of the recommendations. Thank you also to the reviewers whose thoughtful comments and feedback have helped to strengthen the paper significantly, and to the Stimson Center for affording the opportunity to deliver on this timely brief.

About the Author

Gillian McKay is a PhD candidate in the School of Politics and International Studies at the University of Leeds. Her research is focused on the United Kingdom’s approach to mass atrocity prevention and is being undertaken in collaboration with the Aegis Trust, supervised within the European Centre for the Responsibility to Protect, and funded by the Economic and Social Research Council. Her academic background is in psychology and human rights law, and she has previously worked in the humanitarian and development sectors.

Appendix I

Countries where election-related disinformation had been documented in 202095  Note: Bradshaw, Samantha, Hannah Bailey, and Philip N. Howard. Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation. Computational Propaganda Research Project. Accessed May 5, 2022. https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/127/2021/01/CyberTroop-Report-2020-v.2.pdf  :

AngolaHungaryPhilippines
ArgentinaIndiaPoland
ArmeniaIndonesiaRussia
AustraliaIranRwanda
AustriaIraqSaudi Arabia
AzerbaijanIsraelSerbia
BrazilItalySouth Africa
CambodiaKazakhstanSouth Korea
ColombiaKenyaSpain
Costa RicaKyrgyzstanSri Lanka
CroatiaKuwaitSudan
Czech RepublicLebanonSweden
EcuadorLibyaTaiwan
EgyptMacedoniaTunisia
El SalvadorMalaysiaTurkey
EthiopiaMaltaUkraine
GeorgiaMexicoUnited Kingdom
GermanyMoldovaUnited States
GhanaNetherlandsVenezuela
GreeceNigeriaYemen
GuatemalaPakistanZimbabwe

Appendix II

United Nations Framework of Analysis for Atrocity Crimes:

Risk Factor Risk Indicator 
Situations of armed conflict or other instability  

Situations that place a State under stress and generate an environment conducive to atrocity crimes. 
1.2 Security crisis caused by, among other factors, defection from peace agreements, armed conflict in neighboring countries, threats of external interventions or acts of terrorism.
1.3 Humanitarian crisis or emergency, including those caused by natural disasters or epidemics. 
1.7 Economic instability caused by scarcity of resources or disputes over their use or exploitation.
1.9 Economic instability caused by acute poverty, mass unemployment or deep horizontal inequalities. 
Record of serious violations of international human rights and humanitarian law 

Past or current serious violations of international human rights and humanitarian law, particularly if assuming an early pattern of conduct, and including those amounting to atrocity crimes, that have not been prevented, punished or adequately addressed and, as a result, create a risk of further violations.
 
2.1 Past or present serious restrictions to or violations of international human rights and humanitarian law, particularly if assuming an early pattern of conduct and if targeting protected groups, populations or individuals.
2.2 Past acts of genocide, crimes against humanity, war crimes or their incitement
2.3 Policy or practice of impunity for or tolerance of serious violations of international human rights and humanitarian law, of atrocity crimes, or of their incitement.
2.5 Continuation of support to groups accused of involvement in serious violations of international human rights and humanitarian law, including atrocity crimes, or failure to condemn their actions.
2.6 Justification, biased accounts or denial of serious violations of international human rights and humanitarian law or atrocity crimes.
2.7 Politicization or absence of reconciliation or transitional justice processes following conflict.
2.8 Widespread mistrust in State institutions or among different groups as a result of impunity. 
Weakness of State structures 

Circumstances that negatively affect the capacity of a State to prevent or halt atrocity crimes.
 
3.5 High levels of corruption or poor governance  
Motives or incentives  

Reasons, aims or drivers that justify the use of violence against protected groups, populations or individuals, including by actors outside of State borders.
 
4.1 Political motives, particularly those aimed at the attainment or consolidation of power.
4.2 Economic interests, including those based on the safeguard and well-being of elites or identity groups, or control over the distribution of resources.
4.9 Social trauma caused by past incidents of violence not adequately addressed and that produced feelings of loss, displacement, injustice and a possible desire for revenge. 
Capacity to commit atrocity crimes  

Conditions that indicate the ability of relevant actors to commit atrocity crimes.
 
5.1 Availability of personnel and of arms and ammunition, or of the financial resources, public or private, for their procurement.
5.2 Capacity to transport and deploy personnel and to transport and distribute arms and ammunition.
5.3 Capacity to encourage or recruit large numbers of supporters from populations or groups, and availability of the means to mobilize them.
5.5 Presence of or links with other armed forces or with non-State armed groups.
5.6 Presence of commercial actors or companies that can serve as enablers by providing goods, services, or other forms of practical or technical support that help sustain perpetrators.
5.7 Financial, political or other support of influential or wealthy national actors.
5.8 Armed, financial, logistic, training or other support of external actors, including States, international or regional organizations, private companies, or others.  
Enabling circumstances or preparatory action  

Events or measures, whether gradual or sudden, which provide an environment conducive to the commission of atrocity crimes, or which suggest a trajectory towards their perpetration.
 
7.14 Increased inflammatory rhetoric, propaganda campaigns or hate speech targeting protected groups, populations or individuals. 
Triggering factors 

Events or circumstances that, even if seemingly unrelated to atrocity crimes, may seriously exacerbate existing conditions or may spark their onset.
 
8.8 Census, elections, pivotal activities related to those processes, or measures that destabilize them.
8.12 Acts related to accountability processes, particularly when perceived as unfair. 

Appendix III

Date Country 
18 June 2022 Nigeria 
16 July 2022 Nigeria 
25 July 2022 Tunisia 
By July 2022 Senegal 
By July 2022 Republic of Congo 
9 August 2022 Kenya 
August 2022 Angola 
By September 2022 Chad 
13 November 2022 Somaliland 
17 December 2022 Tunisia 
By December 2022 Sudan 
2022 Comoros 
2022 Djibouti 
2022 Equatorial Guinea 
2022 Lesotho 
2022 Mauritius 
2022 Sierra Leone 

References

Notes

  • 1
      Note: Posetti, Julie, and Alice Matthews. “A Short Guide to the History of ‘Fake News’ and Disinformation.” International Center for Journalists 7 (2018).  
  • 2
      Note: Communications Authority of Kenya, “Second Quarter Sector Statistics Report for the Financial Year 2021/2022,” accessed May 5, 2022, https://www.ca.go.ke/wp-content/uploads/2022/03/Sector-Statistics-Report-Q2-2021-2022-.pdf.  
  • 3
      Note: Nanjala Nyabola, Digital Democracy, Analogue Politics: How the Internet Era Is Transforming Politics in Kenya (London: Bloomsbury Publishing, 2018), 89.  
  • 4
      Note: Philippe Bolopion, “Atrocities as the New Normal: Time to e-Energize the ‘Never Again’ Movement,” Human Rights Watch, 2018, accessed May 5 2022, https://www.hrw.org/news/2018/12/10/atrocities-new-normal.   
  • 5
      Note: For example, see the work of HateLab: https://hatelab.net/publications/.   
  • 6
      Note: United Nations, “Framework for Analysis of Atrocity Crimes,” accessed May 5, 2022, https://www.un.org/en/genocideprevention/documents/about-us/Doc.3_Framework%20of%20Analysis%20for%20Atrocity%20Crimes_EN.pdf.     
  • 7
      Note: An assessment by the author in May 2022 using the UN Framework found 26 risk indicators across 7 risk factors (see Appendix 2). In February 2022, Kenya’s NCIC identified 23 counties at risk of election-related violence (National Cohesion and Integration Commission, “The Inauguration of State of Peace and Political Decency in the Run Up to the 2022 General Election,” 2022, accessed  May 5, 2022, https://cohesion.or.ke/index.php/media-center/press-releases-speeches/396-media-brief-the-inauguration-of-state-of-peace-and-political-decency-in-the-run-up-to-the-2022-general-elections-friday-18th-february-2022).  
  • 8
      Note: See Sentinel Project:  https://thesentinelproject.org/2021/03/29/the-2022-kenyan-general-election-an-analysis-of-new-and-enduring-violence-risk-factors/.   
  • 9
      Note: Nyabola, Digital Democracy, 89.  
  • 10
      Note: Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking,” Council of Europe, 2018.  
  • 11
      Note: United States Holocaust Memorial Museum, “Holocaust Encyclopedia: Nazi Propaganda,” accessed May 5, 2022, https://encyclopedia.ushmm.org/content/en/article/nazi-propaganda#the-role-of-newspapers-2.  
  • 12
      Note: Christine L. Kellow and H. Leslie Steeves, “The Role of Radio in the Rwandan Genocide,” Journal of Communication 48, no. 3 (1998): 107–128.  
  • 13
      Note: Jamie Bartlett, The People vs Tech: How the Internet Is Killing Democracy (and How We Save It) (London: Random House, 2018), 47.  
  • 14
      Note: Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science 359, no. 6380 (2018): 1146–1151.  
  • 15
      Note: Bartlett, The People vs Tech, 56.  
  • 16
      Note: Vosoughi, Roy, and Aral, “The Spread of True and False News Online.”  
  • 17
      Note: Author interview, March 2022.   
  • 18
      Note: Nate Ranner, “Trump’s Election Lies Were Among His Most Popular Tweets,” CNBC, accessed May 5, 2022, https://www.cnbc.com/2021/01/13/trump-tweets-legacy-of-lies-misinformation-distrust.html.    
  • 19
      Note: Stephen McLoughlin, “Understanding Mass Atrocity Prevention During Periods of Democratic Transition,” Politics and Governance 3, no. 3 (2015): 27–41.  
  • 20
      Note: Kate Ferguson, “For the Wind Is in the Palm-Trees: The 2017 Rohingya Crisis and an Emergent UK Approach to Atrocity Prevention,” Global Responsibility to Protect 1, no. aop (2021): 1–28; Séverine Autesserre, The Frontlines of Peace: An Insider’s Guide to Changing the World (New York: Oxford University Press, 2021).   
  • 21
      Note: Freedom House, “Countries and Territories,” accessed May 5, 2022, https://freedomhouse.org/countries/freedom-world/scores.  
  • 22
      Note: Freedom House, “Kenya: Freedom in the World 2021,” accessed May 5, 2022, https://freedomhouse.org/country/kenya/freedom-world/2021.  
  • 23
      Note: Philip Waki, Gavin McFadyen, and Pascal Kambale, “Commission of Inquiry into the Post-Election Violence (CIPEV),” Nairobi, Kenya, 2008.  
  • 24
      Note: Waki et al., CIPEV, 24, 28.  
  • 25
      Note: Waki et al., CIPEV, 196.  
  • 26
      Note: Waki et al., CIPEV.   
  • 27
      Note: Graham Harrison, “Onwards and Sidewards? The Curious Case of the Responsibility to Protect and Mass Violence in Africa,” Journal of Intervention and Statebuilding 10, no. 2 (2016): 143–161.  
  • 28
      Note: Edward Luck, “The Responsibility to Protect: Growing Pains or Early Promise?,” Ethics & International Affairs 24, no. 4 (2010): 352.  
  • 29
      Note: Abdullahi Boru Halakhe, “ ‘R2P in Practice’: Ethnic Violence, Elections and Atrocity Prevention in Kenya,” Global Centre for the Responsibility to Protect, Occasional Paper Series 4 (2013).  
  • 30
      Note: Halakhe, “R2P in Practice.”  
  • 31
      Note: Nyabola, Digital Democracy, 27.  
  • 32
      Note: David Smith, “Africa’s Top 10 Tweeting Politicians,” The Guardian, 2012, accessed May 5, 2022, https://www.theguardian.com/world/2012/oct/30/africa-twitter-blogs-politicians.  
  • 33
      Note: Nyabola, Digital Democracy, 160.  
  • 34
      Note: GeoPoll and Portland, “The Reality of Fake News in Kenya,” 2018, accessed May 5, 2022, https://www.geopoll.com/blog/geopoll-and-portland-launch-a-survey-report-on-fake-news-in-kenya/.   
  • 35
      Note: Portland, “How Africa Tweets 2018,” accessed May 5, 2022, https://portland-communications.com/pdf/How-Africa-Tweets-2018.pdf.   
  • 36
      Note: Nyabola, Digital Democracy, 95.    
  • 37
      Note: National Cohesion and Integration Commission, “A Violence Free 2022: Roadmap to Peaceful 2022 General Elections,” 2020, accessed May 5, 2022, https://cohesion.or.ke/images/docs/downloads/NCIC_Roadmap_to_Peaceful_Elections_in_Kenya.pdf.  
  • 38
      Note: National Cohesion and Integration Commission, “The Third State of Peace and Political Decency in the Run Up to the 2022 General Elections,” 2022, accessed May 5, 2022, https://cohesion.or.ke/index.php/media-center/press-releases-speeches/406-the-third-state-of-peace-and-political-decency-in-the-run-up-to-the-2022-general-elections; National Cohesion and Integration Commission, “The Second State of Peace and Political Decency in the Run Up to the 2022 General Election,” 2022, accessed May 5, 2022, https://cohesion.or.ke/index.php/media-center/press-releases-speeches/403-the-second-state-of-peace-and-political-decency-in-the-run-up-to-the-2022-general-elections; National Cohesion and Integration Commission, “The Inauguration of State of Peace.”  
  • 39
      Note: Patrick Kanyi Wamuyu, “The Kenyan Social Media Landscape: Trends and Emerging Narratives, 2020,” SIMELab, 2020, accessed May 5, 2022, https://www.usiu.ac.ke/assets/image/Kenya_Social_Media_Lanscape_Report_2020.pdf.   
  • 40
      Note: Nyabola, Digital Democracy, 89.  
  • 41
      Note: Nyabola, Digital Democracy, 5.    
  • 42
      Note: Nyabola, Digital Democracy, 94. The study in question was undertaken in the United States by the Pew Research Center: available from https://www.pewresearch.org/internet/2016/10/25/political-content-on-social-media/.  
  • 43
      Note: Erik Nisbet and Olga Kamenchuk, “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy,” Hague Journal of Diplomacy 14, no. 1–2 (2019): 65–82.  
  • 44
      Note: Nyabola, Digital Democracy, 91–93.  
  • 45
      Note: Jacinta Mwende Maweu. “ ‘Fake Elections’? Cyber Propaganda, Disinformation and the 2017 General Elections in Kenya,” African Journalism Studies 40, no. 4 (2019): 62–76.  
  • 46
      Note: Maweu, “Fake Elections?”; see also Patrick Mutahi and Brian Kimari, “Fake News and the 2017 Kenyan Elections,” South African Journal of Communication Theory and Research 46, no. 4 (2020): 37.  
  • 47
      Note: George Ogola, “#Whatwouldmagufulido? Kenya’s Digital ‘Practices’ and ‘Individuation’ as a (Non)Political Act,” Journal of Eastern African Studies 13, no. 1 (2019): 127.  
  • 48
      Note: Author interview, March 2022; see also Victor Abuso, “Kenya: Politicians on Notice after Dozens of Political Dialects Banned,” Africa Report, 2022, accessed May 5, 2022, https://www.theafricareport.com/193658/kenya-politicians-on-notice-after-dozens-of-political-dialects-banned/.   
  • 49
      Note: Waki et al., CIPEV.     — an incident that Ruto later described as “an accidental kitchen fire.”  Note: WikiLeaks, “A/S Carson and NSC Senior Director Gavin’s Meeting with Agricultural Minister Ruto,” 2009, accessed May 5, 2022, https://wikileaks.org/plusd/cables/09NAIROBI1083_a.html.     
  • 50
    WikiLeaks, “A/S Carson and NSC Senior Director Gavin’s Meeting with Agricultural Minister Ruto,” 2009, accessed May 5, 2022, https://wikileaks.org/plusd/cables/09NAIROBI1083_a.html
  • 51
      Note: Vosoughi, Roy, and Aral, “The Spread of True and False News Online.”  
  • 52
      Note: Bartlett, The People vs Tech.   
  • 53
      Note: Odanga Madung and Brian Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” Mozilla, 2021, accessed May 5, 2022, https://foundation.mozilla.org/en/blog/fellow-research-inside-the-shadowy-world-of-disinformation-for-hire-in-kenya/.     
  • 54
      Note: Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya.”  
  • 55
      Note: Al Jazeera, “Kenya: Supreme Court Blocks Controversial Constitutional Reforms,” 2022, accessed May 5, 2022, https://www.aljazeera.com/news/2022/3/31/kenyas-top-court-to-rule-on-contested-constitutional-reforms.   
  • 56
      Note: Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” 10.     
  • 57
      Note: Sigomba Ramadhan and Brian Murimi, “Social Media Role in Political Disinformation, Smear Campaigns,” Star, 2022, accessed May 5, 2022, https://www.the-star.co.ke/news/big-read/2022-01-18-social-media-role-in-political-disinformation-smear-campaigns/.   
  • 58
      Note: Odanga Madung and Brian Obilo, “How to Manipulate Twitter and Influence People: Propaganda and the Pandora Papers in Kenya,” Mozilla, 2021, accessed May 5, 2022, https://foundation.mozilla.org/en/blog/new-research-in-kenya-disinformation-campaigns-seek-to-discredit-pandora-papers/.   
  • 59
      Note: Nisbet and Kamenchuk, “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy,” 72.  
  • 60
      Note: Odanga Madung, “Exporting Disinformation: How Foreign Groups Peddle Influence in Kenya through Twitter,” Mozilla, 2022, accessed May 5, 2022, https://foundation.mozilla.org/en/campaigns/exporting-disinformation-how-foreign-groups-peddle-influence-in-kenya-through-twitter/.  
  • 61
      Note: Bravin Yuri in Ramadhan and Murimi, “Social Media Role in Political Disinformation, Smear Campaigns.”  
  • 62
      Note: Ramadhan and Murimi, “Social Media Role in Political Disinformation, Smear Campaigns.”  
  • 63
      Note: Patrick Gathara, “Kenyan Youth Are Not to Blame for Their Election Apathy,” Al Jazeera, 2022, accessed May 5, 2022, https://www.aljazeera.com/opinions/2022/2/6/why-are-kenyan-youth-apathetic-about-voting.   
  • 64
      Note: Terms used in author interview, April 2022.  
  • 65
      Note: Author interview, March 2022.  
  • 66
      Note: UNESCO, “Journalism, ‘Fake News,’ & Disinformation,” 2018, accessed May 5, 2022, https://en.unesco.org/sites/default/files/journalism_fake_news_disinformation_print_friendly_0_0.pdf.  
  • 67
      Note: On Europe in Kenya and Africa, see Madung, “Exporting Disinformation”; on Russia in Africa, see Anton Shekhovtsov, “Fake Election Observation as Russia’s Tool of Election Interference: The Case of AFRIC,” European Platform for Democratic Elections, 2020, accessed May 5, 2022, https://www.epde.org/en/documents/details/fake-election-observation-as-russias-tool-of-election-interference-the-case-of-afric.html; Shelby Grossman, Daniel Bush, and Renee DiResta, “Evidence of Russia-Linked Influence Operations in Africa,” Stanford Internet Observatory, 2019, accessed May 5, 2022, https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/29oct2019_sio_-_russia_linked_influence_operations_in_africa.final_.pdf.   
  • 68
      Note: UNESCO, “UNESCO Launches Social Media for Peace Project in Kenya,” 2021, accessed May 5, 2022, https://en.unesco.org/news/unesco-launches-social-media-peace-project-kenya.  
  • 69
      Note: Author interview, March 2022.   
  • 70
      Note: African Arguments, “Africa Elections 2022: All the Upcoming Votes,” 2022, accessed May 5, 2022, https://africanarguments.org/2022/04/africa-elections-all-upcoming-votes/; see also ISS Africa, “African Elections to Watch in 2022,” 2021, accessed May 5, 2022, https://issafrica.org/pscreport/psc-insights/african-elections-to-watch-in-2022.  
  • 71
      Note: Author interview, April 2022.  
  • 72
      Note: Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” 13.  
  • 73
      Note: Author interview, April 2022.    
  • 74
      Note: Author interviews, March and April 2022.    
  • 75
      Note: Author interview, March 2022.  
  • 76
      Note: Madung, “Exporting Disinformation: How Foreign Groups Peddle Influence in Kenya through Twitter,” 14;  Madung and Obilo, “Inside the Shadowy World of Disinformation-for-Hire in Kenya,” 12.   
  • 77
      Note: Author interview, April 2022.  
  • 78
      Note: Maweu, “Fake Elections?,” 64; see also Reporters Without Borders, “Kenyan Election Campaign Hits Journalists and Media Freedom,” 2017, accessed May 5, 2022,  https://rsf.org/en/news/kenyan-election-campaign-hits-journalists-and-media-freedom.  
  • 79
      Note: Author interview, April 2022.  
  • 80
      Note: Victor Jack, “Russia Expands Laws Criminalizing ‘Fake News,’ ” Politico, 2022, accessed May 5, 2022, https://www.politico.eu/article/russia-expand-laws-criminalize-fake-news/.  
  • 81
      Note: Quote from author interview, April 2022; see also EU DisinfoLab, “Tackling Disinformation Online: The Digital Services Act Opens the Era of Accountability,” 2022, accessed May 5, 2022, https://www.disinfo.eu/advocacy/tackling-disinformation-online-the-digital-services-act-opens-the-era-of-accountability/.   
  • 82
      Note: Author interview, April 2022.     
  • 83
      Note: National Cohesion and Integration Commission, “The Third State of Peace and Political Decency.”   
  • 84
      Note: Author interview, March 2022.  
  • 85
      Note: Author interview, March 2022.  
  • 86
      Note: Author interview, March 2022.  
  • 87
      Note: Author interview, March 2022.  
  • 88
      Note: Author interview, March 2022.  
  • 89
      Note: Author interview, March 2022.  
  • 90
      Note: Meta, “Bringing Local Context to Our Global Standards,” 2022, accessed May 5, 2022, https://transparency.fb.com/policies/improving/bringing-local-context.   
  • 91
      Note: Author interview, April 2022; see also Wilson Center, “Malign Creativity: How Gender, Sex, and Lies Are Weaponized Against Women Online,” 2022, accessed May 5, 2022, https://www.wilsoncenter.org/publication/malign-creativity-how-gender-sex-and-lies-are-weaponized-against-women-online.   
  • 92
      Note: Ranking Digital Rights, “Big Tech Scorecard,” 2022, accessed May 5, 2022,  https://rankingdigitalrights.org/index2022/companies/Twitter.   
  • 93
      Note: Author interview, April 2022.  
  • 94
      Note: Author interview, April 2022.  
  • 95
      Note: Bradshaw, Samantha, Hannah Bailey, and Philip N. Howard. Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation. Computational Propaganda Research Project. Accessed May 5, 2022. https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/127/2021/01/CyberTroop-Report-2020-v.2.pdf  

Recent & Related

Project Note
Ryan Fletcher

Subscription Options

* indicates required

Research Areas

Pivotal Places

Publications & Project Lists

38 North: News and Analysis on North Korea