Authors
Asfandyar Mir
Ramzy Mardini
News Type
Commentary
Date
Paragraphs

On Wednesday night, U.S.-led coalition forces based out of Camp Taji north of Baghdad came under intense rocket fire. The attack killed three coalition personnel, two American and one British. It also injured nearly a dozen more personnel.

While rocket fire on U.S. military bases in Iraq is not new, this attack is the first time U.S. personnel have been killed by suspected Iranian-backed Iraqi groups since the United States killed Iranian Gen. Qasem Soleimani in early January. The attack is likely to anger the Trump administration, which has pursued an aggressive strategy against Iran. It also catches the White House in the middle of another ballooning international crisis — the coronavirus pandemic.

Why did this attack happen now? And will this incident spark more hostilities in the Middle East?
 

 

Read the rest at The Washington Post

All News button
1
-

Abstract:

Ahmet T. Kuru will talk about his new book  Islam, Authoritarianism, and Underdevelopment: A Global and Historical Comparison (Cambridge University Press, 2019). Why do Muslim-majority countries have high levels of authoritarianism and low levels of socio-economic development in comparison to world averages? Kuru elaborates an argument about the ulema-state alliance as the cause of these problems in the Muslim world from the eleventh century to the present. Criticizing essentialist, post-colonialist, and new institutionalist alternative explanations, Kuru focuses on the relations between intellectual, economic, religious, and political classes in his own explanation.

 

Speaker Bio:

Image
ahmet kuru
Ahmet T. Kuru is Professor of Political science at San Diego State University. Kuru received his PhD from the University of Washington and held a post doc position at Columbia University. He is the author of award-winning Secularism and State Policies toward Religion: The United States, France, and Turkey (Cambridge University Press) and the co-editor (with Alfred Stepan) of Democracy, Islam, and Secularism in Turkey (Columbia University Press). Kuru’s works have been translated into Arabic, Chinese, French, Indonesian, and Turkish.

Ahmet Kuru Professor of Political Science at San Diego State University
Authors
Shelby Grossman
Khadeja Ramali
Renee DiResta
News Type
Blogs
Date
Paragraphs

The Stanford Internet Observatory has been investigating new facets to the manipulation of the local media environment in Libya: Russian actors who are known to have previously created and sponsored online news media fronts and associated Facebook pages, now appear to be expanding into similar activities in broadcast media. By surreptitiously financing a well-established well-known media brand, these actors are taking a Cold War-era strategy of supporting local media outlets and updating it for the digital age.

Over the past year Russia has become increasingly involved in the conflict in Libya. Some of this involvement is kinetic: Russian mercenary soldiers employed by firms linked to Yevgeny Prigozhin, a Russian businessman with close ties to Vladimir Putin, are fighting alongside Khalifa Haftar’s self-styled Libyan National Army (LNA) forces. Modern Russian weapons have been found on battlefields. Alongside the kinetic, the relationship includes media and information operations support for political candidates, and social media influence operations: Stanford Internet Observatory research previously found that Prigozhin-linked firms had created Facebook Pages bolstering not only Haftar but Saif al-Islam Gaddafi, one of Muammar Gaddafi’s surviving sons. Prigozhin may be trying to bring Gaddafi supporters to Haftar’s camp, or simply playing multiple sides of the local power game by bolstering two likely presidential contenders. While the motivation remains a matter of state strategy, it is clear that Russian actors are exerting influence via traditional as well as social media channels.

This involvement takes the form of both direct involvement in content creation as well as financial support for local creators, which presents a challenge for evaluating authenticity in the Libyan media ecosystem: when does foreign support for local media cross the line into facilitating inauthentic behavior?

In November the Dossier Center, a London-based investigative organization, shared an appendix from an internal Prigozhin-linked group document with the Stanford Internet Observatory team. The leaked document, dated March 20, 2019, describes three media interventions in Libya:

  1. entering into a financial arrangement in which a Prigozhin-linked firm would own 50% of the former state-run TV station under Muammar Gaddafi (now supportive of Saif al-Islam Gaddafi); 
  2. creating a physical pro-LNA newspaper, Voice of the People 
  3. consulting on Alhadath, a Haftar-aligned TV station.

In this post, we discuss the social media and online presence of these television channels and the Voice of the People print newspaper. Key findings include:

  • By secretly investing in a long-standing TV channel, Prigozhin is refining his ability to blur the lines of media authenticity. 
  • The TV channel (and its related social media entities) have historically been pro-Gaddafi; in the months since investment, they additionally became supportive of Haftar. This backfired, with social media users mocking the obvious shift in tone and calling out what they perceived to be the channel’s foreign backers. 
  • A real political party, the Civil Democratic Party, posts PDFs of the Prigozhin-funded newspaper on its Facebook Page, with the party’s logo on the paper’s header. The newspaper is vigorously anti-GNA and pro-Haftar.


Internal document from a Prigozhin-linked group. Source: The Dossier Center.

Aljamahiria TV station and Jana News Agency

The Aljamahiria TV channel was the former Libyan state-run broadcasting organization under Gaddafi. Anti-Gaddafi rebel forces removed it from the air in 2011. It appeared again in 2014, and is now on Nilesat, an Egyptian communications satellite.

The Dossier Center document describes “the company” (the name for the Prigozhin-linked group) providing technical, financial, and advisory support for a TV station, Aljamahiria TV since January 2019. The memo goes on to say that “the channel criticizes the activities of Khalifa Haftar [LNA], Khalid Al-Mishri [head of the Tripoli-based High Council of State] and Western countries” and supports Saif al-Islam Gaddafi, and notes that “50% of the channel (in a joint venture) belongs to the Russian side” (translated).

The memo highlights the transformative effects of the Prigozhin investment, saying that the TV channel used to be:

chaotic, regularly interrupted for 2-3 months. Currently, the channel broadcasts on a regular basis and is popular with supporters of Saif al-Islam Gaddafi. The channel’s monthly audience exceeds 6 million views in the Middle East and North Africa. Moreover, the company’s employees created a unified information service for the Jamahiria TV channel and the Jana news agency. In March 2019, the company’s specialists launched 6 new regular broadcasts and resumed work [...]

The memo then shows before and after images of the studio, illustrating that the Prigozhin investment helped modernize the studio.


Heading reads: "On-air studio: Before and after working with the company"

Aljamahiria and Jana News Agency have an extensive presence on social media platforms, sometimes with substantial followings and frequent posting schedules:


Aljamahiriya TV station and Jana News Agency social media presence.

Content on social media accounts associated with this TV station indicate that it shares Muammar Gaddafi nostalgia content (ie, “Muammar Gaddafi, I wish that you would come back.”) and content supportive of Saif al-Islam Gaddafi. For several months the additional posts on Aljamahiria’s Facebook Page were typically neutral news statements -- noting that there were clashes south of Tripoli, or posts about the weather.

In December 2019, however, the tone changed. Posts began to appear  that were critical of Turkish military support to the UN-recognized Tripoli-based Government of National Accord, and more supportive of the LNA. For example, a post on January 6, 2020 said قوات الشعب المسلح تحرر مدينة سرت (The forces of the armed population free the city of Sirte). The phrase “forces of the armed population” is a phrase originating from the Gaddafi regime to describe the official Libyan armed forces. Here the channel is using the term interchangeably to describe the LNA advancement on Sirte.


Posts on al-Jamahiriya became more anti-GNA over time. The anti-GNA slant measure comes from a dictionary of 37 words and phrases like “liberation” (as in “Haftar will liberate Tripoli”) and “Turkey” (as in “report reveals the number of Syrian mercenaries arriving from Turkey to Libya”) and “Qatar”/”terrorism”/”Muslim Brotherhood” (as in “Doha, funded by `Hamas and the Muslim Brotherhood’ to spread terrorism in Libya”).

Beginning in December 2019, social media users noticed this trend; they called it “حفترة قناة الخضراء” (Hafterization of the Green Channel). On December 19 one Facebook user commented in a group that Aljamahiria had shifted from referring to “Haftar’s militias” to “the Armed Forces,” language that aligns Gaddafi-era terms with Haftar’s LNA. One Page mocked Aljamahiria’s dramatic shift in tone, suggesting satirically that even the word “prayers” needs to be renamed “Haftar’s prayers.” In one comment thread a user said “the channel is now with Haftar,” and another responded saying “no, Haftar is now with the channel.”

Authentic Gadaffi supporters took to Facebook to express their displeasure at how now Hafterized Aljamahiria was misrepresenting them. One commenter even wrote a few poetic verses to describe his anger:

Yes, he really hafterised it
from Ghat to Sebha ..
and He ruined it ..
The green channel, he hafterized it.

Yes, rats hafterize yourselves.
The zero hour always equals zero if it's according to the local time of the Karama leader's watch.


A user posted Aljamahiria content using laughing emojis after noting that Aljamahiria encouraged people to fight with Haftar.


A Facebook user commenting on Aljamahiria’s shift from referencing “Haftar’s militias” to “the Armed Forces.”

Aljamahiria then backed off the pro-LNA language; in one post they called Haftar a war criminal. But social media users noticed this shift in tone as well. In February 2020 a pro-GNA Page posted an Aljamahiria video, saying that the channel was suddenly criticizing Haftar after having encouraging the youth to fight with him. In response to an Aljamahiria video that was critical of Haftar and posted in February 2020, one user wrote: “why did u turn on the army?” (translated). Two users posted 1,500 word tomes theorizing about the shifts in Aljamahiria’s tone, with one directing remarks toward what he perceived to be Aljamahiria’s foreign backers: he accused Jamahiriya of accepting money from foreign countries and said that Jamahiriya had become a channel of “propaganda and distorted ideas.” A Twitter user commented on the new tone shift as well. 

Libya Facts, a pro-Gaddafi Page, defended the Aljamahiria Page, showing screenshots of anti-Haftar posts on Aljamahiria to allay suspicions. Libya Facts also noted that the channel is based out of Egypt and the Egyptian government carefully monitors who receives foreign funding, implying that Aljamahiria could not possibly be tied to any foreign entity. 

Aljamahiria has a professional, polished Instagram account, created in October 2019, which shares original Muammar Gaddafi nostalgia memes and pro-Saif al Islam Gaddafi memes. 


Upper left: A post from facebook.com/libyanfacts.ly attempting to prove the neutrality of Aljamahiria TV. The image caption says “Jamahiriya attacked Haftar.” Upper right: A post on the Aljamahiria Facebook Page. The text says, in part, Muammar Gaddafi, I wish that you would come back. Lower left: An ad run by the Aljamahiria Facebook Page. It reads, “The official account of the Jamahiriya (green) channel.” The channel was called The Green Channel under Muammar Gaddafi. Lower right: A meme bolstering Saif al-Islam Gaddafi on the Aljamahiria Instagram account.

The Jana News Agency, which is explicitly part of the Aljamahiria network (its logo says Aljamahiria News Agency) and is mentioned in the leaked document, has a website, jana-ly.co, that was created in January 2017. Its original Facebook Page had 5 admins in Egypt, 1 admin in the UK, and 3 admins whose location are hidden. This pattern is similar to the administrator ownership pattern of Facebook Pages involved in Libyan influence operations that we identified in previous research, where Pages typically had 5 Egyptian Page admins and 1 other admin in a European country. Interestingly, jana-ly.co has an article from November 2, 2019, reposted from Russian state media outlet Sputnik, about the Prigozhin-linked Facebook takedown of influence operations targeting Libya. It claims that Facebook removed those Pages in anger over the success of the Russia-Africa Sochi summit. In February 2020 its Page came down, and days later it respawned as facebook.com/jana2.ly with 3 admins in Egypt. 

We also found a Facebook Page called الجبهة الشعبية لتحرير ليبيا (Popular Front for the Liberation of Libya) that lists as its “media platforms” facebook.com/aljamahiriytv and facebook.com/janaly.co Like the TV channel and Jana News, the Popular Front for the Liberation of Libya Page has pro-Gaddafi content. It has also run anti-Sarraj and anti-Turkish ads. The Popular Front Page also lists facebook.com/safalbonyan, facebook.com/libya24tv, and facebook.com/libyamandela as additional “media platforms”, which similarly have pro-Gaddafi posts. All of these Pages have a majority of administrators in Egypt.  

Consistent with the trends we observe on these social platforms, New York Time reporting suggests that Haftar is welcoming support from former Gaddafi supporters.

Voice of the People Newspaper

Another entity mentioned in the Dossier Center memo is the Voice of the People newspaper. The memo notes that “since January 2019, the Company’s specialists began publishing the Voice of the People newspaper. The newspaper is distributed in the territory controlled by the LNA. The general content of the newspaper is criticism of the new draft Constitution, the policies of Al-Misri and Sarraj, support for the activities of the LNA and the image of Khalifa Haftar. The circulation of the newspaper is 300,000 copies. At the moment, 2 circulations of the newspaper have been prepared and distributed.”

The creation of a print newspaper is noteworthy. New York Times reporting suggests Russian entities created a newspaper in Madagascar in 2018 as well, saying: “Russians published their own newspaper in the local language and hired students to write fawning articles about the president to help him win another term.”

We found the two issues of the newspaper here and here -- as of February 2020, there don’t appear to be more. The newspaper is vehemently against the new draft constitution, encouraging citizens to vote “no” on it. The Constitutional Drafting Assembly voted to allow military personnel to be eligible for president only if they renounce their military positions two years before elections, a policy that pro-Haftar groups were against given that he would likely run for the position in the future. The latest (fourth) draft of the constitution also says that presidential candidates must relinquish any foreign nationalities at least one year before elections; Haftar has American citizenship. The Civil Democratic Party appears to be close to Haftar, and supports his offensive on Tripoli.  Accounts on Twitter have said that the newspaper is being distributed for free; one posted a photo of it.


Photo of the newspaper, Voice of the People. The Tweet says “Read the Voice of the People newspaper.”

The paper is branded with the logo of the Civil Democratic Party, and issues have been posted as PDFs on the Civil Democratic Party Facebook Page. The Party is made up of former members of Libya’s Transitional Council and former ambassadors. Its Facebook Page was created in September 2017, and has 3 admins in Libya. The CDP appears to have their own video recording capability, occasionally posting videos reminiscent of news broadcasts.


Cover page of the March 2019 issue of the Voice of the People newspaper.

The first issue of the newspaper, published in January 2019, included an article introducing the paper, written by the president of the party. The party leader expressed allegiances toward the LNA. The issue focused on criticizing the constitutional drafting project. Articles implied the new draft constitution was undemocratic and “succumbed to political Islam.” An opinion piece urged citizens to vote against the constitution.


A cartoon in the 2019 issue saying that the constitution project does not meet the demands of the people. The other cartoon criticized the constitution for preventing Haftar from competing in elections.


The first issue of the newspaper told Libyans not to be fooled, and to vote “no” on the draft constitution.

The second issue of the newspaper, published in March 2019, led with an article called “The Hidden Lebanese Government” (translated). The article alleged that the Government of National Accord -- the internationally-recognized Tripoli-based government -- is letting international actors like the UN and Lebanon take over the government. Another article on the cover page noted that the average Libyan is suffering, with queues at banks and corruption. The GNA should be replaced, it claimed, and the state should regain its monopoly on force and bring back the rule of law. A cartoon on the cover page shows the GNA Prime Minister bringing foreign allies a pie of Libya. Each foreign actor vies for a piece of Libya, and the Special Representative for the UN in Libya says there will be enough for everyone.


A cartoon of the UN serving a pie of Libya in the second issue of the Voice of the People newspaper.

Alhadath TV Station

The third aspect of this memo references the LNA-aligned TV channel: “In February 2019, the company’s specialists conducted an external audit of the activities of Alhadath TV channel (LNA channel), on the basis of which they prepared and presented their recommendations for the optimization of broadcasting to the [Haftar] team.” 

The Facebook Page associated with this TV channel has about 875,000 followers. It was created in 2016, and has seven Page administrators in Libya. The Page is generally pro-Haftar and often reposts statements from Haftar’s spokesman, Ahmed al Mismari. There is a Twitter account, @libyaalhadathtv, created in 2015, which has 89,100 followers. The only account that the Twitter account follows is @news9ly, which was created in July 2019 and shared LibyaAlhadathTV content. There is also a YouTube channel, created in 2016, which notes that the station (like Aljamahiria) is on Nilesat. Its associated website is libyaalhadath.net. The current nature of Prigozhin’s involvement in this TV channel is unclear, though we note that Facebook lists the Alhadath Page as a “Related Page” to the new Jana News Agency Page.


Pages discussed in this post, with a vertical line noting the start of January 2019. The internal document suggests some Prigozhin activities in Libya began in January.

Our prior research showed that Russian actors created Facebook Pages supportive of Saif Gaddafi and Haftar. This leaked document suggests that Russian actors are supporting these two figures in Libya’s traditional print and television media space as well. In our earlier research, we found that Russian actors had franchised out management of their Facebook Pages to content creators in Egypt. This made it harder for Libyans to detect the involvement of Russian actors. Similarly, here we see foreign actors inserting themselves into the legitimate Libyan media environment by way of financial support. While Libyans noticed the change in tone on the Pages, attributing that involvement to specific actors is a significant challenge. Going forward, the combination of franchising and virtually-undetectable financial support will make gauging the independence and authenticity of media outlets online and offline even more difficult for disinformation researchers. These tactics will also create difficult decisions for platforms about whether the behavior violates their terms of service. 

All News button
1
-

The Shorenstein Asia-Pacific Research Center cordially invites its faculty, scholars, staff, affiliates, and their families to join APARC's first International Potluck Day! Join us to celebrate the diversity of APARC through a multicultural smorgasbord of food. Bring a dish from your home country or family heritage to share with the APARC community as we take the time to mix, mingle, and celebrate the diversity that makes APARC special.

Due to current circumstances, we will be postponing this event until further notice. Thank you for your understanding.

-

The research on misinformation generally and fake news specifically is vast, as is coverage in media outlets. Two questions run throughout both the academic and public discourse: what explains the spread of fake news online, and what can be done about it? While there is substantial literature on who is likely to be exposed to and share fake news, these behaviors might not signal belief or effect. Conversely, there is far less work on who is able to differentiate between true and false stories and, as a result, who is most likely to believe fake news (or, conversely, not believe true news), a question that speaks directly to Facebook’s recent “community review” approach to combating the spread of fake news on its platform.

In his talk, Professor Tucker will report on initial findings from a new collaborative project between NYU’s Center for Social Media and Politics and Stanford’s Program on Democracy and the Internet designed to fill these gaps in the scholarly literature and inform the types of policy decisions being made by Facebook. The project has enlisted both professional fact checkers and random “crowds” of close to 100 people to fact check five “fresh” articles (that have appeared in the past 24 hours) per day, four days a week, for eights week using an innovative transparent and replicable algorithm for selecting the articles for fact checking. He will report on initial observations regarding (a) individual determinants of fact checking proficiency; (b) the viability using the “wisdom of the crowds” for fact checking, including examining the tradeoffs between crafting a more accurate crowd vs. a more representative crowd and (c) results from experiments designed to assess potential policy interventions to improve crowdsourcing accuracy.

About the Speaker:

Image
Joshua Tucker
Joshua A. Tucker is Professor of Politics, affiliated Professor of Russian and Slavic Studies, and affiliated Professor of Data Science at New York University. He is the Director of NYU’s Jordan Center for Advanced Study of Russia, a co-Director of the NYU Social Media and Political Participation (SMaPP) laboratory, a co-Director of the new NYU Center for Social Media and Politics, and a co-author/editor of the award-winning politics and policy blog The Monkey Cage at The Washington Post. He serves on the advisory boards of the American National Election Study, the Comparative Study of Electoral Systems, and numerous academic journals. Originally a scholar of post-communist politics, he has more recently studied social media and politics. His research in this area has included studies on the effects of network diversity on tolerance, partisan echo chambers, online hate speech, the effects of exposure to social media on political knowledge, online networks and protest, disinformation and fake news, how authoritarian regimes respond to online opposition, and Russian bots and trolls. His research has been funded by over $8 million in grants in the past three years, including a 2019 Knight Foundation “Research on the Future of an Informed Society” grant. His most recent book is the co-authored Communism’s Shadow: Historical Legacies and Contemporary Political Attitudes (Princeton University Press, 2017), and he is the co-editor of the forthcoming edited volume Social Media and Democracy (Cambridge University Press, 2020). 

News Type
Q&As
Date
Paragraphs

A Q&A with Professor Stephen Stedman, who serves as the Secretary General of the Kofi Annan Commission on Elections and Democracy in the Digital Age.

Image
Stedman Steve
Stephen Stedman, a Senior Fellow at the Freeman Spogli Institute for International Studies (FSI) at Stanford, is the director of the Kofi Annan Commission on Elections and Democracy in the Digital Age, an initiative of the Kofi Annan Foundation. The Commission is focused on studying the effects of social media on electoral integrity and the measures needed to safeguard the democratic process.  

At the World Economic Forum in Davos, Switzerland, the Commission which includes FSI’s Nathaniel Persily, Alex Stamos, and Toomas Ilves, launched a new report, Protecting Electoral Integrity in the Digital Age. The report takes an in-depth look at the challenges faced by democracy today and makes a number of recommendations as to how best to tackle the threats posed by social media to free and fair elections. On Tuesday, February 25, professors Stedman and Persily will discuss the report’s findings and recommendations during a lunch seminar from 12-1:15 PM. To learn more and to RSVP, visit the event page.

Q: What are some of the major findings of the report? Are digital technologies a threat to democracy?

Steve Stedman: Our report suggests that social media and the Internet pose an acute threat to democracy, but probably not in the way that most people assume. Many people believe that the problem is a diffuse one based on excess disinformation and a decline in the ability of citizens to agree on facts. We too would like the quality of deliberation in our democracy to improve and we worry about how social media might degrade democratic debate, but if we are talking about existential threats to democracy the problem is that digital technologies can be weaponized to undermine the integrity of elections.

When we started our work, we were struck by how many pathologies of democracy are said to be caused by social media: political polarization; distrust in fellow citizens, government institutions and traditional media; the decline in political parties; democratic deliberation, and on and on. Social media is said to lessen the quality of democracy because it encourages echo chambers and filter bubbles where we only interact with those who share our political beliefs. Some platforms are said to encourage extremism through their algorithms.

What we found, instead, is a much more complex problem. Many of the pathologies that social media are said to create – for instance, polarization, distrust, and political sorting begin their trendlines before the invention of the Internet, let alone the smart phone. Some of the most prominent claims are unsupported by evidence, or are confounded by conflicting evidence. In fact, we say that some assertions simply cannot be judged without access to data held by the tech platforms.

Instead, we rely on the work of scholars like Yochai Benkler and Edda Humphries to argue that not all democracies are equally vulnerable to network propaganda and disinformation. It is precisely where you have high pre-existing affective polarization, low trust, and hyperpartisan media, that digital technologies can intensify and amplify polarization.

Elections and toxic polarization are a volatile mix. Weaponized disinformation and hate speech can wreak havoc on elections, even if they don’t alter the vote tallies. This is because democracies require a system of mutual security. In established democracies political candidates and followers take it for granted that if they lose an election, they will be free to organize and contest future elections. They are confident that the winners will not use their power to eliminate them or disenfranchise them. Winners have the expectation that they hold power temporarily, and accept that they cannot change the rules of competition to stay in power forever. In short, mutual security is a set of beliefs and norms that turn elections from being a one-shot game into a repeated game with a long shadow of the future.

In a situation already marred by toxic polarization, we fear that weaponized disinformation and hate speech can cause parties and followers to believe that the other side doesn’t believe in the rules of mutual security. The stakes become higher. Followers begin to believe that losing an election means losing forever. The temptation to cheat and use violence increases dramatically. 

Q: As far as political advertising, the report encourages platforms to provide more transparency about who is funding that advertising. But it also asks that platforms require candidates to make a pledge that they will avoid deceptive campaign practices when purchasing ads. It also goes as far as to recommend financial penalties for a platform if, for example, a bot spreading information is not labelled as such. Some platforms might argue that this puts an unfair onus on them. How might platforms be encouraged to participate in this effort?

SS: The platforms have a choice: they can contribute to toxic levels of political polarization and the degradation of democratic deliberation, or they can protect electoral integrity and democracy. There are a lot of employees of the platforms who are alarmed at the state of polarization in this country and don’t want their products to be conduits of weaponized disinformation and hate speech. You saw this in the letter signed by Facebook employees objecting to the decision by Mark Zuckerberg that Facebook would treat political advertising as largely exempt from their community standards. If ever there were a moment in this country that we should demand that our political parties and candidates live up to a higher ethical standard it is now. Instead Facebook decided to allow political candidates to pay to run ads even if the ads use disinformation, tell bald-faced lies, engage in hate speech, and use doctored video and audio. Their rationale is that this is all part of “the rough and tumble of politics.” In doing so, Facebook is in the contradictory position that it has hundreds of employees working to stop disinformation and hate speech in elections in Brazil and India, but is going to allow politicians and parties in the United States to buy ads that can use disinformation and hate speech.

Our recommendation gives Facebook an option that allows political advertisement in a way that need not enflame polarization and destroy mutual security among candidates and followers: 1.) Require that candidates, groups or parties who want to pay for political advertising on Facebook sign a pledge of ethical digital practices; 2.) Then use the standards to determine if an ad meets the pledge or not. If an ad uses deep fakes, if an ad grotesquely distorts the facts, if an ad out and out lies about what an opponent said or did, then Facebook would not accept the ad. Facebook can either help us raise our electoral politics out of the sewer or it can ensure that our politics drowns in it.

It’s worth pointing out that the platforms are only one actor in a many-sided problem. Weaponized disinformation is actively spread by unscrupulous politicians and parties; it is used by foreign countries to undermine electoral integrity; and it is often spread and amplified by irresponsible partisan traditional media. Fox News, for example, ran the crazy conspiracy story about Hilary Clinton running a pedophile ring out of a pizza parlor in DC. Individuals around the president, including the son of the first National Security Adviser tweeted the story. 

Q: While many of the recommendations focus on the role of platforms and governments, the report also proposes that public authorities promote digital and media literacy in schools as well as public interest programming for the general population. What might that look like? And how would that type of literacy help protect democracy? 

SS: Our report recommends digital literacy programs as a means to help build democratic resilience against weaponized disinformation. Having said that however, the details matter tremendously. Sam Wineburg at Stanford, who we cite, has extremely insightful ideas for how to teach citizens to evaluate the information they see on the Internet, but even he puts forward warnings: if done poorly digital literacy could simply increase citizen distrust of all media, good and bad; digital literacy in a highly polarized context begs the question of who will decide what is good and bad media. We say in passing that in addition to digital literacy we need to train citizens to understand biased assimilation of information. Digital literacy trains citizens to understand who is behind a piece of information and who benefits from it. But we also need to teach citizens to stand back and ask, “why am I predisposed to want to believe this piece of information?”

Q: Obviously access to data is critical for researchers and commissioners to do their work, analysis and reporting. One of the recommendations asks that public authorities compel major internet platforms to share meaningful data with academic institutions. Why is it so important for platforms and academia to share information?

SS: Some of the most important claims about the effects of social media can’t be evaluated without access to the data. One example we cite in the report is the controversy about whether YouTube’s algorithms radicalize individuals and send them down a rabbit hole of racist, nationalist content. This is a common claim and has appeared on the front pages of the New York Times. The research supporting the claim, however, is extremely thin, and other research disputes it. What we say is that we can’t adjudicate this argument unless YouTube were to share its data, so that researchers can see what the algorithm is doing. There are similar debates concerning the effects of Facebook. One of our commissioners, Nate Persily, has been at the forefront of working with Facebook to provide certified researchers with privacy protected data – Social Science One. Progress has been so slow that the researchers have lost patience. We hope that governments can step in and compel the platforms to share the data.

Q: This is one of the first reports to look at this problem in the Global South. Is the problem more or less critical there?

SS: Kofi Annan was very concerned that the debate about digital technologies and democracy was far too focused on Europe and the United States. Before Cambridge Analytica’s involvement in the United States and Brexit elections of 2016, its predecessor company had manipulated elections in Asia, Africa and the Caribbean. There is now a transnational industry in election manipulation.

What we found does not bode well for democracies in the rest of the world. The factors that make democracies vulnerable to network propaganda and weaponized disinformation are often present in the Global South: pre-existing polarization, low trust, and hyperpartisan traditional media. Many of these democracies already have a repertoire of electoral violence. 

On the other hand, we did find innovative partnerships in Indonesia and Mexico where Election Management Bodies, civil society organizations, and traditional media cooperated to fight disinformation during elections, often with success. An important recommendation of the report is that greater attention and resources are needed for such efforts to protect electoral integrity in the Global South. 

About the Commission on Elections and Democracy in the Digital Age

 As one of his last major initiatives, in 2018 Kofi Annan convened the Commission on Elections and Democracy in the Digital Age. The Commission includes members from civil society and government, the technology sector, academia and media; across the year 2019 they examined and reviewed the opportunities and challenges for electoral integrity created by technological innovations. Assisted by a small secretariat at Stanford University and the Kofi Annan Foundation, the Commission has undertaken extensive consultations and issue recommendations as to how new technologies, social media platforms and communication tools can be harnessed to engage, empower and educate voters, and to strengthen the integrity of elections. Visit  the Kofi Annan Foundation and the Commission on Elections and Democracy in the Digital Age for more on their work.

All News button
1
-

This event is co-sponsored by the European Security Initiative

* Please note all CISAC events are scheduled using the Pacific Time Zone

 

Seminar Recording: https://youtu.be/1rkTwxnf2Fg

 

About this Event: Russia has employed the semi-state Wagner Group security company in Ukraine, Syria, the Central African Republic, Libya, Mozambique, and Mali (so far). Wagner is tightly connected to Russia's military intelligence organization (the GRU), and partially funded by one of Vladimir Putin's cronies, Evgeny Prigozhin, who also uses it for private duties. So why is Wagner technically illegal (and even unconstitutional) in Russia? Its use is less costly in budgetary and political terms than using the uniformed military, and it provides (limited) plausible deniability for Russian actions. But it is also unclear what Russia wants from impoverished sub-Saharan Africa. Using the best available evidence, this presentation explores these mysteries.

 

About the Speaker: Kimberly Marten is a professor of political science (and the department chair) at Barnard College, Columbia University, and a faculty member of Columbia’s Harriman Institute and Saltzman Institute. She has written four books, including Warlords: Strong-Arm Brokers in Weak States (Cornell, 2012), and Engaging the Enemy: Organization Theory and Soviet Military Innovation (Princeton, 1993) which received the Marshall Shulman Prize. The Council on Foreign Relations (where she is a member) published her special report, Reducing Tensions between Russia and NATO (2017). She is a frequent media commentator, and appeared on “The Daily Show” with Jon Stewart. She earned her A.B. at Harvard and Ph.D. at Stanford, and was a CISAC post-doc.

Virtual Seminar

Kimberly Marten Professor of Political Science (and the department chair) at Barnard College, Columbia University Barnard College, Columbia University
Seminars
-

Join Stephen Stedman, Nathaniel Persily, the Cyber Policy Center, and the Center on Democracy, Development and the Rule of Law (CDDRL) in an enlightening exploration of the recent report, Protecting Electoral Integrity in the Digital Age, put out by the Kofi Annan Commission on Elections and Democracy in the Digital Age. Moderated by Kelly Born, Executive Director of the Cyber Policy Center.

More on the report:

 

Abstract:

New information and communication technologies (ICTs) pose difficult challenges for electoral integrity. In recent years foreign governments have used social media and the Internet to interfere in elections around the globe. Disinformation has been weaponized to discredit democratic institutions, sow societal distrust, and attack political candidates. Social media has proved a useful tool for extremist groups to send messages of hate and to incite violence. Democratic governments strain to respond to a revolution in political advertising brought about by ICTs. Electoral integrity has been at risk from attacks on the electoral process, and on the quality of democratic deliberation.

The relationship between the Internet, social media, elections, and democracy is complex, systemic, and unfolding. Our ability to assess some of the most important claims about social media is constrained by the unwillingness of the major platforms to share data with researchers. Nonetheless, we are confident about several important findings.

About the Speakers

Image
Stephen Stedman
Stephen Stedman is a senior fellow at the Freeman Spogli Institute for International Studies, professor, by courtesy, of political science, and deputy director of the Center on Democracy, Development and Rule of Law. Professor Stedman currently serves as the Secretary General of the Kofi Annan Commission on Elections and Democracy in the Digital Age, and is the principal drafter of the Commission’s report, “Protecting Electoral Integrity in the Digital Age.”

Professor Stedman served as a special adviser and assistant secretary general of the United Nations, where he helped to create the United Nations Peacebuilding Commission, the UN’s Peacebuilding Support Office, the UN’s Mediation Support Office, the Secretary’s General’s Policy Committee, and the UN’s counterterrorism strategy. During 2005 his office successfully negotiated General Assembly approval of the Responsibility to Protect. From 2010 to 2012, he directed the Global Commission on Elections, Democracy, and Security, an international body mandated to promote and protect the integrity of elections worldwide.  Professor Stedman served as Chair of the Stanford Faculty Senate in 2018-2019. He and his wife Corinne Thomas are the Resident Fellows in Crothers, Stanford’s academic theme house for Global Citizenship. In 2018, Professor Stedman was awarded the Lloyd B. Dinkelspiel Award for outstanding service to undergraduate education at Stanford.

Image
Nathaniel Persily

Nathaniel Persily is the James B. McClatchy Professor of Law at Stanford Law School, with appointments in the departments of Political Science, Communication and FSI.  Prior to joining Stanford, Professor Persily taught at Columbia and the University of Pennsylvania Law School, and as a visiting professor at Harvard, NYU, Princeton, the University of Amsterdam, and the University of Melbourne. Professor Persily’s scholarship and legal practice focus on American election law or what is sometimes called the “law of democracy,” which addresses issues such as voting rights, political parties, campaign finance, redistricting, and election administration. He has served as a special master or court-appointed expert to craft congressional or legislative districting plans for Georgia, Maryland, Connecticut, and New York, and as the Senior Research Director for the Presidential Commission on Election Administration.

Also among the commissioners of the report were FSI's Alex Stamos, and Toomas Ilves

 

 

Stephen Stedman
-

Abstract:

China’s cyberspace and technology regime is going through a period of change—but it’s taking a while. The U.S.–China economic and tech competition both influences Chinese government developments and awaits their outcomes, and the 2017 Cybersecurity Law set up a host of still-unresolved questions. Data governance, security standards, market access, compliance, and other questions saw only modest new clarity in 2019. But 2020 promises new laws on personal information protection and data security, and the Stanford-based DigiChina Project in the Program on Geopolitics, Technology, and Governance, is devoted to monitoring, translating, and explaining these developments. From AI governance to the the nexus of cybersecurity and supply chains, this talk will summarize recent Chinese policymaking and lay out expectations for the year to come.

Image
Graham Webster
About the Speaker:

Graham Webster is editor in chief of the Stanford–New America DigiChina Project at the Stanford University Cyber Policy Center and a China digital economy fellow at New America. He was previously a senior fellow and lecturer at Yale Law School, where he was responsible for the Paul Tsai China Center’s U.S.–China Track 2 and Track 1.5 dialogues for five years before leading programming on cyberspace and technology issues. In the past, he wrote a CNET News blog on technology and society from Beijing, worked at the Center for American Progress, and taught East Asian politics at NYU's Center for Global Affairs. Webster holds a master's degree in East Asian studies from Harvard University and a bachelor's degree in journalism from Northwestern University. Webster also writes the independent Transpacifica e-mail newsletter.

0
Research Scholar
Graham Webster

Graham Webster is a research scholar and editor in chief of the DigiChina Project at the Stanford University Cyber Policy Center and a China digital economy fellow at New America. Based at Stanford, he leads an inter-organization network of specialists to produce analysis and translation on China’s digital policy developments. He researches, publishes, and speaks to diverse audiences on the intersection of U.S.–China relations and advanced technology.

From 2012 to 2017, Webster worked for Yale Law School as a senior fellow and lecturer responsible for the Paul Tsai China Center’s Track 2 dialogues between the United States and China, co-teaching seminars on contemporary China and Chinese law and policy, leading programming on cyberspace in U.S.–China relations, and writing extensively on the South China Sea and the law of the sea. While with Yale, he was a Yale affiliated fellow with the Yale Information Society Project, a visiting scholar at China Foreign Affairs University, and a Transatlantic Digital Debates fellow with the Global Public Policy Institute and New America.

He was previously an adjunct instructor teaching East Asian politics at New York University, a public policy and communications officer at the EastWest Institute, a Beijing-based journalist writing on technology in China for CNET News and other outlets, and an editor at the Center for American Progress. He has worked as a consultant to Privacy International, the National Bureau of Asian Research, the Clinton Global Initiative, and the Natural Resources Defense Council’s China Program.

Webster writes for both specialist and general audiences, including for the MIT Technology Review, Foreign Affairs, Slate, The Washington Post’s Monkey Cage, BBC Chinese, Lawfare, ChinaFile, The Diplomat, Fortune, ArtAsiaPacific, and Logic magazine. He has been quoted by The Wall Street Journal, The Washington Post, Reuters, Bloomberg News, Wired, Caixin, and Quartz; spoken to NPR and BBC World Service radio; and appeared on BBC World News, CBSN, Channel News Asia, and Deutsche Welle television. Webster has testified before the U.S.–China Economic and Security Review Commission and speaks regularly at universities and conferences in North America, East Asia, and Europe.

Webster holds a B.S. in journalism and international studies from Northwestern University and an A.M. in East Asian studies from Harvard University. He took Ph.D. coursework in political science at the University of Washington and language training at Tsinghua University, Peking University, Stanford University, and Kanda University of International Studies.

Editor-in-Chief, DigiChina
Date Label
Graham Webster
-

Multilateral Negotiations on ICTs (information and communications technologies) and International Security: Process and Prospects for the UN Group of Government Experts and the UN Open-Ended Working Group

Abstract: The intent of this seminar is to provide an update on recent events at the UN relevant to international discussions of cybersecurity (and a primer of sorts on current UN processes for addressing this topic).

In 2018, UN Member States decided to establish two concurrent negotiations with nearly identical mandates on the international security dimension of ICTs—a sixth limited membership UN Group of Governmental Experts (GGE) and an Open-Ended Working Group (OEWG) open to all governments. How did this happen? Are they competing or complementary endeavors? Is it likely that one will be able to bridge the longstanding divides on how international law applies to cyberspace or agree by consensus to additional norms of responsible State behavior? What would be a good outcome of each process? And how do these negotiations fit into the wider UN ecosystem, including the follow-up to the Secretary-General’s High Level Panel on Digital Cooperation.  

Image
Kerstin Vignard
About the Speaker: Kerstin Vignard is an international security policy professional with nearly 25 years’ experience at the United Nations, with a particular interest in the nexus of international security policy and technology. Vignard is Deputy to the Director at UNIDIR, currently on temporary assignment leading UNIDIR’s team supporting the Chairmen of the latest Group of Governmental Experts (GGEs) on Cyber Security and the Open-Ended Working Group. She has led UNIDIR’s team supporting four previous cyber GGEs. From 2013 to 2018, she initiated and led UNIDIR’s work on the weaponization of increasingly autonomous technologies, and is the co-Principal Investigator of a CIFAR AI & Society grant examining potential regulatory approaches for security and defence applications of AI.

Subscribe to Middle East and North Africa