POSTPONED: Due to concerns over the spread of COVID-19, Stanford CDDRL will be postponing The Ukrainian Nexus Conference until a later date. Please check back for updates on our rescheduled conference date.
Please join the Ukrainian Emerging Leaders Program at Stanford's Center on Democracy, Development and the Rule of Law for the conference, "The Ukrainian Nexus: Politics, Technology, Creativity". This is the Center's second annual conference on Ukraine, created by the program's fellows.
After the events of 2019 everybody knows about Ukraine in the U.S. but nobody actually knows what Ukraine is. The mission of the Conference is to discover Ukraine and its role in the world order for the Stanford and the broader Silicon Valley community. There will be three streams -- Government, Innovation/Business and Creativity.
The goal of this year's conference is to bring together major stakeholders from the new Ukrainian leadership (government, parliament, civil society and business), Silicon Valley's technology and innovation sector, and the Stanford academic community.
In keeping with Stanford University's March 3 message to the campus community on COVID-19 and current recommendations of the CDC, the Asia-Pacific Research Center is electing to postpone this event until further notice. We apologize for any inconvenience this may cause, and appreciate your understanding and cooperation as we do our best to keep our community healthy and well.
Data-intensive technologies such as AI may reshape the modern world. We propose that two features of data interact to shape innovation in data-intensive economies: first, states are key collectors and repositories of data; second, data is a non-rival input in innovation. We document the importance of state-collected data for innovation using comprehensive data on Chinese facial recognition AI firms and government contracts. Firms produce more commercial software and patents, particularly data-intensive ones, after receiving government public security contracts. Moreover, effects are largest when contracts provide more data. We then build a directed technical change model to study the state's role in three applications: autocracies demanding AI for surveillance purposes, data-driven industrial policy, and data regulation due to privacy concerns. When the degree of non-rivalry is as strong as our empirical evidence suggests, the state's collection and processing of data can shape the direction of innovation and growth of data-intensive economies.
Image
David Yang’s research focuses on political economy, behavioral and experimental economics, economic history, and cultural economics. In particular, David studies the forces of stability and forces of changes in authoritarian regimes, drawing lessons from historical and contemporary China. David received a B.A. in Statistics and B.S. in Business Administration from University of California at Berkeley, and PhD in Economics from Stanford. David is currently a Prize Fellow in Economics, History, and Politics at Harvard and a Postdoctoral Fellow at J-PAL at MIT. He also joined Harvard’s Economics Department as an Assistant Professor as of 2020.
David Yang
Prize Fellow in Economics, History, and Politics; Department of Economics, Harvard University
Democratic consolidation around the world currently faces major challenges. Threats to democracy have become more insidious, especially due to the manipulation of legal and constitutional procedures originally designed to guard democracy against arbitrary action and abuse. Free and fair elections, the cornerstone of democratic legitimacy, are under considerable stress from populism and post-truth movements, who abuse new digital communication technologies to confuse and mislead citizens. Today, free and fair elections, the primary expression of democratic will for collective government, are far from guaranteed in many countries around the world. Protecting them will require a new set of policies and actions from technological platforms, governments, and citizens.
Once hailed as a great force for human empowerment and liberation, social media and related digital tools have rapidly come to be regarded as a major threat to democratic stability and human freedom. Based on a deeply problematic business model, social-media platforms are showing the potential to exacerbate hazards that range from authoritarian privacy violations to partisan echo chambers to the spread of malign disinformation. Authoritarian forces are also profiting from a series of other advances in digital technology, notably including the revolution in artificial intelligence (AI). These developments have the potential to fuel a “postmodern totalitarianism” vividly illustrated by China’s rapidly expanding projects of digital surveillance and social control. They also pose a series of challenges for contemporary democracies.
WE HAVE REACHED VENUE CAPACITY AND ARE NO LONGER ACCEPTING RSVPS
Authoritarian governments around the world are developing increasingly sophisticated technologies for controlling information. In the digital age, many see these efforts as futile, as they are easily thwarted by savvy Internet users who quickly find ways to evade and circumvent them. In this talk, Professor Roberts demonstrates that even censorship that is easy to circumvent is enormously effective. Censorship acts like a tax on information, requiring those seeking information to spend more time and money if they want access. By creating small inconveniences that are easy to explain away, censorship powerfully influences the spread of information and, in turn, what people know about politics. Through analysis of Chinese social media data, online experiments, nationally representative surveys, and leaks from China’s Propaganda Department, Professor Roberts find that when Internet users notice blatant censorship they are willing to compensate for better access. But subtler censorship, such as burying search results or introducing distracting information on the web, is more effective because users are less aware of it. Roberts challenges the conventional wisdom that online censorship is undermined when it is incomplete and shows instead how censorship’s porous nature is used strategically to divide the public and target influencers.
Image
Margaret E. Roberts is an Associate Professor at the U.C. San Diego Department of Political Science. Her research interests lie in the intersection of political methodology and the politics of information, specifically focused on automated text analysis and understanding censorship and propaganda in China. Her work has appeared in venues such as the American Journal of Political Science, American Political Science Review, Political Analysis and Science. Her recent book Censored: Distraction and Diversion Inside China’s Great Firewall was listed as one of the Foreign Affairs Best Books of 2018, was honored with the Goldsmith Book Award, and has been awarded the Best Book Award in the Human Rights Section and Information Technology and Politics Section of the American Political Science Association. She received her Ph.D. from Harvard University, an M.S. in statistics from Stanford University, and a B.A. in Economics and International Relations from Stanford.
Advisory on Novel Coronavirus (COVID-19)
In accordance with university guidelines, if you (or a spouse/housemate) have returned from travel to mainland China in the last 14 days, we ask that you DO NOT come to campus until 14 days have passed since your return date and you remain symptom-free. For more information and updates, please refer to the Stanford Environmental Health & Safety website: https://ehs.stanford.edu/news/novel-coronavirus-covid-19
As a U.S.-China trade deal hangs in the balance and the world’s two largest economies are locked in a race for technological supremacy, concerns have arisen about China’s counterintelligence threat to the United States. In July 2019, FBI Director Christopher Wray told members of the U.S. Senate Judiciary Committee that China poses the most severe counterintelligence threat to the United States than any other country, and described that national security and economic espionage threat as “deep and diverse and wide and vexing.” He noted that the FBI has to contend not only with Chinese officials but also with “nontraditional collectors,” including Chinese scientists and students who are looking to steal American innovation. There are currently multiple legislative proposals in Congress, all of which, in one way or another, are aimed at limiting university collaboration with Chinese nationals and the education of Chinese nationals in “strategic” research fields by U.S. higher education institutions.
These legislative endeavors, however, argues Arthur Bienenstock, co-chair of the American Academy of Arts and Sciences’ Committee on International Scientific Partnerships, may endanger the U.S. science and technology workforce and limit the effectiveness of U.S. academic research, thus weakening the very fields the nation is most anxious to protect.
Bienenstock is also a member of the National Science Board, the governing body of the National Science Foundation, and former associate director for science of the White House Office of Science and Technology Policy. At Stanford, he is special assistant to the President for federal research policy, associate director of the Wallenberg Research Link, and professor emeritus of photon science. At a recent lecture hosted by APARC’s China Program, Bienenstock discussed some of the proposed legislation and federal acts regarding international scientific collaboration with China and their implications for the U.S. scientific workforce. He cautioned U.S policymakers against an expansive interpretation of what constitutes “sensitive research” in strategic areas, such as artificial intelligence and quantum science, and offered a framework for determining when scientific research should be subject to greater control.
Indeed, said Bienenstock, “China is the only nation in the world that can and plans to challenge U.S. economic, military and ideological leadership” – a challenge that is partly based on its becoming a major scientific and technological power. He agreed that the concerns of FBI Director Wray and others are valid and must be considered carefully, but noted, based on his observations at informative sessions and a meeting with an FBI officer, that the overall number of documented misdeeds involving Chinese nationals is over 100 – far from a deep and wide threat – and that he has not seen evidence of significant student participation in those misdeeds.
We must come to terms with reality, claimed Bienenstock, presenting evidence that the United States is no longer the dominant funder of science and technology research; that Chinese nationals constitute a very significant portion of the U.S. workforce in computer science, engineering, and mathematics; and that the U.S. science and technology workforce is highly dependent on Chinese graduate students.
The United States must maintain and strengthen its scientific and technological efforts if it is to maintain a leadership position, Bienenstock said. To do so, he emphasized, U.S. universities must maintain their openness, and lawmakers, in turn, must thoughtfully understand the benefits of collaboration with Chinese scientists and engineers as well as keep the country attractive for Chinese students.
Today the Stanford Internet Observatory published a white paper on GRU online influence operations from 2014 to 2019. The authors conducted this research at the request of the United States Senate Select Committee on Intelligence (SSCI) and began with a data set consisting of social media posts provided to the Committee by Facebook. Facebook attributed the Pages and posts in this data set to the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (Главное управление Генерального штаба Вооружённых сил Российской Федерации), known as the GU, or by its prior acronym GRU. It removed the content in or before 2018. The data provided by Facebook to SSCI consisted of 28 folders, each corresponding to at least one unique Facebook Page. These Pages were in turn tied to discrete GRU-attributed operations. Some of these Pages and operations were significant; others were so minor they scarcely had any data associated with them at all.
While some content related to these operations has been unearthed by investigative journalists, a substantial amount has not been seen by the public in the context of GRU attribution. The SIO white paper is intended to provide an overview of the GRU tactics used in these operations and to offer key takeaways about the distinct operational clusters observed in the data. Although the initial leads were provided by the Facebook data set, many of these Pages have ties to material that remains accessible on the broader internet, and we have attempted to aggregate and archive that broader expanse of data for public viewing and in service to further academic research.
Several key takeaways appear in the analysis:
Traditional narrative laundering operations updated for the internet age. Narrative laundering – the technique of moving a certain narrative from its state-run origins to the wider media ecosystem through the use of aligned publications, “useful idiots,” and, perhaps, witting participants – is an "active-measures" tactic with a long history. In this white paper we show how narrative laundering has been updated for the social-media era. The GRU created think tanks and media outlets to serve as initial content drops, and fabricated personas — fake online identities — to serve as authors. A network of accounts additionally served as distributors, posting the content to platforms such as Twitter and Reddit. In this way, GRU-created content could make its way from a GRU media property to an ideologically aligned real independent media website to Facebook to Reddit — a process designed to reduce skepticism in the original unknown blog.
Image
The website for NBene Group, a GRU-attributed think tank. In one striking example of how this content can spread, an NBene Group piece about the annexation of Crimea was cited in an American military law journal article.
The emergence of a two-pronged approach: narrative and memetic propaganda by different entities belonging to a single state actor. The GRU aimed to achieve influence by feeding its narratives into the wider mass-media ecosystem with the help of think tanks, affiliated websites, and fake personas. This strategy is distinct from that of the Internet Research Agency, which invested primarily in a social-first memetic (i.e., meme-based) approach to achieve influence, including ad purchases, direct engagement with users on social media, and content crafted specifically with virality in mind. Although the GRU conducted operations on Facebook, it either did not view maximizing social audience engagement as a priority or did not have the wherewithal to do so. To the contrary, it appears to have designed its operation to achieve influence in other ways.
A deeper understanding of hack-and-leak operations. GRU hack-and-leak operations are well known. This tactic — which has been described in detail in the Mueller Report — had a particularly remarkable impact on the 2016 U.S. Election, but the GRU conducted other hack-and-leak operations between 2014 and 2019 as well. One of the salient characteristics of this tactic is the need for a second party (such as Wikileaks, for example) to spread the results of a hack-and-leak operation, since it is not effective to leak hacked documents without having an audience. In this white paper we analyze the GRU’s methods for disseminating the results of its hack-and-leak operations. While its attempts to do so through its own social media accounts were generally ineffective, it did have success in generating media attention (including on RT), which led in turn to wider coverage of the results of these operations. Fancy Bear’s own Facebook posts about its hack-and-leak attack on the World Anti-Doping Agency (WADA), for example, received relatively little engagement, but write-ups in Wired and The Guardian ensured that its operations got wider attention.
Some of the most noteworthy operations we analyze in this white paper include:
Inside Syria Media Center (ISMC), a media entity that was created as part of the Russian government’s multifarious influence operation in support of Syrian President Bashar al-Assad. Although ISMC claimed to be “[c]ollecting information about the Syrian conflict from ground-level sources,” its actual function was to boost Assad and discredit Western forces and allies, including the White Helmets. Our analysis of the ISMC Facebook Page shows exceptionally low engagement — across 5,367 posts the average engagement was 0.1 Likes per post — but ISMC articles achieved wider attention when its numerous author personas (there were six) reposted them on other sites. We counted 142 unique domains that reposted ISMC articles. This process happened quickly; a single article could be reposted on many alternative media sites within days of initial publication on the ISMC website. We observe that, while both Internet Research Agency (IRA) and GRU operations covered Syria, the IRA only rarely linked to the ISMC website.
Image
The Quora profile for Sophie Mangal, one of the personas that authored and distributed ISMC content.
APT-28, also known as Fancy Bear, is a cyber-espionage group identified by the Special Counsel Investigation as GRU Units 26165 and 74455. This entity has conducted cyber attacks in connection with a number of Russian strategic objectives, including, most famously, the DNC hack of 2016. The Facebook data set provided to SSCI included multiple Pages related to hacking operations, including DCLeaks and Fancy Bears Hack Team, a sports-related Page. This activity included a hack-and-leak attack on WADA, almost certainly in retaliation for WADA’s recommendation that the International Olympic Committee ban the Russian team from the 2016 Olympics in Rio de Janeiro. The documents leaked (and, according to WADA, altered) by Fancy Bears purported to show that athletes from EU countries and the US were cheating by receiving spurious therapeutic use exemptions. Our analysis of these Pages looks at their sparse engagement on social platforms and the stark contrast to the substantial coverage in mainstream press. It also notes the boosting of such operations by Russian state-linked Twitter accounts, RT, and Sputnik.
CyberBerkut, Committee of Soldiers’ Mothers of Ukraine, and “For an Exit from Ukraine,” a network of Pages targeting Ukraine, which has been subject to an aggressive disinformation campaign by the Russian government since the Euromaidan revolution in 2014. Our investigation of these Pages highlights the degree to which apparently conflicting messages can be harnessed together in support of a single overarching objective. (This also suggests a parallel with the tactics of the IRA, which frequently boosted groups on opposite sides of contentious issues.) Among the multiple, diverging operational vectors we analyzed were attempts to sow disinformation intended to delegitimize the government in Kyiv; to leverage a Ukrainian civil-society group to undermine public confidence in the army; and to convince Ukrainians that their country was “without a future” and that they were better off emigrating to Poland. While the Pages we analyzed worked with disparate themes, their content was consistently aimed at undermining the government in Kyiv and aggravating tensions between Eastern and Western Ukraine.
Considered as a whole, the data provided by Facebook — along with the larger online network of websites and accounts that these Pages are connected to — reveal a large, multifaceted operation set up with the aim of artificially boosting narratives favorable to the Russian state and disparaging Russia’s rivals. Over a period when Russia was engaged in a wide range of geopolitical and cultural conflicts, including Ukraine, MH17, Syria, the Skripal Affair, the Olympics ban, and NATO expansion, the GRU turned to active measures to try to make the narrative playing field more favorable. These active measures included social-media tactics that were repetitively deployed but seldom successful when executed by the GRU. When the tactics were successful, it was typically because they exploited mainstream media outlets; leveraged purportedly independent alternative media that acts, at best, as an uncritical recipient of contributed pieces; and used fake authors and fake grassroots amplifiers to articulate and distribute the state’s point of view. Given that many of these tactics are analogs of those used in Cold-War influence operations, it seems certain that they will continue to be refined and updated for the internet era, and are likely to be used to greater effect.
Image
The linked white paper and its conclusions are in part based on the analysis of social-media content that was provided to the authors by the Senate Select Committee on Intelligence under the auspices of the Committee’s Technical Advisory Group, whose Members serve to provide substantive technical and expert advice on topics of importance to ongoing Committee activity and oversight. The findings, interpretations, and conclusions presented herein are those of the authors, and do not necessarily represent the views of the Senate Select Committee on Intelligence or its Membership.
“Our dystopian present is your dystopian future if nothing significant is done,” cautioned Ressa, urging the Stanford community to pressure technology platforms and social media to stop disinformation spread.
“This is an existential moment for global power structures, turned upside down by technology. When journalists globally are under attack, democracy is under attack.” With these words, the internationally-esteemed investigative journalist and press freedom champion Maria Ressa, winner of the 2019 Shorenstein Journalism Award, opened her keynote address at a lunchtime ceremony, held at Stanford on October 21.
Ressa knows first-hand the terrifying reality of continuously being subject to online attacks and politically motivated attempts by the government to silence and intimidate. As CEO and executive editor of Rappler, she has led the Philippine independent news platform in shining critical light on the Duterte administration's policies and actions. President Duterte in turn has made no secret of his dislike for Ressa and Rappler, accusing the platform for carrying "fake news." Ressa has been arrested twice this year, accused of corporate tax evasion and of violating security laws, and slapped with charges of cyber libel for a report that was published before the libel law came into effect. Since Duterte’s election in summer 2016, the Philippine government has filed at least 11 cases and investigations against Ressa and Rappler.
“And all because I’m a journalist,” she says.
Speaking at the Shorenstein Award’s eighteenth annual panel discussion, Ressa detailed the devastating effects that disinformation has had on democracy and societal cohesion in the Philippines. She vividly explained why each and every one of us should be gravely concerned about the breaking down of the information ecosystem in a country halfway around the world. The Philippines, she said, is a case study of how attacks on truth and facts rip the heart out of civic engagement and gradually kill democracy, “a death by a thousand cuts.”
Ressa was joined on the panel by Stanford’s Larry Diamond, senior fellow at the Hoover Institution and the Freeman Spogli Institute for International Studies, and Raju Narisetti, director of the Knight-Bagehot Fellowship in Economics and Business Journalism and professor of professional practice at Columbia Journalism School, who also serves on the selection committee for the Shorenstein Journalism Award. Shorenstein APARC’s Southeast Asia Program Director Donald K. Emmerson chaired the discussion.
Modern authoritarians follow a familiar playbook, noted Ressa, for they know well that “If you can make people believe lies are the facts, then you can control them.” Their first step is to lie all the time. The second is to argue their opponents and the journalists are the ones who lie. Then finally everyone looks around and says, "What's truth?" And when there is no truth resistance is impossible.
Ressa went on to describe detailed examples of patriotic trolling in the Philippines, that is, how state-sponsored online hate and harassment campaigns silence and intimidate journalists and others who voice criticism of the Duterte administration. Instead of censoring, she said, state agents now flood the information ecosystem with lies, blurring the line between fact and fiction. These information operations are conducted through the weaponization of technology and social media platforms, first and foremost Facebook. Ressa’s team at Rappler uses network analysis methods to unveil the flow and spread of online disinformation and harassment campaigns on Facebook and from there to other platforms as well as traditional and state media.
Ressa urged the packed audience of campus and community members to remember that “Without facts you cannot have truth, without truth you cannot have trust, and without any of these three democracy as we know it is dead. The public sphere is dead […] our Philippine dystopian present is your dystopian future, if nothing significant is done.”
She closed her keynote by pleading: “Please push tech platforms and social media to do something to stop the lies from spreading. Lies laced with anger and hate spread faster than facts. Fight for your rights.”
Watch Ressa’s keynote and the entire panel proceedings here or on our YouTube channel. You can also listen to Ressa’s keynote below and on our SoundCloud channel. A transcript of the keynote address is available below.
No Ministry of Truth
Is the attack on truth a technological problem, and can it have a technological solution? It's naïve, said Diamond, to think that there is a purely technological solution or that we can rein in the alarming developments in the Philippines and elsewhere without addressing their technological elements and the economic incentives underlying these elements. “There has to be a macro political element of response,” argued Diamond, “which obviously has to involve advanced liberal democracies condemning and drawing boundaries around the murderous authoritarianism of Rodrigo Duterte.”
Image
Left to right: Donald K. Emmerson, Maria Ressa, Raju Narisetti, Larry Diamond.
Narisetti emphasized the need to look at the problem and its potential solutions holistically and bear in mind that solutions must come from multiple areas. “We must remember that technology has value, but it has no values. It's a matter of who is using it and how they're using it.” And while we certainly don't want Facebook to be the Ministry of Truth, continued Narisetti, by no means do we want Congress to take on that role. He pointed to specific possible regulatory solutions, such as insisting Facebook enable its users to port their complete data outside of the platform if they wish to do so, or establishing a system of data and privacy courts.
Commitment to Journalism that Courageously Seeks Accuracy
The Shorenstein Journalism Award, which is sponsored by APARC, was presented to Ressa at a private evening ceremony. “You would be hard pressed to find a person whose work more fully embodies the ideals that define journalism than Maria Ressa,” said James Hamilton, Stanford’s Hearst Professor of Communication, Chair of the Department of Communication, Director of the Stanford Journalism Program, who also serves on the selection committee for the award. Shorenstein APARC Director Gi-Wook Shin joined Hamilton in co-presenting Ressa the award.
The Shorenstein Award, which carries a cash prize of $10,000, recognizes accomplished journalists committed to critical reporting on and exploring the complexities of Asia through their writing. It alternates between honoring recipients from the West, who mainly address American audiences, and recipients from Asia, who often work on the frontline of the battle for freedom of the press in their countries. Established in 2002, the award honors the legacy of APARC benefactor Mr. Walter H. Shorenstein, who was passionate about promoting both excellence in journalism and a deeper understanding of Asia.
If America frames its response to Russia and China as one of “civilizational struggle,” Diamond says, Vladimir Putin and Xi Jinping will only get stronger. Listen and read here.