Authors
Stanford Internet Observatory
News Type
Blogs
Date
Paragraphs

On November 5, 2020 Facebook announced the takedown of two networks: 

  • 25 Pages, 31 profiles, and 2 Instagram accounts affiliated with the Muslim Brotherhood. According to Facebook, the operation originated in Egypt, Turkey, and Morocco. The network targeted audiences both in Egypt directly and across the Middle East and North Africa.

  • 11 Pages, 6 Groups, 33 profiles, and 47 Instagram accounts that originated in Afghanistan and Iran and targeted Farsi/Dari speakers in Afghanistan.

Facebook suspended these networks not due to the content of their posts, but for coordinated inauthentic behavior: the Facebook Pages and Groups were managed by fake accounts. A summary of the two networks is below, and full reports are linked at the top of this post. Facebook's announcement is here.

Muslim Brotherhood-Linked Takedown

Many social media disinformation campaigns—and associated takedowns—have been linked to Saudi Arabia, the UAE, and Egypt. But we believe this is the first takedown linked to opposing pro-Muslim Brotherhood actors. Interestingly, this network appears markedly similar to networks from anti-Muslim Brotherhood disinformation campaigns on Facebook. Both sides create professional branding for Pages and share polished, original videos. We conjecture that like anti-Muslim Brotherhood operations, this network may be linked to a digital marketing firm in Egypt. These firms have a particular signature.   

Key takeaways:

  • This was a complex cross-platform operation with a substantial audience. The Facebook Pages the Stanford Internet Observatory analyzed had nearly 1.5 million followers. The operation was also linked to many Twitter accounts, YouTube channels, and Telegram channels, many of which had large followings. 
  • The network created and shared hundreds of original videos and dozens of original songs. 
  • While most of the profiles linked to this operation were stub accounts, one of the profiles ran a social media advertising agency in Egypt.
  • Central messages included:
    • Praise for the Muslim Brotherhood-supporting governments of Turkey and Qatar.
    • Criticism of Saudia Arabian, Egyptian, and UAE governments. 
    • Accusations that the Egyptian government had imprisoned and killed Muslim Brotherhood supporters, and that Muslim Brotherhood supporters were being detained across countries
    • The Facebook Page names were direct and unsubtle. Examples include Tunisia Against the UAE, Hearts with Qatar, and YemenAgainstKSAUEA [sic].

 

The People and Hearts with Qatar Facebook Page The People and Hearts with Qatar Facebook Page

Takedown of Accounts Originating in Afghanistan and Iran

 

This operation produced content oriented towards women, including promoting women's rights. It also promoted the narrative that Iran is a good ally for Afghanistan, highlighted the brutality of the Taliban, and criticized Pakistani and American intervention in Afghanistan. 

Key takeaways:

  • The network aimed to appeal to women. Fifty-three percent of the Instagram accounts had profile photos of women (compared to 11% with photos of men), and the network shared stories about the educational success of women. It is possible the intent was to undermine the peace negotiations between the Afghan government and the Taliban; the Taliban is known for restricting women’s rights.  

  • The network shared messaging that criticized Pakistan, the Taliban, and the U.S. Content about the U.S. criticized U.S. President Donald Trump in general, and specifically claimed that Trump was colluding with the Taliban. The network praised the role Iran could play in Afghan peace negotiations.

  • Posts from accounts purporting to be in Afghanistan used the term Farsi to describe its language, instead of Dari, often explicitly saying they were proud to use the term Farsi. The two languages are very similar; Iran uses the term Farsi and Afghanistan uses the term Dari. 

  • The Facebook profiles and Instagram accounts were as actively involved in pushing particular narratives as the Pages and Groups, and in many cases had larger followers. 

  • We identified five Telegram channels linked to this Facebook/Instagram operation. 

 

Afghanistan My Passion Instagram A post from the Afghanistan My Passion Instagram account using a fabricated photo. The Taliban are shown praying for their “partner” Trump.

Read More

Nigeria Takedown twitter graphic
Blogs

Analysis of an October 2020 Facebook Takedown Linked to the Islamic Movement in Nigeria

In this post and in the attached report we investigate an operation that called for the release from prison of Sheikh Ibrahim El-Zakzaky.
Analysis of an October 2020 Facebook Takedown Linked to the Islamic Movement in Nigeria
graphic illustration of facebook posts from yemen
Blogs

The Ministry of Made-Up Pages: Yemen-Based Actors Impersonate Government Agencies to Spread Anti-Houthi Content

We analyzed a now-suspended network of Facebook Pages, Groups, and profiles linked to individuals in Yemen. We found accounts that impersonated government ministries in Saudi Arabia, posts that linked to anti-Houthi websites, and pro-Turkish Pages and Groups.
The Ministry of Made-Up Pages: Yemen-Based Actors Impersonate Government Agencies to Spread Anti-Houthi Content
twittertakedownapril2
Blogs

Analysis of April 2020 Twitter takedowns linked to Saudia Arabia, the UAE, Egypt, Honduras, Serbia, and Indonesia

Analysis of April 2020 Twitter takedowns linked to Saudia Arabia, the UAE, Egypt, Honduras, Serbia, and Indonesia
All News button
1
Authors
Jody Berger
Graham Webster
News Type
Q&As
Date
Paragraphs

Graham Webster leads the DigiChina Project, which translates and explains Chinese technology policy for an English-language audience so that debates and decisions regarding cyber policy are factual and based on primary sources of information.  

Housed within Stanford’s Program on Geopolitics, Technology, and Governance (GTG) and in partnership with New America, DigiChina and its community of experts have already published more than 80 translations and analyses of public policy documents, laws, regulations and political speeches and are creating an open-access knowledge base for policy-makers, academics, and members of the tech industry who need insight into the choices China makes regarding technology.

Q. Why is this work important?

A lot of tech is produced in China so it’s important to understand their policies. And in Washington, D.C., you hear a lot of people say, “Well, you can’t know what China’s doing on tech policy. It’s all a secret.” But while China’s political system is often opaque, if you happen to read Chinese, there’s a lot that’s publicly available and can explain what the Chinese government is thinking and planning.

With our network of experts, DigiChina works at the intersection of two policy challenges. One is how do we deal with high technology, and the questions around economic competitiveness, personal autonomy and the security risks that our dependence on tech creates.

The other challenge is, from a US government, business or values perspective, what needs to be done about the increased prominence and power of the Chinese government and its economic, technological and military capabilities.

These questions cut across tech sectors from IT infrastructure to data-driven automation, and cutting-edge developments in quantum technology, biotech, and other fields of research.

Q: How was DigiChina started?

A number of us were working at different organizations, think tanks, consultancies and universities and we all had an interest in explaining the laws and the bureaucratic language to others who aren’t Chinese policy specialists or don’t have the language skills.  

We started working informally at first and then reached out to New America, which is an innovative type of think tank combining research, innovation, and policy thinking on challenges arising during this time of rapid technological and social change. Under the New America umbrella, and through partnerships with the Leiden Asia Centre, a leading European research center based at the University of Leiden, and the Ethics and Governance of Artificial Intelligence Initiative at Harvard and MIT, we were able to build out the program and increase the number of experts in our network.

Q: Who is involved in DigiChina and what types of expertise do you and others bring to the project?

More than 40 people have contributed to DigiChina publications so far, and it’s a pretty diverse group. There are professors and think tank scholars, students and early-career professionals, and experienced government and industry analysts. Everyone has a different part of the picture they can contribute, and we reach out to other experts both in China and around the world when we need more context.

As for me, I was working at Yale Law School’s China Center when I was roped into what would become DigiChina and had spent several years in Beijing and New Haven working more generally on US-China relations and Track 2 dialogues, where experts and former officials from the two countries meet to take on tough problems. As a journalist and graduate student, I had long studied technology and politics in China, and I took on a coordinating role with DigiChina as I turned back to that pursuit full time.

Stanford is an ideal home because the university is a powerhouse in Chinese studies and an epicenter of global digital development.
Graham Webster
Editor in Chief, DigiChina Project

Q. Are there other organizations involved as well?

We have a strong tie to the Leiden Asia Centre at the University of Leiden in the Netherlands, where one of DigiChina’s cofounders, Rogier Creemers, is a professor, and where staff and student researchers have contributed to existing and forthcoming work. We coordinate with a number of other groups on translations, and the project benefits greatly from the time and knowledge contributed by employees of various institutions. I hope that network will increasingly be a resource for contributors and their colleagues.

The project is currently supported by the Ford Foundation, which works to strengthen democratic values, promote international cooperation and advance human achievement around the world. A generous grant from Ford will keep the lights on for two years, giving us the ability to build our open-access resource and, with further fundraising, the potential to bring on more in-house editorial and research staff.

We hope researchers and policy thinkers, regardless of their approaches or ideologies, can use our translations to engage with the real and messy evolution of Chinese tech policy.
Graham Webster
Editor in Chief, DigiChina Project

Q. Do you have plans to grow the project?

We are working to build an accessible online database so researchers and scholars can review primary source documents in both the original Chinese and in English. And we are working toward a knowledge base with background entries on key institutions, legal concepts, and phrases so that a broader audience can situate things like Chinese legal language in their actual context. Providing access to this information is especially important now and in the near future, whether we have a second Trump Administration or a Biden Administration in the United States.

On any number of policy challenges, effective measures are going to depend on going beyond caricatures like an “AI arms race,” “cyber authoritarianism,” or “decoupling,” which provide useful frameworks for debate but can tend to prejudge the outcomes of a huge number of developments. We hope researchers and policy thinkers, regardless of their approaches or ideologies, can use this work to engage with the real and messy evolution of Chinese tech policy.

Hero Image
All News button
1
Subtitle

Webster explains how DigiChina makes Chinese tech policy accessible for English speakers

Authors
News Type
News
Date
Paragraphs

On September 29, the APARC China Program hosted Thomas Fingar and Stephen Stedman for the program “Rebuilding International Institutions.” The program, which was moderated by China Program Director Jean Oi, examined the future of international institutions such as the United Nations (UN), World Trade Organization (WTO), and World Health Organization (WHO) in our evolving global political landscape. While Fingar and Stedman acknowledged that such institutions facilitated attainment of unprecedented peace and prosperity after WWII, they also asked difficult questions: Are these institutions still adequate? And if not, how will we change them?

Shorenstein APARC Fellow Thomas Fingar kicked off the session by asking whether or not US-China tensions would impede cooperation on major global challenges, or if those challenges were so serious as to render such rivalries immaterial. Perhaps the most obvious example of such a crisis is the current COVID-19 pandemic. The efforts to curb the virus’ spread not only by individual countries, but also by international organizations like the WHO, have proven largely inadequate. According to Fingar, our existing institutions need to be reformed or supplemented to deal with these types of threats. However, such an overhaul of our international systems will be difficult, he says.

How, then, will we go about such a massive project? Stephen Stedman, Deputy Director at Stanford’s Center on Democracy, Development, and the Rule of Law (CDDRL), responded by explaining that the current failure of international cooperation makes such undertakings tough. Globalization has been a double-edged sword: On one hand, more contact, perhaps inherently, leads to increased tension. The resurgence of traditional notions of sovereignty in 2010, kickstarted by the opposition of countries like Russia and China to what was seen as UN overreaching, has led to a reduction of international cooperation overall. On the other hand, Fingar posits that our interconnectedness may force us toward cooperation despite rivalries as we face more and more transnational threats. International institutions create rules to organize and manage our many interconnected relationships so that we can deal with our problems effectively and reduce friction.

Stedman also pointed to the upcoming US elections and the major impact their outcome will have on how these problems are addressed—or not. In the last four year, the United States has pulled back significantly from international institutions and agreements, leaving a gap that China has started to fill. Furthermore, despite the US’s retreat from international responsibility, the country still remains a critical actor in global initiatives. China’s embrace of a global leadership role is not inherently negative, but its future relationship with the US will need to be “managed in a way that you get greater cooperation and not just paralysis.” Stedman says that it is likely that progress will need to be made on a bilateral front in order to have productive conversations about international issues with China.

Concluding on an optimistic note, Fingar voiced his hope that the current tensions and negative perceptions between rivals might ultimately “be mitigated by success in dealing with a common problem,” because “experience does shape perceptions.”

A video recording of this program is available upon request. Please contact Callista Wells, China Program Coordinator at cvwells@stanford.edu with any inquiries.

Read More

View of building roof in the Forbidden City complex and the Beijing skyline in the background
News

New Fellowship on China Policy Seeks to Strengthen U.S.-China Relations

Stanford University’s Shorenstein Asia-Pacific Center invites applications for the inaugural 2021-22 China Policy Fellowship from experts with research experience on issues vital to the U.S. China policy agenda and influence in the policymaking process.
New Fellowship on China Policy Seeks to Strengthen U.S.-China Relations
Cover of the book Fateful Decisions: Choices That Will Shape China's Future
News

Thomas Fingar and Jean Oi Analyze the Choices and Challenges Facing China’s Leaders

Fingar and Oi joined the National Committee on U.S.-China Relations to discuss their edited volume, ‘Fateful Decisions: Choices that Will Shape China’s Future.’
Thomas Fingar and Jean Oi Analyze the Choices and Challenges Facing China’s Leaders
Woman in a face mask looking at a stock market board
News

Pressing “Re-start”: Business Operations in China after COVID-19 – Highlights of Survey Results and Conversation with Prominent China CxOs

“[T]he biggest challenge for us is really how to . . . navigate through all the unknowns. I mean, at that time [of COVID-19], at every stage we were facing different challenges . . . different phases” stated Zhiqiang (ZZ) Zhang, President of ABB (China).
Pressing “Re-start”: Business Operations in China after COVID-19 – Highlights of Survey Results and Conversation with Prominent China CxOs
All News button
1
-

This fourth workshop of the Stanford Internet Observatory's End-to-End Encryption series will focus on civil society concerns in an encrypted world. We will focus on strategies and tradeoffs to make encrypted platforms safer without compromising security.

Login details will be provided to registered participants.
This event was originally scheduled for November 16 and has been moved to December 7. Registration link forthcoming.

Webinar

Workshops
Paragraphs

THE EMERGENCE OF A DIGITAL SPHERE where public debate takes place raises profound questions about the connection between online information and polarization, echo chambers, and filter bubbles. Does the information ecosystem created by social media companies support the conditions necessary for a healthy democracy? Is it different from other media? These are particularly urgent questions as the United States approaches a contentious 2020 election during the COVID-19 pandemic.

The influence of technology and AI-curated information on America’s democratic process is being examined in the eight-week Stanford University course, “Technology and the 2020 Election: How Silicon Valley Technologies Affect Elections and Shape Democracy.” This issue brief focuses on the class session on “Echo Chambers, Filter Bubbles, and Polarization,” with guest experts Joan Donovan and Joshua Tucker.

All Publications button
1
Publication Type
Policy Briefs
Publication Date
Authors
Marietje Schaake
Rob Reich
-

This event is open to Stanford undergraduate students only. 

Image
CDDRL Flyer 2021

The Center on Democracy, Development, and the Rule of Law (CDDRL) will be accepting applications from eligible juniors on who are interested in writing their senior thesis on a subject touching upon democracy, economic development, and rule of law (DDRL) from any university department.  The application period opens on January 11, 2021 and runs through February 12, 2021.   CDDRL faculty and current honors students will be present to discuss the program and answer any questions.

For more information on the Fisher Family CDDRL Honors Program, please click here.

**Please note all CDDRL events are scheduled using the Pacific Time Zone

 Online, via Zoom: REGISTER

CDDRL
Encina Hall, C152
616 Jane Stanford Way
Stanford, CA 94305-6055

(650) 725-2705 (650) 724-2996
0
Senior Fellow at the Freeman Spogli Institute for International Studies
Professor, by courtesy, of Political Science
Stedman_Steve.jpg
PhD

Stephen Stedman is a Freeman Spogli senior fellow at the Center on Democracy, Development, and the Rule of Law and FSI, an affiliated faculty member at CISAC, and professor of political science (by courtesy) at Stanford University. 

In 2011-12 Professor Stedman served as the Director for the Global Commission on Elections, Democracy, and Security, a body of eminent persons tasked with developing recommendations on promoting and protecting the integrity of elections and international electoral assistance. The Commission is a joint project of the Kofi Annan Foundation and International IDEA, an intergovernmental organization that works on international democracy and electoral assistance. In 2003-04 Professor Stedman was Research Director of the United Nations High-level Panel on Threats, Challenges and Change and was a principal drafter of the Panel’s report, A More Secure World: Our Shared Responsibility. In 2005 he served as Assistant Secretary-General and Special Advisor to the Secretary- General of the United Nations, with responsibility for working with governments to adopt the Panel’s recommendations for strengthening collective security and for implementing changes within the United Nations Secretariat, including the creation of a Peacebuilding Support Office, a Counter Terrorism Task Force, and a Policy Committee to act as a cabinet to the Secretary-General.  His most recent book, with Bruce Jones and Carlos Pascual, is Power and Responsibility: Creating International Order in an Era of Transnational Threats (Washington DC: Brookings Institution, 2009).

Affiliated faculty at the Center for International Security and Cooperation
Date Label
Deputy Director, CDDRL

Encina Hall, C150
616 Jane Stanford Way
Stanford, CA 94305

0
Center Fellow, Freeman Spogli Institute for International Studies
didi_kuo_2023.jpg

Didi Kuo is a Center Fellow at the Freeman Spogli Institute for International Studies (FSI) at Stanford University. She is a scholar of comparative politics with a focus on democratization, corruption and clientelism, political parties and institutions, and political reform. She is the author of The Great Retreat: How Political Parties Should Behave and Why They Don’t (Oxford University Press, forthcoming) and Clientelism, Capitalism, and Democracy: the rise of programmatic politics in the United States and Britain (Cambridge University Press, 2018).

She has been at Stanford since 2013 as the manager of the Program on American Democracy in Comparative Perspective and is co-director of the Fisher Family Honors Program at CDDRL. She was an Eric and Wendy Schmidt Fellow at New America and is a non-resident fellow with the Carnegie Endowment for International Peace. She received a PhD in political science from Harvard University, an MSc in Economic and Social History from Oxford University, where she studied as a Marshall Scholar, and a BA from Emory University.

Associate Director for Research, CDDRL
Authors
Carly Miller
Tara Kheradpir
Renee DiResta
Abuzar Royesh
News Type
Blogs
Date
Paragraphs

Summary

On October 8, 2020, Twitter announced the takedown of an operation it attributed to Iran. The actors compromised real accounts to tweet Black Lives Matter content, and additionally created fake accounts with bios stolen from other real accounts on Twitter. The Stanford Internet Observatory analyzed the accounts’ behaviors, tweets and images related to this relatively small operation. The activity observed in this dataset —  compromising Twitter accounts, then leveraging them to disseminate messaging — appears to be a bit of a departure from prior Iran-linked activity. As we will discuss, the effort encompassed in this set contained unrefined messaging and ineffective dissemination. Other Iran-linked malign actors involved in prior influence operations appear to have been far more adept at creating fake social media personas for the purpose of disseminating propaganda. Topically, as SIO has previously noted, verified accounts run by Iranian regime leaders and its state media have previously waded into the Black Lives Matter conversation, posting support for protestors, portraying American police as fascists, and declaring that the US government is guilty of human rights violations and racism.

Key Takeaways

  • In total, 104 accounts were utilized in the Iran-linked operation. Of this dataset, 81 accounts were real accounts that had been hacked for the purposes of the operation. The remaining 23 accounts were fake accounts that Twitter assessed were created by the malign actor, and incorporated elements of theft such as bios stolen from real accounts. 

  • The compromised accounts were hacked to tweet content about Black Lives Matter, using the hashtag #black_lives_matter. These tweets contained images or memes to advance a pro-BLM narrative. 

  • Tweets from the fake accounts were broader in focus, and covered multiple topics. These accounts tweeted in English and Arabic. A subset of English tweets by accounts claiming to be journalists shared news articles that were more critical of Donald Trump but also retweeted the US President’s account. Tweets in Arabic focused on two individuals critical of the Kuwaiti government, alleging they abused or trafficked drugs. 

Tactics, Techniques, and Procedures

There were two distinct tactics observed in the dataset: hacking real accounts, and creating fake personas with stolen biographies. 

The majority of accounts, 81, fell into the first category: account theft. The compromised accounts sent all but two of their tweets on June 3, 2020, that consisted of the hashtag #black_lives_matter and an image, such as an alleged Black Lives Matter protest. The hacked accounts were primarily located in the United States and came from a variety of communities: there were DJs, gamers, and accounts that role-played as vampires and werewolves. We observed tweets, Facebook posts and website updates from accounts that were compromised, some of which noted that they had been hacked, and others stating that they had regained control of their accounts. We do not name these accounts for privacy reasons. 

The second category centered around 23 fake personas created by the malign actor. This subset of accounts followed similar naming conventions: a first name followed by a last name or last initial and a series of numbers. Of the accounts, 22 of 23 were created on one of three dates in January 2020 — January 8, January 11, and January 25 — and each batch had similarities in its location and bio profession. For example, eight accounts created on Jan 25, 2020, had bios claiming to be journalists. The final account, which shared the same naming convention, was created on January 22. Unlike the other accounts, this account’s bio was in Arabic and it tweeted mostly in Arabic. 

 

figure one Iran blog post Figure 1: Account creation dates (aggregated by month) for all users in the dataset, both hacked and fake. The graph shows a spike in user creations in January 2020, when the adversary created its fake persona accounts. 

 

The majority of the fake accounts stole their bios from real accounts on Twitter, most of which were from users located in the United Kingdom. The bios from real accounts ranged from those of government officials, to a primary school teacher, to TV presenters and journalists. A majority of the real accounts that bios were stolen from had large followings (the largest had 508,800 followers), though some were relatively small (101 followers). It is unclear why these individuals were selected. 

The most active fake persona, Jennife55580973, and other accounts to a lesser extent, tweeted extensively for accounts to “please follow me back.” This behavior suggests this cluster was in the early phase of network building. The accounts mentioned each other in their tweets, creating a retweet ring of fake journalist personas that we discuss in more detail below.  

figure two Iran blog post Figure 2: The network of interactions (retweets, replies and mentions) initiated by the fake persona accounts (filtered to remove single-instance activity). This graph demonstrates the interconnectedness of the fake account network, while also showing that the accounts branched out to prominent figures such as @realdonaldtrump. 

Themes

#black_lives_matter

The 81 hacked accounts in the dataset were very minimally utilized: they tweeted the hashtag “#black_lives_matter,” along with an image. There were several variants of George Floyd's face edited to include an overlay of Joaquin Phoenix-style Joker makeup, which we have elected not to include. This analogy may have been meant to show support for protesters, or to encourage a more violent revolution as was depicted in the 2019 movie Joker; the purpose was somewhat unclear given the limited text. Other images shared in the tweets suggested a pro-BLM narrative, such as an image of Martin Luther King Jr. with the text, “even viruses know we are all made the same: STOP RACISM”. 

i can not breathe Figure 3: Examples of additional content sent through the compromised accounts. Left: An image of what seems to be a Black Lives Matter protest, with the phrase “I can not breathe,” a slight variant from the more commonly used phrase “I can’t breathe.” Right: An image of Martin Luther King Jr. and “even viruses know we are all made the same,” an anti-racism and pro-Black Lives Matter message. 
   

A Ring of Fake Journalists

Among the fake accounts created by the Iran-affiliated entity, eight personas claimed to be journalists. The narrative focus of the journalist ring differed based on the language of the tweets. The majority of tweets from this network were in English; a small amount were in Arabic and Urdu. 

English tweets

The tweets in English shared links to articles from CNN, the New York Times and the Wall Street Journal; the text of the tweets was the opening of the article (or copied from the news outlet’s own tweet about the article). The English tweets from this journalist ring were substantively different from the tweets from the “non-journalist” fake accounts, which didn’t share article links or focus primarily on events in the news. 

English tweets from the fake journalist accounts did not seem to center around a single dominant narrative; the accounts tweeted about global political events, COVID-19, President Donald Trump and George Floyd. The articles shared by the fake accounts were usually critical of Trump. For example, one account shared an article from CNN about how the governor of Illinois had labeled President Trump “a miserable failure.” At the same time, the accounts also retweeted a small but notable amount of tweets from sources that tend to be more favorable to Donald Trump, such as Candace Owens, Charlie Kirk and Donald Trump himself. 

The narratives incorporating Black Lives Matter and George Floyd were pro-BLM. For example, one account shared a CNN article and quote from Michelle Obama about the George Floyd protests: 

“Race and racism is a reality that so many of us grow up learning to just deal with. But if we ever hope to move past it, it can’t just be on people of color to deal with it,” former first lady Michelle Obama said while speaking out on George Floyd’s deathhttps://t.co/B3ZVUa0fL3

Another account copied CNN’s tweet that claimed GOP senators had asked the President to take a “far more compassionate approach amid the deep unrest” after George Floyd’s death. 

Arabic tweets

The content in Arabic from the fake journalists were mostly retweets of tweets critical of two individuals: Hani Hussein (هاني حسين), a former Kuwaiti oil minister who resigned in 2013 due to tensions with Parliament, and Abdul Hamid Dashti (عبدالحميد دشتي), a Shiite former Kuwaiti MP who was sentenced in absentia in 2016 and 2017 for remarks and tweets insulting Saudi Arabia and Bahrain. The targeting of these two individuals was not exclusive to the fake journalist accounts — all 23 fake accounts posted tweets in Arabic about the two individuals. The tweets aimed to paint Hussein and Dashti in a negative light. For example, the tweets spread rumors that Hani Hussein was abusing drugs, and referred to Abdul Hamid Dashti as a mercenary and a degenerate thief. Similarly, there was a subset of copypasta tweets — tweets that shared verbatim text — from the fake accounts in this dataset responding to real accounts on Twitter with tweets critical of Dashti, as seen in the tweet below. 

Image
Figure 4: tweet exchange between fake account

Tweet Reply Translation: “The first residency dealer in Kuwait is the mercenary Abdul Hamid Dashti and his son Talal ‘Al-Nibras.’ This is a letter from the Iranian embassy to the Kuwaiti Ministry of Interior complaining about his trafficking in residences. The funny thing is that this degenerate thief, at the behest of the son of the tanker thief Khalifa, who stole Kuwait during the invasion, looks up to us! https://t.co/wyACCtdkUo

The image in the tweet is allegedly a letter sent from the Iranian Embassy to the Kuwaiti Ministry of the Interior complaining that Dashti was ‘trafficking’ in government-funded residences, though the tweet did not specify what specifically he was trafficking. Some of these tweets were retweets of politicians in Pakistan and Kuwait, such as Dr. Basel Al-Sabah, Kuwait’s Minister of Health, which tweeted comments such as defending the state’s response to the pandemic in the face of “malicious rumors and propaganda.”  

Given their scattered focus, it is unclear from the content what the adversary’s intended purpose was for the fake accounts. 

Conclusion

Overall, this was a relatively small network in the early stages of its activity that was detected and removed before it had a chance to have a significant impact. Given Iranian-affiliated actors’ prior willingness to overtly leverage #BLM hashtags to denigrate American society and political leaders, it is somewhat surprising to see an Iran-linked adversary doing the work to compromise accounts to simply use them to send out a handful of #black_lives_matter tweets. While the narratives may not have been singularly focused across both the hacked and fake accounts, this operation provides researchers more insight into the different tactics and strategies leveraged to weigh in on political conversations and narratives on Twitter.   

All News button
1
Authors
Stanford Internet Observatory
News Type
Blogs
Date
Paragraphs

In this post and in the attached report we investigate a U.S. domestic astroturfing operation that Facebook attributed to social media consultancy Rally Forge. The use of marketing agencies and social consultancies to carry out influence operations has become quite common now, worldwide. Hiring an agency may afford the client plausible deniability in the event of discovery. Rally Forge served a range of clients including Turning Point Action and Inclusive Conservation Group. In September 2019 it was implicated in an operation uncovered by the Washington Post, in which teenagers appeared to be posting comments using fake accounts. Twitter and Facebook each took down a subset of the accounts immediately, and Facebook opened an investigation. This report provides an assessment of content taken down as a result of that investigation. 

Key takeaways

  • Rally Forge-linked accounts engaged in astroturfing operations on multiple platforms, posting “vox populi” comments about hunting or politics that appeared grassroots but was in fact paid commentary, much of it from people who did not exist.

  • The fake accounts were operated over a period of several years, with a period of dormancy that appeared to coincide with the end of the 2018 election cycle. These fake accounts occasionally pivoted in their expressed political beliefs and topical focus. 

  • Most of the Rally Forge-linked Page audiences were small, and comments that its personas left did not appear to generate much response. However, several of its Pages did achieve significant reach at their peak.

hunting memes Examples of content and replies from the hunting-advocacy astroturfing operation carried out by the network.

While there are bright lines when it comes to foreign influence operations, policies are fuzzier when considering U.S.-based actors, particularly as networked activism tactics are used by an increasing variety of domestic political and issue-based advocacy groups. In this case, the vast majority of the content that Facebook attributed to the Rally Forge network consisted of fairly standard political and issue-based advocacy work. However, there was additionally extensive inauthenticity in the form of fake accounts, which attempted to manipulate the public by way of astroturfed comment activity.

networkgraph The Rally Forge network across Facebook, Instagram, and Twitter. Twitter is the upper left, Instagram lower right, and Facebook the smaller cluster between them. Large nodes are individual actors in the network, and the small nodes surrounding them are “interests”—Pages and accounts that they follow. Accounts are increasingly likely to be “real” as they stray from the center of the clusters and have additional diverse interests. Subcommunities of the 3 major social networks, represented with different color, are inferred by modularity. For example, the two darkest colored clusters on the lower right are of International Conservation Group leadership.

All News button
1
Subtitle

An astroturfing operation involving fake accounts (some with AI-generated images) that left thousands of comments on Facebook, Twitter, and Instagram. Clients included Turning Point Action and Inclusive Conservation Group, a pro-hunting organization.

Paragraphs

THE 2020 ELECTION IN THE UNITED STATES will take place on November 3 in the midst of a global pandemic, economic downturn, social unrest, political polarization, and a sudden shift in the balance of power in the U.S Supreme Court. On top of these issues, the technological layer impacting the public debate, as well as the electoral process itself, may well determine the election outcome. The eight-week Stanford University course, “Technology and the 2020 Election: How Silicon Valley Technologies Affect Elections and Shape Democracy,” examines the influence of technology on America’s democratic process, revealing how digital technologies are shaping the public debate and the election.

The eight-week Stanford University course, “Technology and the 2020 Election: How Silicon Valley Technologies Affect Elections and Shape Democracy,” examines the influence of technology on America’s democratic process, revealing how digital technologies are shaping the public debate and the election...

 

All Publications button
1
Publication Type
Policy Briefs
Publication Date
Authors
Marietje Schaake
Rob Reich
Rob Reich
-

Image
election debrief event stanford

The US 2020 elections have been fraught with challenges, including the rise of "fake news” and threats of foreign intervention emerging after 2016, ongoing concerns of racially-targeted disinformation, and new threats related to the COVID-19 pandemic. Digital technologies will have played a more important role in the 2020 elections than ever before.

On November 4th at 10am PST, join the team at the Stanford Cyber Policy Center, in collaboration with the Freeman Spogli Institute, Stanford’s Institute for Human-Centered Artificial Intelligence, and the Stanford Center on Philanthropy and Civil Society, for a day-after discussion of the role of digital technologies in the 2020 Elections.  Speakers will include Nathaniel Persily, faculty co-director of the Cyber Policy Center and Director of the Program on Democracy and the Internet, Marietje Schaake, the Center’s International Policy Director and International Policy Fellow at Stanford’s Institute for Human-Centered Artificial Intelligence, Alex Stamos, Director of the Cyber Center’s Internet Observatory and former Chief Security Officer at Facebook and Yahoo, Renee DiResta, Research Manager at the Internet Observatory, Andrew Grotto, Director of the Center’s Program on Geopolitics, Technology, and Governance, and Rob Reich, Faculty Director of the Center for Ethics in Society, in conversation with Kelly Born, the Center’s Executive Director.

Please note that we will also have a YouTube livestream available for potential overflow or for anyone having issues connecting via Zoom: https://youtu.be/H2k62-JCAgE

 

0
renee-diresta.jpg

Renée DiResta is the former Research Manager at the Stanford Internet Observatory. She investigates the spread of malign narratives across social networks, and assists policymakers in understanding and responding to the problem. She has advised Congress, the State Department, and other academic, civic, and business organizations, and has studied disinformation and computational propaganda in the context of pseudoscience conspiracies, terrorism, and state-sponsored information warfare.

You can see a full list of Renée's writing and speeches on her website: www.reneediresta.com or follow her @noupside.

 

Former Research Manager, Stanford Internet Observatory

CISAC
Stanford University
Encina Hall, C428

Stanford, CA 94305-6165

(650) 723-9866
0
Andrew Grotto

Andrew J. Grotto is a research scholar at the Center for International Security and Cooperation at Stanford University.

Grotto’s research interests center on the national security and international economic dimensions of America’s global leadership in information technology innovation, and its growing reliance on this innovation for its economic and social life. He is particularly interested in the allocation of responsibility between the government and the private sector for defending against cyber threats, especially as it pertains to critical infrastructure; cyber-enabled information operations as both a threat to, and a tool of statecraft for, liberal democracies; opportunities and constraints facing offensive cyber operations as a tool of statecraft, especially those relating to norms of sovereignty in a digitally connected world; and governance of global trade in information technologies.

Before coming to Stanford, Grotto was the Senior Director for Cybersecurity Policy at the White House in both the Obama and Trump Administrations. His portfolio spanned a range of cyber policy issues, including defense of the financial services, energy, communications, transportation, health care, electoral infrastructure, and other vital critical infrastructure sectors; cybersecurity risk management policies for federal networks; consumer cybersecurity; and cyber incident response policy and incident management. He also coordinated development and execution of technology policy topics with a nexus to cyber policy, such as encryption, surveillance, privacy, and the national security dimensions of artificial intelligence and machine learning. 

At the White House, he played a key role in shaping President Obama’s Cybersecurity National Action Plan and driving its implementation. He was also the principal architect of President Trump’s cybersecurity executive order, “Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure.”

Grotto joined the White House after serving as Senior Advisor for Technology Policy to Commerce Secretary Penny Pritzker, advising Pritzker on all aspects of technology policy, including Internet of Things, net neutrality, privacy, national security reviews of foreign investment in the U.S. technology sector, and international developments affecting the competitiveness of the U.S. technology sector.

Grotto worked on Capitol Hill prior to the Executive Branch, as a member of the professional staff of the Senate Select Committee on Intelligence. He served as then-Chairman Dianne Feinstein’s lead staff overseeing cyber-related activities of the intelligence community and all aspects of NSA’s mission. He led the negotiation and drafting of the information sharing title of the Cybersecurity Act of 2012, which later served as the foundation for the Cybersecurity Information Sharing Act that President Obama signed in 2015. He also served as committee designee first for Senator Sheldon Whitehouse and later for Senator Kent Conrad, advising the senators on oversight of the intelligence community, including of covert action programs, and was a contributing author of the “Committee Study of the Central Intelligence Agency’s Detention and Interrogation Program.”

Before his time on Capitol Hill, Grotto was a Senior National Security Analyst at the Center for American Progress, where his research and writing focused on U.S. policy towards nuclear weapons - how to prevent their spread, and their role in U.S. national security strategy.

Grotto received his JD from the University of California at Berkeley, his MPA from Harvard University, and his BA from the University of Kentucky.

Research Scholar, Center for International Security and Cooperation
Former Director, Program on Geopolitics, Technology, and Governance
Date Label
Stanford Law School Neukom Building, Room N230 Stanford, CA 94305
650-725-9875
0
James B. McClatchy Professor of Law at Stanford Law School
Senior Fellow, Freeman Spogli Institute
Professor, by courtesy, Political Science
Professor, by courtesy, Communication
headshot_3.jpg

Nathaniel Persily is the James B. McClatchy Professor of Law at Stanford Law School, with appointments in the departments of Political Science, Communication, and FSI.  Prior to joining Stanford, Professor Persily taught at Columbia and the University of Pennsylvania Law School, and as a visiting professor at Harvard, NYU, Princeton, the University of Amsterdam, and the University of Melbourne. Professor Persily’s scholarship and legal practice focus on American election law or what is sometimes called the “law of democracy,” which addresses issues such as voting rights, political parties, campaign finance, redistricting, and election administration. He has served as a special master or court-appointed expert to craft congressional or legislative districting plans for Georgia, Maryland, Connecticut, New York, North Carolina, and Pennsylvania.  He also served as the Senior Research Director for the Presidential Commission on Election Administration. In addition to dozens of articles (many of which have been cited by the Supreme Court) on the legal regulation of political parties, issues surrounding the census and redistricting process, voting rights, and campaign finance reform, Professor Persily is coauthor of the leading election law casebook, The Law of Democracy (Foundation Press, 5th ed., 2016), with Samuel Issacharoff, Pamela Karlan, and Richard Pildes. His current work, for which he has been honored as a Guggenheim Fellow, Andrew Carnegie Fellow, and a Fellow at the Center for Advanced Study in the Behavioral Sciences, examines the impact of changing technology on political communication, campaigns, and election administration.  He is codirector of the Stanford Program on Democracy and the Internet, and Social Science One, a project to make available to the world’s research community privacy-protected Facebook data to study the impact of social media on democracy.  He is also a member of the American Academy of Arts and Sciences, and a commissioner on the Kofi Annan Commission on Elections and Democracy in the Digital Age.  Along with Professor Charles Stewart III, he recently founded HealthyElections.Org (the Stanford-MIT Healthy Elections Project) which aims to support local election officials in taking the necessary steps during the COVID-19 pandemic to provide safe voting options for the 2020 election. He received a B.A. and M.A. in political science from Yale (1992); a J.D. from Stanford (1998) where he was President of the Stanford Law Review, and a Ph.D. in political science from U.C. Berkeley in 2002.   

CV
Date Label
Rob Reich
0
marietje.schaake

Marietje Schaake is a non-resident Fellow at Stanford’s Cyber Policy Center and at the Institute for Human-Centered AI. She is a columnist for the Financial Times and serves on a number of not-for-profit Boards as well as the UN's High Level Advisory Body on AI. Between 2009-2019 she served as a Member of European Parliament where she worked on trade-, foreign- and tech policy. She is the author of The Tech Coup.


 

Non-Resident Fellow, Cyber Policy Center
Fellow, Institute for Human-Centered Artificial Intelligence
Date Label
Subscribe to Middle East and North Africa