-

* Please note all CISAC events are scheduled using the Pacific Time Zone.

 

Register in advance for this webinar: https://stanford.zoom.us/webinar/register/8416226562432/WN_WLYcdRa6T5Cs1MMdmM0Mug

 

About the Event: Is there a place for illegal or nonconsensual evidence in security studies research, such as leaked classified documents? What is at stake, and who bears the responsibility, for determining source legitimacy? Although massive unauthorized disclosures by WikiLeaks and its kindred may excite qualitative scholars with policy revelations, and quantitative researchers with big-data suitability, they are fraught with methodological and ethical dilemmas that the discipline has yet to resolve. I argue that the hazards from this research—from national security harms, to eroding human-subjects protections, to scholarly complicity with rogue actors—generally outweigh the benefits, and that exceptions and justifications need to be articulated much more explicitly and forcefully than is customary in existing work. This paper demonstrates that the use of apparently leaked documents has proliferated over the past decade, and appeared in every leading journal, without being explicitly disclosed and defended in research design and citation practices. The paper critiques incomplete and inconsistent guidance from leading political science and international relations journals and associations; considers how other disciplines from journalism to statistics to paleontology address the origins of their sources; and elaborates a set of normative and evidentiary criteria for researchers and readers to assess documentary source legitimacy and utility. Fundamentally, it contends that the scholarly community (researchers, peer reviewers, editors, thesis advisors, professional associations, and institutions) needs to practice deeper reflection on sources’ provenance, greater humility about whether to access leaked materials and what inferences to draw from them, and more transparency in citation and research strategies.

View Written Draft Paper

 

About the Speaker: Christopher Darnton is a CISAC affiliate and an associate professor of national security affairs at the Naval Postgraduate School. He previously taught at Reed College and the Catholic University of America, and holds a Ph.D. in Politics from Princeton University. He is the author of Rivalry and Alliance Politics in Cold War Latin America (Johns Hopkins, 2014) and of journal articles on US foreign policy, Latin American security, and qualitative research methods. His International Security article, “Archives and Inference: Documentary Evidence in Case Study Research and the Debate over U.S. Entry into World War II,” won the 2019 APSA International History and Politics Section Outstanding Article Award. He is writing a book on the history of US security cooperation in Latin America, based on declassified military documents.

Virtual Seminar

Christopher Darnton Associate Professor of National Security Affairs Naval Postgraduate School
Seminars
Authors
Riana Pfefferkorn
Riana Pfefferkorn
News Type
Blogs
Date
Paragraphs

When we’re faced with a video recording of an event—such as an incident of police brutality—we can generally trust that the event happened as shown in the video. But that may soon change, thanks to the advent of so-called “deepfake” videos that use machine learning technology to show a real person saying and doing things they haven’t.

This technology poses a particular threat to marginalized communities. If deepfakes cause society to move away from the current “seeing is believing” paradigm for video footage, that shift may negatively impact individuals whose stories society is already less likely to believe. The proliferation of video recording technology has fueled a reckoning with police violence in the United States, recorded by bystanders and body-cameras. But in a world of pervasive, compelling deepfakes, the burden of proof to verify authenticity of videos may shift onto the videographer, a development that would further undermine attempts to seek justice for police violence. To counter deepfakes, high-tech tools meant to increase trust in videos are in development, but these technologies, though well-intentioned, could end up being used to discredit already marginalized voices. 

(Content Note: Some of the links in this piece lead to graphic videos of incidents of police violence. Those links are denoted in bold.)

Recent police killings of Black Americans caught on camera have inspired massive protests that have filled U.S. streets in the past year. Those protests endured for months in Minneapolis, where former police officer Derek Chauvin was convicted this week in the murder of George Floyd, a Black man. During Chauvin’s trial, another police officer killed Daunte Wright just outside Minneapolis, prompting additional protests as well as the officer’s resignation and arrest on second-degree manslaughter charges. She supposedly mistook her gun for her Taser—the same mistake alleged in the fatal shooting of Oscar Grant in 2009, by an officer whom a jury later found guilty of involuntary manslaughter (but not guilty of a more serious charge). All three of these tragic deaths—George Floyd, Daunte Wright, Oscar Grant—were documented in videos that were later used (or, in Wright’s case, seem likely to be used) as evidence at the trials of the police officers responsible. Both Floyd’s and Wright’s deaths were captured by the respective officers’ body-worn cameras, and multiple bystanders with cell phones recorded the Floyd and Grant incidents. Some commentators credit a 17-year-old Black girl’s video recording of Floyd’s death for making Chauvin’s trial happen at all.

The growth of the movement for Black lives in the years since Grant’s death in 2009 owes much to the rise in the availability, quality, and virality of bystander videos documenting police violence, but this video evidence hasn’t always been enough to secure convictions. From Rodney King’s assailants in 1992 to Philando Castile’s shooter 25 years later, juries have often declined to convict police officers even in cases where wanton police violence or killings are documented on video. Despite their growing prevalence, police bodycams have had mixed results in deterring excessive force or impelling accountability. That said, bodycam videos do sometimes make a difference, helping to convict officers in the killings of Jordan Edwards in Texas and Laquan McDonald in Chicago. Chauvin’s defense team pitted bodycam footage against the bystander videos employed by the prosecution, and lost.

What makes video so powerful? Why does it spur crowds to take to the streets and lawyers to showcase it in trials? It’s because seeing is believing. Shot at differing angles from officers’ point of view, bystander footage paints a fuller picture of what happened. Two people (on a jury, say, or watching a viral video online) might interpret a video two different ways. But they’ve generally been able to take for granted that the footage is a true, accurate record of something that really happened. 

That might not be the case for much longer. It’s now possible to use artificial intelligence to generate highly realistic “deepfake” videos showing real people saying and doing things they never said or did, such as the recent viral TikTok videos depicting an ersatz Tom Cruise. You can also find realistic headshots of people who don’t exist at all on the creatively-named website thispersondoesnotexist.com. (There’s even a cat version.) 

While using deepfake technology to invent cats or impersonate movie stars might be cute, the technology has more sinister uses as well. In March, the Federal Bureau of Investigation issued a warning that malicious actors are “almost certain” to use “synthetic content” in disinformation campaigns against the American public and in criminal schemes to defraud U.S. businesses. The breakneck pace of deepfake technology’s development has prompted concerns that techniques for detecting such imagery will be unable to keep up. If so, the high-tech cat-and-mouse game between creators and debunkers might end in a stalemate at best. 

If it becomes impossible to reliably prove that a fake video isn’t real, a more feasible alternative might be to focus instead on proving that a real video isn’t fake. So-called “verified at capture” or “controlled-capture” technologies attach additional metadata to imagery at the moment it’s taken, to verify when and where the footage was recorded and reveal any attempt to tamper with the data. The goal of these technologies, which are still in their infancy, is to ensure that an image’s integrity will stand up to scrutiny. 

Photo and video verification technology holds promise for confirming what’s real in the age of “fake news.” But it’s also cause for concern. In a society where guilty verdicts for police officers remain elusive despite ample video evidence, is even more technology the answer? Or will it simply reinforce existing inequities? 

The “ambitious goal” of adding verification technology to smartphone chipsets necessarily entails increasing the cost of production. Once such phones start to come onto the market, they will be more expensive than lower-end devices that lack this functionality. And not everyone will be able to afford them. Black Americans and poor Americans have lower rates of smartphone ownership than whites and high earners, and are more likely to own a “dumb” cell phone. (The same pattern holds true with regard to educational attainment and urban versus rural residence.) Unless and until verification technology is baked into even the most affordable phones, it risks replicating existing disparities in digital access. 

That has implications for police accountability, and, by extension, for Black lives. Primed by societal concerns about deepfakes and “fake news,” juries may start expecting high-tech proof that a video is real. That might lead them to doubt the veracity of bystander videos of police brutality if they were captured on lower-end phones that lack verification technology. Extrapolating from current trends in phone ownership, such bystanders are more likely to be members of marginalized racial and socioeconomic groups. Those are the very people who, as witnesses in court, face an uphill battle in being afforded credibility by juries. That bias, which reared its ugly head again in the Chauvin trial, has long outlived the 19th-century rules that explicitly barred Black (and other non-white) people from testifying for or against white people on the grounds that their race rendered them inherently unreliable witnesses. 

In short, skepticism of “unverified” phone videos may compound existing prejudices against the owners of those phones. That may matter less in situations where a diverse group of numerous eyewitnesses record a police brutality incident on a range of devices. But if there is only a single bystander witness to the scene, the kind of phone they own could prove significant.

The advent of mobile devices empowered Black Americans to force a national reckoning with police brutality. Ubiquitous, pocket-sized video recorders allow average bystanders to document the pandemic of police violence. And because seeing is believing, those videos make it harder for others to continue denying the problem exists. Even with the evidence thrust under their noses, juries keep acquitting police officers who kill Black people. Chauvin’s conviction this week represents an exception to recent history: Between 2005 and 2019, of the 104 law enforcement officers charged with murder or manslaughter in connection with a shooting while on duty, 35 were convicted

The fight against fake videos will complicate the fight for Black lives. Unless it is equally available to everyone, video verification technology may not help the movement for police accountability, and could even set it back. Technological guarantees of videos’ trustworthiness will make little difference if they are accessible only to the privileged, whose stories society already tends to believe. We might be able to tech our way out of the deepfakes threat, but we can’t tech our way out of America’s systemic racism. 

Riana Pfefferkorn is a research scholar at the Stanford Internet Observatory

Read More

Riana Pfefferkorn
News

Q&A with Riana Pfefferkorn, Stanford Internet Observatory Research Scholar

Riana Pfefferkorn joined the Stanford Internet Observatory as a research scholar in December. She comes from Stanford’s Center for Internet and Society, where she was the Associate Director of Surveillance and Cybersecurity.
Q&A with Riana Pfefferkorn, Stanford Internet Observatory Research Scholar
A member of the All India Student Federation teaches farmers about social media and how to use such tools as part of ongoing protests against the government. (Pradeep Gaur / SOPA Images / Sipa via Reuters Connect)
Blogs

New Intermediary Rules Jeopardize the Security of Indian Internet Users

New Intermediary Rules Jeopardize the Security of Indian Internet Users
All News button
1
Authors
Daphne Keller
News Type
Blogs
Date
Paragraphs

I am a huge fan of transparency about platform content moderation. I’ve considered it a top policy priority for years, and written about it in detail (with Paddy Leerssen, who also wrote this great piece about recommendation algorithms and transparency). I sincerely believe that without it, we are unlikely to correctly diagnose current problems or arrive at wise legal solutions.

So it pains me to admit that I don’t really know what “transparency” I’m asking for. I don’t think many other people do, either. Researchers and public interest advocates around the world can agree that more transparency is better. But, aside from people with very particular areas of interest (like political advertising), almost no one has a clear wish list. What information is really important? What information is merely nice to have? What are the trade-offs involved?

That imprecision is about to become a problem, though it’s a good kind of problem to have. A moment of real political opportunity is at hand. Lawmakers in the USEurope, and elsewhere are ready to make some form of transparency mandatory. Whatever specific legal requirements they create will have huge consequences. The data, content, or explanations they require platforms to produce will shape our future understanding of platform operations, and our ability to respond — as consumers, as advocates, or as democracies. Whatever disclosures the laws don’t require, may never happen.

It’s easy to respond to this by saying “platforms should track all the possible data, we’ll see what’s useful later!” Some version of this approach might be justified for the very biggest “gatekeeper” or “systemically important” platforms. Of course, making Facebook or Google save all that data would be somewhat ironic, given the trouble they’ve landed in by storing similar not-clearly-needed data about their users in the past. (And the more detailed data we store about particular takedowns, the likelier it is to be personally identifiable.)

For any platform, though, we should recognize that the new practices required for transparency reporting comes at a cost. That cost might include driving platforms to adopt simpler, blunter content rules in their Terms of Service. That would reduce their expenses in classifying or explaining decisions, but presumably lead to overly broad or narrow content prohibitions. It might raise the cost of adding “social features” like user comments enough that some online businesses, like retailers or news sites, just give up on them. That would reduce some forms of innovation, and eliminate useful information for Internet users. For small and midsized platforms, transparency obligations (like other expenses related to content moderation) might add yet another reason to give up on competing with today’s giants, and accept an acquisition offer from an incumbent that already has moderation and transparency tools. Highly prescriptive transparency obligations might also drive de facto standardization and homogeneity in platform rules, moderation practices, and features.

None of these costs provides a reason to give up on transparency — or even to greatly reduce our expectations. But all of them are reasons to be thoughtful about what we ask for. It would be helpful if we could better quantify these costs, or get a handle on what transparency reporting is easier and harder to do in practice.

I’ve made a (very in the weeds) list of operational questions about transparency reporting, to illustrate some issues that are likely to arise in practice. I think detailed examples like these are helpful in thinking through both which kinds of data matter most, and how much precision we need within particular categories. For example, I personally want to know with great precision how many government orders a platform received, how it responded, and whether any orders led to later judicial review. But to me it seems OK to allow some margin of error for platforms that don’t have standardized tracking and queuing tools, and that as a result might modestly mis-count TOS takedowns (either by absolute numbers or percent).

I’ll list that and some other recommendations below. But these “recommendations” are very tentative. I don’t know enough to have a really clear set of preferences yet. There are things I wish I could learn from technologists, activists, and researchers first. The venues where those conversations would ordinarily happen — and, importantly, where observers from very different backgrounds and perspectives could have compared the issues they see, and the data they most want — have been sadly reduced for the past year.

So here is my very preliminary list:

  • Transparency mandates should be flexible enough to accommodate widely varying platform practices and policies. Any de facto push toward standardization should be limited to the very most essential data.
  • The most important categories of data are probably the main ones listed in the DSA: number of takedowns, number of appeals, number of successful appeals. But as my list demonstrates, those all can become complicated in practice.
  • It’s worth taking the time to get legal transparency mandates right. That may mean delegating exact transparency rules to regulatory agencies in some countries, or conducting studies prior to lawmaking in others.
  • Once rules are set, lawmakers should be very reluctant to move the goalposts. If a platform (especially a smaller one) invests in rebuilding its content moderation tools to track certain categories of data, it should not have to overhaul those tools soon because of changed legal requirements.
  • We should insist on precise data in some cases, and tolerate more imprecision in others (based on the importance of the issue, platform capacity, etc.). And we should take the time to figure out which is which.
  • Numbers aren’t everything. Aggregate data in transparency reports ultimately just tell us what platforms themselves think is going on. To understand what mistakes they make, or what biases they may exhibit, independent researchers need to see the actual content involved in takedown decisions. (This in turn raises a slough of issues about storing potentially unlawful content, user privacy and data protection, and more.)

It’s time to prioritize. Researchers and civil society should assume we are operating with a limited transparency “budget,” which we must spend wisely — asking for the information we can best put to use, and factoring in the cost. We need better understanding of both research needs and platform capabilities to do this cost-benefit analysis well. I hope that the window of political opportunity does not close before we manage to do that.

Daphne Keller

Daphne Keller

Director of the Program on Platform Regulation
BIO

Read More

Cover of the EIP report "The Long Fuse: Misinformation and the 2020 Election"
News

Election Integrity Partnership Releases Final Report on Mis- and Disinformation in 2020 U.S. Election

Researchers from Stanford University, the University of Washington, Graphika and Atlantic Council’s DFRLab released their findings in ‘The Long Fuse: Misinformation and the 2020 Election.’
Election Integrity Partnership Releases Final Report on Mis- and Disinformation in 2020 U.S. Election
Daphne Keller QA
Q&As

Q&A with Daphne Keller of the Program on Platform Regulation

Keller explains some of the issues currently surrounding platform regulation
Q&A with Daphne Keller of the Program on Platform Regulation
twitter takedown headliner
Blogs

Analysis of February 2021 Twitter Takedowns

In this post and in the attached reports we investigate a Twitter network attributed to actors in Armenia, Iran, and Russia.
Analysis of February 2021 Twitter Takedowns
All News button
1
Subtitle

In a new blog post, Daphne Keller, Director of the Program on Platform Regulation at the Cyber Policy Center, looks at the need for transparency when it comes to content moderation and asks, what kind of transparency do we really want?

-

End-to-end encrypted (E2EE) communications have been around for decades, but the deployment of default E2EE on billion-user platforms has new impacts for user privacy and safety. The deployment comes with benefits to both individuals and society but it also creates new risks, as long-existing models of messenger abuse can now flourish in an environment where automated or human review cannot reach. New E2EE products raise the prospect of less understood risks by adding discoverability to encrypted platforms, allowing contact from strangers and increasing the risk of certain types of abuse. This workshop will place a particular focus on platform benefits and risks that impact civil society organizations, with a specific focus on the global south. Through a series of workshops and policy papers, the Stanford Internet Observatory is facilitating open and productive dialogue on this contentious topic to find common ground. 

An important defining principle behind this workshop series is the explicit assumption that E2EE is here to stay. To that end, our workshops have set aside any discussion of exceptional access (aka backdoor) designs. This debate has raged between industry, academic cryptographers and law enforcement for decades and little progress has been made. We focus instead on interventions that can be used to reduce the harm of E2E encrypted communication products that have been less widely explored or implemented. 

Submissions for working papers and requests to attend will be accepted up to 10 days before the event. Accepted submitters will be invited to present or attend our upcoming workshops. 

SUBMIT HERE

Webinar

Workshops
-

Please note: the start time for this event has been moved from 3:00 to 3:15pm.

Join FSI Director Michael McFaul in conversation with Richard Stengel, Under Secretary of State for Public Diplomacy and Public Affairs. They will address the role of entrepreneurship in creating stable, prosperous societies around the world.

Richard Stengel Undersecretary of State for Public Diplomacy and Public Affairs Special Guest United States Department of State

Encina Hall
616 Jane Stanford Way
Stanford, CA 94305-6055

0
Director, Freeman Spogli Institute for International Studies
Ken Olivier and Angela Nomellini Professor of International Studies, Department of Political Science
Peter and Helen Bing Senior Fellow, Hoover Institution
2022-mcfaul-headshot.jpg
PhD

Michael McFaul is Director at the Freeman Spogli Institute for International Studies, the Ken Olivier and Angela Nomellini Professor of International Studies in the Department of Political Science, and the Peter and Helen Bing Senior Fellow at the Hoover Institution. He joined the Stanford faculty in 1995. Dr. McFaul also is as an International Affairs Analyst for NBC News and a columnist for The Washington Post. He served for five years in the Obama administration, first as Special Assistant to the President and Senior Director for Russian and Eurasian Affairs at the National Security Council at the White House (2009-2012), and then as U.S. Ambassador to the Russian Federation (2012-2014).

He has authored several books, most recently the New York Times bestseller From Cold War to Hot Peace: An American Ambassador in Putin’s Russia. Earlier books include Advancing Democracy Abroad: Why We Should, How We Can; Transitions To Democracy: A Comparative Perspective (eds. with Kathryn Stoner); Power and Purpose: American Policy toward Russia after the Cold War (with James Goldgeier); and Russia’s Unfinished Revolution: Political Change from Gorbachev to Putin. He is currently writing a book called Autocrats versus Democrats: Lessons from the Cold War for Competing with China and Russia Today.

He teaches courses on great power relations, democratization, comparative foreign policy decision-making, and revolutions.

Dr. McFaul was born and raised in Montana. He received his B.A. in International Relations and Slavic Languages and his M.A. in Soviet and East European Studies from Stanford University in 1986. As a Rhodes Scholar, he completed his D. Phil. In International Relations at Oxford University in 1991. His DPhil thesis was Southern African Liberation and Great Power Intervention: Towards a Theory of Revolution in an International Context.

CV
Moderator
Panel Discussions
Authors
Steve Fyffe
News Type
News
Date
Paragraphs

The United States has a growing inventory of spent nuclear fuel from commercial power plants that continues to accumulate at reactor sites around the country.

In addition, the legacy waste from U.S. defense programs remains at Department of Energy sites around the country, mainly at Hanford, WA, Savannah River, SC, and at Idaho National Laboratory.

Image
But now the U.S. nuclear waste storage program is “frozen in place”, according to Rod Ewing, Frank Stanton professor in nuclear security at Stanford’s Center for International Security and Cooperation.

“The processing and handling of waste is slow to stopped and in this environment the pressure has become very great to do something.”

Currently, more than seventy thousand metric tons of spent nuclear fuel from civilian reactors is sitting in temporary aboveground storage facilities spread across 35 states, with many of the reactors that produced it shut down.  And U.S. taxpayers are paying the utilities billions of dollars to keep it there.

Meanwhile, the deep geologic repository where all that waste was supposed to go, in Yucca Mountain Nevada, is now permanently on hold, after strong resistance from Nevada residents and politicians led by U.S. Senator Harry Reid.

The Waste Isolation Pilot Plant in Carlsbad New Mexico, the world’s first geologic repository for transuranic waste, has been closed for over a year due to a release of radioactivity.

And other parts of the system, such as the vitrification plant at Hanford and the mixed oxide fuel plant at Savannah River , SC, are way behind schedule and over budget.

It’s a growing problem that’s unlikely to change this political season.

“The chances of dealing with it in the current Congress are pretty much nil, in my view,” said former U.S. Senator Jeff Bingaman (D-NM).

“We’re not going to see a solution to this problem this year or next year.”

The issue in Congress is generally divided along political lines, with Republicans wanting to move forward with the original plan to build a repository at Yucca Mountain, while Democrats support the recommendations of the Blue Ribbon Commission on America’s Nuclear Future to create a new organization to manage nuclear waste in the U.S. and start looking for a new repository location using an inclusive, consent-based process.

“One of the big worries that I have with momentum loss is loss of nuclear competency,” said David Clark, a Fellow at the Los Alamos National Laboratory.

Image
“So we have a whole set of workers who have been trained, and have been working on these programs for a number of years. When you put a program on hold, people go find something else to do.”

Meanwhile, other countries are moving ahead with plans for their own repositories, with Finland and Sweden leading the pack, leaving the U.S. lagging behind.

So Ewing decided to convene a series of high-level conferences, where leading academics and nuclear experts from around the world can discuss the issues in a respectful environment with a diverse range of stakeholders – including former politicians and policy makers, scientists and representatives of Indian tribes and other effected communities.

“For many of these people and many of these constituencies, I’ve seen them argue at length, and it’s usually in a situation where a lot seems to be at stake and it’s very adversarial,” said Ewing.

“So by having the meeting at Stanford, we’ve all taken a deep breath, the program is frozen in place, nothing’s going to go anywhere tomorrow, we have the opportunity to sit and discuss things. And I think that may help.”

Former Senator Bingaman said he hoped the multidisciplinary meetings, known at the “Reset of Nuclear Waste Management Strategy and Policy Series”, would help spur progress on this pressing problem.

“There is a high level of frustration by people who are trying to find a solution to this problem of nuclear waste, and there’s no question that the actions that we’ve taken thus far have not gotten us very far,” Bingaman said.

“I think that’s why this conference that is occurring is a good thing, trying to think through what are the problems that got us into the mess we’re in, and how do we avoid them in the future.”

The latest conference, held earlier this month, considered the question of how to structure a new nuclear waste management organization in the U.S.

Speakers from Sweden, Canada and France brought an international perspective and provided lessons learned from their countries nuclear waste storage programs.

“The other…major programs, France, Switzerland, United Kingdom, Canada, they all reached a crisis point, not too different from our own,” said Ewing.

“And at this crisis point they had to reevaluate how they would go forward. They each chose a slightly different path, but having thought about it, and having selected a new path, one can also observe that their programs are moving forward.”

France has chosen to adopt a closed nuclear cycle to recycle spent fuel and reuse it to generate more electricity.

Image
“It means that the amount of waste that we have to dispose of is only four percent of the total volume of spent nuclear fuel which comes out of the reactor,” said Christophe Poinssot of the French Atomic and Alternative Energy Commission.

“We also reduce the toxicity because…we are removing the plutonium. And finally, we are conditioning the final waste under the form of nuclear glass, the lifetime of which is very long, in the range of a million years in repository conditions.”

Clark said that Stanford was the perfect place to convene a multidisciplinary group of thought leaders in the field who could have a real impact on the future of nuclear waste storage policy.

“The beauty of a conference like this, and holding it at a place like Stanford University and CISAC, is that all the right people are here,” he said.

“All the people who are here have the ability to influence, through some level of authority and scholarship, and they’ll be able to take the ideas that they’ve heard back to their different offices and different organizations.  I think it will make a difference, and I’m really happy to be part of it.”

Ewing said it was also important to include students in the conversation.

“There’s a next generation of researchers coming online, and I want to save them the time that it took me to realize what the problems are,” Ewing said.

“By mixing students into this meeting, letting them interact with all the parties, including the distinguished scientists and engineers, I’m hoping it speeds up the process.”

Professor Ewing is already planning his next conference, next March, which will focus on the consent-based process that will be used to identify a new location within the U.S. for a repository.

All News button
1
News Type
News
Date
Paragraphs

Russ Feingold, the former U.S. senator perhaps best known for pushing campaign finance reform, will spend the spring quarter at Stanford lecturing and teaching.

Feingold will be the Payne Distinguished Lecturer and will be in residence at the Freeman Spogli Institute for International Studies while teaching and mentoring graduate students in the Ford Dorsey Program in International Policy Studies and the Stanford Law School.

Feingold was recently the State Department’s  special envoy to the Great Lakes Region of Africa and the Democratic Republic of Congo. He will bring his knowledge and longstanding interest in one of the most challenging, yet promising, places in Africa to campus with the cross-listed IPS and Law School course, “The Great Lakes Region of Africa and American Foreign Relations: Policy and Legal Implications of the Post-1994 Era.”

Feingold, a Wisconsin Democrat who served three terms in the Senate between 1993 and 2011, co-sponsored the Bipartisan Campaign Reform Act of 2002. Better known as the McCain-Feingold Act, the legislation regulated the roles of soft money contributions and issue ads in national elections.

Hero Image
All News button
1
-
Michael Albertus seminar

For millennia, land has been a symbol of wealth and privilege. But the true power of land ownership is even greater than we might think. Who owns the land determines whether a society will be equal or unequal, whether it will develop or decline, and whether it will safeguard or sacrifice its environment. Modern history has been defined by land reallocation on a massive scale. From the 1500s on, European colonial powers and new nation-states shifted indigenous lands into the hands of settlers. The 1900s brought new waves of land appropriation, from Soviet and Maoist collectivization to initiatives turning large estates over to family farmers. The shuffle continues today as governments vie for power and prosperity by choosing who should get land. Drawing on a career’s worth of original research and on-the-ground fieldwork, Land Power shows that choices about who owns the land have locked in poverty, sexism, racism, and climate crisis—and that what we do with the land today can change our collective fate.

ABOUT THE SPEAKER

Michael Albertus is a Professor of Political Science at the University of Chicago and the author of five books. His research examines democracy and dictatorship, inequality and redistribution, property rights, and civil conflict. His newest book, Land Power: Who Has It, Who Doesn't, and How That Determines the Fate of Societies, was published by Basic Books in January 2025. In addition to his books, Albertus is also the author of nearly 30 peer-reviewed journal articles, including at flagship journals like the American Journal of Political Science, Journal of Politics, and World Politics. The defining features of Albertus' work are his engagement with big questions and puzzles and the ability to join big data and cutting-edge research methods with original, deep on-the-ground fieldwork everywhere from government offices to archives and farm fields. He has conducted fieldwork throughout the Americas, southern Europe, South Africa, and elsewhere. His books and articles have won numerous awards and shifted conventional understandings of democracy, authoritarianism, and the consequences of how humans occupy and relate to the land.
 

Virtual to Public. Only those with an active Stanford ID with access to Room E008 in Encina Hall may attend in person.

Hesham Sallam
Hesham Sallam

Virtual to Public. Only those with an active Stanford ID with access to Encina E008 in Encina Hall may attend in person.

Michael Albertus Professor of Political Science Professor of Political Science, University of Chicago University of Chicago
Seminars
Date Label
-
Didi Kuo book launch

Once a centralizing force of the democratic process, political parties have eroded over the past fifty years. In her forthcoming book, The Great Retreat: How Political Parties Should Behave and Why They Don't, Didi Kuo explores the development of political parties as democracy expanded across the West in the nineteenth century. While parties have become professionalized and nationalized, they have lost the robust organizational density that made them effective representatives. After the Cold War, the combination of a neoliberal economic consensus, changes to campaign finance, and shifting party priorities weakened the party systems of Western democracies. In order for democracy to adapt to a new era of global capitalism, The Great Retreat makes the case for stronger parties in the form of socially embedded institutions with deep connections to communities and citizens.

Kuo will give a brief talk about the book before being joined by Jake Grumbach, Julia Azari, and Bruce Cain for a panel discussion.

speakers

Didi Kuo

Didi Kuo

Center Fellow, FSI
Full bio

Didi Kuo is a Center Fellow at the Freeman-Spogli Institute for International Studies at Stanford University. Her research interests include democratization, political parties, state-building, and the political economy of representation. She is the author of The Great Retreat: How Political Parties Should Behave - and Why They Don't (Oxford University Press) and Clientelism, Capitalism, and Democracy: the Rise of Programmatic Politics in the United States and Britain (Cambridge University Press 2018). She was an Eric and Wendy Schmidt Fellow at New America, is a non-resident scholar at the Carnegie Endowment for International Peace, and is an adjunct fellow at the Niskanen Center.
 

Jacob Grumbach stanfing in front of wall of leaves

Jake Grumbach

Associate Professor, Goldman School of Public Policy at UC Berkeley
Panelist

Jake Grumbach is an associate professor at the Goldman School of Public Policy at UC Berkeley. He was previously associate professor of political science at the University of Washington and a postdoctoral fellow at the Center for the Study of Democratic Politics at Princeton.

He studies the political economy of the United States, with interests in democratic institutions, labor, federalism, racial and economic inequality, and statistical methods. His book, Laboratories Against Democracy (Princeton University Press 2022), investigates the causes and consequences of the nationalization of state politics.

Before graduate school, he earned a B.A. from Columbia University and worked as a public health researcher. Outside of academia, he's a nerd for 70s funk/soul and 90s hip hop, as well as a Warriors fan.
 

Julia Azari

Julia Azari

Professor of Political Science, Marquette University
Panelist

Julia Azari is Professor of Political Science at Marquette University. An active public-facing scholar, she has published commentary on presidential and party politics in FiveThirtyEight, Politico, Vox, The New York Times, The Washington Post, MSNBC, and The Guardian.

Her scholarly work has appeared in journals such as The Forum, Perspectives on Politics, The Annals of the American Academy of Political and Social Science, Foreign Affairs, and Social Science History. She has contributed invited chapters to books published by the University Press of Kansas, University of Pennsylvania Press, Cambridge University Press, and University of Edinburgh Press. Azari is the author of Delivering the People’s Message: The Changing Politics of the Presidential Mandate (Cornell, 2014), coeditor of The Presidential Leadership Dilemma (SUNY, 2013), and co-editor of The Trump Legacy (under contract, University Press of Kansas).
 

Bruce Cain

Bruce Cain

Charles Louis Ducommun Professor, Humanities and Sciences; Director, Bill Lane Center for the American West; and Professor, Political Science
Moderator
full bio

Bruce E. Cain is a Professor of Political Science at Stanford University and Director of the Bill Lane Center for the American West. He received a BA from Bowdoin College (1970), a B Phil. from Oxford University (1972) as a Rhodes Scholar, and a Ph D from Harvard University (1976). He taught at Caltech (1976-89) and UC Berkeley (1989-2012) before coming to Stanford. Professor Cain was Director of the Institute of Governmental Studies at UC Berkeley from 1990-2007 and Executive Director of the UC Washington Center from 2005-2012. He was elected to the American Academy of Arts and Sciences in 2000 and has won awards for his research (Richard F. Fenno Prize, 1988), teaching (Caltech, 1988 and UC Berkeley, 2003), and public service (Zale Award for Outstanding Achievement in Policy Research and Public Service, 2000). His areas of expertise include political regulation, applied democratic theory, representation, and state politics. Some of Professor Cain’s most recent publications include “Malleable Constitutions: Reflections on State Constitutional Design,” coauthored with Roger Noll in University of Texas Law Review, volume 2, 2009; “More or Less: Searching for Regulatory Balance,” in Race, Reform and the Political Process, edited by Heather Gerken, Guy Charles and Michael Kang, CUP, 2011; and “Redistricting Commissions: A Better Political Buffer?” in The Yale Law Journal, volume 121, 2012. He is currently working on a book about political reform in the US.
 

Bruce E. Cain

In-person: William J. Perry Conference Room (Encina Hall, 2nd floor, 616 Jane Stanford Way, Stanford)

Online: Via Zoom

Encina Hall, C150
616 Jane Stanford Way
Stanford, CA 94305

0
Center Fellow, Freeman Spogli Institute for International Studies
didi_kuo_2023.jpg

Didi Kuo is a Center Fellow at the Freeman Spogli Institute for International Studies (FSI) at Stanford University. She is a scholar of comparative politics with a focus on democratization, corruption and clientelism, political parties and institutions, and political reform. She is the author of The Great Retreat: How Political Parties Should Behave and Why They Don’t (Oxford University Press, forthcoming) and Clientelism, Capitalism, and Democracy: the rise of programmatic politics in the United States and Britain (Cambridge University Press, 2018).

She has been at Stanford since 2013 as the manager of the Program on American Democracy in Comparative Perspective and is co-director of the Fisher Family Honors Program at CDDRL. She was an Eric and Wendy Schmidt Fellow at New America and is a non-resident fellow with the Carnegie Endowment for International Peace. She received a PhD in political science from Harvard University, an MSc in Economic and Social History from Oxford University, where she studied as a Marshall Scholar, and a BA from Emory University.

Didi Kuo
Lectures
Date Label
-
Jasmine English seminar

Multiracial populations increased faster than any single race in the most recent U.S. census. However, we know little about how this shift might impact the political attitudes of monoracial Americans. Drawing on literatures in racial politics and social psychology, this article addresses this question by examining how learning about African Americans with Irish ancestry impacts Irish Americans' racial attitudes. Findings from a survey of Irish Americans show that identifying as Irish American predicts implicit and explicit prejudice toward Black Americans. However, a survey experiment reveals that correcting Irish Americans’ underestimations of the proportion of African Americans with Irish ancestry decreases racial prejudice among Irish Americans who identify as Irish American. In other words, learning about a shared ancestry with African Americans reduces racial prejudice among those Irish Americans most predisposed to prejudice. Open-ended survey responses offer two possible explanations: the potential for multiracialism to reduce racial essentialism and increase perceptions of closeness across race.

ABOUT THE SPEAKER

Jasmine English is a Postdoctoral Fellow at Stanford’s Center on Democracy, Development, and the Rule of Law and will be an Assistant Professor of Political Science at Reed College (2025-).

Jasmine’s research focuses on political behavior, interracial solidarity, and the carceral state in American politics. Ongoing projects on interracial solidarity examine how learning about African Americans with Irish ancestry impacts Irish Americans’ racial attitudes. Her work on the carceral state includes projects on “carceral political discussion” and the impact of militarized policing on perceptions of the Black Lives Matter movement. Jasmine’s research has been published in the American Political Science Review, Politics, Groups, and Identities, and is forthcoming at Perspectives on Politics. She has received several awards for her research, including best paper awards from the American Political Science Association (APSA) sections on Interpretive Methods and Qualitative and Multi-Method Research. 

Jasmine received her PhD in Political Science from MIT in 2024. She received her BA from UCLA, where she graduated with degrees in Political Science and Economics. Jasmine is originally from Belfast, Northern Ireland.

Virtual to Public. Only those with an active Stanford ID with access to Room E008 in Encina Hall may attend in person.

Hesham Sallam
Hesham Sallam

Virtual to Public. Only those with an active Stanford ID with access to Encina E008 in Encina Hall may attend in person.

0
CDDRL Postdoctoral Fellow, 2024-25
155a4102_-_jasmine.jpg

Jasmine is a Postdoctoral Fellow at Stanford’s Center on Democracy, Development and the Rule of Law (2024-25) and will be an Assistant Professor of Political Science at Reed College (2025).

Her research focuses on political behavior, interracial solidarity, and the carceral state in American politics. Ongoing projects include “Dilemmas of Accommodation,” which explores the barriers to deliberation and political action in racially diverse churches, and a series of projects on “carceral political discussion.” Across her research, Jasmine uses ethnographic methods, in-depth interviews, original surveys, and experiments. Her work has been published in the American Political Science Review and Politics, Groups, and Identities.

Jasmine received her PhD in Political Science from MIT in 2024 and graduated with degrees in Political Science and Economics from UCLA in 2018. She is originally from Belfast, Northern Ireland.

Date Label
Jasmine English
Seminars
Date Label
Subscribe to The Americas