Biosecurity
-

Abstract: Biotechnology is rapidly diffusing globally. Efficient methods for reading, writing, and editing genetic information, producing genetic diversity, and selecting for traits are becoming widely available. New communities of practice are gaining the power to act on timescales and geographies that fall outside current systems of oversight. Governments and scientific communities alike are struggling to respond appropriately.

Recent controversies have brought these issues to light: “gain-of-function” research may risk causing the very pandemics it aims to help mitigate; the development of “gene drives” may drastically alter ecosystems; and crowd-funded “CRISPR kits” are giving decentralized communities access to powerful new tools. Meanwhile, a series of accidents at the nation’s premier biological labs, and recent struggles in responses to Ebola and Zika are raising concerns about the capacity to respond to biological threats regardless of their cause – accidental, deliberate or naturally occurring. The lack of mechanisms to assess the benefits and risks of advances in biotechnology has prompted reactive and blunt policy solutions including scientific and government-lead research moratoriums.

This presentation will review recent developments and discuss improved strategies for preparing for emerging biological risks. It will highlight key needs and opportunities in leadership, oversight and learning to mature our institutions to tackle long-term governance challenges.

About the Speaker: Dr. Megan J. Palmer is a Senior Research Scholar and William J. Perry Fellow in International Security at the Center for International Security and Cooperation (CISAC) at Stanford University. She leads a research program focused on risk governance in biotechnology and other emerging technologies. Dr. Palmer is also an investigator of the multi-university Synthetic Biology Engineering Research Center (Synberc), where for the last 5 years she served as Deputy Director of its policy-related research program, and led projects in safety and security, property rights, and community organization and governance. She was previously a research scientist at the California Center for Quantitative Bioscience at UC Berkeley, and an affiliate of Lawrence Berkeley National Labs.

Dr. Palmer has created and led many programs aimed at developing and promoting best practices and policies for the responsible development of biotechnology. She founded and serves as Executive Director of the Synthetic Biology Leadership Excellence Accelerator Program (LEAP), an international fellowship program in responsible biotechnology leadership. She also leads programs in safety and responsible innovation for the international Genetically Engineered Machine (iGEM) competition. Dr. Palmer advises a diversity of organizations on their approach to policy issues in biotechnology, including serving on the board of the synthetic biology program of the Joint Genomics Institute (JGI)

Dr. Palmer holds a Ph.D. in Biological Engineering from MIT, and was a postdoctoral scholar in the Bioengineering Department at Stanford University, when she first became a CISAC affiliate. She received a B.Sc.E. in Engineering Chemistry from Queen’s University, Canada.

616 Jane Stanford Way
Suite C238
Stanford, CA 94305-6165

(650) 725-8929
0
Senior Director of Public Impact at Ginkgo Bioworks
CISAC Affiliate
meganpalmer.jpeg
PhD

Dr. Megan J. Palmer is the Executive Director of Bio Policy & Leadership Initiatives at Stanford University (Bio-polis). In this role, Dr. Palmer leads integrated research, teaching and engagement programs to explore how biological science and engineering is shaping our societies, and to guide innovation to serve public interests. Based in the Department of Bioengineering, she works closely both with groups across the university and with stakeholders in academia, government, industry and civil society around the world.

In addition to fostering broader efforts, Dr. Palmer leads a focus area in biosecurity in partnership with the Freeman Spogli Institute for International Studies (FSI) at Stanford. Projects in this area examine how security is conceived and managed as biotechnology becomes increasingly accessible. Her current projects include assessing strategies for governing dual use research, analyzing the diffusion of safety and security norms and practices, and understanding the security implications of alternative technology design decisions.

Dr. Palmer has created and led many programs aimed at developing and promoting best practices and policies for the responsible development of bioengineering. For the last ten years she has led programs in safety, security and social responsibility for the international Genetically Engineered Machine (iGEM) competition, which last year involved over 6000 students in 353 teams from 48 countries. She also founded and serves as Executive Director of the Synthetic Biology Leadership Excellence Accelerator Program (LEAP), an international fellowship program in biotechnology leadership. She advises and works with many other organizations on their strategies for the responsible development of bioengineering, including serving on the board of directors of Revive & Restore, a nonprofit organization advancing biotechnologies for conservation.

Previously, Megan was a Senior Research Scholar and William J. Perry Fellow in International Security at the Center for International Security and Cooperation (CISAC), part of FSI, where she is now an affiliated researcher. She also spent five years as Deputy Director of Policy and Practices for the multi-university NSF Synthetic Biology Engineering Research Center (Synberc). She has previously held positions as a project scientist at the California Center for Quantitative Bioscience at the University of California Berkeley (where she was an affiliate of Lawrence Berkeley National Labs), and a postdoctoral scholar in the Bioengineering Department at Stanford University. Dr. Palmer received her Ph.D. in Biological Engineering from M.I.T. and a B.Sc.E. in Engineering Chemistry from Queen’s University, Canada.

 

Senior Research Scholar CISAC
Seminars
-

Abstract: Industry, medical centers, academics and patient advocates have come together to create common standards for the representation and exchange of genomics information for both research and clinical use in The Global Alliance for Genomics and Health. Now GA4GH involves hundreds of organizations and individuals worldwide. The open source projects of our Data Working Group welcome participation by all individuals and organizations.

About the Speaker: David Haussler develops new statistical and algorithmic methods to explore the molecular function, evolution, and disease process in the human genome, integrating comparative and high-throughput genomics data to study gene structure, function, and regulation. As a collaborator on the international Human Genome Project, his team posted the first publicly available computational assembly of the human genome sequence. His team subsequently developed the UCSC Genome Browser, a web-based tool that is used extensively in biomedical research. He built the CGHub database to hold NCI’s cancer genome data, co-founded the Genome 10K project so science can learn from other vertebrate genomes, co-founded the Treehouse Childhood Cancer Project to enable international comparison of childhood cancer genomes, and is a co-founder of the Global Alliance for Genomics and Health (GA4GH), a coalition of the top research, health care, and disease advocacy organizations.

Haussler is a member of the National Academy of Sciences and the American Academy of Arts and Sciences and a fellow of AAAS and AAAI. He has won a number of awards, including the 2014 Dan David Prize, 2011 Weldon Memorial prize for application of mathematics and statistics to biology, 2009 ASHG Curt Stern Award in Human Genetics, and the 2008 Senior Scientist Accomplishment Award from the International Society for Computational Biology, the 2006 Dickson Prize for Science from Carnegie Mellon University, and the 2003 ACM/AAAI Allen Newell Award in Artificial Intelligence.

David Haussler Distinguished Professor, Biomolecular Engineering UC Santa Cruz Genomics Institute, University of California, Santa Cruz
Seminars
Paragraphs

Management of emerging risks in life science and technology requires new leadership and a sober assessment of the legacy of Asilomar.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Science
Authors
Megan Palmer
Megan Palmer
David Relman
David Relman
Frank Fukuyama
-

Abstract: Technological and social forces are allowing the growth of science outside a professionalized context.  The Internet, mobile devices, and public availability of big data are technological facilitators of “citizen science”, in part through enabling data sharing, analysis, and communication.  These technological changes now allow the entire scientific process, from funding and development of a research agenda, to conduct, analysis and dissemination and application of findings, to take place without the involvement of any science professionals or research-related institutions.  However, most ethical and regulatory frameworks for biomedical science arose from concepts of obligations of professionals and (largely not-for-profit) institutions.  We will discuss current examples of “citizen science” in biology and clinical research, and the ethical and policy implications.

About the Speaker: Mildred Cho is a Professor in the Division of Medical Genetics of the Department of Pediatrics at Stanford University, Associate Director of the Stanford Center for Biomedical Ethics, and Director of the Center for Integration of Research on Genetics and Ethics. She received her B.S. in Biology in 1984 from the Massachusetts Institute of Technology and her Ph.D. in 1992 from the Stanford University Department of Pharmacology.  Her post-doctoral training was in Health Policy as a Pew Fellow at the Institute for Health Policy Studies at the University of California, San Francisco and at the Palo Alto VA Center for Health Care Evaluation.  She is a member of international and national advisory boards, including for Genome Canada, the March of Dimes, and the Board of Reviewing Editors of Science magazine.  Her current research projects examine ethical and social issues in research on the human genome and microbiome, synthetic biology and genome editing, and the ethics at the intersection of clinical practice and research.  

 

 

Mildred Cho Professor in the Division of Medical Genetics of the Department of Pediatrics and Associate Director of the Stanford Center for Biomedical Ethics Stanford University
Seminars
Authors
News Type
News
Date
Paragraphs

The United States needs to build a better governance regime for oversight of risky biological research to reduce the likelihood of a bioengineered super virus escaping from the lab or being deliberately unleashed, according to an article from three Stanford scholars published in the journal Science today.

"We've got an increasing number of unusually risky experiments, and we need to be more thoughtful and deliberate in how we oversee this work," said co-author David Relman, a professor of infectious diseases and co-director of Stanford's Center for International Security and Cooperation (CISAC).

Relman said that cutting-edge bioscience and technology research has yielded tremendous benefits, such as cheap and effective ways of developing new drugs, vaccines, fuels and food. But he said he was concerned about the growing number of labs that are developing novel pathogens with pandemic potential.

For instance, researchers at the Memorial Sloan Kettering Cancer Center, in their quest to create a better model for studying human disease, recently deployed a gene editing technique known as CRISPR-Cas9 on a respiratory virus so that it was able to edit the mouse genome and cause cancer in infected mice.

"They ended up creating, in my mind, a very dangerous virus and showed others how they too could make similar kinds of dangerous viruses," Relman said.

Image

Scientists in the United States and the Netherlands, conducting so-called "gain-of-function" experiments, have also created much more contagious versions of the deadly H5N1 bird flu in the lab.

Publicly available information from published experiments like these, such as genomic sequence data, could allow scientists to reverse engineer a virus that would be difficult to contain and highly harmful were it to spread.

And a recent spate of high-profile accidents at U.S. government labs – including the mishandling of anthrax, bird flu, smallpox and Ebola samples – has raised the specter of a dangerous pathogen escaping from the lab and causing an outbreak or even a global pandemic.

"These kinds of accidents can have severe consequences," said Megan Palmer, CISAC senior research scholar and a co-author on the paper. "But we lack adequate processes and public information to assess the significance of the benefits and risks. Unless we address this fundamental issue, then we're going to continue to be reactive and make ourselves more vulnerable to mistakes and accidents in the long term."

Centralizing leadership

Leadership on risk management in biotechnology has not evolved much since the mid-1970s, when pioneering scientists gathered at the Asilomar Conference on Recombinant DNA and established guidelines that are still in use today.

Palmer said that although scientific self-governance is an essential element of oversight, left unchecked, it could lead to a "culture of invincibility over time."

"There's reliance on really a narrow set of technical experts to assess risks, and we need to broaden that leadership to be able to account for the new types of opportunities and challenges that emerging science and technology bring," she said.

Relman described the current system as "piecemeal, ad hoc and uncoordinated," and said that a more "holistic" approach that included academia, industry and all levels of government was needed to tackle the problem.

"It's time for us as a set of communities to step back and think more strategically," Relman said.

The governance of "dual use" technologies, which can be used for both peaceful and offensive purposes, poses significant challenges in the life sciences, said Stanford political scientist Francis Fukuyama, who also contributed to the paper.

"Unlike nuclear weapons, it doesn't take large-scale labs," Fukuyama said. "It doesn't take a lot of capacity to do dangerous research on biology."

The co-authors recommend appointing a top-ranking government official, such as a special assistant to the president, and a supporting committee, to oversee safety and security in the life sciences and associated technologies. They would coordinate the management of risk, including regulatory authorities needed to ensure accountability and information sharing.

"Although many agencies right now are tasked with worrying about safety, they have got conflicting interests that make them not ideal for being the single point of vigilance in this area," Fukuyama said.

"The National Institutes of Health is trying to promote research but also stop dangerous research. Sometimes those two aims run at cross-purposes.

"It's a big step to call for a new regulator, because in general we have too much regulation, but we felt there were a lot of dangers that were not being responded to in an appropriate way."

Improving cooperation

Strong cooperative international mechanisms are also needed to encourage other countries to support responsible research, Fukuyama said.

"What we want to avoid is a kind of arms race phenomenon, where countries are trying to compete with each other doing risky research in this area, and not wanting to mitigate risks because of fears that other countries are going to get ahead of them," he said.

The co-authors also recommended investing in research centers as a strategic way to build critical perspective and analysis of oversight challenges as biotechnology becomes increasing accessible.

 

Hero Image
All News button
1
-

Abstract: Faster evolving technologies, new peer adversaries, and the increased role of non-government entities changes how we think about decisions to develop and adopt new technology. Uncertainties about technology “shelf life,” adversary intentions, and dual uses of technology complicate these decisions. This seminar will discuss the use of mathematical models and optimization methods to provide insight on technology policy issues. These issues include: balancing risk and affordability during technology research and development; timing technology adoption; and understanding adversary responses to new technologies. Examples will be discussed from offensive cyber operations and synthetic biology. We will conclude by discussing implications for how policy analysts and policy makers think about technology and security.

 

About the Speaker: Philip Keller is a National Defense Science and Engineering Graduate Fellow at Stanford. He is completing his PhD in Management Science & Engineering. He studies technology policy problems posed by new technologies. His research is highly interdisciplinary, drawing on methods from engineering risk and decision analysis, game theory, and operations research. His professional experience includes conducting studies and analysis for the Department of Defense and the Department of Homeland Security at RAND and the Homeland Security Studies and Analysis Institute. Previous study topics include unmanned aircraft operations; nuclear terrorism; offensive cyber operations; and military force structure. Philip holds a BS in Mathematics and an MS in Defense and Strategic Studies.

Predoctoral Fellow CISAC
Seminars
-

Abstract: Biotechnology is in a transition from artisanal tools and methods to computer-controlled, high-throughput systems that allow research and development at industrial scale. This digitization is also radically reducing technical and economic barriers, empowering a new generation of young designers to do bioengineering on par with major companies but at a fraction of the cost, and prompting a re-think of the entire industry, including business models, intellectual property, ethics and biosecurity. This shift has the potential to disrupt R&D on a global scale. This lecture provides an overview of the issues and opportunities.

About the Speaker: Autodesk Distinguished Researcher Andrew Hessel is spearheading the development of tools and processes that facilitate the computer-aided design and computer-aided manufacture of living creatures and systems. As a 2015-2016 AAAS-Lemelson Invention Ambassador, he also encourages others to explore invention and innovation in biological engineering. Andrew is active in the iGEM and DIYbio (do-it-yourself) communities and frequently works with students and young entrepreneurs to guide their career and business development efforts. He has given hundreds of invited talks on synthetic biology to groups that include hollywood movie producers, the United Nations, and the FBI.

Andrew Hessel Distinguished Researcher Autodesk Inc. (Bio/Nano Programmable Matter group)
Seminars
Paragraphs

In an article published by the Council on Foreign Relations' Foreign Affairs magazine, David Relman and Marc Lipsitch examine recent advances in biological engineering as well as lapses in laboratory security in the context of biosafety and biosecurity concerns. The authors argue that current oversight is ill-equipped to handle the potential risks that can result from this type of research, and call for improved oversight mechanisms that involve diverse stakeholders to better govern these fields.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Foreign Affairs
Authors
David Relman
Marc Lipsitch
Authors
Steve Fyffe
News Type
News
Date
Paragraphs

The H5N1 strain of the bird flu is a deadly virus that kills more than half of the people who catch it.

Fortunately, it’s not easily spread from person to person, and is usually contracted though close contact with infected birds.

But scientists in the Netherlands have genetically engineered a much more contagious airborne version of the virus that quickly spread among the ferrets they use as an experimental model for how the disease might be transmitted among humans.

And researchers from the University of Wisconsin-Madison used samples from the corpses of birds frozen in the Arctic to recreate a version of the virus similar to the one that killed an estimated 40 million people in the 1918 flu pandemic.

It’s experiments like these that make David Relman, a Stanford microbiologist and co-director of the Center for International Security and Cooperation, say it's time to create a better system for oversight of risky research before a man-made super virus escapes from the lab and causes the next global pandemic.

“The stakes are the health and welfare of much of the earth’s ecosystem,” said Relman.

“We need greater awareness of risk and a greater number of different kinds of tools for regulating the few experiments that are going to pose major risks to large populations of humans and animals and plants.”

Terrorists, rogue states or conventional military powers could also use the published results of experiments like these to create a deadly bioweapon.

“This is an issue of biosecurity, not just biosafety,” he said.

“It’s not simply the production of a new infectious agent, it’s the production of a blueprint for a new infectious agent that’s just as risky as the agent itself.”

Image
H5N1 bird flu seen under an electron microscope. The virus is colored gold. Photo credit: CDC
Scientists who conduct this kind of research argue that their labs, which follow a set of safety procedures known at Biosafety Level 3, are highly secure and the chances of a genetically engineered virus being released into the general population are almost zero.

But Relman cited a series of recent lapses at laboratories in the United States as evidence that accidents can and do happen.

“There have been a frightening number of accidents at the best laboratories in the United States with mishandling and escape of dangerous pathogens,” Relman said.

“There is no laboratory, there is no investigator, there is no system that is foolproof, and our best laboratories are not as safe as one would have thought.”

The Centers for Disease Control and Prevention (CDC) admitted last year that it had mishandled samples of Ebola during the recent outbreak, potentially exposing lab workers to the deadly disease.

In the same year, a CDC lab accidentally contaminated a mild strain of the bird flu virus with deadly H5N1 and mailed it to unsuspecting researchers.

And a 60 year-old vial of smallpox (the contagious virus that was effectively eradicated by a worldwide vaccination program) was discovered sitting in an unused storage room at a U.S. Food and Drug Administration lab.

Earlier this year, the U.S. Army accidentally shipped samples of live anthrax to hundreds of labs around the world.

Similar problems have been reported in labs around the world. The United Kingdom has had more than 100 mishaps in its high-containment labs in recent years.

It’s difficult to judge the full scope of the problem, because many lab accidents are underreported.

Studying viruses in the lab does bring important potential benefits, such as the promise of universal vaccines, as well as cheap and effective ways of developing new drugs and other kinds of alternative defenses against naturally occurring diseases.

“It’s a very tricky balancing act,” Relman said.

“We don’t want to simply shut down the work or impede it unnecessarily.”

However, there are safer ways to conduct research, such as using harmless “avirulent” versions of the virus that would not cause widespread death and injury if it infected the general public, Relman said.

Developing better tools for risk-benefit analysis to identify and mitigate potential dangers in the early stages of research would be another important step towards making biological experiments safer.

Closer cooperation among diverse stakeholders (including domain experts, government agencies, funding groups, governing organizations of scientists and the general public) is also needed in order to develop effective rules for oversight and regulation of dangerous experiments, both domestically and abroad.

“We believe that the solutions are going to have to involve a diverse group of actors that has not yet been brought together,” Relman said.

“We need new approaches for governance in the life sciences that allow for these kinds of considerations across the science community and the policy community.”

You can read more about Relman’s views on how to limit the risks of biological engineering in this article he wrote for Foreign Affairs with co-author with Marc Lipsitch, director of Harvard’s Center for Communicable Disease Dynamics.

Hero Image
All News button
1
-

Abstract: The threat of biological attack on the people of the United States and the world, whether intentional, natural or accidental, is of growing concern, both in spite of and because of significant technological advances over the past four decades. As a global leader, the United States needs a comprehensive policy approach for managing future attacks, which incorporates technologic elements from rapid detection through appropriate response. American and international responses to recent infectious disease outbreaks such as anthrax (intentional, accidental), H5N1 influenza (natural) and ebola (natural) have managed to contain these events ‐ with the paradoxical effect on policy makers, both political and administrative, of relief (“missed that bullet”, “we must be doing this right”), rather than serving as wake‐up calls. A challenge in merging technological solutions into policy lies in the rapid advances across the multiple sciences. Translation of these ongoing technologic advances for policy leaders is an essential element in effective policy development. Incorporation of technologic solutions into biosecurity policy construction, combined with motivated leadership, has the potential for enhancing future national and global responses to unprecedented biological attacks.

About the Speaker: Patrick J. Scannon, M.D., Ph.D. is XOMA's Company Founder, Executive Vice President, Chief Scientific Officer and a member of its Board of Directors. Since 1980, Dr. Scannon has directed the Company's product identification, evaluation and clinical testing programs for novel therapeutic monoclonal antibodies and proteins against infectious, oncologic, metabolic and immunologic diseases. As Chief Scientific Officer, he leads evaluations for new therapeutic antibody identification and discovery programs. 

Dr. Scannon holds a Ph.D. in organic chemistry from the University of California, Berkeley and an M.D. from the Medical College of Georgia. He completed his medical internship and residency in internal medicine at the Letterman Army Medical Center in San Francisco. A board-certified internist, Dr. Scannon is also a member of the American College of Physicians. He is the inventor or co-inventor of several issued U.S. patents, and has published numerous scientific abstracts and papers.

Dr. Scannon has served as a member of the Research Committee for Infectious Diseases Society of America (IDSA), the National Biodefense Science Board (NBSB, a federal advisory board for the Department of Health and Human Services), the chair of the Chem/Bio Warfare Defense Panel for the Defense Threat Reduction Agency (DTRA) and a member of the Defense Sciences Research Council (DSRC, a research board for Defense Advanced Research Projects Agency (DARPA)). He has served as a Trustee of the University of California Berkeley Foundation and as a member of the University of California Berkeley Chancellor's Community Advisory Board. Dr. Scannon is currently on the Board of Directors of Pain Therapeutics, Inc.

Technology Impact on Biosecurity Policy and Practice
Download ppt
Patrick J. Scannon Founder, Executive Vice President, Chief Scientific Officer XOMA
Seminars
Subscribe to Biosecurity