Cybersecurity
Paragraphs

Attribution of malicious cyber activities is a deep issue about which confusion and disquiet can be found in abundance. Attribution has many aspects—technical, political, legal, policy, and so on. A number of well-researched and executed papers cover one or more of these aspects, but integration of these aspects is usually left as an exercise for the analyst. This paper distinguishes between attribution of malicious cyber activity to a machine, to a specific human being pressing the keys that initiate that activity, and to a party that is deemed ultimately responsible for that activity. Which type of attribution is relevant depends on the goals of the relevant decision maker. Further, attribution is a multi-dimensional issue that draws on all sources of information available, including technical forensics, human intelligence, signals intelligence, history, and geopolitics, among others. From the perspective of the victim, some degree of factual uncertainty attaches to any of these types of attribution, although the last type—attribution to an ultimately responsible party—also implicates to a very large degree legal, policy, and political questions. But from the perspective of the adversary, the ability to conceal its identity from the victim with high confidence is also uncertain. It is the very existence of such risk that underpins the possibility of deterring hostile actions in cyberspace.

All Publications button
1
Publication Type
Working Papers
Publication Date
Journal Publisher
Social Science Research Network
Authors
Herbert Lin
Paragraphs

This book discusses issues in large-scale systems in the United States and around the world. The authors examine the challenges of education, energy, healthcare, national security, and urban resilience. The book covers challenges in education including America's use of educational funds, standardized testing, and the use of classroom technology.  On the topic of energy, this book examines debates on climate, the current and future developments of the nuclear power industry, the benefits and cost decline of natural gases, and the promise of renewable energy. The authors also discuss national security, focusing on the issues of nuclear weapons, terrorism and cyber security.  Urban resilience is addressed in the context of natural threats such as hurricanes and floods.

All Publications button
1
Publication Type
Books
Publication Date
Journal Publisher
Wiley (1st edition)
Authors
Elisabeth Paté-Cornell
Authors
News Type
Q&As
Date
Paragraphs

A real possibility exists that foreign hackers could throw a monkey wrench into the outcome of the U.S. presidential election in the fall, a Stanford expert says.

Herbert Lin, senior research scholar for cyberpolicy and security at Stanford’s Center for International Security and Cooperation and a research fellow at the Hoover Institution, said that electronic voting could be affected by hackers in the presidential race, especially if a candidate claims tampering. In recent months, hackers from outside the country reportedly infiltrated the Democratic National Committee and Hillary Clinton campaign computer networks, leading to data breaches that made headlines worldwide.

The Stanford News Service interviewed Lin on this subject:

How worried are you about possible cyberattacks that could influence the outcome of the November elections in the U.S.?

There are two kinds of things to worry about. One is an actual cyberattack that, for example, alters vote counts in a way that tilts the election away from the will of the voters. That kind of attack is hard to pull off, and I’m not very worried about that – though I worry about it some.

A second worry – much more serious in my opinion – is the possibility that an election loser might challenge the outcome of the election, alleging that the results were altered by a cyberattack, especially if the election were close. How would anyone ever prove that ballots, electronically cast with no permanent and auditable record, were accurately counted?

If the evidence that Russians hacked the Democratic National Committee and the Hillary Clinton campaign proves to be legitimate, how should President Obama respond to Russia and Vladimir Putin?

Herbert Lin

Herbert Lin (Image credit: Rod Searcey/CISAC)

 

The U.S. has many response options, ranging from private diplomatic conversations to military action and everything in between. There are many things we could do to exact a price. But some of these things may be wise and others may be unwise. For example, an unwise option would be to threaten overt military action and otherwise do saber-rattling in response. The balancing act is calibrating a response that exacts a penalty but does not provoke a response that is unacceptable to us – and that’s a hard thing to do.

Would the U.S. ever hack back at Russia in some way?

I would be utterly amazed if the U.S. were not hacking Russia, and every other major power in the world for that matter. And I would be amazed if every other major power in the world were not hacking the U.S. There’s a baseline level of hacking that is going on all the time by everyone.

So, the question isn’t hacking or not hacking, the question is hacking back versus hacking. And on that point, I suspect it would be really hard for the recipient – in this case, Russia – to distinguish between hacking that almost surely is going on already and hacking that was conducted in response to any putative Russian involvement in the Democratic National Committee hack.

Is the hacking symbolic of a poor relationship between the U.S. and Russian governments?

I would not say symbolic – but it’s entirely consistent with a poor relationship.

In this 2015 video, Herb Lin talks about how U.S. policy on offensive cyber operations should be declassified.

 

Clifton Parker is a writer for the Stanford News Service.

All News button
1
-

Lunch will be served. Please RSVP to allow for an accurate headcount.

Abstract: Dr. Johnston will present a preliminary analysis of some of the tensions between inter-state crisis management principles (as accepted by many Chinese crisis management experts) and concepts for the use of cyber weapons in military conOlicts being developed by the Chinese military.

About the Speaker: Alastair Iain Johnston is The Governor James Albert Noe and Linda Noe Laine Professor of China in World Affairs at Harvard University and a visiting fellow at the Hoover Institution in summer 2016. He has written on socialization theory, identity and foreign policy, and strategic culture, mostly with application to the study of China’s foreign policy and East Asian international relations.

Alastair Iain Johnston Harvard University
Seminars
Paragraphs

This post is a review of a five-day NERC Critical Infrastructure Protection (CIP) training being offered by the SANS institute.

All Publications button
1
Publication Type
Commentary
Publication Date
Authors
News Type
News
Date
Paragraphs

Despite growing consensus about the magnitude of cyber security threats, a clear strategy for securing the United States’ critical digital infrastructure has yet to be reached. This is partially due to the complexity of cyber security issues, which intersect computer science, law, policy, economics, public opinion, and ethics. In recent years, however, the Hoover Institution has helped scholarship and dialogue on cyber security to move forward by channeling the expertise of Hoover fellows, Stanford University, and Silicon Valley, as well as extending these resources to policy makers and the media.

Hoover’s Cyber Security Boot Camps, led by Hoover fellows Amy Zegart and Herbert Lin in partnership with Stanford University’s Cyber Policy Program and the Center for International Security and Cooperation (CISAC), are key components of these efforts. Past boot camps have assembled senior congressional staff from both sides of the aisle for expert briefings and discussions about the law, policy, and technology pertaining to cyber security. This year, Zegart and Lin shifted the program’s focus toward national media, partnering with Hoover’s public affairs team to host a cyber security themed Media Roundtable.

Following the format of previous Media Roundtables, Hoover brought dozens of reporters from leading outlets such as the Wall Street Journal, Washington Post, and New York Times together with cyber policy and technology experts on May 16, 2016. The program featured presentations, interactive discussion, and thought-provoking exercises designed to aid reporters in understanding and communicating cyber security news and debates. The interactive atmosphere also helped strengthen lines of communication between the reporters, technology experts, and strategists tasked with making sense of the changing cyber security landscape.

Amy Zegart, Davies Family Senior Fellow at Hoover, introduced attendees to the unique challenges of crafting cyber security policy. Zegart discussed the exceptional vulnerability of powerful countries to cyber threats, consumer driven connectivity as a factor that increases cyber risks, and the obstacles to protecting privately held cyber infrastructure at a time of acute mistrust of government.

John Villasenor, a professor of electrical engineering, public policy, and management; visiting professor of law at UCLA; and a national fellow at the Hoover Institution, introduced the technical challenges associated with cyber security. Villasenor discussed the irreversible growth of cyberspace as mobile connectivity proliferates and data storage costs plummet, the overwhelming complexity of cyber systems, and the startling capabilities of hackers in identifying and exploiting security weaknesses.

Herbert Lin, Hoover research fellow and senior research scholar for cyber security and policy at CISAC, applied his expertise to an often-overlooked topic in cyber security: the role of offensive cyber tactics. Where passive defenses such as network security or law enforcement fail, offensive measures can prove critical in disrupting or identifying the source of cyber security breaches. Lin also discussed the potential use of offensive cyber tactics against our adversaries without waiting for incoming attacks, which he likens to “punching” in cyberspace, rather than “punching back.”

Carey Nachenberg, a vice president and fellow at Symantec Corporation and prolific developer of cyber security technology, delivered a technical primer on cyber exploitation. Nachenberg described ways that design flaws, human error, and the sheer complexity of cyber systems create potential vulnerabilities. He also provided a step-by-step walkthrough of various tactics hackers use to exploit these weaknesses, including denial of service attacks, computer worms, and manipulating human agents.

Jack Goldsmith, senior fellow at Hoover and the Henry L. Shattuck Professor of Law at Harvard, discussed the complications of applying international law designed to address traditional uses of force to cyber hostilities. Goldsmith highlighted the problematic distinction between cyber attacks, which constitute illegal acts of international aggression, and exploitations, which constitute legal acts of espionage.

Elaine Korzak, a W. Glenn Campbell and Rita Ricardo-Campbell National Fellow at Hoover, reported on the evolving UN response to cyber security concerns. After decades of review, UN action on cyber law gained traction in 2014 with a milestone report recognizing the applicability of international law to cyberspace. A subsequent 2015 report recommended several cooperative steps on cyber security, although the proposed rules and norms rely on voluntary implementation.

The roundtable also featured interactive exercises to expand media perspectives on cyber issues, including a detailed simulation of a cyber security breach at a major web services company. Participants formed groups to address technical, legal, public relations, and other concerns related to the breach and presented their strategies to real-world private-sector cyber security experts. Hoover invited four other cyber security leaders to discuss what the media is getting right and wrong on cyber coverage and how reporters can develop stronger relationships with private sector sources.

The 2016 Cyber Media Roundtable covered a wide range of complex topics, and the engagement of participants signaled strong interest in internalizing the material. Discussion periods spilled into breaks, and participants asked penetrating questions characteristic of good reporting.

Reflecting on the outcomes of the event, Amy Zegart stated:

The media cyber boot camp was a great success—giving some of the nation’s top national security reporters a fast and deep dive into key cyber issues, developing broader networks of experts to help inform the public debate, and enabling candid conversation with industry leaders about what the press can do to improve coverage of cyber issues.  Our vision is to hold a boot camp every year to educate a wide range of key policymakers and influencers—including congressional staff, federal judges, and the press.

Moving cyber policy forward will require continued attention to issues raised in the Media Roundtable. How can tensions between government and the private sector be eased to allow for greater cooperation? Can current international rules and norms be applied to cyber issues? To what extent do legal and ethical considerations permit “hacking back” or even hacking first? Where should reasonable expectations for cyber security be set in light of the overwhelming complexity of cyber systems?

As the larger policy community expands their focus on these and other key cyber security questions, Hoover’s ongoing research and outreach will help inform their answers.

All News button
1
-

Abstract: The NERC-CIP standards are the only federally mandated cybersecurity standards for critical infrastructure in the United States.  Targeting the electric system, the standards have been developed to ensure the reliability and the resilience of the electric grid and prevent catastrophic failures.  Although the standards have been around for almost a decade, their role in building the resilience of the electric grid is fiercely contested, with critics claiming the standards represent little more than a ‘check box’ exercise that directs attention and resources away from achieving real security.  This talk will present evidence on the effectiveness of the standards in addressing risk and offer suggestions as to how the standards might be improved to enhance resilience.

About the Speaker: Aaron Clark-Ginsberg is a U.S. Department of Homeland Security Cybersecurity Postdoctoral Scholar at CISAC.  His research interests center on the theory and practice of disaster risk governance, particularly resilience and disaster risk reduction approaches.  He is currently researching how government regulations designed to improve the resilience of the power grid to cyber-threats are affecting utility companies.

Aaron holds a PhD and MSc in Humanitarian Action from the University College Dublin and a BA in American Studies with a Concentration in Environmental Studies from Kenyon College.  Aaron's doctoral research examined how international NGOs interacted with national stakeholders to reduce disaster risk in developing countries.  As part of this, Aaron traveled to ten countries in Asia, Africa, and the Caribbean to review risk reduction and resilience building approaches addressing a variety of hazards including flooding, drought, price shocks, cyclones, landslides, erosion, disease, and conflict.

Aaron has extensive experience in real world application of risk management principles.  Aaron’s PhD was in conjunction with Concern Worldwide, an international Irish humanitarian organization.  While at Concern, Aaron produced a series of reports on risk management in different countries and contexts designed to improve the effectiveness of Concern’s approach to risk reduction. He has also conducted policy-focused research on humanitarian reform for the World Humanitarian Summit Irish Consultative Process, the results of which were used to help develop the Irish position on humanitarian action. Aaron also spent four seasons working as a wildland firefighter for various governmental and private sector organizations across the western United States.

 
Cybersecurity Regulations and Power Grid Resilience (preliminary findings)
Download pdf
Cybersecurity Postdoctoral Scholar CISAC
Seminars
-

- To ensure an accurate headcount for lunch, RSVPs are required - 

[[{"fid":"222969","view_mode":"crop_870xauto","fields":{"format":"crop_870xauto","field_file_image_description[und][0][value]":"","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false,"field_credit[und][0][value]":"","field_caption[und][0][value]":"","field_related_image_aspect[und][0][value]":"","thumbnails":"crop_870xauto"},"type":"media","field_deltas":{"1":{"format":"crop_870xauto","field_file_image_description[und][0][value]":"","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false,"field_credit[und][0][value]":"","field_caption[und][0][value]":"","field_related_image_aspect[und][0][value]":"","thumbnails":"crop_870xauto"}},"link_text":null,"attributes":{"width":"870","class":"media-element file-crop-870xauto","data-delta":"1"}}]]

 

To Be Announced Honors Student CISAC Honors Program in International Security Studies
0
CISAC Honors Student
benmittelberger_rsd16_003_0098a.jpg
Class of 2016

Ben Mittelberger is a senior in computer science concentrating in information systems design and implementation. He is a current student in the CISAC Honors Program. His thesis is titled: "In Data We Trust?: The Big Data Capabilities of the National Counterterrorism Center." It focuses on the increasing size and complexity of intelligence datasets and whether or not the center is structured properly to leverage them. He is advised by Dr. Martha Crenshaw

Honors Student CISAC Honors Program in International Security Studies
Honors Student CISAC Honors Program in International Security Studies
Seminars
Authors
News Type
News
Date
Paragraphs

It’s a quintessential Silicon Valley scene. A group of tech-savvy Stanford students are delivering a passionate pitch about a product they hope is going to change the world, while a room full of venture capitalists, angel investors and entrepreneurs peppers them with questions.

But there’s a twist. This Stanford classroom is also packed with decorated military veterans and active duty officers. And a group of analysts from the U.S. intelligence community is monitoring the proceedings live via an iPad propped up on a nearby desk.

These Stanford students aren’t just working on the latest “Uber for X” app. They’re searching for solutions to some of the toughest technological problems facing America’s military and intelligence agencies, as part of a new class called Hacking for Defense.

A student team briefs the class on a wearable sensor they're developing for an elite unit of U.S. Navy SEALs – a product they're pitching as "fitbit for America's divers." A student team briefs the class on a wearable sensor they're developing for an elite unit of U.S. Navy SEALs – a product they're pitching as "fitbit for America's divers."
“There’s no problems quite like the kind of problems that the defense establishment faces, so from an engineering standpoint, it has the most powerful ‘cool factor’ of anything in the world,” said Nitish Kulkarni, a senior in mechanical engineering.

Kulkarni’s team is working with an organization within the US Department of Defense to devise a system that will provide virtual assistance to Afghan and Iraqi coalition forces as they defuse deadly improvised explosive devices.

“At Stanford there’s a lot of opportunities for you to build things and go out and learn new stuff, but this was one of the first few opportunities I’ve seen where as a Stanford student and as an engineer, I can go and work on problems that will actually make a difference and save lives,” said Kulkarni.

A 21st century tech ROTC

That’s exactly the kind of “21st century tech ROTC” model of national service that Steve Blank, a consulting associate professor at Stanford’s Department of Management Science and Engineering, said he had in mind when he developed the class.

“The nation is facing a set of national security threats it’s never faced before, and Silicon Valley has not only the technology resources to help, but knows how to move at the speed that these threats are moving at,” said Blank.

MBA student Rachel Moore presents for Team Sentinel, which is working with the U.S. 7th Fleet to find better ways to analyze drone and satellite imagery. MBA student Rachel Moore presents for Team Sentinel, which is working with the U.S. 7th Fleet to find better ways to analyze drone and satellite imagery.
The students’ primary mission will be to produce products that can help keep Americans and our allies safe, at home and abroad, according to Blank.

Former U.S. Army Special Forces Colonel Joe Felter, who helped create the class and co-teaches it with Blank, said the American military needs to find new ways to maintain its technological advantage on the battlefield.

“Groups like ISIS, al–Qaeda and other adversaries have access to cutting edge technologies and are aggressively using them to do us harm around the world,” said Felter, who served in Iraq and Afghanistan and is currently a senior research scholar at Stanford’s Center for International Security and Cooperation (CISAC) and research fellow at the Hoover Institution.

“The stakes are high – this is literally life and death for our young men and women deployed in harm’s way. We’re in a great position here at Stanford and in Silicon Valley to help make the connections and develop the common language needed to bring innovation into the process, in support of the Department of Defense and other government agencies’ missions.”

Startup guru Steve Blank shares a light moment with a group of students. Startup guru Steve Blank shares a light moment with a group of students.
The class is an interdisciplinary mix of undergraduate and graduate students, from freshman to fifth year PhD student.

“It’s like a smorgasbord of all these people coming together from different parts and different schools of Stanford, and so I think that’s just a really cool environment to be in,” said Rachel Moore, a first-year MBA student.

Moore’s team includes electrical and mechanical engineering students, and they’re working together to develop a system to enable the Navy’s Pacific Fleet to automatically identify enemy ships using images from drones and satellites.

Tough technological challenges

Months before the course start date, class organizers asked U.S. military and intelligence organizations to identify some of their toughest technological challenges.

Class co-teacher Pete Newell throws his hands up to celebrate a student breakthrough. Class co-teacher Pete Newell throws his hands up to celebrate a student breakthrough.
U.S. Army Cyber Command wanted to know if emerging data mining, machine learning and data science capabilities could be used to understand, disrupt and counter adversaries' use of social media.

The Navy Special Warfare Group asked students to design wearable sensors for Navy SEALs, so they could monitor their physiological conditions in real-time during underwater missions.

Intelligence and law enforcement agencies were interested in software that could help identify accounts tied to malicious “catfishing” attempts from hackers trying to steal confidential information.

And those were just a few of the 24 problems submitted by 14 government agencies.

Developing Solutions

The class gives eight teams of four students 10 weeks to actively learn about the problem they are addressing from stake holders and end users most familiar with the problem and to iteratively develop possible solutions or  a “minimum viable product,” using a modified version of Steve Blank’s “lean launchpad methodology,” which has become a revered how-to guide among the Silicon Valley startup community.

Rachel Olney, a graduate student in mechanical engineering, tries on a military-grade dry suit on a visit to the 129th Rescue Wing at Moffett Field. Rachel Olney, a graduate student in mechanical engineering, tries on a military-grade dry suit on a visit to the 129th Rescue Wing at Moffett Field.
A key tenet of Blank’s methodology is what he calls the “customer discovery process.”

“If you’re not crawling in the dirt with these guys, then you don’t understand their problem,” Blank told the class.

One student team, which was working on real-time biofeedback sensors and geo-location devices for an elite team of Navy SEALS (a project they were initially pitching at “fitbit for America’s divers”), earned a round of applause from the class when they showed a slide featuring photos from a field trip they took to the 129th Rescue Wing at Moffett Field to find out what it felt like to wear a military-grade dry suit.

Rachel Olney, a graduate student in mechanical engineering, said the experience of squeezing into the tight suit and wearing the heavy dive gear gave her a better appreciation for the physical demands that Navy SEALs have to deal with during a mission.

“They’re diving down to like 200 feet for up to six to eight hours…and during that time they can’t eat, they can’t hydrate, they’re physically exerting a lot, because they’re swimming miles and miles and miles at depth and they can’t see and they can’t talk to each other,” Olney said.

Image
“It’s probably one of the most extreme things that humans do right now.”

Another group came in for some heavy criticism from the teaching team for failing to identify and interview enough end users.

But the next week, they were back in front of the class showing a video from a team member’s visit to an Air Force base in Fresno, where he logged some time inside the 90-pound bomb suit that explosive ordinance disposal units wear in the field.

“You can’t address a customer issue unless and until you really step into the shoes of the customer,” said Gaurav Sharma, who’s a student at Stanford's Graduate School of Business.

“That was the exact reason why I went to Fresno and wore the bomb suit, to get into the shoes of the end customer.”

Navigating the defense bureaucracy

Active duty military officers from CISAC’s Senior Military Fellows program and the Hoover Institution’s National Security Affairs Fellows program act as military liaisons for the class and help students navigate the complex defense bureaucracy.

Colonel John Cogbill, U.S. Army“[The students] have really just jumped in with both feet and immersed themselves in this Department of Defense world that for so many civilians is just very foreign to them,” said U.S. Army Colonel John Cogbill, who has spent the last year as a senior military fellow at CISAC.

“I think they will come away from this experience with a much better appreciation of what we do inside the Department of Defense and Intelligence community, and where there are opportunities for helping us do our jobs better.”

Cogbill said he hoped that some of the inventions from the class, like an autonomous drone designed to improve situational awareness for Special Forces teams, could help the troops on his next combat deployment, where he will serve as the Deputy Commanding Officer of the U.S. Army’s elite 75th  Ranger Regiment.

“It’s not just about making them more lethal, it’s also about how to keep them alive on the battlefield,” said Cogbill.

Students also get support from their project sponsors and personnel at the newly established Defense Innovation Unit Experimental (DIUx) stationed at Moffett Field.

Tech saves lives on the battlefield

Another key member of the teaching team is Pete Newell, who was awarded the Silver Star Medal (America’s third-highest military combat decoration), for leading a U.S. Army battalion into the Battle of Fallujah, where he survived an ambush and left the protection of his armored vehicle in an attempt to save a mortally wounded officer.

Class co-teacher and Silver Star Medal recipient Pete Newell explains some of the classic reasons why military products fail in the field. Class co-teacher and Silver Star Medal recipient Pete Newell explains some of the classic reasons why military products fail in the field.
Newell said he saw first-hand the difference that technology can make on the battlefield in his next job, when he served as director of the U.S. Army’s Rapid Equipping Force, which was tasked with creating technological solutions to the troops fighting in Afghanistan.

“What I realized is that the guys on the front edge of the battlefield who were actually fighting the fight, don’t have time to figure out what the problem is that they have to solve,” Newell said.

“They’re so involved in just surviving day to day, that they really don’t have time to step back from it and see those problems coming, and what they needed was somebody to look over their shoulder and look a little deeper and anticipate their needs.”

One of the first and most urgent problems Newell faced on the job was responding to the sudden spike in IED attacks on dismounted infantry.

The Army was still using metal detector technology from the ‘50s to find mines, but the new breed of IEDs, which were often hidden inside buried milk jugs, were virtually undetectable to the outdated technology.

Former U.S. Army Colonel Pete Newell demystifies some military jargon for the class. Former U.S. Army Colonel Pete Newell demystifies some military jargon for the class.
“They could create an improvised explosive device and a pressure plate trigger…by using almost zero metal content,” Newell said. “It was almost impossible to find.”

Newell’s solution was a handheld gradiometer, the kind of technology used to find small wires in your backyard during a construction project, paired with a ground penetrating radar that can see objects underground.

But by the time the new technology reached troops in the field last summer, more than 4,000 had been wounded or killed in IED attacks.

Newell said he hoped the class would help get life-saving technology deployed throughout the military faster.

“I think it’s important to enable this younger generation of technologists to actually connect with some of the national security issues we face and give them an opportunity to take part in making the world a safer place,” Newell said.

Tom Byers, an entrepreneurship professor in Management Science and Engineering and faculty director of the Stanford Technology Ventures Program, rounds out the teaching team and brings his experience in innovation education and entrepreneurship to the classroom.

Inspiring the next generation

Students said the opportunity to find solutions to consequential problems was their primary inspiration for joining the class.

“When I first came to Stanford, the hype around entrepreneurship was very much around, ‘go out, make an app, do something really fun and cool, and get rich’,” said Darren Hau, a junior in Electrical Engineering.

Students share a laugh during a class break. Students share a laugh during a class break.
“In Hacking for Defense, I think you’re seeing a lot of people bring that same entrepreneurial mindset into a problem statement that seems a lot more impactful.”

Felter said he was humbled that so many students were willing to serve in this way.

“It’s encouraging to find out that students at one of our top universities are very interested and highly motivated to work very hard and use their skills and expertise and talent and focus it on these pressing national security problems,” said Felter.

The teaching team said they planned on expanding their class to other universities across the country in the coming years, to create a kind of open source network for solving unclassified national security problems.

For military officers like Cogbill, who will likely soon be leading U.S. soldiers into combat, that’s welcome news.

“Every time you run a course, that’s eight more problems,” Cogbill said.

“If this scales across 10, 20, 30, 40 more universities, you can imagine how many more problems can be solved, and how many more lives can potentially be saved.”

 

All News button
1
-

Abstract: The disclosure of software vulnerabilities has stirred controversy for decades among security researchers and software vendors, and more recently governments. Despite increasing interdependency of software and systems (e.g., the Internet of Things) and resulting complexity in vulnerability disclosure and coordination, no unified norms have yet emerged.

This talk addresses the development of norms that (attempt to) govern the disclosure of software security flaws in relation to structural changes of the software industry and the Internet. This includes new forms of private, but monetarily rewarded disclosure on markets and through bug bounty programs, as well as government efforts to prohibit proliferation of knowledge and technology through export controls. Recently, governments acknowledged the withholding of vulnerability information on the grounds of national security and law enforcement needs, trading off against the need for defensive security of civilian computers and networks.

The talk outlines pressing policy issues and connects them to recent developments (e.g., Apple vs. FBI). It concludes by making the case for why norms on vulnerability disclosure are an essential component in shaping cybersecurity governance.

About the Speaker: Andreas Kuehn is a Ph.D. Candidate in Information Science and Technology at Syracuse University. He joined CISAC as a Zukerman Cybersecurity Predoctoral Fellow in October 2014. Prior, he was a visiting graduate student at Cornell University’s Department of Science & Technology Studies. He holds a M.Sc. in Information Systems from the University of Zurich, Switzerland.

In his dissertation, Andreas examined the historical, organizational, and institutional developments of software vulnerability and exploit markets as they are shaped by the perennial controversy on vulnerability disclosure. His qualitative, empirical research on emerging technologies and governance is informed by Science and Technology Studies and Institutional Theory.

Cybersecurity Predoctoral Fellow CISAC, Stanford University
Seminars
Subscribe to Cybersecurity