Authors
News Type
News
Date
Paragraphs
SELECT Magazine's contributing editor talks to Rafiq Dossani about outsourcing, one of the hottest and most controversial topics in the global IT industry.

SELECT: What is the current size of the outsourcing market? What percent of U.S. software development, call centers, etc., have already moved to developing nations? Is the amount of outsourcing still increasing?

Dossani: To provide some perspective, although the off-shoring of services has been going on for mans Nears, technology led by the widespread use of the Internet has changed things. The resulting new twist in the provision of services is that the required interaction between the seller and the consumer has been substantially limited. The advances in information technology made possible the parsing of the provision of certain services into components requiring different levels of skill and interactivity As a result, certain portions of the serviced activity that might or might not be skill-intensive, but required low levels of face-to-face interaction could be relocated offshore. The sequence of events that enabled this process is the following:

    First, the digital age allowed (or, at least, revolutionized) the conversion of service flows into stocks of information, making it possible to store a service. For example, a legal opinion that earlier had to be delivered to the client in person could now be prepared as a computer document and transmitted to the client via e-mail or, better yet, encoded into software. Easy storage and transmission allowed for the physical separation of the client and vendor as well as their separation in time. It also induced the separation of services into components that were standardized and could be prepared in advance (such as a template for a legal opinion) and other components that were customized for the client (such as the opinion itself) or remained non-storable. Taking advantage of the possibility of subdividing tasks and the economies that come with the division of labor, this reduced costs by offering the possibility of preparing the standardized components with lower-cost

    labor and, possibly at another location.

    The second fundamental impact of digitization was the conversion of non-information service flows into information service flows. For example, the sampling of tangible goods by a buyer through visiting a showroom is increasingly being replaced by virtual samples delivered over the Internet. Once converted to an information flow, the service may also he converted into a stock of information and subjected to the above mentioned forces of cost reduction through standardization of components and remote production.

    Third, the low-cost transmission of the digitized material accelerated the off-shoring of services. Services such as writing software programs which were off-shored to India in the early 1970s were enabled by digitized storage and, in the 1980s, by the standardization of programming languages. Still later, as digital transmission costs fell in

    the 1990s (just as digital storage costs had fallen earlier), even nonstorable

    services, such as customer care, could he handled offshore.

The offshore services outsourcing market (excluding software development) is still small and will probably be approximately $10 billion for 2005. It employs about 500,000 people, two thirds of whom are located in India. The rest are widely distributed, with developing Asia and Ireland accounting for most of the remaining employment. About 60% of the employment is in call centers. The U.S. and U.K. call-center industry together employ about five million people, so the percentage of offshore jobs is still small. It is even smaller for other services.

Offshore software development employs approximately another 500,000 people. This compares with U.S. employment of about two million. This is a larger percentage of the total software development labor market. although most of the outsourced work is programming, while work such as systems integration and design continue to be done in the U.S.

The growth rate is still high and there are concerns about whether or not this rapid growth rate will hurt the quality of work done. However. this rate will still likely he in excess of 30% in 2005 and 2006. The reason for this is the massive wage differential.

Clearly there have been massive failures as well as outstanding successes in outsourcing. What are the critical success factors for making outsourcing work?

The infrastructure (telecom. finance, power) has all been standardized, although the solutions might not he the same as in developed countries. The critical success factors are two: the quality of labor and supervision; and managing growth. Unbelievable there is a growing shortage of labor. The result is that the quality of work is declining. Project supervisory skills are also in short supply. Managing growth, especially keeping attrition

under control, training, developing new vertical skills, moving into back-office work, and offering the client turnkey packages are some of the critical managerial factors for success.

Short of being willing to work for $15,000/year, what can western IT professionals do to provide sufficient value to prevent their functions from being outsourced?

The U.S. educational system still turns out a good product. It is sufficiently ahead of the comparable Indian product so that a recent computer science/computer engineering graduate from the average U.S. university can earn a premium of at least 100% over his Indian counterpart from a good university such as the IlTs, with substantially higher premiums for graduates from schools such as MIT and Stanford. The problem occurs more with mid-career professionals. Those with older skills are unable to compete with freshly trained graduates from India. Therefore, they need to update those skills regularly and take advantage of opportunities to globalize and convert them into managerial skills. This may have to he mandated at some point, as has happened in the financial sector, where stockbrokers need to regularly sit for exams to renew their licenses.

That said, most of the offshore jobs are relatively low-skilled. For example, the single largest category of offshore services is outbound calling for the financial services industry for selling mortgages or collecting overdue receivables. The work is routine, based on scripts that pop-up on the computer screen in response to prompts.

Do these findings suggest that developed countries are likely to be only marginally threatened by the globalization of services? Even if high-end work is stays within developed countries, as has happened in the software industry, the problem is that not everyone in developed countries can readily shift to high-end work. Since the 1960s, the shift in the economies of developed nations towards service-based economies certainly increased the number of highly-skilled service workers, but there was an even greater swelling in the number of other less-skilled service workers. This is partly a consequence of the nature of many services as linked, inseparable sets of activities with different

skill levels, combined with a pyramid of labor requirements, i.e., there is more demand for lower-level work than for high-end work. In manufacturing. the unemployment created by the reduction in demand for blue collar labor in developed countries was offset by the absorption of much of the surplus labor into service industries, often with minimal training. But the shift of low-end service workers to high-end workers will require a longer period of re-education and may have significant interim consequences on unemployment rates.

The threat to developed countries is increased by the fact that, apart from software, the largest growth in off-shoring is happening in business services. These are also the sectors with the largest growth in U.S. employment.

Further, there is evidence that even higher skilled functions can be moved offshore or might evolve on their own. For example, interviews with people at a firm earlier this year revealed that they had initially been contracted by an American firm to call its clients with overdue credit card payments. The offshore company eventually purchased the receivables from its client and assumed the collection risk itself. Another firm, Wipro Spectramind, managed the radiology services of Massachusetts General Hospital for its second and third shifts. Thus, American radiologists, who earn an average of $315,000 a year were replaced by Indian radiologists, who earn $20,000 a year on average.

I understand that there is a whole subculture in Pakistan and India of people who go to work in the late afternoon or evening and then work a full day on U.S. time. What effect has outsourcing had on the cultures of the countries that are recipients of much of the outsourced work? Have labor rates dramatically increased? Is it difficult for local companies in India and Pakistan to get quality IT talent?

Indeed such a subculture now exists. It is viewed as very stressful work and not suitable for a long-term career. Companies that do such work try to ameliorate the stress by hiring psychiatric counselors to provide free counseling to stressed-out employees. They also provide free meals and transportation, sports facilities, etc.

However, labor rates have increased only, a little. This is more than offset by

the rise in productivity of this labor over time.

Outsourcing is clearly a temporary solution. As labor rates equalize, the benefit of outsourcing decreases. In Pakistan and China, there are still huge differences in labor costs, but in Turkey, rates are closer to what they are in the 11.5. and other Western states. Realistically, how long can we gain a significant benefit form outsourcing?

India and China, and to some extent, Pakistan, have large labor pools. That is why, in manufacturing, Chinese wage rates have not changed despite massive employment growth over the past three decades. I think that wage rates in India will actually fall because of increasing supply, which is being drawn into outsourcing. This would mean several more decades of benefit from outsourcing.

One way in which developed countries may retain value is if their firms control the work done, either through providing the risk capital or through subsidiaries. While it is difficult to predict which organizational types will dominate, a number of firm-specific factors that influence the liability of off-shoring and organization structure are summarized here:

    The knowledge component of the activity. A higher knowledge component makes the firm more concerned about whether the quality of the service will change due to a location change or the transfer process.

    The interactive components of the process.

    The ability to modularize the process

    Savings from concentrating an activity in one location, leading to

    benefits of scale and scope.

    Reengineering as part of the transfer process. To transfer a business process, it is necessary to study it intensively and script the transfer. In the process of study, often there will he aspects of the current methodology for discharging the process that do not add value. Very often these aspects are legacies of earlier methodologies that were not eliminated as the production process evolved. During the act of transfer these are easier to abandon than at an existing facility' where they have become a "natural' part of the daily routine. Our interviews identified other unexpected benefits that go beyond the efficiency effects. Simply examining the business processes may reveal previously undetected inefficiencies. During the transfer process, these inefficiencies can be addressed without disrupting work patterns. Workers in the new location then use the reengineered process which is usually more efficient.

    The time-sensitive nature of the work.

All News button
1
Authors
Rosamond L. Naylor
News Type
News
Date
Paragraphs
CESP senior fellows Rosamond L. Naylor, Walter P. Falcon, and Harold A. Mooney released the findings of a new study on the impacts of an increasingly global livestock industry in the Policy Forum of the Dec. 9 issue of Science.

The turkey and ham many are eating this holiday season don't just appear magically on the table. Most are the end product of an increasingly global, industrialized system that is resulting in costly environmental degradation. Better understanding of the true costs of this resource-intensive system will be critical to reducing its negative effects on the environment, says an interdisciplinary team of researchers led by Stanford University's Rosamond Lee Naylor, Walter Falcon, and Harold Mooney.

"Losing the Links Between Livestock and Land" appears in the Policy Forum in the Dec. 9 issue of Science. It represents a synthesis of research by professors at Stanford University, the University of Virginia, the University of California at Davis, the universities of Manitoba and British Columbia in Canada, and the United Nations LEAD (Livestock Development and Environment) program within the Food and Agricultural Organization of UN.

"Sixty years ago, the link between the livestock production and consumption was much more clear and direct, with most consumers getting their meat and dairy products from small, family-owned farms," says lead author Naylor, an economist. Co-author Falcon agrees. "When I was growing up in Iowa, almost all farmers kept both chickens and pigs."

Today, meat consumption has sky-rocketed, and large-scale intensive livestock operations provide most of those products, both in the U.S. and around the world.

Particularly striking is the growth in demand for meat among developing countries, Naylor notes. "China's meat consumption is increasing rapidly with income growth and urbanization, and it has more than doubled in the past generation," she says. As a result, land once used to provide grains for humans now provides feed for hogs and poultry.

Numerous factors have contributed to the global growth of livestock systems, Naylor notes, including declining feed-grain prices; relatively inexpensive transportation costs; and trade liberalization. "But many of the true costs remain largely unaccounted for," she says. Those costs include destruction of forests and grasslands to provide farmland for corn, soybeans and other feed crops destined not directly for humans but for livestock; use of large quantities of freshwater; and nitrogen losses from croplands and animal manure.

Nitrogen losses are especially problematic, says James Galloway of the University of Virginia. "Once nitrogen is lost to the atmosphere or to water, it can have a large number of sequential environmental effects. For example, ammonia emitted into the atmosphere can in sequence affect atmospheric visibility, forest productivity, lake acidity and eventually impact the nutrient status of coastal waters."

Naylor cited Brazil as a specific example of the large impact on ecosystems and the environment. "Grasslands and rainforests are being destroyed to make room for soybean cultivation," she said. The areas are supplying feed to the growing livestock industry in Brazil, China, India and other parts of the world, leading to "serious consequences on biodiversity, climate, soil and water quality."

Naylor and her research team are seeking better ways to track all costs of livestock production, especially the hidden ones related to ecosystem degradation and destruction. "What is needed is a re-coupling of crop and livestock systems," Naylor said. "If not physically, then through pricing and other policy mechanisms that reflect social costs of resource use and ecological abuse."

Such policies "should not significantly compromise the improving diets of developing countries, nor should they prohibit trade," Naylor added. Instead, they should "focus on regulatory and incentive-based tools to encourage livestock and feed producers to internalize pollution costs, minimize nutrient run-off, and pay the true price of water."

She cited efforts in the Netherlands to track nitrogen inputs and outputs for hog farms as one approach. In the U.S., the 2002 Farm Bill provided funds for livestock producers to redesign manure pits and treat wastes, but she notes that much greater public and private efforts are needed to reduce the direct and indirect pollution caused by livestock.

In the end, though, it may be up to consumers to demand more environmentally sustainable approaches to livestock production. "In a global economy with no global society, it may well be up to consumers to set a sustainable course," she added.

Seed funding for the research was provided by the Woods Institute for the Environment, which supports interdisciplinary approaches to complex environmental issues. Naylor, Falcon and Mooney are affiliated with the institute and with the Center for Environmental Sciences and Policy in Stanford's Freeman Spogli Institute for International Studies.

In addition to Naylor, Mooney and Falcon of Stanford and Galloway of Virginia, co-authors are Henning Steinfeld of the United Nations Food and Agriculture Organization; Galloway; Vaclav Smil, University of Manitoba; Eric Bradford, University of California at Davis; and Jacqueline Alder, University of British Columbia.

All News button
1
0
Visiting Fellow in Israel Studies, FSI
W. Glenn Campbell National Fellow, Hoover Institution (2008-2009)
CDDRL Affiliated Scholar, 2008-2009
CDDRL Predoctoral Fellow, 2004-2008
amichai_magen.jpg

Amichai Magen is the Visiting Fellow in Israel Studies at Stanford University's Freeman Spogli Institute for International Studies. In Israel, he is a Senior Lecturer (US Associate Professor), Head of the MA Program in Diplomacy & Conflict Studies, and Director of the Program on Democratic Resilience and Development (PDRD) at the Lauder School of Government, Diplomacy and Strategy, Reichman University. His research and teaching interests address democracy, the rule of law, liberal orders, risk and political violence.

Magen received the Yitzhak Rabin Fulbright Award (2003), served as a pre-doctoral fellow at the Center on Democracy, Development, and the Rule of Law (CDDRL), and was a National Fellow at the Hoover Institution. In 2016 he was named Richard von Weizsäcker Fellow of the Robert Bosch Academy, an award that recognizes outstanding thought-leaders around the world. Between 2018 and 2022 he was Principal Investigator in two European Union Horizon 2020 research consortia, EU-LISTCO and RECONNECT. Amichai Magen served on the Executive Committee of the World Jewish Congress (WJC) and is a Board Member of the Israel Council on Foreign Relations (ICFR) and the International Coalition for Democratic Renewal (ICDR).

Date Label
-

The accession of Cyprus to the European Union (EU) in May of 2004 constitutes the most positive strategic development in the history of the island state since its independence in 1960. In the last two years, the Cypriot people have experienced watershed events, filled with frustrations, challenges but also opportunities. Cyprus' EU membership has extended the borders of the EU to the strategic corner of the Eastern Mediterranean and has brought the Middle East ever closer to Europe. It is hoped that Cyprus' EU membership can contribute to the expansion of peace, stability, security and prosperity in the area. Cyprus is situated at the crossroads of three continents and civilizations, where global political and economic interests, as well as international security concerns, converge. Together with its American ally and with the help of its European partners Cyprus aspires to play a positive role, and to act as a bridge of mutual understanding and the promotion of sustained and result oriented dialogue between its Middle Eastern neighbors and Europe. At the same time, Cyprus strives to achieve a just, permanent, functional and mutually acceptable solution to the Cyprus problem, an end of the Turkish military occupation, reunification and prosperity for all Cypriots within their common European home.

His Excellency Euripides L. Evriviades presented his credentials as the Ambassador of the Republic of Cyprus to the United States to President George W. Bush on 4 December 2003. He is also accredited as High Commissioner to Canada. Ambassador Evriviades served as Ambassador of Cyprus to the Netherlands from August 2000 to October 2003. Prior to his posting in The Hague, he served as the Ambassador to Israel from November 1997 until July 2000. Earlier in his career, Mr. Evriviades held positions at Cypriot embassies in Tripoli, Libya; Moscow, USSR/Russia; and Bonn, Germany.

CISAC Conference Room

H. E. Euripides L. Evriviades Ambassador of the Republic of Cyprus to the United States
Lectures
Paragraphs

This paper presents interim findings of "The Experience with Independent Power Producers in Developing Countries," a research project being conducted by the Program on Energy and Sustainable Development at Stanford University ("PESD").

All Publications button
1
Publication Type
Working Papers
Publication Date
Journal Publisher
Program on Energy and Sustainable Development Working Paper #39
Authors
Erik Woodhouse
Paragraphs

This article uses the case of the Cuban Missile Crisis to illustrate the criteria by which victory and defeat are assessed in international crises. The evidence suggests that few objective criteria are actually used in such evaluations. Indeed, examination of the specific terms of crisis settlements can prove to be less important than a range of factors that do not conform to traditional rational actor assumptions. These include: i) prior biases in perception, ii) the experience of the crisis itself and the subsequent way in which it becomes framed, and iii) public opinion management during and after the crisis. This analysis has significant implications for policymakers who have to deal with the aftermath of a crisis, and also for the wider public and media, if governments are to be held accountable for their foreign policy.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Security Studies
Authors
Authors
David Laitin
News Type
Commentary
Date
Paragraphs
As the war on terrorism continues, statistics on terrorist attacks are becoming as important as the unemployment rate or the GDP. Yet the terrorism reports produced by the U.S. government do not have nearly as much credibility as its economic statistics, because there are no safeguards to ensure that the data are as accurate as possible and free from political manipulation. Alan B. Kreuger and David Laitin outline a solution.

From the September/October 2004 issue of Foreign Affairs.

As the war on terrorism continues, statistics on terrorist attacks are becoming as important as the unemployment rate or the GDP. Yet the terrorism reports produced by the U.S. government do not have nearly as much credibility as its economic statistics, because there are no safeguards to ensure that the data are as accurate as possible and free from political manipulation. The flap over the error-ridden 2003 Patterns of Global Terrorism report, which Secretary of State Colin Powell called "a big mistake" and which had to be corrected and re-released, recently brought these issues to the fore. But they still have not been adequately addressed.

Now-common practices used to collect and disseminate vital economic statistics could offer the State Department valuable guidance. Not long ago, economic statistics were also subject to manipulation. In 1971, President Richard Nixon attempted to spin unemployment data released by the Bureau of Labor Statistics (BLS) and transferred officials who defied him. This meddling prompted the establishment of a series of safeguards for collecting and disseminating economic statistics. Since 1971, the Joint Economic Committee of Congress has held regular hearings at which the commissioner of the BLS discusses the unemployment report. More important, in the 1980s, the Office of Management and Budget issued a directive that permits a statistical agency's staff to "provide technical explanations of the data" in the first hour after principal economic indicators are released and forbids "employees of the Executive Branch" from commenting publicly on the data during that time.

The State Department should adopt similar protections in the preparation and dissemination of its reports. In addition to the global terrorism report, the State Department is required by Congress to report annually on international bribery, human rights practices, narcotics control, and religious freedom. Gathering and reporting data for congressional oversight is presently a low-level function at the State Department. The department rarely relies on high-quality, objective data or on modern diagnostic tests to distinguish meaningful trends from chance associations. Adopting safeguards against bias, both statistical and political, would enable Congress to better perform its constitutional role as the White House's overseer and allow the American public to assess the government's foreign policy achievements.

A PATTERN OF ERRORS

Congress requires that the State Department provide each year "a full and complete report" that includes "detailed assessments with respect to each foreign country ... in which acts of international terrorism occurred which were, in the opinion of the Secretary, of major significance." The global terrorism reports are intended to satisfy this requirement, but, over time, they have become glossy advertisements of Washington's achievements in combating terrorism, aimed as much at the public and the press as at congressional overseers.

The 2003 global terrorism report was launched at a celebratory news conference in April. Deputy Secretary of State Richard Armitage and Ambassador J. Cofer Black, the State Department coordinator for counterterrorism, outlined some remaining challenges, but principally they announced the Bush administration's success in turning the terrorist tide. Black called the report "good news," and Armitage introduced it by saying, "You will find in these pages clear evidence that we are prevailing in the fight." The document's first paragraph claimed that worldwide terrorism dropped by 45 percent between 2001 and 2003 and that the number of acts committed last year "represents the lowest annual total of international terrorist attacks since 1969." The report was transmitted to Congress with a cover letter that interpreted the data as "an indication of the great progress that has been made in fighting terrorism" after the horrific events of September 11.

But we immediately spotted errors in the report and evidence contradicting the administration's claims. For example, the chronology in Appendix A, which lists each significant terrorist incident occurring in the year, stopped on November 11-an unusual end to the calendar year. Clearly, this was a mistake, as four terrorist attacks occurred in Turkey between November 12 and the end of 2003. Yet it was impossible to tell whether the post- November 11 incidents were inadvertently dropped off the chronology and included in figures in the body of the report or completely overlooked.

More important, even with the incomplete data, the number of significant incidents listed in the chronology was very high. It tallied a total of 169 significant events for 2003 alone, the highest annual count in 20 years; the annual average over the previous five years was 131. How could the number of significant attacks be at a record high, when the State Department was claiming the lowest total number of attacks since 1969? The answer is that the implied number of "nonsignificant" attacks has declined sharply in recent years. But because nonsignificant events were not listed in the chronology, the drop could not be verified. And if, by definition, they were not significant, it is unclear why their decrease should merit attention.

On June 10, after a critical op-ed we wrote in The Washington Post, a follow-up letter to Powell from Representative Henry Waxman (D-Calif.), and a call for review from the Congressional Research Service, the State Department acknowledged errors in the report. "We did not check and verify the data sufficiently," spokesman Richard Boucher said. "... [T]he figures for the number of attacks and casualties will be up sharply from what was published."

At first, Waxman accused the administration of manipulating the data to "serve the Administration's political interests." Powell denied the allegation, insisting that "there's nothing political about it. It was a data collection and reporting error." Although there is no reason to doubt Powell's explanation, if the errors had gone in the opposite direction-making the rise in terrorism on President George W. Bush's watch look even greater than it has been-it is a safe bet that the administration would have caught them before releasing the report. And such asymmetric vetting is a form of political manipulation.

Critical deficiencies in the way the report was prepared and presented compromised its accuracy and credibility. Chief among these were the opaque procedures used to assemble the report, the inconsistent application of definitions, insufficient review, and the partisan release of the report. These deficiencies resulted in a misleading and unverifiable report that appeared to be tainted by political manipulation.

It is unclear exactly how the report was assembled. The report notes that the U.S. government's Incident Review Panel (IRP) is responsible for determining which terrorist events are significant. It says little, however, about the panel's members: how many there are, whether they are career employees or political appointees, or what affiliations they have. Nor does it describe how they decide whether an event is significant. Do they work by consensus or majority rule? What universe of events do they consider?

The State Department announced a decline in total terrorist attacks, which resulted from a decline in nonsignificant events. But without information about the nonsignificant events, readers were essentially asked to blindly trust the nameless experts who prepared it.

The report's broad definitions, moreover, are sometimes too blunt to help classification. Terrorism is defined as "premeditated, politically motivated violence perpetrated against noncombatant targets by subnational groups or clandestine agents, usually intended to influence an audience." The report specifies that an international terrorist attack is an act committed by substate actors from one nation against citizens or property of another. An incident "is judged significant if it results in loss of life or serious injury to persons, major property damage, and/or is an act or attempted act that could reasonably be expected to create the conditions noted."

But hardly any explanation was provided about how the IRP distinguishes significant from nonsignificant events. When is property damage too minor for an event to be significant? How are nonsignificant events identified? Is the IRP responsible for making these determinations too? Has the source and scope of their information changed over time? The corrected 2003 report, the first to list individual nonsignificant acts, defines as "major" property damage that exceeds $10,000. It does not indicate, however, whether that criterion applied to previous reports.

Admittedly, measuring international terrorism is no easy task. Even scholarly reckonings are not free from subjective judgment, and there are inevitably close calls to be made. The most one can hope for in many cases is consistent application of ambiguous definitions.

Unfortunately, in the global terrorism reports the rules have been applied inconsistently. Many cross-border attacks on civilians in Africa have not been included in the reports, for example, even though similar attacks in other regions have been. The report for 2002, moreover, counts as significant a suicide attack by Chechen shaheeds (Islamist martyrs) against a government building in Moscow that killed 72 people. Yet none of the numerous suicide attacks by the Chechen "black widows" that terrorized Russia and killed scores in 2003 was tallied as an international terrorist attack in the latest report. After one such attack, Russian President Vladimir Putin said, "Today, after a series of recent terrorist attacks, we can say that the bandits active in Chechnya are not just linked with international terrorism, they are an integral part of it." If the State Department considers such attacks domestic, rather than international, it should do so consistently from one year to the next.

Another problem is that the staff that prepared the 2003 global terrorism report did not participate in releasing it; in fact, they have yet to be identified. High-level Bush administration officials presented the report to the media, using it to support White House policies and take credit for the alleged decline in terrorism. Even after the report's flaws were recognized, they continued to spin the figures. When the corrected version was released, Black repeated that "we have made significant progress," despite being pressed to acknowledge that last year the number of significant attacks reached a 20-year high. Given the war on terrorism's central role in the upcoming presidential election, such presentation gives the appearance that the report is being manipulated for political gain.

The State Department has tried to explain the report's flaws using language eerily reminiscent of the Bush administration's justification of the failure to find weapons of mass destruction in Iraq. Spokesman Boucher told reporters that previous claims that the war on terrorism was succeeding had been based "on the facts as we had them at the time [and] the facts that we had were wrong." Even Powell partook in the spinning. On the one hand, he announced that "the [original] narrative is sound and we're not changing any of the narrative." On the other hand, he acknowledged, "We will change the narrative wherever the narrative relates to the data."

To his credit, Powell instructed those responsible for preparing the report to brief Waxman's staff on the procedures they had used and the origins of their mistakes. Based on a summary of the briefing by Waxman's staff, much has come to light. Authority for compiling the list of attacks was shifted from the CIA to the Terrorist Threat Integration Center (TTIC), an organization created in May 2003 to "merge and analyze all threat information in a single location." The TTIC provided information to the IRP, which, it was disclosed, consists of representatives from the CIA, the Defense Intelligence Agency, the National Security Agency, and the State Department's Bureau of Intelligence and Research. A TTIC representative chaired the meetings and could cast a vote to break ties on the classification of an event as significant or nonsignificant.

At least this year, chaos prevailed. The IRP's members changed from meeting to meeting-when they attended the meetings at all. The CIA employee responsible for the database left but was never replaced; in mid-process, an outside contractor who entered data was replaced by another contractor. Because of technical incompetence, the report relied on the wrong cutoff date.

Arithmetic errors were rampant. Larry Johnson, a retired CIA and State Department professional, discovered that the total number of fatalities in the chronology exceeded the number listed in the statistical review in Appendix G. According to Black, the errors resulted from "a combination of things: inattention, personnel shortages and database that is awkward and is antiquated and needs to have very proficient input be made in order for to be sure that the numbers will spill then to the different categories that are being captured [sic]." The debacle is more like an episode of the Keystone Kops than a chapter from Machiavelli, but even that analogy is not very comforting.

SETTING THE RECORD STRAIGHT

Despite the data's limitations, the chronology of significant events in the 2003 global terrorism report yields important information about terrorism's trends, its geographical characteristics, and its magnitude.

Time-series analysis, which seeks to discern trends in given phenomena over time, requires a consistent approach to collecting data. The State Department's terrorism report presents time-series analysis, but by focusing on the total number of attacks it misleadingly combines verifiable data on significant events with nonverifiable data on insignificant ones. And because, as TTIC director John Brennan admitted, "many nonsignificant events occur throughout the world that are not counted in the report," one must also be concerned about consistency in the measurement of the total number of terrorist events. Even if the nonsignificant events were listed (and thus could be verified), trends in significant events are more relevant because they track events that, by definition, are more important. Accurately measuring these trends is a prerequisite for understanding the factors that underlie them and the policies that shape them. In fact, an analysis of the revised report reveals that the number of significant attacks increased from 124 to 175, or by 41 percent, from 2001 to 2003-a significant fact indeed.

The detailed chronology also allows analysts to cumulate terrorist events for each country and cross-classify them according to the country where they occurred and the perpetrators' country of origin. These figures can then be related to the countries' characteristics, yielding information that can help policymakers devise strategies to address terrorism's root causes. Using the global terrorism reports for the years 1997-2002, the authors of this article have previously found that terrorists tend to come from nondemocratic countries, both rich and poor, and generally target nationals from rich, democratic countries.

The State Department has rightly emphasized that the threat of terrorism remains serious, but a close examination of its data helps put the magnitude of the threat in perspective. In 2003, a total of 625 people--including 35 Americans--were killed in international terrorist incidents worldwide. Meanwhile, 43,220 died in automobile accidents in the United States alone, and three million died from AIDS around the world. Comparative figures, particularly when combined with forecasts of future terrorism trends, can help focus debate on the real costs people are willing to bear--in foregone civil liberties and treasure--to reduce the risk posed by terrorism.

CHANGING TRACKS

The State Department currently uses, and Congress accepts, nineteenth-century methods to analyze a twenty-first-century problem. To prevent errors of the type that riddled the 2003 global terrorism report, Congress has two alternatives. It could reassign the State Department's reporting responsibilities to a neutral research agency, such as the GAO (the General Accounting Office, recently renamed the Government Accountability Office) which routinely uses appropriate statistical practices. The problem is that the GAO has little foreign policy expertise and does not necessarily have access to the (sometimes classified) information that goes into the reports. Alternatively, Congress could keep the reports within the State Department's purview but demand that its practices for data collection and analysis be improved and that the reports be insulated from partisan manipulation.

If responsibility remains within the State Department, Congress should establish a statistical bureau in the department to ensure that scientific standards are respected in all reports, thereby elevating the status of data-gathering and statistics there. The bureau would promote consistency, statistical rigor, and transparency. When appropriate, it could seek input from the scientific community. And, while respecting classified sources, it could also insist that sufficient information be released to independent analysts for verification.

To overcome conflicts of interest facing political appointees who issue government reports, the State Department should adopt rules similar to those that govern the production and dissemination of key economic indicators. Career staff who prepare the reports should be given an hour to brief the media on technical aspects of the data, during which time political appointees would be precluded from making public comments. (After the hour elapses, it is expected that political appointees would offer their interpretations.) Career staff should be protected so they can prepare mandated reports without interference from political appointees and then present them for review by the statistics bureau. Once the reports are finalized, but before they are publicly released, they should be circulated to designated political appointees who need to prepare for their release. Disclosure dates should be announced long in advance to prevent opportunistic timing by political appointees.

Last October, in a candid memorandum to top aides that was leaked to the press, Secretary of Defense Donald Rumsfeld admitted, "Today, we lack metrics to know if we are winning or losing the global war on terror. Are we capturing, killing, or deterring and dissuading more terrorists every day than the madrassas [Islamic schools] and the radical clerics are recruiting, training, and deploying against us?" The statement was a stinging acknowledgment that the government lacks both classified and unclassified data to make critical policy decisions. It is also a reminder that only accurate information, presented without political spin, can help the public and decision-makers know where the United States stands in the war on terrorism and how best to fight it.

Alan B. Krueger is Bendheim Professor of Economics and Public Policy at Princeton University. David D. Laitin is Watkins Professor of Political Science at Stanford University.
All News button
1
Paragraphs

This paper is part of a larger study on the historical experience of Independent Power Producers (IPPs) in countries undergoing transition in their institutions of governance. The study seeks to explain the patterns of investment in IPPs and project outcomes with the aim of using this information to plot alternative future models for IPP investment. This paper follows the research methods and guidelines laid out in the research protocol, "The Experience with Independent Power Projects in Developing Countries: Introduction and Case Study Methods".

All Publications button
1
Publication Type
Working Papers
Publication Date
Journal Publisher
Program on Energy and Sustainable Development Working Paper #32
Authors
Joshua C. House
Paragraphs

The enlargement of the EU to Central and Eastern European Countries (CEEC) candidates, Turkish candidacy and the Stabilization and Association Process in the Balkans, provides researchers with intriguing opportunities for exploring the effects of international actors on democratic and rule of law reforms in a diverse group of countries.

All Publications button
1
Publication Type
Working Papers
Publication Date
Authors
Amichai Magen
Paragraphs

The fifth enlargement round (Enlargement) of the European Union (EU), which took place on May 1st 2004, is rightly recognized to be a momentous landmark in the history of modern European integration; the culmination of a fifteen-year process that has variably transformed and will continue to deeply impact the regime characteristics of the post-communist New Member States (NMS) and the remaining candidates (Bulgaria, Croatia, Romania and Turkey) - as well as the EU governance system and its perception of itself as an international actor.

All Publications button
1
Publication Type
Working Papers
Publication Date
Journal Publisher
CDDRL Working Papers
Authors
Amichai Magen
Subscribe to Turkey