Comparative effectiveness research
Paragraphs

Background: Although recent guidelines call for expanded routine screening for HIV, resources for antiretroviral therapy (ART) are limited, and all eligible persons are not currently receiving treatment.

Objective: To evaluate the effects on the U.S. HIV epidemic of expanded ART, HIV screening, or interventions to reduce risk behavior.

Design: Dynamic mathematical model of HIV transmission and disease progression and cost-effectiveness analysis.

Data Sources: Published literature.

Target Population: High-risk (injection drug users and men who have sex with men) and low-risk persons aged 15 to 64 years in the United States.

Time Horizon: Twenty years and lifetime (costs and quality-adjusted life-years [QALYs]).

Perspective: Societal.

Intervention: Expanded HIV screening and counseling, treatment with ART, or both.

Outcome Measures: New HIV infections, discounted costs and QALYs, and incremental cost-effectiveness ratios.

Results of Base-Case Analysis: One-time HIV screening of low-risk persons coupled with annual screening of high-risk persons could prevent 6.7% of a projected 1.23 million new infections and cost $22 382 per QALY gained, assuming a 20% reduction in sexual activity after screening. Expanding ART utilization to 75% of eligible persons prevents 10.3% of infections and costs $20 300 per QALY gained. A combination strategy prevents 17.3% of infections and costs $21 580 per QALY gained.

Results of Sensitivity Analysis: With no reduction in sexual activity, expanded screening prevents 3.7% of infections. Earlier ART initiation when a CD4 count is greater than 0.350 × 109 cells/L prevents 20% to 28% of infections. Additional efforts to halve high-risk behavior could reduce infections by 65%.

Limitation: The model of disease progression and treatment was simplified, and acute HIV screening was excluded.

Conclusion: Expanding HIV screening and treatment simultaneously offers the greatest health benefit and is cost-effective. However, even substantial expansion of HIV screening and treatment programs is not sufficient to markedly reduce the U.S. HIV epidemic without substantial reductions in risk behavior.

Primary Funding Source: National Institute on Drug Abuse, National Institutes of Health, and Department of Veterans Affairs.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Annals of Internal Medicine
Authors
Elisa F. Long
Margaret Brandeau
Douglas K. Owens
Douglas Owens
Paragraphs

Background— Family members of patients with established long-QT syndrome (LQTS) often lack definitive clinical findings, yet may have inherited an LQTS mutation and be at risk of sudden death. Genetic testing can identify mutations in 75% of patients with LQTS, but genetic testing of family members remains controversial.

Methods and Results— We used a Markov model to assess the cost-effectiveness of 3 strategies for treating an asymptomatic 10-year-old, first-degree relative of a patient with clinically evident LQTS. In the genetic testing strategy, relatives undergo genetic testing only for the mutation identified in the index patient, and relatives who test positive for the mutation are treated with β-blockers. This strategy was compared with (1) empirical treatment of relatives with β-blockers and (2) watchful waiting, with treatment only after development of symptoms. The genetic testing strategy resulted in better survival and quality-adjusted life years at higher cost, with a cost-effectiveness ratio of $67 400 per quality-adjusted life year gained compared with watchful waiting. The cost-effectiveness of the genetic testing strategy improved to less than $50 000 per quality-adjusted life year gained when applied selectively either to (1) relatives with higher clinical suspicion of LQTS (pretest probability 65% to 81%), or to (2) families with a higher than average risk of sudden death, or to (3) larger families (2 or more first-degree relatives tested).

Conclusions— Genetic testing of young first-degree relatives of patients with definite LQTS is moderately expensive, but can reach acceptable thresholds of cost-effectiveness when applied to selected patients.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Circulation: Cardiovascular Quality and Outcomes
Authors
Marco V. Perez
Narmadan A. Kumarasamy
Douglas K. Owens
Douglas K. Owens
Paul J Wang
Mark A. Hlatky
Mark Hlatky
Paragraphs

Background: Warfarin reduces the risk for ischemic stroke in patients with atrial fibrillation (AF) but increases the risk for hemorrhage. Dabigatran is a fixed-dose, oral direct thrombin inhibitor with similar or reduced rates of ischemic stroke and intracranial hemorrhage in patients with AF compared with those of warfarin.

Objective: To estimate the quality-adjusted survival, costs, and cost-effectiveness of dabigatran compared with adjusted-dose warfarin for preventing ischemic stroke in patients 65 years or older with nonvalvular AF.

Design: Markov decision model.

Data Sources: The RE-LY (Randomized Evaluation of Long-Term Anticoagulation Therapy) trial and other published studies of anticoagulation. The cost of dabigatran was estimated on the basis of pricing in the United Kingdom.

Target Population: Patients 65 years or older with nonvalvular AF and risk factors for stroke (CHADS(2) score ≥1 or equivalent) and no contraindications to anticoagulation.

Time Horizon: Lifetime.

Perspective: Societal.

Intervention: Warfarin anticoagulation (target international normalized ratio, 2.0 to 3.0); dabigatran, 110 mg twice daily (low dose); and dabigatran, 150 mg twice daily (high dose).

Outcome Measures: Quality-adjusted life-years (QALYs), costs (in 2008 U.S. dollars), and incremental cost-effectiveness ratios.

Results of Base-Case Analysis: The quality-adjusted life expectancy was 10.28 QALYs with warfarin, 10.70 QALYs with low-dose dabigatran, and 10.84 QALYs with high-dose dabigatran. Total costs were $143,193 for warfarin, $164,576 for low-dose dabigatran, and $168,398 for high-dose dabigatran. The incremental cost-effectiveness ratios compared with warfarin were $51,229 per QALY for low-dose dabigatran and $45,372 per QALY for high-dose dabigatran.

Results of Sensitivity Analysis: The model was sensitive to the cost of dabigatran but was relatively insensitive to other model inputs. The incremental cost-effectiveness ratio increased to $50,000 per QALY at a cost of $13.70 per day for high-dose dabigatran but remained less than $85,000 per QALY over the full range of model inputs evaluated. The cost-effectiveness of high-dose dabigatran improved with increasing risk for stroke and intracranial hemorrhage.

Limitation: Event rates were largely derived from a single randomized clinical trial and extrapolated to a 35-year time frame from clinical trials with approximately 2-year follow-up.

Conclusion: In patients 65 years or older with nonvalvular AF at increased risk for stroke (CHADS(2) score ≥1 or equivalent), dabigatran may be a cost-effective alternative to warfarin depending on pricing in the United States.

Primary Funding Source: American Heart Association and Veterans Affairs Health Services Research & Development Service.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Annals of Internal Medicine
Authors
James Freeman
Ruo Zhu
Douglas K. Owens
Douglas K. Owens
Alan M. Garber
David Hutton
Alan Go
Paul Wang
Mintu Turakhia
Paragraphs

The major expansion of federal comparative effectiveness research launched in 2009 held the potential to supply the information needed to help slow health spending growth while improving the outcomes of care. However, when Congress passed the Patient Protection and Affordable Care Act one year later, it limited the role of cost analysis in the work sponsored by the Patient-Centered Outcomes Research Institute. Despite this restriction, cost-effectiveness analysis meets important needs and is likely to play a larger role in the future. Under the terms of the Affordable Care Act, the institute can avoid commissioning cost-effectiveness analyses and still provide information bearing on the use and costs of health care interventions. This information will enable others to investigate the comparative value of these interventions. We argue that doing so is necessary to decision makers who are attempting to raise the quality of care while reining in health spending.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Health Affairs
Authors
Alan Garber
Sox HC
Paragraphs

BACKGROUND: -Many myocardial infarctions and strokes occur in individuals with low-density lipoprotein cholesterol levels below recommended treatment thresholds. High sensitivity C-reactive protein (hs-CRP) testing has been advocated to identify low- and intermediate-risk individuals who may benefit from statin therapy. Methods and Results-A decision analytic Markov model was used to follow hypothetical cohorts of individuals with normal lipid levels but without coronary artery disease, peripheral arterial disease, or diabetes mellitus. The model compared current Adult Treatment Panel III practice guidelines, a strategy of hs-CRP screening in those without an indication for statin treatment by current practice guidelines followed by treatment only in those with elevated hs-CRP levels, and a strategy of statin therapy at specified predicted risk thresholds without hs-CRP testing. Risk-based treatment without hs-CRP testing was the most cost-effective strategy, assuming that statins were equally effective regardless of hs-CRP status. However, if normal hs-CRP levels identified a subgroup with little or no benefit from statin therapy (<20% relative risk reduction), then hs-CRP screening would be the optimal strategy. If harms from statin use were greater than generally recognized, then use of current clinical guidelines would be the optimal strategy. Conclusion-Risk-based statin treatment without hs-CRP testing is more cost-effective than hs-CRP screening, assuming that statins have good long-term safety and provide benefits among low-risk people with normal hs-CRP.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Circulation
Authors
Lee KK
Cipriano LE
Douglas K. Owens
Douglas K. Owens
Go AS
Mark A. Hlatky
Mark Hlatky
Paragraphs

With the awareness of maternal depression as a prevalent public health issue and its important link to child physical and mental health, attention has turned to how healthcare providers can respond effectively. Intimate partner violence (IPV) and the use of alcohol, tobacco, and other drugs are strongly related to depression, particularly for low-income women. The American College of Obstetricians and Gynecologists (ACOG) recommends psychosocial screening of pregnant women at least once per trimester, yet screening is uncommonly done. Research suggests that a collaborative care approach improves identification, outcomes, and cost-effectiveness of care. This article presents The Perinatal Mental Health Model, a community-based model that developed screening and referral partnerships for use in community obstetric settings in order to specifically address the psychosocial needs of culturally diverse, low-income mothers.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Journal of Women's Health
Authors
Connelly CD
Baker-Ericzen MJ
Hazen AL
Landsverk J
Sarah (Sally) Horwitz
Paragraphs

BACKGROUND: Universal testing and treatment holds promise for reducing the burden of human immunodeficiency virus (HIV) in sub-Saharan Africa, but linkage from testing to treatment sites and retention in care are inadequate.

METHODS: We developed a simulation of the HIV epidemic and HIV disease progression in South Africa to compare the outcomes of the present HIV treatment campaign (status quo) with 4 HIV testing and treating strategies that increase access to antiretroviral therapy: (1) universal testing and treatment without changes in linkage to care and loss to follow-up; (2) universal testing and treatment with improved linkage to care; (3) universal testing and treatment with reduced loss to follow-up; and (4) comprehensive HIV care with universal testing and treatment, improved linkage to care, and reduced loss to follow-up. The main outcome measures were survival benefits, new HIV infections, and HIV prevalence. 

RESULTS: Compared with the status quo strategy, universal testing and treatment (1) was associated with a mean (95% uncertainty bounds) life expectancy gain of 12.0 months (11.3-12.2 months), and 35.3% (32.7%-37.5%) fewer HIV infections over a 10-year time horizon. Improved linkage to care (2), prevention of loss to follow-up (3), and comprehensive HIV care (4) provided substantial additional benefits: life expectancy gains compared with the status quo strategy were 16.1, 18.6, and 22.2 months, and new infections were 55.5%, 51.4%, and 73.2% lower, respectively. In sensitivity analysis, comprehensive HIV care reduced new infections by 69.7% to 76.7% under a broad set of assumptions.

CONCLUSIONS: Universal testing and treatment with current levels of linkage to care and loss to follow-up could substantially reduce the HIV death toll and new HIV infections. However, increasing linkage to care and preventing loss to follow-up provides nearly twice the benefits of universal testing and treatment alone.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Archives of Internal Medicine
Authors
Eran Bendavid
Eran Bendavid
Margaret L. Brandeau
Wood R
Douglas K. Owens
Douglas K. Owens
Authors
Jennifer Burney
Jennifer Burney
David Lobell
David Lobell
Louis Bergeron
News Type
News
Date
Paragraphs
Advances in high-yield agriculture achieved during the so-called Green Revolution have not only helped feed the planet, but also have helped slow the pace of global warming by cutting the amount of biomass burned - and the resulting greenhouse gas emissions - when forests or grasslands are cleared for farming. Stanford researchers estimate those emissions have been trimmed by over half a trillion tons of carbon dioxide. The paper is being released this week in the Proceedings of the National Academy of Sciences.

Advances in high-yield agriculture over the latter part of the 20th century have prevented massive amounts of greenhouse gases from entering the atmosphere - the equivalent of 590 billion metric tons of carbon dioxide - according to a new study led by two Stanford Earth scientists.

The yield improvements reduced the need to convert forests to farmland, a process that typically involves burning of trees and other plants, which generates carbon dioxide and other greenhouse gases.

The researchers estimate that if not for increased yields, additional greenhouse gas emissions from clearing land for farming would have been equal to as much as a third of the world's total output of greenhouse gases since the dawn of the Industrial Revolution in 1850.

The researchers also calculated that for every dollar spent on agricultural research and development since 1961, emissions of the three principal greenhouse gases - methane, nitrous oxide and carbon dioxide - were reduced by the equivalent of about a quarter of a ton of carbon dioxide - a high rate of financial return compared to other approaches to reducing the gases.

"Our results dispel the notion that modern intensive agriculture is inherently worse for the environment than a more 'old-fashioned' way of doing things," said Jennifer Burney, lead author of a paper describing the study that will be published online by the Proceedings of the National Academy of Sciences.

Adding up the impact

The researchers calculated emissions of carbon dioxide, methane and nitrous oxide, converting the amounts of the latter two gases into the quantities of carbon dioxide that would have an equivalent impact on the atmosphere, to facilitate comparison of total greenhouse gas outputs.

Burney, a postdoctoral researcher with the Program on Food Security and the Environment at Stanford, said agriculture currently accounts for about 12 percent of human-caused greenhouse gas emissions. Although greenhouse gas emissions from the production and use of fertilizer have increased with agricultural intensification, those emissions are far outstripped by the emissions that would have been generated in converting additional forest and grassland to farmland.

"Every time forest or shrub land is cleared for farming, the carbon that was tied up in the biomass is released and rapidly makes its way into the atmosphere - usually by being burned," she said. "Yield intensification has lessened the pressure to clear land and reduced emissions by up to 13 billion tons of carbon dioxide a year."

"When we look at the costs of the research and development that went into these improvements, we find that funding agricultural research ranks among the cheapest ways to prevent greenhouse gas emissions," said Steven Davis, a co-author of the paper and a postdoctoral researcher at the Carnegie Institution at Stanford.

To evaluate the impact of yield intensification on climate change, the researchers compared actual agricultural production between 1961 and 2005 with hypothetical scenarios in which the world's increasing food needs were met by expanding the amount of farmland rather than by the boost in yields produced by the Green Revolution.

"Even without higher yields, population and food demand would likely have climbed to levels close to what they are today," said David Lobell, also a coauthor and assistant professor of environmental Earth system science at Stanford.

"Lower yields per acre would likely have meant more starvation and death, but the population would still have increased because of much higher birth rates," he said. "People tend to have more children when survival of those children is less certain."

Avoiding the need for more farmland

The researchers found that without the advances in high-yield agriculture, several billion additional acres of cropland would have been needed.

Comparing emissions in the theoretical scenarios with real-world emissions from 1961 to 2005, the researchers estimated that the actual improvements in crop yields probably kept greenhouse gas emissions equivalent to at least 317 billion tons of carbon dioxide out of the atmosphere, and perhaps as much as 590 billion tons.

Without the emission reductions from yield improvements, the total amount of greenhouse gas pumped into the atmosphere over the preceding 155 years would have been between 18 and 34 percent greater than it has been, they said.

To calculate how much money was spent on research for each ton of avoided emissions, the researchers calculated the total amount of agricultural research funding related to yield improvements since 1961 through 2005. That produced a price between approximately $4 and $7.50 for each ton of carbon dioxide that was not emitted.

"The size and cost-effectiveness of this carbon reduction is striking when compared with proposed mitigation options in other sectors," said Lobell. "For example, strategies proposed to reduce emissions related to construction would cut emissions by a little less than half the amount that we estimate has been achieved by yield improvements and would cost close to $20 per ton."

The authors also note that raising yields alone won't guarantee lower emissions from land use change.

"It has been shown in several contexts that yield gains alone do not necessarily stop expansion of cropland," Lobell said. "That suggests that intensification must be coupled with conservation and development efforts.

"In certain cases, when yields go up in an area, it increases the profitability of farming there and gives people more incentive to expand their farm. But in general, high yields keep prices low, which reduces the incentive to expand."

The researchers concluded that improvement of crop yields should be prominent among a portfolio of strategies to reduce global greenhouse gases emissions.

"The striking thing is that all of these climate benefits were not the explicit intention of historical investments in agriculture. This was simply a side benefit of efforts to feed the world," Burney noted. "If climate policy intentionally rewarded these kinds of efforts, that could make an even bigger difference. The question going forward is how climate policy might be designed to achieve that."

David Lobell is a Center Fellow at the Freeman Spogli Institute for International Studies and at the Woods Institute for the Environment. The Program on Food Security and the Environment is a joint project of the Woods Institute and the Freeman Spogli Institute. The Precourt Institute for Energy and FSE provided funding for Jennifer Burney's research on agriculture and energy.


All News button
1
News Type
News
Date
Paragraphs

Image
3ie logo
Among hundreds of applicants, REAP was one of only 25 groups to secure a competitive grant from the International Initiative for Impact Evaluation, or 3ie, to assess the value of expanding vocational education training (VET) in China.

Read below for a summary of the proposed study.


Investment in Vocational vs. General Schooling: Evaluating China’s Expansion of Vocational Education and Laying the Foundation for Further Vocational Education Evaluation


A key policy question in developing countries, including China, is how to balance investments between vocational and general education in a way that supports economic growth and reduces social inequality. There is no definitive study in any developing country on the returns to vocational education and training (VET). In the absence of information on how VET might affect the earnings of workers, it is unclear if recent efforts of the Chinese government to expand VET are sound. If the returns are negligible, the government might consider slowing the expansion or improving the quality of VET.

Additionally, it is estimated that only about 40% of the students that graduate from junior high school in poor, rural areas continue with their studies; the rest enter the unskilled labour force. Why are these rates so low? Surprisingly, little is known about the factors that keep students out of school. There is no systematic study of what is working in VET and what is not. Despite the rapid expansion of VET, China has set up few mechanisms to evaluate the quality of VET programs.

The goal of this project is to help the Chinese government evaluate the effectiveness of the expansion of VET. It aims to provide empirical evidence on the returns to VET; the factors that might keep disadvantaged students from receiving quality schooling; and measure the quality and cost-effectiveness of VET programs.

This study will estimate the returns from VET versus general schooling using various “quasi-experimental” methods. It will follow a randomized control trial design and randomly assign junior high students to programmes that provide vouchers for VET schooling, vouchers for academic schooling, and academic counselling for students to become better informed about their schooling/employment options. The project will assess if students work harder, perform better and matriculate to academic high school and VET at higher rates when they have sufficient financial aid and counselling. Finally, it will also develop an Entrance/Exit Examination that can be used by principals of VET institutions and local officials in charge of VET to assess the quality of their programs. The findings of these quality studies of VET will be useful in influencing policy on one of China’s most debated education issues.


All News button
1
Subscribe to Comparative effectiveness research