Imagine you are part of a philanthropy that funds a diverse portfolio of programs — in education, health, and agriculture, for example — or maybe you are considering programs in those areas but have not yet committed funding. You have limited resources and cannot fund everything, so how do you ensure that your spending does the most good?
In early 2020, IDinsight partnered with the Coppel Family philanthropy to help answer this question. The Coppel Family supports various community-based antipoverty programs in rural Malawi, which collectively benefit over 250,000 people. Comparing the value of those different programs is challenging — for example, how should the Coppel Family decide whether to prioritize a program supporting early childhood nutrition or secondary school education?
This blog describes how we answered this question. Our methodology was informed by GiveWell, one of IDinsight’s long-term partners. GiveWell has popularized the use of cross-intervention comparisons using cost-effectiveness analysis, including applying impact estimates from the literature to programs implemented by charities.1
The Coppel Family wanted to understand how to maximize the impact of its program portfolio as soon as possible. Impact evaluations of each program — in education, health, agriculture, and more — could take years to provide results.2 How could we assess which programs work best, without waiting to see evaluation results?
Our goal was to provide evidence the Coppel Family could use on a short timeline — in a matter of months — to begin thinking about how to reallocate funding to the most cost-effective programs. To do so, we combed through existing research literature to find high-quality evidence on the impact of programs similar to those funded by the Coppel Family, and from contexts similar to rural Malawi.
In some cases, a pre-existing meta-analysis already compared several relevant programs, e.g. lessons for new mothers on the importance of exclusively breastfeeding infants during their first six months. Where meta-analyses did not exist or did not incorporate all relevant studies we found in the literature, we needed to aggregate impact estimates from different sources. However, not all pieces of evidence should be given equal weight. The standard meta-analysis approach is to weight estimates by their precision,3 but we also needed to factor in which programs in the literature are most similar to those funded by the Coppel Family.4 To address both aspects, we created a combined weight that incorporated both precision and similarity, to calculate impact estimates using weighted averages, as illustrated below.
At this point, we had about 10 program impact estimates across several sectors, expressed in different units such as changes in school enrollment, the prevalence of disease, and crop yields. Next, we needed a way to compare across sectors.
Comparing the estimated impact of these 10 programs would be relatively straightforward if every program meant to achieve the same objective — improving school enrollment, for example. But, how could we determine which programs have the most impact, when they aim to achieve such different goals?
To compare programs across different sectors, we needed to convert school enrollment, disease prevalence, and outcomes into common units — dollar values. For programs that reduce the burden of disease or save lives, we calculated the program’s dollar-value benefit to an individual, over his or her lifetime, using disability-adjusted life-year values5 and Value of Statistical Life (VSL) data. For more information on VSL and incorporating people’s preferences in how funding is allocated between programs, please see our Measure People’s Preferences project, in partnership with GiveWell.
For all other programs, we drew on existing evidence to determine how they impacted income-earning potential over an individual’s lifetime, and calculated lifetime benefits using the net present value of future gains in income. That required us to use existing evidence to make assumptions about how and when a program begins to impact income, and how long that impact lasts. Some programs (e.g. agriculture) target participants in mid-adulthood and increase income for their remaining income-earning years, whereas others (e.g. education, child health) target infant or child participants and impact their income-earning potential when the children become adults.
We then used the costs of the Coppel Family programs to calculate estimated cost-effectiveness, as shown in the example figure below. We rely on a set of assumptions about how to apply monetary values to diverse outcomes in our cost-effectiveness estimates. While guided by previous studies, there are limitations to this approach.6
As a last step, we sought to give our partner a sense of how confident we were about each cost-effectiveness estimate, to further inform any decisions being considered based on our analysis. We, therefore, graded our confidence in each cost-effectiveness estimate as high, medium, or low. Cost-effectiveness estimates graded as high suggest that 1) the evidence literature is high quality; 2) the studied interventions likely heavily overlap with Coppel Family programs; 3) the impact pathway is transferrable across contexts, and 4) we have no major doubts about the lifetime benefits calculations.
We fully acknowledge that the method we describe here has its limitations. Nevertheless, cost-effectiveness estimates are important guideposts to determine the probable value of different programs and how they could be prioritized. In this case, our approach enabled us to make the following recommendations to the Coppel Family, to help maximize the impact of their giving:
The Coppel Family is now using these recommendations to explore changes to their funding portfolio — changes that can bring additional benefits to the 250,000 individuals supported by their programs.
Cost-effectiveness analysis can prove a useful approach for other philanthropic actors as well, especially when there are time or resource constraints. It is an essential tool for turning a familiar challenge — comparing apples and oranges — into a powerful opportunity — to use the limited resources available to do the most good possible.
5 December 2024
4 December 2024
3 December 2024
We support philanthropic organizations to measure their impact, assess new ventures, identify optimal social investments, and support their grantees.
29 September 2020
9 July 2021
8 December 2021
23 February 2023