This page gives the details of our process for identifying, reviewing and evaluating major disaster relief organizations and their response to the Haiti earthquake.
Table of Contents
Identifying organizations to include
To create a list of these organizations, we referred to:
- The Chronicle of Philanthropy's tally of which charities had raised how much for Haiti relief as of July 2010.1 The biggest numbers on this list are likely the biggest money-raisers overall; charities absent from the list likely didn't raise enough to be of interest, or didn't share their tallies with the Chronicle (and we feel the latter would be a significantly bad sign for accountability).
- A list we had made in the weeks immediately following the Haiti earthquake, noting which charities were advertising via Google Adwords for earthquake-related searches. This is a simple heuristic for finding charities that are likely to rely upon, and solicit, donations from the public at large.
Specifically, we rated and ranked any charity that either (a) appeared in the Chronicle tally and advertised via Google Adwords; or (b) was among the 10 biggest money-raisers in the Chronicle tally, regardless of whether we saw its ads on Google Adwords. We also added several more charities at our discretion.
The table below lists all the organizations included in our report and our reason for including them.
Reviewing individual organizations
- Between September 2010 and January 2011, we reviewed the websites of the organizations above looking for information about each organization's financials and activities (details of the questions we asked, and the kind of information we sought, below). In particular, we reviewed "Where we work," "Publications," and Haiti-specific pages for each organization. (We list the parts of the website we reviewed at the top of each review.)
- We also tried relevant Google searches looking for additional information, such as
"[Organization name] Haiti." - We wrote up our answers to key questions, based on these materials, on individual review pages for each charity. These reviews summarize, and cite, the most clear and detailed information we were able to find.
- We contacted all included organizations on either Friday January 7th, 2011 or Monday January 10, 2011 inviting them to preview our writeups. We revised writeups to incorporate any relevant additional information they referred us to.
Questions we focused on
In examining charities' information, we've focused on the following questions:
- Financials. Did organizations clearly and frequently disclose how much they were seeking, what they'd do if they raised more than they sought, how much they'd raised and how much they'd spent?
- Transparency on activities. Did charities provide specific and comprehensive accounts of their activities in the affected areas? Could someone on the ground verify or refute their claims about how the money was spent?
- Results. Did charities publish substantive information on their successes and shortcomings?
- Everyday work. Do charities publish clear and substantive information about their non-disaster relief work? We've argued before that donations intended for disaster relief may effectively fund charities' everyday operations instead (even if they are formally earmarked for and allocated to disaster relief). Therefore, we think the quality of (and transparency around) a charity's everyday work ought to carry a heavy weight in donors' decisions.
How we evaluated the quality of organizations' answers
Before creating a rubric for assessing organizations, we made some general observations about answers to the questions listed above:
- Financials. Most major organizations have disclosed how much they've raised and (less frequently) how much they've spent. No organizations have been consistent and clear about how much money they were seeking. In cases where we have been able to find posted amounts sought, the charities posting them generally quickly raised more than they were seeking. In one case the appeal was constantly revised upwards. Despite raising much more than they sought (and than they've spent), we've only seen two groups (Doctors Without Borders and Oxfam) that stopped taking donations for the disaster.
- Transparency on activities. Few organizations are clear about how they have spent funds. Most give examples of activities and outputs (such as number of tarps distributed, people served with sanitation services, etc.); many give funding breakdowns at the broadest level (amount of money spent on food, shelter, health, etc.); but few give funding breakdowns with more detail than that, or more detailed and clearly comprehensive accounts of their activities and outputs.
- Results. No evaluations seem to be posted yet for Haiti relief. Many have been posted for work on the Asian tsunami; in the future we plan to examine these evaluations more closely, but at this point we feel that they rarely provide a strong sense of what has and hasn't been achieved.
- Everyday work. Few of the large organizations we examined are clear about how their overall budget breaks down and what they do around the world. The information provided is generally similar in clarity to information on relief activities (very broad budget breakdowns and examples of activities and outputs).
Based on these observations, we decided to evaluate organizations based largely on their level of clarity/transparency regarding how they spend their money (both for relief activities and everyday activities). Specifically,
- Many organizations provide funding breakdowns only at the highest/broadest level (amount of money spent on food, shelter, health, etc.) Organizations that provide more detail on how funds were spent - dividing their funds into more specific categories or providing more context and explanation - are marked at least "slightly above average" in our summary table.
- We seek a sense of not only what sorts of categories money was spent on, but what came of it - what specific activities were undertaken, what outputs resulted (shelters constructed, patients treated, etc.) Many organizations simply provide a list of outputs/activities, without making it clear whether this list is comprehensive or providing any way of associating the outputs/activities with expenses. However, some organizations describe specific, quantified activities and outputs in a way that allows a donor to (at least roughly and informally) link them with a more-detailed-than-average funding breakdown (as discussed immediately above). These organizations are marked "above average" in our summary table.
- We don't consider a charity to have "strong" transparency unless we can connect expenditures and activities/outputs with a high degree of specificity, gaining a sense of per-activity or per-output costs in multiple areas. We believe this is a reasonable standard of transparency.2
- We encountered a few organizations that appear primarily to give grants to other organizations; these grantmakers provided detail on whom they had made grants to, and for how much, but provided no information on the outputs/activities resulting from their grants. Not seeing a good way to compare the transparency of these organizations to that of the others, we simply excluded them from our evaluation. They are listed below our summary table.
Ongoing updates
The above steps were completed between late 2010 and January 2011.
Our next update of this content is planned for late 2011. We will examine organizations' responses to the March 2011 earthquake in Japan and the 2011 famine in Somalia.
- 1
Chronicle of Philanthropy, "How Charities are Helping," http://philanthropy.com/article/How-Charities-Are-Helping/66243 (accessed January 11, 2010). Archived by WebCite® at http://www.webcitation.org/5vf0gO4ZC.
- 2
See our 2009 blog post on The Global Fund for an example of a very large, complex organization that easily meets this standard of transparency.