Links to our reviews of our progress to date, including our accomplishments and shortcomings, are available here.
Please contact us with other items that should be listed here.
Last updated: April 2024 (2019 version and 2015 version)
Table of Contents
-
Major issues
- 2016 to ongoing: Failure to publish all relevant intervention research
- 2020: Privacy Policy-related misstep
- 2017 to 2019: Failure to publish charity reviews
- 2007 to 2014, with ongoing work to improve: Failure to prioritize staff diversity in hiring
- 2014 to 2016: Failure to prioritize hiring an economist
- 2013 to 2016: Failure to address misconceptions charities have about our application process
- 2009 to 2012: Errors in publishing private material
- 2006 to 2011: Tone issues
- July 2009 to November 2010: Quantitative charity ratings that confused rather than clarified our stances
- December 2007: Overaggressive and inappropriate marketing
- June 2007: Poorly constructed "causes" led to suboptimal grant allocation
-
Smaller issues
- Several years to November 2023: Failure to sense-check all raw data
- Late 2020 to early 2022: Overestimated funds raised
- April 2022: Failures of training and communication left us vulnerable to a crypto scam
- 2021: Miscalculation of and subsequent miscommunication around rollover funds
- November 2018: Spreadsheet errors led to additional funding for one top charity
- 2017: Failure to publish internal metrics report
- November 29, 2016 to December 23, 2016: Poor communication about top charity recommendations restricted to a specific program
- December 2014: Errors in our cost-effectiveness analysis of Development Media International (DMI)
- November to December 2014: Lack of confidence in the cost-effectiveness analyses we relied on for our top charities recommendations
- January to December 2014: Completed fewer intervention reports than projected
- November 2014: Suboptimal grant recommendation to Good Ventures
- November 2014: Not informing candidate charities of our recommendation structure prior to publishing recommendations
- July 2014: Published an update to the intervention report on cash transfers that misstated our view
- February 2014: Incorrect information on homepage
- January to November 2013: Social (non-family, non-financial) relationship between GiveWell staff members and staff of a recommended charity not publicly disclosed
- February to September 2013: Infrequent updates on our top-ranked charity
- May to June 2013: Unpublished website pages intermittently available publicly
- April to December 2012: Taking too much of job applicants' time early in the recruiting process
- March to November 2012: Poor planning led to delayed 2012 charity recommendations release
- June 2012: Failure to discuss sensitive public communication with board member
- July 2007 to March 2012: Phone call issues
- December 2011: Poor communication to donors making larger donations (e.g., greater than $5,000) via the GiveWell website
- December 2011: Problems caused by GiveWell's limited control over the process for donating to our top charities
- December 2011: Miscommunicating to donors about fees and the deductibility of donations to our top charity
- Late 2009: Misinterpreted a key piece of information about a charity to which we gave a $125,000 grant
- August 1, 2009 to December 31, 2009: Grant process insufficiently clear with applicants about our plans to publish materials
- November 25, 2009: Mishandling incentives to share information
- May 2009: Failed to remove two private references from a recording that we published
- January to September 2008: Paying insufficient attention to professional development and support
Major issues
2016 to ongoing: Failure to publish all relevant intervention research
How we fall short: In early 2016, we began to review the evidence base for a large number of programs to determine how we should prioritize charities for further evaluation. Our 2016 research plan discusses this priority (referred to as "intervention prioritization"). Since then, the vast majority of the work that we've done to prioritize interventions remains private, in internal documents that we have not shared because we have not put in the time to ensure the work is of publishable quality. We prioritized spending time to assess additional opportunities more highly than spending time to prepare our work for publication.
While we don't believe that publishing this work is likely to have changed any of the recommendations we make to donors, we see this body of private materials as a substantial failure to be transparent about our work. We believe that transparency is important for explaining our process and allowing others to vet our work. The process of formally writing up our research and seeking internal and external feedback has also on occasion changed our conclusions.
This remains an area for improvement.
Steps we are taking to improve: We plan to make progress on this work in 2020. Our research team has built into its plans for the year more time for publishing research we completed in the past as well as newer investigations.
Update (September 2023): Though we didn't post an update on our progress in 2020, we've taken several steps since 2016 to publish more of our research. As our research team has grown, we've generated a greater volume of research. We've made progress on publishing these findings, but we still have more work to do.
Areas of progress
- Publishing more of our research. We've made substantial progress in publishing more of our research.
- Grant pages. For example, in 2016, we were not yet publishing the full rationale behind each of our funding decisions; we now expect to publish a page on every grant we recommend for funding. A list of all pages we've published on grants since 2014 is available here.
- Deprioritization decisions. We began publishing short notes that explain our decisions to stop or pause investigation on programs that don't appear promising after an initial review (example here). This format allows us to more quickly communicate our views about a deprioritized program so that people can evaluate and respond to our reasoning. You can find all our short deprioritization notes in the program reviews dashboard.
- Publishing more quickly. In 2022, we began setting internal timeline targets for publishing new grant pages. Since the initial goals were set, we have published grant pages for top charities more quickly than before; they are now usually published less than three months after making a grant. As of 2022, we are tracking timelines for all research publishing, so in the future we will be able to assess whether we met our goals. We sped up our publication process, in part, by eliminating unnecessary review steps and streamlining communication with grantees to make their review and signoff easier.
Areas for improvement
- Publishing more quickly. While we have shortened our timelines for publishing relative to 2016, especially for grants to top charities, we still have more work to do to publish research quickly, particularly research on interventions.
- Increasing the legibility of our research. In 2023, we have been prioritizing the legibility of our research. As part of our value of transparency, we want readers to be able to understand our reasoning, evaluate the ways we might be wrong, and provide feedback that will improve our research. Toward that end, we're adding new summaries to our research and grant pages that describe what the program or grant does, identify our key assumptions, and clearly explain the program or grant's cost-effectiveness and what our largest sources of uncertainty are. You can see examples of these new features on this recently published grant page.
2020: Privacy Policy-related misstep
How we fell short: We have gradually expanded our marketing efforts since 2018. In May 2020, as part of these efforts, we updated our Privacy Policy.
Our updated policy included the ability to share personal information with service providers to assist with our marketing efforts. Our contracts require them to keep the information confidential and only use it to assist us with their contracted services.
We decided to use Facebook as such a service provider, and on July 12, 2020, we used email addresses of some donors to create a Facebook Custom Audience to help us identify other potential donors. We understand this to be a common tool for social media marketing. The email addresses were hashed, or converted to randomized code, locally before they were uploaded to Facebook for processing to create a Custom Audience. Facebook was required by our contract to delete the email addresses promptly after the Custom Audience was created and was not allowed to use the email addresses for other purposes.
We regret not having offered all donors a chance to opt out before we used their email addresses for this purpose.
Steps we have taken: We deleted our Custom Audience on July 30, 2020, after realizing some of our donors may have wanted the chance to opt out before their email address was used to create a Custom Audience in order to identify potential new donors. This realization was prompted by our CEO asking for an update on our approach to privacy protection.
We notified donors whose email addresses were used about what happened. We emailed others about the update to our Privacy Policy and how to opt into or out of information-sharing in the future. We also added an opt-out form to our Privacy Policy page. We don’t plan to proactively contact our audience prior to each future marketing effort, though we may decide to on a case by case basis.
We completed an internal assessment of what led to this misstep. To avoid similar missteps in the future, we're piloting a formalized process for scoping projects with a goal, among others, of ensuring the right level of review for very new types of work (as social media marketing was in 2020).
2017 to 2019: Failure to publish charity reviews
How we fell short: Since early 2017, we have had a significant number of conversations with charities about applying for a GiveWell recommendation. We also completed a preliminary evaluation of a number of charities' applications. Much of this work remains private. In some cases, this is because we did not get permission to publish information from those we spoke to. In other cases, this is because we did not put in the time to write up what we have learned in a format that we believed the charities would allow us to publish.
We no longer plan to publish these reviews, as they are outdated and likely would not represent the current organizations accurately. We do not think it would be a good use of the organizations' time to review our outdated work, nor would we expect to be successful in asking them to do so.
However, as we say above: "While we don't believe that publishing this work is likely to have changed any of the recommendations we make to donors, we see this body of private materials as a substantial failure to be transparent about our work. We believe that transparency is important for explaining our process and allowing others to vet our work. The process of formally writing up our research and seeking internal and external feedback has also on occasion changed our conclusions."
Steps we've taken to improve: Going forward, our charity review team is building into its process additional time for publishing.
2007 to 2014, with ongoing work to improve: Failure to prioritize staff diversity in hiring
How we fell short: From 2007 to 2014, we did not prioritize diversity in our hiring, and our staff composition reflects the lack of attention we had paid to this issue.
We believe a more diverse staff will make GiveWell better and more effective. We believe broadening our candidate pipeline and reducing any bias that exists in our hiring process will increase our likelihood of hiring the best people to achieve GiveWell's mission. And, we believe that having a diverse staff and an inclusive culture will make GiveWell more attractive to prospective staff and improve retention.
Steps we have taken and are continuing to take to improve:
We have made progress, but we still consider staff diversity an area in which to improve.
Since 2014, we have taken a number of steps to increase diversity in our hiring. Those efforts include advertising open roles with professional groups that focus on underrepresented audiences and working with consultants to recruit candidates from underrepresented backgrounds. We also use a hiring process that aims to limit bias by focusing on work samples that are graded blindly where possible.
As of 2020, our team is significantly more gender, racially, and ethnically diverse than it was in GiveWell's early years. It still is not as racially or ethnically diverse as we would like it to be. People from low- and middle-income countries, in which our top charities primarily operate, are not well represented on staff. As of mid-2020, we continue to undertake specific projects to increase diversity on our staff. Our next project will explore whether our recruitment processes differ from best practices related to recruiting for a diverse workforce. We will then work to ensure that we're following those best practices.
If you're interested in working at GiveWell, we encourage you to apply.
2014 to 2016: Failure to prioritize hiring an economist
How we fell short: From 2014 to 2016, we produced relatively few intervention reports, a crucial part of our research process. Our low production may be explained by the fact that we tasked relatively junior, inexperienced staff with these reports. We did not prioritize hiring a specialist, likely someone with a PhD in economics or the equivalent, who would have likely been able to complete many more reports during this time. This delayed our research and has potentially led us to recommend fewer top charities than we otherwise might have.
Steps we've taken to improve: In September 2016, we began recruiting for a Senior Fellow to fill this role. The role was filled in May 2017.
2013 to 2016: Failure to address misconceptions charities have about our application process
How we fell short: We realized in 2016 that some charities had misconceptions about our criteria, value-added, and research process. For example, some charities told us that they thought charities could only be recommended for three years; others weren't aware that we had recommended million-dollar "incentive grants" to top charities.
Steps we've taken to improve: We have assigned a staff member the duties of charity liaison. This person will be responsible for communicating with charities that are considering applying, to help them with our process and correct misconceptions.
2009 to 2012: Errors in publishing private material
How we fell short: There were two issues, one larger and one smaller:
- Since 2009, we've made a practice of publishing notes from conversations with charities and other relevant parties. Our practice is to share the conversation notes we take with the other party before publication so that they can make changes to the text before publication. We only publish a version of the notes that the other party approves and will keep the entire conversation confidential if the party asks us to.
In November 2012, a staff member completed an audit of all conversations that we had published. He identified two instances where we had erroneously published the pre-publication (i.e., not-yet-approved) version of the notes. We have emailed both organizations to apologize and inform them of the information that we erroneously shared.
- In October 2012, we published a blog post titled, "Evaluating people." Though the final version of the post did not discuss specific people or organizations, a draft version of the post had done so. We erroneously published the draft version which discussed individuals. We recognized our error within 5 minutes of posting and replaced the post with the correct version; the draft post was available in Google's cache for several hours and was likely available to people who received the blog via RSS if they had their RSS reader open before we corrected our error (and did not refresh their reader).
We immediately emailed all of the organizations and people that we had mentioned to apologize and included the section we had written about them. Note that none of the information we published was confidential; we merely did not intend to publish this information and it had not been fully vetted by GiveWell staff and sent to the organizations for pre-publication comment.
Steps we've taken to improve: In November 2012, we instituted a new practice for publishing conversation notes. We now internally store both private and publishable versions of conversation notes in separate folders (we hope that this practice reduces the likelihood that we upload the wrong file) and have assigned a staff member to perform a weekly audit to check whether any confidential materials have been uploaded. As of this writing (December 2012) we have performed 3 audits and found no instances of publishing private material.
We take the issue of publishing private materials very seriously because parties that share private materials with us must have confidence that we will protect their privacy. We have therefore reexamined our procedures for uploading files to our website and are planning to institute a full scale audit of files that are currently public as well as an ongoing procedure to audit our uploads.
October 2016 Update: We now have a publishing process that clearly separates publishable versions of conversation notes from private versions of notes, and we periodically audit published notes to ensure that all interviewee's suggestions have been incorporated. As of this time, our process requires that explicit approval to publish is given for each file we upload, and we periodically audit these uploads to ensure that private information has not been uploaded to our server.
2006 to 2011: Tone issues
How we fell short: We continue to struggle with an appropriate tone on our blog, one that neither understates nor overstates our confidence in our views (particularly when it comes to charities that we do not recommend). An example of a problematic tone is our December 2009 blog post, Celebrated Charities that we Don't Recommend. Although it is literally true that we don't recommend any of the charities listed in that post, and although we stand by the content of each individual blog post linked, the summaries make it sound as though we are confident that these charities are not doing good work; in fact, it would be more accurate to say that the information we would need to be confident isn't available, and we therefore recommend that donors give elsewhere unless they have information we don't.
We wish to be explicit that we are forming best guesses based on limited information, and always open to changing our minds, but readers often misunderstand us and believe we have formed confident (and, in particular, negative) judgments. This leads to unnecessary hostility from, and unnecessary public relations problems for, the groups we discuss.
Steps we have taken to improve: We do feel that our tone has slowly become more cautious and accurate over time. At the time of this writing (July 2010), we are also resolving to run anything that might be perceived as negative by the group it discusses, before we publish it publicly, giving them a chance to make any corrections to both facts and tone. (We have done this since our inception for charity reviews, but now intend to do it for blog posts and any other public content as well.)
July 2009 to November 2010: Quantitative charity ratings that confused rather than clarified our stances
How we fell short: Between July 2009 and November 2010, we assigned zero- to three-star ratings to all charities we examined. We did so in response to feedback from our fans and followers - in particular, arguments that people want easily digested, unambiguous “bottom line” information that can help them make a decision in a hurry and with a clean conscience. Ultimately, however, we decided that the costs of the ratings - in terms of giving people the wrong impression about where we stood on particular charities - outweighed the benefits.
Steps we have taken to improve: By December 2010 we will replace our quantitative ratings with more complex and ambiguous bottom lines that link to our full reviews.
More information:
- September 2010 blog post on the problems with quantitative charity ratings
- October 2010 blog post on why these ratings don't fit with our mission
December 2007: Overaggressive and inappropriate marketing
How we fell short: As part of an effort to gain publicity, GiveWell's staff (Holden and Elie) posted comments on many blogs that did not give adequate disclosure of our identities (we used our first names, but not our full names, and we didn't note that we were associated with GiveWell); in a smaller number of cases, we posted comments and sent emails that deliberately concealed our identities. Our actions were wrong and rightly damaged GiveWell's reputation. More detail is available via the page for the board meeting that we held in response.
Given the nature of our work, it is essential that we hold ourselves to the highest standards of transparency in everything we do. Our poor judgment caused many people who had not previously encountered GiveWell to become extremely hostile to it.
Steps we have taken to improve: We issued a full public disclosure and apology, and directly notified all existing GiveWell donors of the incident. We held a Board meeting and handed out penalties that were publicly disclosed, along with the audio of the meeting. We increased the Board's degree of oversight over staff, particularly with regard to public communications.
June 2007: Poorly constructed "causes" led to suboptimal grant allocation
How we fell short: For our first year of research, we grouped charities into causes ("Saving lives," "Global poverty," etc.) based on the idea that charities within one cause could be decided on by rough but consistent metrics: for example, we had planned to decide Cause 1 (saving lives in Africa) largely on the basis of estimating the “cost per life saved” for each applicant. The extremely disparate nature of different charities' activities meant that there were major limits to this type of analysis (we had anticipated some limits, but we encountered more).
Because of our commitment to make one grant per cause and our overly rigid and narrow definitions of "causes," we feel that we allocated our grant money suboptimally. For example, all Board members agreed that we had high confidence in two of our Cause 1 (saving lives) applicants, but very low confidence in all of our Cause 2 (global poverty) applicants. Yet we had to give equal size grants to the top applicant in each cause (and give nothing to the 2nd-place applicant in Cause 1).
Steps we have taken to improve: We have shifted our approach to "causes" so that they are defined more broadly. This gives us more flexibility to grant the organizations that appeal to us most. We now explore broad sets of charities that intersect in terms of the people they serve and the research needed to understand them, rather than narrower causes based on the goal of an “apples to apples” comparison using consistent metrics. For example, our recent research report addresses the broad area of international aid (note that this link goes to the research homepage).
October 2016 Update: The process we now use for identifying top charities and priority programs is detailed here.
Smaller issues
Several years to November 2023: Failure to sense-check all raw data
How we fell short: Note that we don't list every small research mistake we make and correct. This page lists mistakes that "affect the impression that people external to the organization have of our work and its reliability." We list these two examples because they're representative of a category of research error we have made.
In brief, we estimated some parameters in our cost-effectiveness models by plugging in raw data at face value without subjecting the numbers to common-sense scrutiny or examining how they could be inaccurate.
This is a quote from our writeup on how we address uncertainty:
To estimate insecticide resistance across countries, we look at bioassay test results on mosquito mortality. These tests essentially expose mosquitoes to insecticide and record the percentage of mosquitoes that die. The tests are very noisy: in many countries, bioassay test results range from 0% to 100% — i.e., the maximum range possible. To come up with country-specific estimates, we take the average of all tests that have been conducted in each country and do not make any further adjustments to bring the results more in line with our common-sense intuition.
Another example comes from that same page:
Another major program area we support is childhood immunization… To model the cost-effectiveness of these programs, we need to take a stance on the share of deaths that a vaccine prevents for a given disease. This assumption enters our cost-effectiveness estimates through our etiology adjustments …. To estimate an etiology adjustment for the rotavirus vaccine, which targets diarrhoeal deaths, we do the following:
- Take raw IHME data on the number of deaths from diarrhea among under 5s in the sub-regions where these programs operate
- Take raw IHME data on the number of deaths from rotavirus (a subset of diarrheal deaths)
- Divide the two to get an estimate of the % of diarrhea deaths in each region that could be targeted by the rotavirus vaccine
As Figure 5 shows, this leads to implausibly large differences between countries; we effectively assume that the rotavirus vaccine is almost completely ineffective at preventing diarrhoeal deaths in India. This seems like a bad assumption; the rotavirus vaccine is part of India’s routine immunization schedule, and an RCT in India that administered the rotavirus vaccine to infants showed a 54% reduction in severe gastroenteritis.
Steps we've taken to improve: We plan to take the steps described in our writeup on uncertainty to address this issue. We are now notably more attentive to the data we aggregate to arrive at our estimates, thus ensuring that we don't follow the (sometimes noisy) data we have without sense-checking the numbers.
Late 2020 to early 2022: Overestimated funds raised
How we fell short: In late 2021, we believed (and we wrote) that we would raise $1 billion annually by 2025. This was a massive overestimate (which we corrected in this mid-2022 post), and this mistake led to the following long-term problems:
- In late 2021, we worried that our research might not be able to keep up with the volume of donations we expected. That is, we thought we'd raise significantly more funding than the cost-effective funding needs we would identify. Because we're committed to being transparent with donors, we wrote that we were holding onto funds we had received (and that we expected to hold funds in the future) because we weren't finding enough grant opportunities to give them to. Unfortunately, the way we communicated about this led to a long-standing, hard-to-correct belief in our audience that we have more funding than we can spend.
- Because we believed that we would raise so much money, we put significantly more attention on building our research team than on building our outreach team, leading to a further imbalance between the volume of highly cost-effective funding opportunities we've identified and our ability to raise sufficient money to fill those funding gaps.
Steps we've taken to improve:
- We have aimed to be very explicit publicly about two facts. First, we do expect to find cost-effective programs to which we can direct all funding we receive. Second, the organizations we recommended are in fact funding-constrained.
- We are actively hiring for senior roles across our outreach team to build outreach capacity so that we can raise more money and fill more of the most cost-effective funding gaps we find.
We previously shared another mistake related to this episode. For more detail, see below in the section titled “2021: Miscalculation of and subsequent miscommunication around rollover funds.”
April 2022: Failures of training and communication left us vulnerable to a crypto scam
How we fell short: In April 2022, we received an email requesting a refund of a cryptocurrency donation, and we decided to grant it despite our no-refunds policy. We later realized that this request hadn't come from the real donor. We credited the real donor with the gift and lost $4,600, which we made up for by drawing on our unrestricted funding.
Cryptocurrency donations are especially fertile ground for scams because information about all crypto transactions is publicly available online, except for the identity of the person initiating the transaction. The email we got in this case basically fit the description of a common scam: a person claims that they've accidentally transferred a larger amount than they intended, often providing screenshots of public details of the transaction as "proof," and asks for a refund, though they didn't actually make the donation themselves.
GiveWell had safeguards in place against this, including requesting that all crypto donors fill out a donation report form against which to verify such claims and maintaining a no-refunds policy (for all types of donations, but particularly for crypto). However, the donor relations team handling requests like this were relatively new to their roles at the time and unfamiliar with this type of scenario, and decided to override the no-refunds policy in light of what they felt was a straightforward request.
We think this mistake was largely caused by a failure of training and knowledge sharing with the new donor relations staff:
- We had made exceptions to the no-refunds policy in the past, but we hadn't adequately documented the specific and limited reason for which exceptions could be made. We should have made these clearer in our internal training materials so new staff would be less reliant on judgment calls. We should also have communicated the no-refunds policy more clearly on our website.
- Former donor relations staff had encountered this type of scam before, but we hadn't included information about it in our training materials.
Steps we took to improve: To avoid this in the future, we've done the following:
- Provided extra training on crypto scams to the donor relations team and incorporated this information into our training materials for new staff.
- Revised the cryptocurrency donation pages on our website to clearly highlight that crypto donations are non-refundable and that donation report forms should be submitted prior to a donation.
- Circulated an internal memo clarifying our no-refunds policy for relevant staff.
- Discussed our cryptocurrency donation practices with experts and implemented best practices for both straightforward and more complicated transactions to reduce the incidence of fraud.
If you are considering making a cryptocurrency donation and want to know more about the steps we take to prevent fraud, please reach out to donations@givewell.org.
2021: Miscalculation of and subsequent miscommunication around rollover funds
Rollover funds are funds that we raise in a certain year but choose not to spend in that year, instead “rolling them over" to the following year because we believe those funds will have a greater impact if spent in the future. For background on rollover funds, see the page we published here.
How we fell short: In November 2021, we announced that we expected to roll over about $110 million in funding to grant to future opportunities. We ultimately rolled over substantially less. We rolled over $18 million that was available for grantmaking as of the end of metrics year 2021 (i.e., January 31, 2022). We also carried over an additional approximately $40 million that was received in metrics year 2021 but was not yet available for granting; this was a combination of:
- unrestricted funds that were designated by the Board for grantmaking in mid-2022, in accordance with our excess assets policy
- donations given to the Top Charities Fund in January 2022, which were allocated alongside donations given to the Top Charities Fund in the rest of Q1 2022
While our forecast was roughly accurate about both funds raised and funds directed, we failed to define the question well enough to predict how much of our available funding we would have left over.
Much of the discrepancy came from:
- Including funds given through GiveWell and designated for specific organizations (e.g., a donation given through our website for the Against Malaria Foundation) on one side of the ledger but not the other. These funds were granted out to the organizations to which they were designated, but we had erroneously been considering them as adding to the total amount of funds that would be available for granting at our discretion. This led to approximately $18 million less in funds available than forecasted.
- Not accounting for the contingency funding that was earmarked for some 2021 grants. These are funds that are currently held by Open Philanthropy but are earmarked for particular programs in the event that the programs require them (e.g., if we stop supporting a program in the future, this funding will be granted out as exit funding for that program). Some of this funding may be returned to our budget in the future, if it goes unspent once the grant period has ended, but for now these funds aren't available to us for granting because they've been earmarked. This led to about $25 million less in funds available than forecasted.
This discrepancy didn't change the bottom line or lead to a suboptimal allocation of funds: we thought we would raise more funding than we could allocate in 2021, and we did. But, we think the conceptual mistakes in our analysis combined with how we publicized the (erroneous) projection of $110 million in rollover funds led to lower overall funds raised. If we had made a more accurate prediction, we probably would have placed less emphasis on rollover funding in our public communications. We expect this would have led to more funds raised for our recommendations and more lives saved or improved, given that as of June 2022, we believe we've found more highly cost-effective funding opportunities than we'll be able to fund this year.
Steps we've taken to improve: We've learned from the specific mistakes we made in 2021 so that we can now approach this type of analysis with more clarity around the different pots of funding at play and how they interact. In the future, when we take on major pieces of analysis like this, we'll have more awareness of potential errors in our methodology and subject the analysis to more thorough review.
November 2018: Spreadsheet errors led to additional funding for one top charity
How we fell short: In November 2018, we used this spreadsheet to calculate how much funding to recommend that Good Ventures grant to each of our top charities. Since that time, we have become aware of two errors in the spreadsheet:
- In a sheet that consolidated information from other sheets, we mistakenly included one funding gap twice. This led us to calculate the amount to recommend that Good Ventures grant to Evidence Action's Deworm the World Initiative as $100,000 higher than we would have without the error ($10.4 million instead of $10.3 million). We learned of this error because Deworm the World brought it to our attention. We expect to reduce the amount that we recommend that Good Ventures grant to Evidence Action in a future grant by an offsetting amount.
- In the same spreadsheet, we increased our estimate of the total amount of additional funding that Malaria Consortium could absorb by $4.8 million over what we had calculated originally, based on new information we received from Malaria Consortium. We later realized that we had not added in Malaria Consortium's 10% overhead rate, leading us to underestimate the total Malaria Consortium could absorb by $480,000. This did not affect our recommendation to Good Ventures or other donors because the amount we recommended these donors give was limited by the amount of funding available, rather than by the amount that Malaria Consortium could absorb.
Steps we took to improve: 2018 was the first time that we used this spreadsheet format to calculate our recommendations to Good Ventures. While the errors made in 2018 did not, in the end, result in over- or under-funding any top charities, they are indicative of ways spreadsheet errors could lead to mistakes in funding levels. We expect to make updates to the format in 2019 to reduce the risk of error and to build in checks for discrepancies.
2017: Failure to publish internal metrics report
How we fell short: Each year, GiveWell publishes a metrics report on our money moved and web traffic. These metrics are part of how we evaluate ourselves. We failed to publish a complete metrics report in 2017, only publishing an interim report at the end of September.
Steps we took to improve: We misassessed the difficulty involved in completing the metrics report. We reassigned responsibility to another staff member who plans to prioritize and publish our 2016 metrics report as soon as possible and 2017 metrics report as soon as possible in 2018.
November 29, 2016 to December 23, 2016: Poor communication about top charity recommendations restricted to a specific program
How we fell short: On November 29th, we released updated charity recommendations. Three of our seven top charities implemented a variety of programs, and our recommendation for them was restricted to a specific program. We did not clearly communicate this fact on our top charities or donate pages, potentially causing donors who gave directly to these three organizations (as opposed to giving via the GiveWell website) to fail to restrict their donations to the programs we recommend.
Steps we've taken to improve: We have updated the pages to reflect the fact that our recommendation for these charities is program-specific.
December 2014: Errors in our cost-effectiveness analysis of Development Media International (DMI)
How we fell short: In early 2015, we discovered some errors in our cost-effectiveness analysis of DMI. See this blog post for details.
Steps we have taken to improve: Going forward, we plan to improve the general transparency and clarity of our cost-effectiveness models, and explicitly prioritize work on cost-effectiveness throughout our research process. See this section of our 2015 annual review for more.
November to December 2014: Lack of confidence in the cost-effectiveness analyses we relied on for our top charities recommendations
How we fell short: We were not highly confident in our cost-effectiveness estimates when we announced our updated charity recommendations at the end of 2014, a fact we noted in the post, because we finalized our cost-effectiveness analyses later in the year than would have been ideal. See this part of our 2014 annual review for more detail.
Steps we have taken to improve: We plan to improve these analyses by reworking our cost-effectiveness models to improve the general transparency and clarity of the analyses and explicitly prioritizing work on cost-effectiveness throughout our research process.
We are also experimenting with more formal project management to increase the likelihood that we complete all tasks necessary for our year-end recommendations update at the appropriate time.
January to December 2014: Completed fewer intervention reports than projected
How we fell short: We published fewer intervention reports than we had planned to at the beginning of 2014. We completed two new intervention reports in 2014, but at the beginning of 2014, we wrote that we hoped to publish 9-14 new reports during the year. On reflection, our goal of publishing 9-14 intervention reports was arbitrary and unrealistic given the amount of time that it has typically taken us to complete intervention reports in the past. See this part of our 2014 annual review for more detail.
Steps we have taken to improve: We have learned more about how much work is involved in completing an intervention report and hope to make more realistic projections about how many we can complete in the future.
November 2014: Suboptimal grant recommendation to Good Ventures
How we fell short: In 2014, we erred in our recommendation to Good Ventures about its giving allocation to our top charities. We made this recommendation two weeks before we announced our recommendations publicly so that we could announce their grants as part of our top charities announcement. If we had fully completed our analysis before making a recommendation to Good Ventures, we likely would have recommended relatively more to AMF and relatively less to GiveDirectly. See this part of our 2014 annual review for more detail.
Steps we have taken to improve: In the end, we adjusted the public targets we announced based on the grants Good Ventures had committed to, so we don’t believe that donors gave suboptimally overall. In the future we expect to make — and announce — our recommendations to Good Ventures and the general public simultaneously.
November 2014: Not informing candidate charities of our recommendation structure prior to publishing recommendations
How we fell short: In our 2014 recommendation cycle, we did not alert our candidate charities of our "Standout Charity" second-tier rating prior to announcing our recommendations publicly. Some of our candidate charities were surprised when they saw their ranking as a "Standout Charity," as they had been assuming that they would either be recommended as a top charity or not recommended at all.
Steps we have taken to improve: We will be more cognizant of how we communicate with charities in the future and will continue to solicit feedback from them so we can identify any other ways in which our communication with them is suboptimal.
July 2014: Published an update to the intervention report on cash transfers that misstated our view
How we fell short: Elie assigned a relatively new Research Analyst to the task of updating the intervention report on cash transfers. The analyst made the specific updates asked for in the task, which led him to change the report’s conclusion on the effect of cash transfers on business expenditures and revenue. A Summer Research Analyst vetted the page, and we published it. After publishing the update, another GiveWell staff member, who had worked on the page previously, noticed that the report’s conclusion on business expenditures and revenue misstated our view.
Steps we have taken to improve: When this mistake was identified, we made two changes. First, when passing off ownership of a page from one staff member to another, we involved all staff members who had previously owned the page via an explicit "hand-off" meeting, and by getting their approval before publishing the page. Second, we became more careful to ensure that all changes made by relatively inexperienced staff are reviewed by more experienced staff before publication.
October 2016 Update: At the time of this update (October 2016), we still aim to hold an explicit "hand-off" meeting which include staff who previously owned the page, although this meeting does not always include all staff who previously owned the page. We do not require the approval of all staff who previously owned the page prior to publication.
February 2014: Incorrect information on homepage
How we fell short: On February 4, 2014, we asked our website developer to make a change to the code that generates our homepage. In the process, he inadvertently copied the homepage content from November 2013. This content had two main differences with the up-to-date content. First, it described our top charities as “proven, cost-effective, underfunded and outstanding” rather than “evidence-backed, thoroughly vetted, and underfunded,” wording we changed in late-2013 because we felt it more accurately described our top charities. Second, it listed our top charities as AMF, GiveDirectly, and SCI, rather than GiveDirectly, SCI, and Deworm the World. According to our web analytics, 98 people visited our AMF page directly after visiting the homepage, possibly believing AMF to be a top charity of ours. Note that the top of our AMF review correctly described our position on AMF at this time.
Steps we’ve taken to improve: We discovered the problem on February 25 and fixed it immediately. We have added a step to our standard process for checking the website after a developer works on it to look for content that is not up to date.
January to November 2013: Social (non-family, non-financial) relationship between GiveWell staff members and staff of a recommended charity not publicly disclosed
How we fell short: Timothy Telleen-Lawton (GiveWell staff member as of April 2013) has been friends with Paul Niehaus (GiveDirectly President and Director) for many years. When Timothy met Holden Karnofsky (GiveWell's Co-Founder and Co-Executive Director) in April 2011, he suggested that GiveWell look into GiveDirectly and introduced Holden and Paul by email. GiveWell later recommended GiveDirectly as a top charity in November 2012, before Timothy was on GiveWell staff.
Starting in January 2013, Holden started living in a shared house with Timothy, around the same time Timothy started a trial to work at GiveWell. Paul has visited and stayed at the shared house several times. We should have publicly disclosed the social connection between Paul and Holden and Timothy.
Note that this mistake solely relates to information we should have publicly disclosed to avoid any appearance of impropriety. We do not believe that this relationship had any impact on our charity rankings. Timothy was not the staff member responsible for the evaluation of GiveDirectly, and Holden has had relatively little interaction with Paul (and had relatively little interaction with Timothy prior to moving to San Francisco in 2013).
Steps we have taken to improve: We publicly disclosed this fact in December 2013; at that time, we also created a page to disclose conflicts of interest.
February to September 2013: Infrequent updates on our top-ranked charity
How we fell short: We aimed to publish regular updates on the Against Malaria Foundation, but we went most of the year (February to September) without any updates. This was caused by our desire to publish comprehensive updates, and we allowed expectations of new information being available shortly to delay publishing brief updates that had meaningful but limited information.
Steps we have taken to improve: As of July 2013, we changed our process for completing top-charity updates. We began publishing notes from our conversations with these charities (as we do for many of the conversations we have more generally) which should lead to more timely updates on our top charities.
October 2016 Update: We now plan to publish twice-yearly 'refreshes' of all of our top charity recommendations, in addition to publishing conversation notes and relevant updates throughout the year.
May to June 2013: Unpublished website pages intermittently available publicly
How we fell short: From May 20 to June 26, private content was intermittently available to the public on the GiveWell website. A change we made on May 20 caused pages set to be visible by staff only to appear, in some browsers, as a page with a login screen and below it, the unpublished content. Unpublished content includes both confidential information and incomplete research. Confidential information on unpublished pages is generally information that we expect to be able to publish, but which we have not yet received approval from an external party to publish. However, there are exceptions to this and it is possible that sensitive information was revealed. We are not aware of any cases of sensitive information being revealed.
Steps we have taken to improve: We fixed the problem a few hours after discovering it. We have added monitoring of unpublished pages to our list of regular website checks.
April to December 2012: Taking too much of job applicants' time early in the recruiting process
How we fell short: During this period, our jobs page invited applicants to apply for our research analyst role. We responded to every applicant by asking them to work on a "charity comparison assignment" in which each applicant compared three charities and discussed which charity they would support and why. This assignment took applicants between 6 and 10 hours to complete. During this period, approximately 50 applicants submitted the assignment, of which we interviewed approximately 8.
We now feel that asking all applicants to complete this test assignment likely took more of their time than was necessary at an early stage in the recruiting process and may have led some strong applicants to choose not to apply.
Steps we've taken to improve: We no longer ask all applicants to complete this assignment. In December 2012, we changed our jobs page to more clearly communicate about our hiring process.
March to November 2012: Poor planning led to delayed 2012 charity recommendations release
How we fell short: In early GiveWell years, we aimed to release updated recommendations by December 1st in order to post our recommendations before "giving season," the weeks at the end of the year when the vast majority of donations are made. In 2011, we released our recommendations in the last week of November, but then ran into problems related to donation processing. To alleviate those problems in the future, we planned to release our recommendations in 2012 by November 1st to give us sufficient time to deal with problems before the end of the year rush of giving.
In 2012, we did not release our recommendations until the last week of November (significantly missing our goal). We continued to publish research about the cost-effectiveness and evidence of effectiveness for the interventions run by our top charities throughout December, which meant that some donors were making their giving decisions before we had published all the relevant information. The primary cause of the delay was that we did not start work on GiveDirectly, the new 2012 top-rated charity until mid-September, which did not give us enough time to finish its full review by the November 1st deadline.
Steps we've taken to improve: In 2013, we again aim to release our recommendations by November 1. This year, we plan to explicitly consider possible top-rated charities on July 1st and move forward with any contenders at that point. October 2016 Update: We now aim to publish our top-charity recommendations before U.S. Thanksgiving each year, so as to make them available throughout the year-end giving season.
June 2012: Failure to discuss sensitive public communication with board member
How we fell short: In late June 2012, we published a blog post on the partnership between GiveWell and Good Ventures. We generally discuss sensitive public communication with a board member before we post, but failed to do so in this case. The post was not as clear as it should have been about the nature of GiveWell's relationship with Good Ventures. The post caused confusion among some in our audience; for example, we received questions about whether we had 'merged.'
Steps we've taken to improve: GiveWell staff will be more attentive in the future to sharing sensitive public information with the board member responsible for public communication before posting.
July 2007 to March 2012: Phone call issues
How we fell short: Throughout GiveWell's history, we have relied on Skype and staff's individual cell phones to make phone calls. This led to instances of poor call quality or dropped calls, but given the fact that GiveWell was a startup, those we spoke with generally understood. In addition, we had not always confirmed with call participants the phone number to use for a particular call or set up and send agendas for the call in advance. Earlier in GiveWell's history, participants likely understood that we were a very new, small organization just getting started and aiming to control costs. But, as we've grown this is no longer a reasonable justification, and both of the problems listed here may have had implications for the professionalism we've projected to those we've spoken with.
Steps we have taken to improve: We have continued to be more vigilant about confirming that all participants are aware of the number to use for scheduled calls. In March 2012, we set up dedicated lines and handsets for our calls.
December 2011: Poor communication to donors making larger donations (e.g., greater than $5,000) via the GiveWell website
How we fell short: In giving season 2011, there were 3 major issues which we communicated poorly about to donors:
- While Google covers credit card processing fees for charities enrolled in the Google Grants program (which includes GiveWell, itself, and many of our top charities), many charities are not enrolled and therefore donors who give to them via our website do pay credit card processing fees on their donations. While these fees are small in absolute terms for smaller donors, a 3% fee on a $10,000 donation is $300. Some donors may realize this and choose to give via credit card regardless. Some, however, may not have realized this and would have preferred to have mailed a check to save the fee. October 2016 Update: Donors who support GiveWell or our top charities through donations to GiveWell are subject to payment processing fees that vary depending on the platform through which they donate; we no longer receive free processing through Google due to the end of that program. We have published a page detailing the options for donating to GiveWell and associated fees, as well as advice for larger donors interested in minimizing such fees.
- People making large donations very frequently run into problems with their credit card companies (due to the fact that they are spending so much more on a single item than they usually do). In our experience, about half of donations over $5,000 are declined the first time a donor tries to make the gift and are only cleared after he or she speaks with his card company. This creates confusion and unexpected hassle for donors trying to give to charity.
- Giving via appreciated stock has beneficial implications for donors allowing them to reduce future capital gains taxes and therefore give more to charity (without giving more "real" money). We did not broadcast this message to donors.
Steps we have taken to improve: Though GiveWell's responsibility for communicating about the points above varies, communicating well about all of the above furthers our mission. We plan to communicate better about these points to larger donors in 2012. (More at a 2012 blog post.)
October 2016 Update: In addition to our page listing giving options for donors, we also have a page of advice for larger donors, including donating appreciate securities.
December 2011: Problems caused by GiveWell's limited control over the process for donating to our top charities
How we fell short:
- On December 21, 2011, a representative from Imperial College Foundation (the organization receiving donations for the support of the Schistosomiasis Control Initiative, our #2-rated charity in 2011) emailed us to let us know that its Google Checkout account had been suspended. Donors who wanted to give to SCI via the GiveWell website give via Google Checkout, and though the Google Checkout button is on the GiveWell website, the charity owns the Checkout account and donations go right to it. GiveWell staff therefore did not know there was a problem until the ICF representative informed us of it. We still do not know how long the problem lasted or whether any donors attempted to make donations during the time the account was suspended. (We do not even know how Google communicated to them about the error). ICF contacted Google but has not determined what led to the account suspension.
Once we learned of the problem, we reconfigured donations to go through GiveWell.
- As noted elsewhere on this page, many larger donations made via credit card are initially declined by credit card companies due to the fact that many donors give a larger amount to charity than they spend in a single purchase throughout the year. Because donations go directly to our charities, at times, GiveWell has to coordinate with charities representatives to cancel charges so that donors feel safe resubmitting their donation. This creates confusion, wastes time, and doesn't allow donors to complete the transaction as quickly as they would like.
- Setting up trackable donation processing for our top charities requires individual communication with each charity. This means that we must spend time communicating with each charity, and each charity must spend time creating its account. Also, in the event that the charity does not have time to set up the account or sets up the account but it has a problem, the required tracking may not be in place. With several charities in 2011, tracking was either not set up at the time we released our recommendations or we needed to create a one-off workaround to track donations to them.
Steps we have taken to improve:
- We plan to better advise larger donors of their non-credit-card options for donating and potential hassles of donating via credit card. October 2016 Update: We have a page of advice for larger donors that discusses options for making large donations. We also have an information page discussing different donation options.
- We are now considering switching over donations to all charities to go through GiveWell so that we are immediately aware of any problems. October 2016 Update: We offer donors the option of donating to GiveWell for regranting to our top charities, or donating directly to our top charities and letting us know that they've done so.
- We aim to complete our recommendations earlier in 2012 than 2011 (to give us additional time to address any problems that come up).
December 2011: Miscommunicating to donors about fees and the deductibility of donations to our top charity
How we fell short: Our top-rated charity in 2011 was the Against Malaria Foundation. We made two errors in the way we communicated to donors about the ramifications of donating to AMF.
- Fees: On our donate to AMF page, we told donors that "no fees are charged on donations to AMF." This was incorrect. Donors who give via AMF's website are charged normal credit card processing fees. We now understand that we miscommunicated with AMF on this issue; AMF did not intend to communicate that there are no processing fees and was unaware that we were communicating this on our site.
- Tax deductibility in Australia: On our top charities page that we published on November 29, 2011, we listed Australia as one of the countries for which donors could take tax deductions. We believed this was accurate because AMF listed Australia as one of the countries in which it is a registered charity. In early December, an Australian donor emailed us to let us know that while AMF is a registered charity and corporations can deduct donations to it, it does not have a status that allows individuals to deduct donations to it. (This issue is discussed in a 2012 blog post.) [November 2015 update: Gifts from individuals to AMF are now tax deductible in Australia. See this blog post for details.]
Steps we have taken to improve:
- Fees: We changed the language on our page to clarify that credit card fees are charged on donations via AMF's website. We also provided donors who wished to give for the support of AMF the option to give donations directly to GiveWell. Because GiveWell was enrolled in Google's Grants program, Google paid credit card processing fees for donations. GiveWell then had the ability to regrant these funds to AMF.
October 2016 Update: Credit card processing fees are incurred if a donor supports AMF through GiveWell; they are no longer covered by Google. Additional details are available here.
- Tax deductibility in Australia: We took several actions. (1) We emailed Rob Mather, AMF's CEO. He agreed that the charity status page on AMF's website was misleading. AMF edited the page to clarify its status in Australia, and Rob Mather offered to refund any donations (or parts of donations) made by Australians relying on the fact that they could receive a tax deduction. (2) On our site, we removed Australia from the list of countries in which AMF is registered for individual-donor tax deductibility. (3) We emailed all Australian donors who had given to AMF (and had found AMF via GiveWell) since we had posted that donations to AMF are tax-deductible for Australians to let them know we had erred and we communicated Rob Mather's offer to refund donations. AMF is in the process of applying for tax deductible status for individuals and will inform us if and when that has been granted. AMF has also told us that the two donors that have asked for refunds have both said they will donate the same amount to AMF when the tax deductible status is in place.
As of September 2015, gifts from individuals to AMF are tax deductible in Australia. See this blog post for details.
Late 2009: Misinterpreted a key piece of information about a charity to which we gave a $125,000 grant
How we fell short: When reviewing Village Enterprise (formerly Village Enterprise Fund) in late 2009, we projected that they would spend 41% of total expenses on grants to business groups, because we misinterpreted a document they sent us which projected spending 41% of total expenses on business grants and mentorship expenses. We do not know what mentorship expenses were expected to be so we do not know the magnitude of our error. Village Enterprise ended up spending 20% of total expenses on business grants in FY 2010. We caught this mistake ourselves when we were updating the review in August 2011. Village Enterprise plans to spend 28% of total expenses on business grants in FY 2012.
Steps we are taking to improve: We have updated our review of Village Enterprise to reflect the correct distribution of expenses. Going forward, before publishing a page, at least one additional GiveWell employee will check the original source of figures that play a key role in our conclusions about a charity or program.
October 2016 Update: We do not always implement this check before publication; sometimes pages are vetted after they are published.
August 1, 2009 to December 31, 2009: Grant process insufficiently clear with applicants about our plans to publish materials
How we fell short: Between 8/1/2009-12/31/2009, we accepted applications for $250,000 in funding for economic empowerment programs in sub-Saharan Africa. We attempted to be extremely clear with charities that we planned on sharing the materials they submitted, and that agreeing to disclosure was a condition of applying, but in a minority of cases, we failed to communicate this. We conceded these cases and gave the charities in question the opportunity to have their materials - and even the mention of the fact that they had applied for funding - withheld.
We try to avoid keeping materials confidential unless absolutely necessary, and in this case our unclear communications led to confrontations and to confidentiality situations that could have been avoided.
Details at this blog post.
Steps we have taken to improve:
- We offered the minority of charities with whom we'd been unclear the option not only to have their materials omitted, but to have us not disclose the fact that they applied for funding from us.
- We added clarificatory language to the top of our charity reviews, in order to clarify what a "0-star rating" means.
- In the future, we may publicly publish pages on charities we consider before we accept materials from them, in order to make our intentions about disclosure and public discussion absolutely clear.
October 2016 Update: Details of our current transparency policy are available here.
November 25, 2009: Mishandling incentives to share information
How we fell short: A blog post discussing the Acumen Fund paraphrased information we'd been given during Acumen's application for funding from us. An Acumen Fund representative told us this had come off as a "bait and switch": using the grant application as a pretense for gathering information that we could use for a negative piece. (This was not the case; we had invited Acumen to apply in the hopes that they would be a strong applicant, and would have written a similar blog post afterward if they had simply declined to speak with us.)
We try to avoid creating incentives for charities to withhold information, given how little is available currently. Therefore, we are generally careful with how we use any substantive information that is disclosed, and generally check with the charity in question before publishing anything that could be construed as "using it to make a negative point." (An example is our post on microfinance repayment rates, which uses voluntarily disclosed information to raise concerns about the repayment rate while attempting to be clear that the organization in question should not be singled out for this disclosure. We checked with the organization discussed before making this post.)
In this case, we published our post without such a check, reasoning that we were not sharing any substantive materials (only paraphrasing general statements from representatives). Doing so gave the impression that sharing more information can result in more negative coverage.
We continue to struggle with the balance between disclosing as much information as possible and avoiding disincentives to share information. We will not find a solution in every case, but feel that we mishandled this one.
Steps we have taken to improve: We have let Acumen Fund know that we regret this incident and resolved to be more careful about quoting from representatives and grant applications in the future.
May 2009: Failed to remove two private references from a recording that we published
How we fell short: In May 2009, we discussed the Millions Saved project with a staff member of the project, Dr. Jessica Gottlieb, and then published a copy of the recording of the conversation to our website. Dr. Gottlieb approved making the recording public on the condition that we remove personal references that she made during the conversation. We partially removed the references, but we failed to remove one person's email address and Dr. Gottlieb's suggestion that we speak with a particular person. We noticed this error in February 2014 while preparing to use this recording as part of a test assignment for potential employees. According to our logs, no one had downloaded the audio file during the previous year.
Steps we have taken to improve: We notified Dr. Gottlieb about this mistake and apologized to her. Subsequent to (and unrelated to this error), we had implemented a formal procedure for reviewing uploaded files to confirm that all requested changes to files have been made.
January to September 2008: Paying insufficient attention to professional development and support
How we fell short: At our board meeting in January 2008, we agreed to explore options for professional development and mentoring, in light of the relative youth and inexperience of our staff. GiveWell staff put a lower priority on this than more time-sensitive goals, and while we explored a few options, we made little progress on it between January and September. At the September Board meeting, the Board criticized this lack of progress and reiterated the need for professional development and mentoring.
Steps we have taken to improve: As of July 2009, we had two highly regular mentoring relationships, and two more in a "trial phase." We also stepped up Board oversight through a monthly conference call (attendance was optional but generally high) and more regular calls with Board Vice-President Lindy Miller. An update on professional development was presented at our July 2009 Board meeting.