Our fund managers work at the Open Philanthropy project, where they work full-time, finding the very best donation opportunities. They look for charities working in their key focus areas, which have a good track record and a robust evidence base to support their chosen intervention. If the program you'd like to recommend is within one of their key focus areas, run some quick calculations to check if the charity you'd like to recommend is within the same range of cost-effectiveness as the previous grantees. If so - great! we'd love to hear about it, and I'd recommend you get in touch with the program officer in charge of the program area your non-profit targets.
We recommend running this quick test because while many non-profit programs have a plausible story for impact, very few non-profits surpass our stringent bar for effectiveness.
We don't charge any management fees, you can donate through EA funds, and we'll pass on 100% of the money we receive to the charities.
We don't consider a charities' overhead when evaluating effectiveness, in the same way that you don't consider how much Tim Cook gets paid when you decide whether to buy an iPhone. Historically, groups like Charity Navigator have looked at overhead ratios for one simple reason - they're much easier to measure. Unfortunately, overhead ratios are simply not a useful measure for evaluating charity effectiveness, and they are easily gamed by unscrupulous charities trying to raise funding. Instead, we look at the total costs a charity incurs, and add in any costs that they don't include in their budget, but are necessary to deliver their program. Then we look at the outcomes the charity achieves for that funding as a whole. I'd check out Dan Pallotta's TED talk if you want to know more the overhead myth.
https://www.ted.com/talks/dan_pallotta_the_way_we_think_abou...
Just to add my $0.02, my impression is that while many EAs enjoy discussing the thorny philosophical issues like whether we should be concerned about insect-suffering, or wild-animal suffering, very few would advocate that we actually support habitat destruction or massive interventions in nature. Even groups like FRI, who are heavily focused on suffering, promote the idea of moral uncertainty. They believe that we should avoid drastic actions based on a narrow ethical view, due to uncertainty about which ethical views are more valid.
Like with everything, more controversial issues are more likely to be picked up by the media and blown out of proportion, relative to the actual level of support they receive. I would be extremely surprised if any money from the CEA Animal Welfare Fund went to support habitat destruction to reduce wild-animal suffering. I would be less surprised if money from the fund went to support research into animal consciousness, to help us better compare different types of animal welfare interventions.
I'm not from the media and I am campaigning against "effective altruism" because it promotes eco-terrorism, habitat destruction, call it what you want. I am compelled to do this to protect public safety.
There's no point in denying that a large portion of self-identified EAs are for eco-terrorism, it's all over the internet. There is no gray area here, you are either with the terrorists (strong negative utilitarians) are against them.
The Long-Term future fund is designed to be fairly broad at some point, and it may well support some climate change initiatives in the future, as well as addressing potential risks from advanced artificial intelligence, and any other potential global catastrophic risks. Nick Beckstead has previously expressed interest in funding more research to quantify the tail risks associated with runaway climate change in particular.
Having said that, one of the reasons that EA has not focused as heavily on climate change historically is that climate change is not as neglected as potential risks from AI, or biotechnology. We are happy to see a lot of funding going into climate research and modelling, and a lot of grassroots activism. While climate change is far from solved, and remains one of the most important problems of our time, many EAs think that we should first focus on problems which receive less media attention but are plausibly just as serious.
That doesn't mean there's nothing neglected in climate change (e.g. negative emissions technology) and it would be good to see some more investigation into this.
> many EAs think that we should first focus on problems which receive less media attention but are plausibly just as serious.
How is that effective? How could you even measure the effectiveness of focusing on problems that don't currently exist but that you happen to consider "plausible"? (Global warming, in contrast, is quite measurable.)
I'm looking at the list of recipients of that fund and it looks like self-dealing: people on the futurist side of EA would like to encourage people to donate to people on the futurist side of EA.
That's true. You can donate to GiveWell's top charities directly through GiveWell. For many donors, especially those who only want to donate to proven charities with a good track record, this is a great choice.
With our Global Health and Development Fund, we hope to also make small, seed grants to promising new initiatives, to help them build evidence to support their program, or to replicate a promising intervention in a new geographic region, to figure out if the program can scale. Many of these donation opportunities are small, and individual donors won't necessarily hear about them. With Elie managing this fund, we hope to be able to quickly fund promising new projects, so they spend less time on PR and fundraising, and more time doing good work.
It's really great to hear this view. While lots of donors split their donation between our funds, many people choose to allocate 100% of their donation to a single fund. At a later date, we want to allow people to share their fund allocations, so donors can compare allocations and discuss differences in cause prioritization.
Right now, our Global Health and Development Fund has 61% of all donations by value, with the Long-Term future fund coming in at 22%. It will be exciting to see how this changes over time, and whether there are differences in fund allocation by geographic area or demographics.
FWIW, about 11% of people at our last conference identified themselves as following an earn-to-give path, and we think that's about the right proportion. We don't think everyone should earn-to-give, but we do think it's a path that some people should consider.
I spent about a year earning-to-give as a pharmacist myself before I realised that I could probably do much more good by joining CEA and doing direct work. I gained lots of valuable, real-world experience working in hospitals, and also at big organizations like the Red Cross, this experience not only helped me donate to charity in the short-term, but also to use that experience in my direct work now. I hope that many people will follow a similar path.
Hi, thanks for your post.
I think that many EAs would agree with you here. Effective Altruism is not just about maximizing QALYs. It's about figuring out what doing good even means, figuring out how to improve the world, and then actually doing it. We're not just about educating people on one narrow conception of what doing good means, we know that doing good is hard and complex, so we're trying to build a community of people focused on figuring that out. Our community debates what it means to do the most good endlessly, and there is a lot of nuance, that's what we love about it!
You mentioned some interesting issues in population ethics in your post, it appears you take the person-affecting view. Many EAs who take this view prefer to support global health, or animal welfare charities, as they do not think it is beneficial to maximise the number of happy people in the world. Other people in our community think that a world with more people is better, provided that adding more people does not reduce the overall total happiness. In our summary of the Long-Term future Fund, you can read Nick's take on the person-affecting view[0] and see some more links to discussion of this issue. We love getting into these debates and seeing lots of different perspectives.
[0] https://app.effectivealtruism.org/funds/far-future
We do need many more talented people to found and work for effective non-profits. At CEA, we've found it hard to hire extremely talented people, as many of the people we want to hire want to continue donating instead! Non-profits often find it hard to attract top talent, in part because they tend to pay lower salaries, working at a non-profit is less prestigious, and because talented people tend to want to work with other talented people. We have been impressed with Y Combinator's approach here, they're trying to incentivize talented people to found really effective non-profits, and then help them scale. We're hoping that if donors are willing to fund these new promising projects, many more people will be drawn to the non-profit world.
On the other hand, just like founding a company is not for everyone, neither is working at a non-profit. We should each consider our comparative advantage. As tempestn mentioned, some people might be extremely good at their day-job and be well compensated for it, but might not necessarily make excellent activists. In that case, that person might be able to help the causes they care about most by donating to effective charities.
We need all kinds of people!
Is there an EA job board somewhere? Maybe people aren't interested due to pay, but I doubt many people are aware the jobs exist in the first place. I was under the impression it was a small pool of applicants but a much smaller pool of job openings.
While there aren't many job openings posted, many effective organizations are often open to hiring talented individuals at other times throughout the year. It's definitely worth dropping organizations a quick email to let them know you're interested in future opportunities. If any of you are interested in working at CEA in the future, head to our website and ask to be added to our recruitment email list. We'll be hiring for 4-6 positions later in the year.
There's this facebook group (https://www.facebook.com/groups/1062957250383195/) where EA orgs tend to post their jobs, along with other jobs from orgs that are interesting to EAs even if not explicitly EA.
We agree, effective altruism attempts to focus on problems that are important, tractable and neglected. Animal welfare fits squarely into this bucket for all the reasons you mentioned.
In particular, we often focus on farm animal welfare, as this is an area that is particularly neglected. I love this post from Animal Charity Evaluators[0] which explains that while farmed animals account for 99% of animals killed and used by humans in the US, the vast majority of donations go to animal shelters, with only 1% going to charities which help reduce the suffering of farmed animals.
[0] https://animalcharityevaluators.org/blog/why-farmed-animals/
Our fund managers work at the Open Philanthropy project, where they work full-time, finding the very best donation opportunities. They look for charities working in their key focus areas, which have a good track record and a robust evidence base to support their chosen intervention. If the program you'd like to recommend is within one of their key focus areas, run some quick calculations to check if the charity you'd like to recommend is within the same range of cost-effectiveness as the previous grantees. If so - great! we'd love to hear about it, and I'd recommend you get in touch with the program officer in charge of the program area your non-profit targets. We recommend running this quick test because while many non-profit programs have a plausible story for impact, very few non-profits surpass our stringent bar for effectiveness.
GiveWell is especially excited to see non-profits working on these intervention areas http://blog.givewell.org/2015/10/15/charities-wed-like-to-se...