Effective Altruism has Five Serious Flaws - Avoid It - Be a DIY Philanthropist Instead
Hank Pellissier
2015-07-13 00:00:00

I happily devoured Singer’s initial book on the topic The Life You Can Save: How to Do Your Part to End World Poverty Enthusiastically, I dug into his followup, The Most Good You Can Do: How Effective Altruism Is Changing Ideas About Living Ethically. About three chapters in, I had a few quibbles, but I was feeling so ethically-enamored I enrolled in Singer’s Princeton free Effective Altruism online course, plus I joined the Effective Altruism Bay Area Meetup Group - to deepen my EA education and commitment.



Diving into Princeton’s online coursework, I quickly read all Week One assignments. I was zooming along the straight-and-narrow path to to becoming a full-fledged EA disciple, but then… I meandered over to the Effective Altruism Forum introductory page that displayed 22 short essays; I read them all immediately with increasing dismay.

The “minor quibbles” I originally had with EA flared into major disappointment. I backtracked; I reversed; I rebelled. Now I am not going to the Meetup, I’m not advancing to Week Two in the online course, I’m tossing The Most Good You Can Do into my Kindle archive. I’m finished with EA.

Here’s the five faults I find with Effective Altruism:

1. EA’s Over-Reliance on Charity Navigator, Give Well, and other NPO Evaluators
2. EA’s Stance that “Earning High and Giving Big” is Superior
3. EA’s Too-High Consideration for Animal Rights
4. EA’s Weird, Wrong Alliance with MIRI (Machine Intelligence Research Institute)
5. There’s an Alternative to EA that’s Far Superior: I call it “DIY Philanthropy”


I need to disclose my activities now: I’m founder/director of a small nonprofit that raised $40,000 last year for orphanages, schools, and clinics in Uganda, The Philippines, and the Congo.


Let’s examine EA’s problems, one at a time:

FLAW #1: Over-Reliance on Charity Navigator, Give Well, and other NPO Evaluators

Singer and EA advise giving money only to organizations that donate 75% - 100% of their budget on services. Worthy groups - like Against Malaria Foundation - are determined by “evaluators” like Charity Navigator and GiveWell, who spend millions conducting their research.

There are two humungous flaws in this simplistic and elitist procedure:

1. The effectiveness of a charity’s service depends on more than the service percentile of its budget. If 60% of one NPO’s funds provides a greater good than 90% of another NPO’s budget that serves the same community - dollar for dollar - it’s more effective and ethical to fund the higher-quality service.

2. GiveWell and Charity Navigator both have a “Bigger is Better” bias that cripples grassroots charity startups. Well-deserving groups like mine that spend 100% of our income on services are off-the-radar of the evaluators because we’re too new and too small. Charity Navigator only recommends long-lived humanitarian behemoths that have total revenue of more than $1,000,000 annually, that have been in existence for seven years.

EA’s recommendation that philanthropists fund only the Big Old Charities cripples small newbie groups like mine. My NPO has lost several potential donors who backed out when they couldn’t find our name listed on GiveWell or Charity Navigator. What EA has done is “corporatize” the humanitarian field; they’ve provided large budget NPO’s with a massive advantage over startup charities. EA is “centralizing” the giving field, enabling a select few groups to monopolize.

Do I believe little tiny humanitarian groups provide better services than the massive bureaucracies? YES. For real “effectiveness” in altruism I suggest seeking out solitary individuals or small contingents who are trying to improve your favorite concern.



Hannah Smith, for example, is a blonde UK do-gooder who helps destitute war orphans with PTSD in the Congo via her Congo Orphans Trust. She trucks life-saving supplies into the dangerous region, to mud-and-wattle orphanages packed with hungry kids with haunted eyes.

Thousands of “DIY Humanitarians” like Hannah exist. They will answer your emails, they will thank you profusely, they will deliver 100% of whatever you give them because they do it because they care, not due to a salary. These people run small Indiegogo and GoFundMe campaigns, or maybe they just solicit their friends, communities, or churches. Peter Singer and the EA movement ignore and injure their potential to contribute by promoting Big NPOS only.

To learn more about “start-up” humanitarianism, I suggest reading A Path Appears by Nicolas Kristoff and Sheryl WuDunn.

FLAW #2: EA’s Stance that “Earning High and Giving Big” is Morally Superior

EA crudely seems to regard CASH DONATED as containing the highest moral value. The EA-promoted essay “To save the world, don’t get a job at a charity; go work on Wall Street” by William MacAskill says, “while researching ethical career(s)… I concluded that it’s in fact better to earn a lot of money and donate a good chunk of it…you’ll have made a much bigger difference.” Doing good deeds in your vocation, claims MacAskill, is probably inferior: “if you decide to work in the charity sector, you’re rather limited.”

His reasoning - supported by EA - proclaims it is ethically superior, for example, to take a $200,000 annual income job on Wall Street, and donate 50% to charity, than it is to, for example, teach High School Math in the inner city for $50,000 and donate $5,000 to charity. He’s wrong in this inhumane assessment, for two reasons:

1. The happiness of the individual funder is disregarded. Of course it is wonderful that the additional $95,000 gained is perhaps curing malaria, but its callous to suggest that everyone in the developed world is ethically required to to devote themselves to high-salaried occupations, that they might hate. The giver’s life and need for happiness also contains value. Mandating that developed-world people should labor for others in occupations that might make them miserable is self-righteous and unethical.

2. EA disregards the “human value” of an occupation. The math teacher is unable to donate $95,000 annually to charitable causes, but he is, every school day, conveying information on an important topic and serving as a role model and support for young adults. He is in a position to touch, change, and improve lives. Maybe he will inspire his students to quite drugs, leave gangs, go to college. Wall Street sharks aren’t doing that; they’re usually just helping the 1% maintain their privileged status. Is the math teacher’s contribution to helping humanity less than the Wall Streeter, as MacAskill asserts? No. I believe the math teacher’s contribution is greater.

FLAW #3: EA’s Too-High Consideration for Animal Rights

Yes, I’m an omnivore, and Yes, I’m a “Speciesist.” I’m a Humanitarian not an Animalitarian. My priority is helping humans first; I think its the ethical and sensible stance. I deplore factory farming but it is below human slavery and genocide on my list of concerns.

Singer/EA, not surprisingly, puts Animal Causes on the list of most-ethical concerns. Animal Rights NPOs even have their own evaluator. I don’t support this position; I find it misguided. Hundreds of millions of dollars are already donated to beast-centric concerns that pamper orphan tigers, for example, so they can return to the wild and slay ungulates. I feel a wee bit of compassion for these creatures, but the vast majority of my empathy is reserved for homo sapiens who are hungry, diseased, and uneducated.

Truth is, I don’t think its ethical to donate thousands of dollars to furry-friendly organizations like Maddie’s Fund in San Francisco, where waiting-for-adoption felines recline on comfy furniture, licking their lips while watching song bird videos on their own television set. Meanwhile, right outside, homeless humans dig through dumpsters looking for slabs of cardboard to use as a mattress for the night.

Helping Humans First is central to my moral code. Singer elevates animals to a level that is unacceptable to me; this promotion detours money away from needy people. I find that crazy and shameful.



FLAW #4: EA’s Weird, Wrong Alliance with MIRI (Machine Intelligence Research Institute)

MIRI is a Berkeley-based research team that was previously-titled SIAI (Singularity Institute for Artificial Intelligence). MIRI has a history of arrogance and aggressiveness, justified in their minds, I suppose, by their opinion that the future of the world depends on their ability to help create Friendly AI. MIRI has the financial support of Peter Thiel, who is worth $2.2 billion on Forbes The Midas List. MIRI isn’t curing disease or helping the poor; it’s budget pays the salaries of its aloof, we’re-more-rational-than-you researchers. I’m dismayed that MIRI has infiltrated EA.

Two of the recommended introductory essays on the Effective Altruism organization site are written by MIRI members. Posted second, right under Singer’s preface article, is a math-wonky article by SIAI/MIRI founder Eliezar Yudkowsky. Luke Muelhauser, MIRI’s recent Executive Director (who left last month to join GiveWell), wrote a let’s-set-the-agenda article further down the list, titled “Four Focus areas of effective altruism.” He places MIRI in the third focus area.

MIRI/SIAI tried to “take over” the transhumanist group HumanityPlus 3.5 years ago, when four SIAI members ran for H+’s Board. SIAI ran a sordid, pushy, insulting campaign, bribing voters, accusing opponents of “racism”, deriding Board members as “freaky… bat-shit crazy [with] broken reasoning abilities.” MIRI failed in their attempt to colonize H+, but they’ve successfully wormed their way into the heart of EA.

A colleague of mine (who asked me not to disclose their identity) attended the 2014 EA Summit in San Francisco and afterwards was of the impression that: “MIRI and CFAR (Center for Applied Rationality) are essentially the "owners" of EA. EA as a movement has already sold itself in deals to devils.” This is surely an exaggeration in international EA, but in the SF Bay Area.. MIRI's presence within EA is uncomfortably strong.



FLAW #5: There’s an Alternative to EA that’s Far Superior: I call it “DIY Philanthropy”

Effective Altruism provides too much advice and too many judgmental opinions on who, how, or why to fund. This renders us passive because EA insists that it’s already done the research and ethical thinking for us.

Compassionate people don’t need Big Brother informing them what right or wrong, how to help others. EA is just an obstacle in the path of a far better activity: DIY Philanthropy.

I won’t provide your with lengthy instructions detailing how to accomplish this. being a DIY Human means figuring it out yourself. My only hint is: be a Hannah Smith. She wants to help war orphans in the Congo, so she helps them.

You don’t need Peter Singer and EA telling you how to be charitable.

Let your own brain and heart be your guide.

---

A Vox article that supports my POV can be found HERE


Another IEET essay on Effective Altruism can be found HERE

An essay on DIY Philanthropy can be found HERE