David Piepgrass
4 min readMay 7, 2022

--

Hi, I'm an EA and a sort of lukewarm longtermist. I also have a bee up my ass about misinformation, as you can see in my latest medium article on the Ukraine war.

In connection with longtermism, you've linked to what appears to be a deliberately misleading editorial.

It starts out in a way that reminds me of the time Penn & Teller: Bullshit had an episode about climate change in which they pitted a few random local activists against professional contrarians who explained how all this global warming stuff is bullshit. They didn't talk to any mainstream climate scientists, and they certainly didn't mention that there was a consensus among them.

Similarly, Phil's opening act mentions Nick Bostrom, Effective Altruism, FHI and other key players, but instead of talking about their philosophy and opinions, he begins with guilt-by-association fallacies, pointing out that Elon Musk endorses longtermism (I notice it's trendy to dislike Musk) and associates Peter Thiel (billionaire Trump supporter) with EA in a very direct way. I've been an EA for years and cannot recall any connection between Peter Thiel and EA, so I posted a question on the EA forum to get more information about this (and about this baffling author). When he does get around to talking about longtermism, he provides a distorted picture.

Unfortunately I don't see an EA response to that specific article, but at the following URLs you can see some EA responses to the *kinds* of things that Phil Torres writes:

https://forum.effectivealtruism.org/posts/9YFYuw7qAj3ovh9uK/phil-torres-article-the-dangerous-ideas-of-longtermism-and

https://forum.effectivealtruism.org/posts/xtKRPkoMSLTiPNXhM/response-to-phil-torres-the-case-against-longtermism

https://forum.effectivealtruism.org/posts/tvgvZcmwyrBK9kKbi/the-phil-torres-essay-in-aeon-attacking-longtermism-might-be

I don't want to waste time on a point-by-point takedown, but I do want to comment on his very first direct criticism of longtermism, which is about climate change:

Why do I think this ideology is so dangerous? […] Consider that, as I noted elsewhere, the longtermist ideology inclines its adherents to take an insouciant attitude towards climate change. Why? Because even if climate change causes island nations to disappear, triggers mass migrations and kills millions of people, it probably isn’t going to compromise our longterm potential over the coming trillions of years.

From my perspective this is a very interesting claim, because I spent a good solid year, unemployed, personally arguing with climate science deniers one-on-one, writing articles like this one and eventually being invited as a volunteer moderator on Denial101x and a volunteer member of the SkepticalScience team, during which I wrote articles like "How could global warming accelerate if CO2 is 'logarithmic'?"

I think this makes me a quazi-expert in climate change, as a person who knows a lot about this subject and firmly supports the mainstream consensus position that humans are causing climate change and that it is bad.

What I remember about the EA debate on climate change doesn't sound like what Phil is saying. He is correct that climate change is probably not an existential risk, but his claim that longtermists are "insouciant" (indifferent) about the matter is probably false for most longtermists and very clearly false for me. I have personally donated several thousand dollars to this EA campaign in support of clean energy technology.

Also noteworthy is his labeling of longtermism as an “ideology”. As a fan of the Niskanen Center’s invitation to “abandon ideology”, I take exception to this. I think Phil Torres wants people to think that longtermists, who are high-rung people on The Thinking Ladder, are low-rung ideological zealots.

Edit: also noteworthy is that rich people spending money on preventing pandemics, dangerous AGIs and nuclear war is labeled “so dangerous” while rich people spending money on yachts, mansions, private islands, and private jets is not.

Granted, I'm only a "lukewarm" longtermist - you might say I think that longtermism, short-termism and medium-termism all have important and valuable cause areas. But generally I am lukewarm not because I think longtermism isn't the most important thing, but because I think that

  1. putting humanity on a good trajectory in the short term will increase the chance humanity being on a good trajectory in the long term.
  2. the far future is extremely uncertain, which limits practical longtermist interventions to things like x-risk and s-risk mitigation, which I don't think will achieve longtermist goals all by themselves. Thus, short-term and medium-term ideas should also be important to people who are devoted to longtermism.
  3. humans naturally have a bigger emotional attachment to people who are already alive, and moreover humans have a mental flaw called "scope indifference", due to which 10²⁰ lives in the future only feel slightly more valuable than the ~10¹⁰ human lives that exist today. As a result of these human flaws, longtermism is not an emotionally compelling way to promote EA. Given that increasing the number of EAs will increase the chance of a healthy long-term future, and that long-termism is not emotionally compelling compared to simply preventing global catastrophe, long-termism seems not to be very useful for growing the EA movement in my mind. (Yet, this doesn’t mean it isn’t a wonderful thing.)
  4. it is not clear to me that the loss of future lives is as bad as the loss of current lives (and as someone sympathetic to negative utilitarianism, s-risks feel worse than x-risks to me.)

So far as I’ve seen, experts who are critical of elements of climate science, but acting in good faith, tend to make their criticisms in the form of scientific papers. Similarly, experts who are critical of EA ideas, but acting in good faith, tend bring their arguments to the EA forum. In contrast, bad-faith actors bring their misleading criticisms to general public; they count on having an audience that is ignorant of the details of the issue, so that they can manipulate perception effectively.

--

--

David Piepgrass

Software engineer with over 20 years of experience. Fighting for a better world and against dark epistemology.