Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

muriel_volestrangler

(101,306 posts)
Thu Jul 29, 2021, 01:53 PM Jul 2021

The Dangerous Ideas of "Longtermism" and "Existential Risk"

In a late-2020 interview with CNBC, Skype cofounder Jaan Tallinn made a perplexing statement. “Climate change,” he said, “is not going to be an existential risk unless there’s a runaway scenario.” A “runaway scenario” would occur if crossing one or more critical thresholds in the climate system causes Earth’s thermostat to rise uncontrollably. The hotter it has become, the hotter it will become, via self-amplifying processes. This is probably what happened a few billion years ago on our planetary neighbor Venus, a hellish cauldron whose average surface temperature is high enough to melt lead and zinc.
...
That’s one possibility, for sure. But I think there’s a deeper reason for Tallinn’s comments. It concerns an increasingly influential moral worldview called longtermism. This has roots in the work of philosopher Nick Bostrom, who coined the term “existential risk” in 2002 and, three years later, founded the Future of Humanity Institute (FHI) based at the University of Oxford, which has received large sums of money from both Tallinn and Musk. Over the past decade, “longtermism” has become one of the main ideas promoted by the “Effective Altruism” (EA) movement, which generated controversy in the past for encouraging young people to work for Wall Street and petrochemical companies in order to donate part of their income to charity, an idea called “earn to give.” According to the longtermist Benjamin Todd, formerly at Oxford University, “longtermism might well turn out to be one of the most important discoveries of effective altruism so far.”

Longtermism should not be confused with “long-term thinking.” It goes way beyond the observation that our society is dangerously myopic, and that we should care about future generations no less than present ones. At the heart of this worldview, as delineated by Bostrom, is the idea that what matters most is for “Earth-originating intelligent life” to fulfill its potential in the cosmos. What exactly is “our potential”? As I have noted elsewhere, it involves subjugating nature, maximizing economic productivity, replacing humanity with a superior “posthuman” species, colonizing the universe, and ultimately creating an unfathomably huge population of conscious beings living what Bostrom describes as “rich and happy lives” inside high-resolution computer simulations.

This is what “our potential” consists of, and it constitutes the ultimate aim toward which humanity as a whole, and each of us as individuals, are morally obligated to strive. An existential risk, then, is any event that would destroy this “vast and glorious” potential, as Toby Ord, a philosopher at the Future of Humanity Institute, writes in his 2020 book The Precipice, which draws heavily from earlier work in outlining the longtermist paradigm. (Note that Noam Chomsky just published a book also titled The Precipice.)

https://www.currentaffairs.org/2021/07/the-dangerous-ideas-of-longtermism-and-existential-risk

It gets worse, the further you read. "Longtermism" justifies almost anything on the grounds that you might create trillions of trillions of computer-simulated (and happy) people, and therefore that outweighs any hiccups like climate disaster on the way (as long as the human race isn't literally wiped out completely before a benevolent artifical superintelligence that could make real humans unnecessary). This involves shunning people who criticise the movement, and by a complete coincidence justifies spending lots of money on things rich white men love, like space travel and nuclear bunkers for them to survive in. Some use it to put a higher value on a developed world life than a developing world one. It has aspects of a religion or cult.
1 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
The Dangerous Ideas of "Longtermism" and "Existential Risk" (Original Post) muriel_volestrangler Jul 2021 OP
For those that don't fit in. Anon-C Jul 2021 #1
Latest Discussions»Issue Forums»Editorials & Other Articles»The Dangerous Ideas of "L...