Talk:Human extinction

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Wiki Education Foundation-supported course assignment[edit]

This article was the subject of a Wiki Education Foundation-supported course assignment, between 9 September 2020 and 18 December 2020. Further details are available on the course page. Student editor(s): Yasseenhanafy.

Above undated message substituted from Template:Dashboard.wikiedu.org assignment by PrimeBOT (talk) 23:58, 16 January 2022 (UTC)[reply]

Existential risk draft[edit]

There is ongoing discussion about whether to move Draft:Existential risk to mainspace. If it is decided that the draft content be merged into an existing article instead, I propose that the article be this one instead of the often-suggested Global catastrophic risk for two reasons:

  1. Human extinction and existential risk are conceptually closer in terms of impact on humanity and methodological challenges.
  2. Global catastrophic risk is already long, and the current treatment of existential risk in it is awkward and confusing.

Such a merger could also entail making this article the primary home for existential risk content on Wikipedia, including receiving redirects for "existential risk" and related terms.

WeyerStudentOfAgrippa (talk) 17:41, 19 August 2020 (UTC)[reply]

@Vermeer dawn: What do you think of this option? WeyerStudentOfAgrippa (talk) 15:08, 22 August 2020 (UTC)[reply]

Omnicide[edit]

In my opinion, the Omnicide section should be merged with the Morality section.Rich (talk) 22:47, 11 September 2020 (UTC)[reply]

managed to do it myself, hope you like it.Rich (talk) 23:05, 11 September 2020 (UTC)[reply]

sfn ok?[edit]

I'm planning to expand this article; does anyone object if I change it to WP:SFN cites when I do that? Lev!vich 05:34, 10 October 2020 (UTC)[reply]

The Centre for Applied Eschatology[edit]

I have added a mention of the CAE in the "research" section, and I believe it could be expanded, but all I have at the moment is WP:PRIMARY the official website. I almost half-think that we are being trolled and they aren't actually serious about this, but stranger things do exist. Elizium23 (talk) 06:02, 19 November 2020 (UTC)[reply]

Broken Reference Link[edit]

Hi, The following reference link leads to a 404 server error: [1]

I tried to find the pdf, but didn't have success. Should this be removed? Or can someone else find the intended pdf?

References

concerning the Inevitability of human extinction[edit]

This concerns this change.

From: In the long run, human extinction might be inevitable, depending on the large-scale structure of the universe, which is not completely understood. For example, humanity is unlikely to survive the heat death of the universe or the Big Crunch unless new discoveries in physics either rule out these as the ultimate fate of the universe or illuminate some way to avoid them.

To: In the long run, human extinction is inevitable. For example, humanity will not survive the heat death of the universe or the Big Crunch.

See edit history for arguments for an against, starting with this post. In short, I think the old text should remain, as it does not rely on sources but just claims that we don't know, essentially. · · · Omnissiahs hierophant (talk) 21:44, 29 March 2022 (UTC)[reply]

"How do you know technology can not avoid heat death" ... How do you know technology can avoid heat death? You can not claim something is possible without evidence. And you have no evidence for the claim that technology might be able to avoid heat death. All evidence points to the universe ending in some way. Technology is part of this universe. And what's part of the universe can not change the rules of this universe. Take a game. The characters inside the game can not change the rules of the game no matter how complex the game is. The end of universe means the end of technology. And there is no reason to believe technology can avoid heat death. It is how it is. And it's not up to the reader to decide. Until you provide evidence for the claim that "heat death might be avoided" along with an explanation on how it might be avoided; keep your fictional scenarios to yourselves 5.117.226.53 (talk) 21:52, 29 March 2022 (UTC)[reply]
Several times in human history have we totally flipped everything upside down with new discoveries. From classic greek made-up philosophy-physics, to newtonian physics, to quantum. The heat death, or less likely big rip, are models that are made with our current known laws of physics. It would not be strange, but rather the norm, if this happens at a couple of more times, that we learn radically new science. Going from "We don't know, we might learn new things about the universe" to "we do know (everyone will die)" is the stance that requires sources. (Basically, to take the stance that the end of the universe is absolutely certain, you need to eliminate the possibility that current knowledge of physics will not be updated. It is to mistake the model of reality with reality.) However, all this said, we should probably instead find sources (which might be difficult, it's escathology after all.) · · · Omnissiahs hierophant (talk) 22:06, 29 March 2022 (UTC)[reply]
That's the "science was wrong before" fallacy. To quote Isaac Asimov: "[W]hen people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together"
Models can be updated and improved upon, sure. Doesn't mean all models are incorrect. What we do know is that the universe does have an end. Should there be new information; we'll update our stance. The claim is either "we know nothing" or "the universe has an end until proven otherwise". Either agnosticism or gnosticism. And the latter is preferred. We make assumptions with what we know. E.g. "Columbus came to America in 1492". Perhaps he set foot in Americas some other time. Until there is evidence to the contrary; we will hold onto the assumption as we should with the assumption that the universe will end one day 5.215.234.138 (talk) 18:16, 30 March 2022 (UTC)[reply]
I accept this version. It's not the end of the world (*drum-roll*). Buuuut, as a final note; The original text could perhaps be written as little as 10 or 15 years ago, and not have been terribly wrong then (ie large scale structure). It does say something about how fast these ideas are changing. · · · Omnissiahs hierophant (talk) 21:08, 5 April 2022 (UTC)[reply]
Chiming in as my eyebrow launched. How can the inevitability of human extinction be asserted so absolutely?
This is a particularly grandiose question so I think it deserves great care.
The heat death of the universe and big crunch are hypotheses.
It can of course be argued that they look like the most likely futures based on our understanding of observations thus far. However, observations such as B-Meson decay now seem to conflict with the standard model, we've had string theory, m-theory, and lots of great science continues, but we're nowhere close to a widely accepted theory of everything. In terms of scientific discovery, these are very exciting times.
The original line allowed for new scientific observations and theories, avoided the bug of excessive certainty in probabilities, and inspired readers to keep their minds open to possibilities.
I say revert, but I remain open to persuasion.
I'll check back here in around a month to do so if there's no interim discourse. Tom Cowap (talk) 21:11, 4 May 2022 (UTC)[reply]
haha yes. i just gave up. a better version would maybe say something like "this-and-that theory implies an end of the universe, but no one really knows." Maybe one could also write something about that we know that these theories are wrong, as they do not explain everything, for example dark energy, dark matter, et.c. · · · Omnissiahs hierophant (talk) 21:18, 4 May 2022 (UTC)[reply]
Lol, ok well I'll hold off in case anyone else has other angles to think about.
Maybe we discovered a new theory here - certainty decays into the uncertainty and exploration particles :) Tom Cowap (talk) 23:02, 4 May 2022 (UTC)[reply]
Read my response please:
> I have already explained. Hypothesis are testable predictions based on observation and evidence. There *is* evidence for the hypothesis that the universe will end or else it wouldn't be a hypothesis. Knowledge does not have to be absolute. What knowledge is absolute? Nothing. Not even that you are a "human". Maybe you're a brain in a vat in an alien spaceship connected to something that makes you believe you are a human in this "simulation". Would you claim: "maybe I'm human. We wouldn't know"? Is everything going to be written in a "perhaps", "maybe" and "I don't know" format? Of course not
> Why you insist on this one goes to show you are afraid, very afraid, of the end. Until the day there is a new hypothesis, we know, not with absolute certainty, but reasonably based on given evidence, as we do with everything else, that the universe will end and humanity will go extinct at some point. Humans are not special. You too have an end 5.115.201.11 (talk) 22:33, 5 May 2022 (UTC)[reply]
the evidence that humanity might last forever is that we're considerably more intelligent than extinct species that have been observed. but i digress, as this kind of discussion is not the point of wikipedia 216.164.249.213 (talk) 08:49, 7 March 2023 (UTC)[reply]
That's quite a stretch, and not the evidence you're claiming
Humans are not intelligent, at least not always. They are irrational and willfully ignorant creatures engaging in behaviors that are not in their or anyone else's best interests. They will be their own undoing, I hope
Nothing lasts forever. Humanity won't last forever. That's common sense. And common sense doesn't need consensus 5.115.251.154 (talk) 14:11, 14 December 2023 (UTC)[reply]
Copy-pasting:
> It's arrogant to assume human extinction might not occur at some point. How do you know that it is not or might not be inevitable? 99% of all species have gone extinct, and humans dare assume they are different? How absurd. We will see what will become of humanity with the inevitable collapse of civilization. Then we will talk about "maybe"s and "we don't know"s 5.115.201.11 (talk) 22:42, 5 May 2022 (UTC)[reply]
i'm not sure why you left two replies. yes, humans are different than animals. also, you should read up on the purpose of talk pages on wikipedia. they are not meant for discussion about the subjects of articles. 216.164.249.213 (talk) 08:51, 7 March 2023 (UTC)[reply]
Humans are animals. We share many similarities with other animals/species 5.115.251.154 (talk) 14:15, 14 December 2023 (UTC)[reply]
I have already explained. Hypothesis are testable predictions based on observation and evidence. There *is* evidence for the hypothesis that the universe will end or else it wouldn't be a hypothesis. Knowledge does not have to be absolute. What knowledge is absolute? Nothing. Not even that you are a "human". Maybe you're a brain in a vat in an alien spaceship connected to something that makes you believe you are a human in this "simulation". Would you claim: "maybe I'm human. We wouldn't know"? Is everything going to be written in a "perhaps", "maybe" and "I don't know" format? Of course not
Why you insist on this one goes to show you are afraid, very afraid, of the end. Until the day there is a new hypothesis, we know, not with absolute certainty, but reasonably based on given evidence, as we do with everything else, that the universe will end and humanity will go extinct at some point. Humans are not special. You too have an end 5.115.201.11 (talk) 22:32, 5 May 2022 (UTC)[reply]
It's arrogant to assume human extinction might not occur at some point. How do you know that it is not or might not be inevitable? 99% of all species have gone extinct, and humans dare assume they are different? How absurd. We will see what will become of humanity with the inevitable collapse of civilization. Then we will talk about "maybe"s and "we don't know"s 5.115.201.11 (talk) 22:38, 5 May 2022 (UTC)[reply]
We could reference the article Ultimate fate of the universe, rather than re-iterating it all again in this article. People have already spent lots of time writing about it there, with sources and elaborate description. If we can not agree on anything maybe it is better to just simply link to that article.
Please refrain from making assumptions about the nature of other commentators personalities, and keep strictly to the topic.
(Also, a theory is a hypothesis with evidence supporting it. A hypothesis is something that hopefully leads up to a theory, after a process of examination. A hypothesis without evidence is just an idea, that can be vague or specific, depending on context et.c.) · · · Omnissiahs hierophant (talk) 10:14, 6 May 2022 (UTC)[reply]
The hypothesis has evidence to support it. I see no reason to dismiss it because people don't like the sound of it
And sure. You can reference the Wiki page :> 5.115.201.11 (talk) 17:15, 6 May 2022 (UTC)[reply]
yeah, maybe we can meet halfway. you maybe get what you want, as every single theory listed in that article ends with certain death of humanity lol :) but it also says its a topic of research · · · Omnissiahs hierophant (talk) 21:30, 6 May 2022 (UTC)[reply]
So a bit of progress, we are agreed:
1) knowledge is not absolute
2) we're going on assumption
I also agree there's no need to preface _everything_ with "perhaps", "maybe" etc.
However, this particular question is so important that it justifies including its philosophical qualification.
It's important because it's a big question with substantial associated risks.
For example, a reader suffering from depression could suffer more if they read such a bleak outlook and believe it to be 100% certain, even though we are agreed it is not.
Even people in good general mood can be made to feel down if they see an assertion like this going unqualified.
Social media campaigns bent on manipulation can link to the absolute assertion, and a lot of people put credence in wikipedia editorial standards.
Now, for those who already believe nothing matters, what's on wikipedia won't matter.
So I put the question - what motives might there be to convince others to erroneously believe that humanity is 100% certainly doomed?
For all we know, a state actor wishing to undermine public morale in English-speaking nations might want to spread doomsday cult absolutism.
Or tax-phobes wishing to rally people to predatory policies by convincing them that caring about anyone else is pointless in the end.
Even those who commit war crimes might try to convince people that human life is of little consequence and should attract little or no sanction.
Though an individual may simply be sharing their belief, we are at a time when publics are being radicalised by organised factions.
The more I think about it, the more I see the scales weighing firmly in favour of reverting to the complete form.
I also like Omnissiahs hierophant's idea to add a link to that page with hypotheses about the end of the universe, it's the educate rather than indoctrinate solution.
Sound reasonable? Tom Cowap (talk) 13:00, 14 May 2022 (UTC)[reply]
I suggest a plain revert. The original text is somewhat close to what i think the discussion have been going towards, anyways. And, if plain revert for some reason can not be agreed upon, then something hyper-simple, like this single sentence: "The very long-term survival of humanity, and its descendants, depends on the ultimate fate of the universe." Basically defer the discussion to that article, and i guess it would be difficult to find an argument against that sentence. · · · Omnissiahs hierophant (talk) 08:05, 15 May 2022 (UTC)[reply]
Please read my response to Tom Cowap 5.116.191.28 (talk) 01:06, 18 May 2022 (UTC)[reply]
You wish to keep the sentence vague because you care about the specie while I wish to not keep the sentence vague because I don't care about the specie. There is a conflict of interest. Keeping it vague for the reasons you listed is not less of an indoctrination. You want people to feel a glimmer of "hope". Doesn't sound different from someone who doesn't want people to feel "hope"
I don't believe there is an innate value to humanity. Depression? Everyday I wish to sleep and never wake up. "Humanity is 100% doomed" is not bleak or unpleasant to someone who constantly wants to skin themselves
Doubt a wikipedia page can stop a war crime. Humanity has committed war crimes since the dawn of civilization, before wikipedia was a thing. If someone wants to commit a war crime, they will. And no "humanity *might* not be doomed" will change that 5.116.191.28 (talk) 00:56, 18 May 2022 (UTC)[reply]
What I can agree with: "the very long-term survival of humanity, and its descendants, depends on the ultimate fate of the universe" as omnissiah suggested
What I can not agree with: vague language which is not used anywhere else despite "no knowledge being 100% absolute and certain" 5.116.191.28 (talk) 01:04, 18 May 2022 (UTC)[reply]
To clarify what I mean by "humanity has committed war crimes since the dawn of civilization"
Men colonizing and seizing lands, drawing arbitrary borders around specific geographic "territories" to build "nations" and mass-raping girls and women to humiliate each other and spread religious beliefs
That's what humanity has done since the dawn of civilization
And that's what humanity does today. E.g. Russian troops mass-raping Ukrainian girls and women
History is doomed to repeat itself. Wikipedia doesn't have an effect 5.116.191.28 (talk) 01:22, 18 May 2022 (UTC)[reply]
your argumentation is just a list of emotional expressions. i am updating the article with the minimal agreed sentence (that you now agreed with). · · · Omnissiahs hierophant (talk) 22:25, 18 May 2022 (UTC)[reply]
As if Tom Cowap's "arguments" weren't a bunch of emotional expressions?
And that's okay with me. Your edit is what I can agree with :D 5.115.78.218 (talk) 21:40, 19 May 2022 (UTC)[reply]
yes, they were not 216.164.249.213 (talk) 08:52, 7 March 2023 (UTC)[reply]
"All evidence points to the universe ending in some way" - This is not (even nearly) scientific consensus, whether or not it should remain in the article notwithstanding. 216.164.249.213 (talk) 19:59, 28 December 2022 (UTC)[reply]
It should be the scientific consensus 5.115.251.154 (talk) 14:16, 14 December 2023 (UTC)[reply]
"For example, a reader suffering from depression could suffer more if they read such a bleak outlook" I have been suffering from depression and suicidal thoughts for most of my adult life. I receive medical treatment for it. The thought that humanity is going to be extinct barely registers. Thoughts of my own mortality, and a series of deaths of people I loved or deeply cared about have had much more to do with my thoughts. Dimadick (talk) 05:20, 15 May 2022 (UTC)[reply]
Exactly 5.116.191.28 (talk) 01:13, 18 May 2022 (UTC)[reply]

“Hypothetical” end of human race[edit]

Should we change this to “inevitable”? The human race’s extinction is guaranteed to occur eventually, so it’s not really hypothetical. FinnSoThin (talk) 23:35, 24 May 2022 (UTC)[reply]

[citation needed] 216.164.249.213 (talk) 19:57, 28 December 2022 (UTC)[reply]

Based on a formalization of this argument, researchers have concluded that we can be confident that natural risk is lower than 1 in 14,000 (and likely "less than one in 87,000") per year.[2][edit]

I have truncated this part of the article to take attention off of what I feel are more like periphery characteristics of the important claim than relevant aspects of the scientific consensus. Specifically, I have removed the "likely 'less than one in 87,000'" (which should have been written as "1" btw, but I digress) with a reason I elaborated on in the edit summary. With that being said, this really does not get to the heart of the issue. Forecasts about forecasts actually project very little about the odds of an event: you can have some model, which is likely to be good, which forecasts the likelihood of an event as being very low, but have another one that is arbitrarily high, since there is no explicit constraint on it, and still say "the likelihood of this event is likely <<low>>". Really, we should try and procure a more productive risk profile for the case of a natural extinction event that actually offers an explicit probabilistic forecast instead of a meta-forecast on forecasts about the event themselves. 216.164.249.213 (talk) 19:57, 28 December 2022 (UTC)[reply]

Risk estimates[edit]

(ping 216.164.249.213 (talk · contribs · WHOIS), Blaze Wolf, Thriley)

Regarding the IP editor's removal of the risk estimate table in [1]: bad behavior aside, I'm actually with them on the edit they're trying to make. That's a bunch of people making oddly specific guesses about the future, and duplicating the entire table here seems excessive for the amount of useful information it provides. How do we feel about removing the table and adding the preceding sentence ("In 2008, an informal survey of experts at a conference hosted by the Future of Humanity Institute estimated a 19% risk of human extinction by the year 2100.") to the following bulleted list? mi1yT·C 04:26, 13 March 2023 (UTC)[reply]

"That's a bunch of people making oddly specific guesses about the future" I mean, that's literally what weathermen do. But regardless, I'm fine with removing it. IT's just the WP:STATUSQUO to keep it. ― Blaze WolfTalkBlaze Wolf#6545 14:12, 13 March 2023 (UTC)[reply]
Hey, sorry for the late reply. I trust you guys to make an informed decision. Just didnt like how some people were reverting the table without a second thought, especially given how that version of the article literally claimed said table was unreliable. i think i would be ok with retaining the same info/forecasts as they seem to have garnered interest, but rearranging the section to reflect the fact that extinction-centric organizations are not considered reliable. 216.164.249.213 (talk) 03:08, 17 March 2023 (UTC)[reply]

Recommend update[edit]

The likelihood of human extinction through humankind's own activities, however, is a current area of research and debate.

Yeah, no? I realize the status quo is to keep one’s heads in the sand and to ignore the destruction of the Earth’s ecosystem, which keeps humanity alive, in favor of a dreamy, optimistic consumerism, but we’ve gone well beyond "research and debate" at this point. This sounds like a statement from 1989, not given what we know in 2023. Things are not peachy and business as usual. Viriditas (talk) 00:40, 27 June 2023 (UTC)[reply]

Category errors & Redirect issues: New 'Omnicide' page?[edit]

Posting this on both the 'Human extinction' page and the 'Global catastrophic risk' page to suggest the creation of either 1. an 'Omnicide' page 2. a separate 'Existential risk' page and/or 3. Retitling the 'global catastrophic risk' page.

There are some gnarly issues with the terminology in this constellation of pages & page redirects: 'Human extinction' covers...human extinction. Whereas omnicide, properly understood in the context of the literature on this subject, refers to the extinction of all terrestrial life. We can concede that while these may be potentially related domains they are meant to describe consequentially and meaningfully distinct outcomes. Human extinction is, so to speak, a sub-domain of what is being referred to in the word 'omnicide'--as a header it doesn't even remotely cover what is meant to be called up in the word omnicide. Meanwhile 'mass-extinction' no longer has the right ring because most people entering middle-age who have had access to a K-12 education have been aware that we're living through a mass-extinction since they were young children. Mass-extinction begins to sound like a normative element in other words and is more likely to be associated with ancient, pre-human events as opposed to evoking the threat of a future event without example in terrestrial history.

Meanwhile, "existential risk" redirects to "global catastrophic risk." The distinction here is even more subtle but it's still a problem. Existential risk refers to all of the following: 1. the risk of omnicide 2. the risk of human extinction 3. the risk of a civilizational collapse so severe that would evacuate the meaning or desirability of a continuity of human life. <<The modifying phrase here is important. Civilizational collapse, historically, encompasses plenty of situations that might have been experienced as desirable or less severe than the sort of situation that's being gestured towards in the term existential risk. As a term for this constellation of existential threats 'global catastrophic risk' appears to discount or downplay or fall short of the extremity of these potentials. "Global catastrophic risk" sounds like it could just as easily be applied to the risk of the bond market collapsing as to, say for example, the extinction of all terrestrial life. As a description of omnicide or even of existential risk the header "global catastrophic risk" shades into classical apocalyptic thinking--the end is conceived as potentially redemptive. Conceiving of or talking about the actual cessation of all terrestrial life without the implication of cyclic reinvention or hanging onto the possibility of a silver lining is avoided and repressed by apocalyptic thinking. Clarifying distinctions between these styles of thought about the complex of issues relating to omnicide or to existential risk requires some sort of revision in this space.

The reworking of these either Human extinction page or the Global catastrophic risk page to cover what is being discounted, downplayed or missed in this conversation . Really, it seems to me, the most appropriate move would be to make a new, separate Existential Risk page that has a slightly different emphasis and organization than the Global Catastrophic Risk page. But this is likely to be somewhat duplicative of the Global catastrophic risk page. Therefore: maybe a new Omnicide page? But that term is more exotic. Either option seems worth pursuing.

So the question, I'm raising is: Should we look into re-titling and revising the Global catastrophic risk page or creating a new omnicide page? [1][2][3]


  ThomasMikael (talk) 18:10, 23 January 2024 (UTC)[reply]

References

  1. ^ Moynihan, Thomas (2020). X-risk: how humanity discovered its own extinction. Falmouth: Urbanomic. ISBN 978-1-913029-84-5.
  2. ^ Mohaghegh, Jason Bahbak; Mohaghegh, Jason Bahbak (2023). Mania, doom, and the future-in-deception. Omnicide / Jason Bahbak Mohaghegh. Falmouth, UK: Urbanomic. ISBN 978-1-7336281-6-7.
  3. ^ Mohaghegh, Jason Bahbak; Mohaghegh, Jason Bahbak (2019). Mania, fatality, and the future-in-delirium. Omnicide / Jason Bahbak Mohaghegh. Falmouth, UK: Urbanomic Media Ltd. ISBN 978-0-9975674-6-5.