The Singularity is a hypothetical future event where technology growth becomes uncontrollable and irreversible, leading to unpredictable transformations in our reality[1]. It’s often associated with the point at which artificial intelligence surpasses human intelligence, potentially causing radical changes in society. I’d like to know your thoughts on what the Singularity’s endgame will be: Utopia, Dystopia, Collapse, or Extinction, and why?

Citations:

  1. Singularity Endgame: Utopia, Dystopia, Collapse, or Extinction? (It’s actually up to you!)

  1. https://www.techtarget.com/searchenterpriseai/definition/Singularity-the ↩︎

  • Candelestine@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    Utopia or extinction, depending on the perspective of the person asking. Homo sapiens cannot exist forever, that would require a halting of DNA mutation and biological adaptation. Will “we” still be here even after we’ve begun to require a different classification term for ourselves, just for scientific clarity?

    • OutOfMemory@vlemmy.net
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I think for the purposes of OP’s question, we can ignore genetic evolution. That takes place over hundreds, thousands of generations, and history hasn’t been around that long.

  • axtualdave@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    In the short term, a series of collapses as we reach ever closer toward that singularity. There’s a great many constraints on our ability to grow while on Earth, and it’s proving difficult to get off the planet in any reasonable method with our current technology. I suspect we’ll need to fall down and rebuild a couple times before we can reliably spread to other planets, or even simply exist in orbit.

    Once we get up there, though, and we’re no longer constrained by Earth’s resource limits, we’ll grow signficantly. I suspect we’ll move toward a machine-based society, both in automation and robotics, but also integrating technology into our bodies.

    At some point, someone is going to figure out how to do that mind to machine transfer, and we’ll diverge as a species. The organic humans and the composite AI / machine-based humanity.

    Knowing how stupid we are, though, we’ll probably end up becoming the Borg.

  • bloodfart
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    There will not be a singularity. Global capitalism will absolutely collapse and on its way will become more dystopian. Humanity isn’t going extinct.

    E: the cause of this process is not human nature. Anyone who tells you it is has simply failed to study history. We can have a utopia but global capital has to collapse first to make space for it.

  • queermunist@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    1 year ago

    There are too many structural problems with the extractive economy for our current society to survive. As resources dwindle and climate change gets worse the smaller countries will start to collapse and entire regions will go to war over resources. Billions of humans will be forced to migrate out of uninhabitable zones around the globe and they’ll do anything to escape. The ones that can’t escape will eat each other (metaphorically and literally).

    There won’t be a singularity. There probably won’t even be a global internet in 30 years.

  • InternetPirate@lemmy.fmhy.mlOP
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    edit-2
    1 year ago

    As Connor Leahy says, companies are stuck in a race to the bottom where the only thing that matters is being the first to achieve AGI, even at the expense of security. I believe that unless things change significantly, we are heading towards extinction. We might create a very powerful AGI that simply doesn’t care about us and ends up destroying us. This wouldn’t be because the AGI is inherently evil, but simply because we would be in its way, much like how humans wouldn’t care about ants when building a road. I wish more people were discussing this issue because in a few years, it might be too late.

  • mrmanager@lemmy.today
    link
    fedilink
    arrow-up
    9
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Well, let me put it this way… Enjoy your days now, not later. :)

    And prepare to move to a country where tech is not very widespread. Try to gather money so you can move if you want to.

    Humans can be really nice on a individual level but society is run by evil people. I think it has always been that way. Good people don’t want any part of the power struggles and backstabbing, so they forfeit power to the people who are into that. By design, the system rewards evil people. And they are also the ones who really care about money, status, and so on.

    This means humanity is fucked. It’s pretty simple. Unless consciousness somehow changes in everybody at once, and everyone suddenly wants to do good instead of evil. Then we have a good chance. The tech can help build a paradise here for everyone.

    But that won’t happen unless good aliens somehow transforms our minds into something completely different.

  • Hemingways_Shotgun@lemmy.ca
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    1 year ago

    All of the above.

    Humanity is, at it’s core, motivated by self interest. The singularity will be harnessed by those with the power and means to do so, while those who don’t will either suffer or die.

    The powerful few will adapt to the singularity; using it to craft their own utopia. The masses, without access to the same power that the upper class enjoyed, will fall into a dystopia while even more marginalized substrates of society go extinct completely unnoticed.

  • 5 Card Draw@lemmy.fmhy.ml
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    Almost every comment I’ve seen sees the future as hopeless and I’m going to largely chalk that up to the postmodernism/realism consciousness in our society at this time period.

    I think the future will be a utopia, and there isn’t a long term (I mean centuries or millenia long developments) reason to think otherwise. The idea of utopia has pushed civilization to confront power structures and create new ones, to rethink what was impossible, too difficult to accomplish, etc. The many rights, freedoms, and ideas that many around the world take for granted today began as people envisioning a utopia and trying to make it happen. These ideas can’t be done away with as Alexis De Tocqueville saw.

    Right now there are problems for sure, and I personally think liberty and egality are only a parody of utopia at this point, but that’ll change over a long time.

    Human civilization is only 6000 years old! We’re still working with the brain of primitive humans, and we aren’t even toddlers yet in the grand lifespan of Earth. I think people tend to forget that sometimes.

    We’ll get to a better place, and our consciousness is always changing to confront the problems we face today (biosphere collapse, resource hoarding, infighting, etc).

    Democracy took centuries to develop coherently, and even then it failed MANY times at first. But look at it now.

  • erogenouswarzone
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    I’ll do you one step better. What about when our ai meets another ai?

    Our existence is based on death and war. There is a lot of evidence to suggest we killed off all the other human-like species, such as neanderthals.

    And that is the reason we progressed to a state where we have developed our world and society we know today, and all the other species are just fossils.

    We were the most aggressive and bloodthirsty species of all the other aggressive and bloodthirsty alternatives, and even though we have domesticated our world, we have only begun to domesticate ourselves.

    Think about how we still have seen genocides in our own time.

    Our AI will hopefully pacify these instincts. Most likely not without a fight from certain parties that will consider their right to war absolute.

    Like the one ring, how much of the agressiveness will get poured into our AI?

    What if our AI, in the exploration of space, encounters another AI? Will it be like the early humanoid species, where we either wipe out or get wiped out ourselves?

    Will our AIs have completely abstracted away all the senseless violence?

    If you want a really depressing answer, read the second book of 3 body problem: The Dark Forest.

  • redballooon@lemm.ee
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    The singularity already happened. We have corporations that are unregulatable. They create their own rules and use those rules to grow further, on the cost of our all resources. AI will be used by those corporations to grow further, but it won’t be the game changer towards the dystopia we’re already living and expanding.

  • FlashPossum@social.fossware.space
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    Transformation. Humans as we know us don’t have a future. But I think there will be some kind of merger driven by progress in AI and genetic engineering.

    We will have descendants but they will not be humans as we know them.

  • benjithedog@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    I believe collapse is inevitable. More interesting is what comes after. If we reach true AI before the collapse, it could go either way afterwards but I’m hoping people will create a better society from the ashes.

    At least for the time we’ll have left, because AI or no AI, climate won’t be getting fixed any time soon.