Longtermism poses a real threat to humanity

https://www.newstatesman.com/ideas/2023/08/longtermism-threat-humanity

“AI researchers such as Timnit Gebru affirm that longtermism is everywhere in Silicon Valley. The current race to create advanced AI by companies like OpenAI and DeepMind is driven in part by the longtermist ideology. Longtermists believe that if we create a “friendly” AI, it will solve all our problems and usher in a utopia, but if the AI is “misaligned”, it will destroy humanity…”

@technology

  • laylawashere44@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    1 year ago

    Because it gives powerful people permission to do whatever they want, everyone else be damned.

    Both of the two major Longtermist philophers casually dismiss climate change in their books for example (I have Toby Ord’s book which is apparently basically the same as William Mckaskils book but first and better, supposedly). As if it’s something that can be just solved by technology in the near future. But what if it isn’t?

    What if we don’t come up with fusion power or something and solving climate change requires actual sacrifices that had to be made 50 years before we figured out fusion isn’t going to work out. What if the biosphere actually collapses and we can’t stop it. That’s a solid threat to humanity.

    • wahming@monyet.cc
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      No, it gives them a justification to do so. But is that actually any different from any other belief system? Powerful assholes have always justified their actions using whatever was convenient, be it religion or otherwise. What makes longtermism worse, to the extent it’s a threat to humanity when everything else isn’t?

      • AnonStoleMyPants@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Don’t think so personally. The only reason might be that tech billionaires probably think it is more “their thing” than religion or whatever. Hence, quite bad.

      • Thrashy@beehaw.org
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        Along the lines of @AnonStoleMyPants – the trouble with longtermism and effective altruism generally is that, unlike more established religion, it’s become en vogue specifically amongst the billionaire class, specifically because it’s essentially just a permission structure for them to hoard stacks of cash and prioritize the hypothetical needs of their preferred utopian vision of the future over the actual needs of the present. Religions tend to have a mechanism (tithing, zakat, mitzvah, dana, etc.) for redistributing wealth from the well-off members of the faith towards the needy in an immediate way. Said mechanism may often be suborned by the religious elite or unenforced by some sects, but at least it’s there.

        Unlike those religions, effective altruism specifically encourages wealthy people to keep their wealth to themselves, so that they can use their billionaire galaxy brains to more effectively direct that capital towards long-term good. If, as they see it, Mars colonies tomorrow will help more people than healthcare or UBI or solar farms will today, then they have not just a desire, but a moral obligation to spend their money designing Mars rockets instead of paying more taxes or building green infrastructure. And if having a longtermist in charge of said Mars colony will more effectively safeguard the future of those colonists, then by golly, they have a moral obligation to become the autocratic monarch of Mars! All the dirty poors desperate for help today aren’t worth the resources relative to the net good possible by securing that utopian future they imagine.

        • lloram239@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          effective altruism specifically encourages wealthy people to keep their wealth to themselves, so that they can use their billionaire galaxy brains to more effectively direct that capital towards long-term good.

          And how is that a bad thing? The alternative is to spend money on stuff that doesn’t work or even is actively harmful. The argument here is literally use less brain, do more pointless feelgood measures.

          If, as they see it, Mars colonies tomorrow will help more people than healthcare or UBI or solar farms will today

          Are we forgetting that Musk has an electric car company and used to have a solar company (since been absorbed into Tesla). He doesn’t just want to go to Mars, he does a lot of other stuff as well. Also why should billionaires be responsible for UBI and healthcare? If Musk would spend all his money on healthcare, you’d have healthcare for about three months before he is bankrupt. That kind of stuff is the governments job.