Longtermism poses a real threat to humanity
https://www.newstatesman.com/ideas/2023/08/longtermism-threat-humanity
“AI researchers such as Timnit Gebru affirm that longtermism is everywhere in Silicon Valley. The current race to create advanced AI by companies like OpenAI and DeepMind is driven in part by the longtermist ideology. Longtermists believe that if we create a “friendly” AI, it will solve all our problems and usher in a utopia, but if the AI is “misaligned”, it will destroy humanity…”
This is my first exposure to “longtermism” and it sounds like an interesting debate. But I’m actually having a hard time because this is one of the dumbest neologisms I’ve ever heard.
Sorry for the similarly dumb rant but: Not only is it extremely vague given its very specific technical meaning (the author of the article has to clarify in the first paragraph that it doesn’t mean traditional “long term thinking.” Off to a great start!), but it is about as artful wordplay as painting boobies on a cave wall. It sounds to my ear like a five year old describing the art of fencing as “fightystickism.”
It reminds me of casting a golden calf and then worshipping it as a god. There aren’t even the seeds of the solution to any social problem in LLMs. It’s the classic issue of being knowledgeable about one thing and assuming they are knowledgeable about all things.
There are some seeds there. Llms show that automation will eventually destroy all jobs, and at any time even things we think are unassailable for decades could suddenly find themselves at risk.
That plants a lot of seeds. Just not the ones the average longtermist wants planted.
I think we agree. LLMs and automation in general have a massive labor saving potential, but because of the way our economic systems are structured this could actually lead in the opposite direction of a Utopia instead of toward one as was suggested by the “longtermists” (weird to type that out). The seeds as you suggest are very different than towards the end of conflict and want.
Oh we totally agree, I was just agreeing with you in a slightly tongue in cheek manner.
It sounds good on its face but it really goes off the skids the moment you listen to what any “long termer” actually believes. Basically it’s a libertarian asshole ideology that lets them discount any human misery they may cause because they’re thinking about theoretical future people who don’t exist.
It’s an amazing ideology to have because if you can create a plausible future benefit you can do any real evil and feel like the good guy!
Stealed billions in wages from your workers? It’s alright, the funds will be used for you to gain influence and guide humanity to a better future!
Did you release a bioweapon in some global south country? It’s all right! Overpopulation was a danger 5 or 6 generations from now!
Destroyed democracy! It does not matter. Fascism today ensures democracy in the year 4000, trust me bro!