DK_en 2x02 - Algojerks towards the thousand-years Reich
Episode first aired on 24 December, 2022. Listen on Spreaker.com
Today, an episode to close the year with a new looming threat, as befits any self-respecting serial.
One would spend all day talking about Elon Musk, the guy has the stage sense of a seasoned standup comedian.
Take for example the idiotic survey about his resignation and his comment: 'I'll resign as soon as I find someone crazy enough to take the job, and I'll just keep the direction of the development team and the server team', which is a bit like saying that you hand over the driving to someone else but you control the steering wheel and the engine.
But this is still ordinary manchild material.
On the other hand, one of his latest gimmicks is more interesting, you may have heard it, a journalist asked him: why did you buy Twitter? And he replied:
My biological neural network concluded that it was important to buy Twitter. If Twitter had not been bought and steered in a good direction, it would have been a danger to the future of civilisation. That's why I bought it.
Ian Miles Cheong @stillgray, 12:04PM Dec. 5, 2022
- Now, it could be argued that someone who talks about himself as his 'biological neural network' knows about neural networks what 1980s cyberpunk books wrote about them. But from the technology manchild, we expect nothing less, don't we?
- Or we could note that he talks about himself in the third person, like Maradona. The difference being that Maradona with a ball at his feet was a god, while Musk with a company in his hands is just a con artist.
- Or we could consider that other bullshit, that Twitter could be a danger to civilisation. With a twitter user base below five hundred million we can confidently say that about six and a half billion people not only don't lose sleep over twitter, they say "Twitter who?".
All of this would be funny, but as an Italian, I know that one should never play the game of successful charlatans. Because they are charlatans, but successful ones, which means they have that kind of animal intelligence capable of dragging you down to their level and then beating you with experience.
Instead, one should recognise Musk as just the media-ready court jester of an extremely dangerous ideology that apparently is in great vogue among the uselessly mega-rich. (OK, "uselessly" is a pleonasm, I used it once just to make things very clear).
One can tell this from the only ridiculous thing he says, that of the threat to civilisation. This statement is interesting and worrying. Not for the twitter reference, but for the one about the threat to civilisation.
Let's think about that for a moment. Musk doesn't say 'a danger to free speech', which would at least be one of his favourite soundbites, or 'a danger to democracy', or at least 'a danger to US social media dominance'.
No, Musk specifically speakes of 'danger to civilisation', no less.
And this, in my opinion, tells us a lot more about his response than he puts into words.
For one, the fact that Musk believes his actions have, or at least should seek to have, universal effects.
Musk does not justify his actions from a business perspective, or from a market perspective, or from the perspective of one or more countries.
Considering that if we speak of 'civilisation' in this century, we can only mean human civilisation, Musk is claiming to be acting from nothing less than a planetary perspective.
You will admit that even for a megalomaniac narcissist this is not a normal rant, it is something so off the scale that it cannot be a part of the mask. Here we're facing something genuinely strange.
I see two options:
- either Musk has lost all contact with reality, and then we are in a clinical situation, or,
- he really means what he says, and then we have to understand the scope and consequences.
Let's indulge in the only genuinely revolutionary act in these times and think about it.
In my opinion, we are faced with two questions:
- this story about the risk to civilisation: is it a way for a narcissist to sound important or does it really express Musk's motives, or at least his rationalisations?
- If Musk really believes in the bullshit he says, what can it mean? We are not talking about an isolated lunatic here, but the second or third richest man in the world.
Let's take as a working hypothesis that Musk does believe what he says when he claims that Twitter may pose a danger to civilisation.
We should then expect behaviours that escape traditional cost-benefit analysis, simply because the scale on which Musk assesses them is enormously larger.
Are we observing these behaviours? I would say so. If we look at what Musk has done since he bought Twitter, there is nothing that makes normal sense. So, either Musk has gone off the rails, or he is acting on a different plane than the one we are using to understand him.
And mind you, I'm not even remotely taking Musk's motives at face value. I'm not believing that Twitter might pose a 'danger to civilisation', I'm just trying to assess whether Musk's actions might confirm that he actually believes what he says.
This is important because a person who truly reasons in terms of benefits and risks to the whole of civilisation will necessarily have a very different value system from someone who, for example, has a company's quarterly result, their career or even simply the end of the month as their horizon.
If Musk's motives are genuine, then it makes sense to have poured $44 billion into Twitter, and it certainly makes sense to do whatever it takes to keep Twitter in check and prevent it from endangering civilisation. And it certainly also makes sense to do a lot of other things that in a normal perspective would be ridiculous, absurd or forbidden, like turning offices into dormitories and work into indentured servitude.
Seen from this perspective, the redundancies, the collapse of users, the flight of advertisers are just a small price to pay:
- the fire brigade does not care about parking bans or house furnishings when putting out a fire.
- The surgeon does not hesitate to cut out a limb to save a life.
Suddenly things take a slightly disturbing turn. If indeed the second or third richest man in the world acts for the kind of reasons he says, then we can expect many more inexplicable actions that are off the scale of normality.
And staying within normalcy is important when someone can individually command the financial means of a small state.
Time for another question. Is this whole danger to civilisation thing Musk's pipe dream, or is it a thing?
Unfortunately for us, it is a thing. It is actually peddled as a respectable viewpoint, possibly even a philosophy. It is called longtermism, or sometimes also Effective Altruism, and it is time we talked about it.
Long-termism, in a nutshell, says that the only issue worth dealing with is the long-term survival of human civilisation. Long-term does not mean one hundred or two hundred years, but millions or billions of years. Long-termism takes itself literally: the 'long term' is the time remaining until the Universe ends, a period of time on which there is no scientific consensus, but which we can generically consider to be in the order of trillions of years.
A long-termist starts from the simple observation that in the lifetime of the Universe, if human civilisation expands in the Galaxy, at least 1050 human beings may live.
- Ten thousand
- billions
- of billions
- of billions
- of billions
- of billions. And this is one of the most conservative estimates.
Such an enormous number makes not only the seven billion people living on the planet today, but also those few other billions who have lived since the first Homo Sapiens Sapiens, completely insignificant. Drop-in-the-ocean-level insignificant.
For a longtermist, the historical task of humanity is precisely to enable and insure this grandiose cosmic potential.
Some background information.
Over the next hundred million years or so, there is a strong possibility that the Earth will be hit by an asteroid the size of the one that caused the extinction of the dinosaurs, and that any form of civilisation on the planet will be wiped out.
For a longtermist, it is therefore imperative that humanity become a multiplanetary civilisation, colonising Mars, and probably some of the more promising moons of Jupiter and Saturn.
Five billion years from now, the star we know as the Sun will run out of hydrogen and begin to fuse helium. This will turn it into a red giant, whose diameter will grow to swallow Mercury and Venus. The Earth may either be swallowed or burned to a crisp, and Mars would probably be reduced to a cinder.
For a longtermist, this implies that humanity cannot limit itself to the Solar System, but must directly set itself the goal of colonising the Galaxy. As an intermediate goal, it is necessary for humanity to take direct control of its own evolutionary path, to overcome the limitations of natural selection, and to introduce with its own technology those genetic mutations necessary for the realisation of projects that must extend over millennia: prolonged longevity, increased physical performance, improved metabolism, probably the ability to hibernate for long periods, and of course, at least for those few who will have to take responsibility for leading humanity towards the realisation of its potential, amplified cognitive faculties.
The numbers involved, both in terms of time and the number of potential future human beings, are such that any event that is not directly an existential threat to the species is a negligible event. Again, drop-in-the-ocean level negligible.
In the longtermist view, not all problems that you and I might call planetary are worthy of attention.
For example, for a long-termist, climate change, or a nuclear conflict, are not really big problems. Even if they were to cause the death of 80 or 90 per cent of the world's population, the existence of the species would not be called into question, we have the technologies and resources to ensure the survival of a reasonable portion of the population even at those junctures.
When talking about population and survival, one has to start making distinctions. In the longtermist view, the inhabitants of the technologically developed north, for example, have more access and therefore more opportunity and competence to use their scientific and technical skills to ensure that humanity does not fall off the path towards its cosmic potential.
When talking about surviving climate change or nuclear conflict, it is therefore important to distinguish who should survive. This is why the longtermists are very keen to purchase huge tracts of land in remote parts of the planet, and equip them to ensure the survival of the longtermists, their families, and their servants, that is the service workers who can cater to the needs of the community and, of course, defend it against other negligible survivors.
Those who do not have the means to create such communities or the luck to work in them will not fare as well. But as we said before, any number of living individuals sacrificed at any point in time is insignificant compared to the 1050 human beings that humanity will be able to boast by realising its cosmic potential.
For these reasons, a longtermist has very specific interests when it comes to technology:
- robotics, because the creation of anthropomorphic robots makes it possible to reduce the number of actual humans (especially less desirable ones) needed for survival; also, robots are more easily controlled than humans;
- artificial intelligence, because the possibility of the development of an autonomous cybernetic super-intelligence, hostile to humanity, constitutes an existential danger to the species, and therefore every effort must be made to create Artificial Intelligences that are merely servile instead. In essence, the creation of cybernetic slaves.
- the colonisation of Mars and the initiation of interstellar colonisation projects.
More mundane issues, such as peacekeeping, government collaboration, climate change, hunger and poverty, or overcoming the capitalist model on the other hand, are negligibly minor issues. Yes, they may be temporarily important, but they are not fundamental to the survival of the species and therefore have too limited a horizon to be truly engaging.
It should be obvious that not everyone can deal with such broad horizons. If you work for a living, say, and are already struggling to pay your mortgage, your child's college tuition, and have no idea how you are going to secure a pension, you certainly don't have the energy, or the resources, or maybe even the interest to worry about how to colonise the planets around Proxima Centauri.
Even if you are a politician, your re-election will depend enormously more on geographically and temporally limited issues than on initiating, or maintaining, projects spanning centuries or millennia.
Who, then, can embrace longtermism?
- Someone who has achieved full economic independence,
- someone with ample resources to invest,
- someone who yearns for cultural and political clout commensurate with their wealth
- someone who needs a higher cause that can guarantee him permanent control of his wealth against those who blather on about redistribution
- and someone who perhaps, in deference to the Calvinist view so dear to the rich, believes that his own success is an indication of God's favour, and that this places him above the rest of men.
Long-termism not only promotes and justifies the accumulation of the obscene levels of wealth, necessary to take on projects spanning centuries or millennia; it also turns the obscenely rich into the chosen ones to guide humanity towards achieving its cosmic potential.
If you are a megarich, longtermism offers you a place in the cosmic order. Suddenly you're not just someone with money and nothing to do with it, no: your money makes you one of the chosen few who can discuss the future of all humanity. And you are important and valuable, because people with fewer resources cannot attain the level of cosmic perspective that lets you and you ilk confidently plan across millennia.
Longtermism not only lets you feel necessary and important enough to plan and shape the entire future evolution of civilisation; it also sets you free of any practical, ethical, and legal legal considerations.
Compared to the fate of 1050 people, any other considerations are marginal. If you move within this conceptual framework, you can do anything you want. Preferably not with your own money, of course, but by securing public funding for your private fantasies, in the best tradition of the rich of all time.
And so we have Musk creating SpaceX to colonise Mars with NASA money, and buying Twitter on debt to avoid a risk to civilisation.
We have Gates writing books on how to avoid future pandemics and influencing international reaction to CoVid through the Gates Foundation.
We have Peter Thiel (the one from Palantir) and Zuckerberg and Bezos buying up huge estates in New Zealand and Hawaii to build doom-proof residences.
We have foundations such as OpenAI (the ones from chatGPT) advancing the myth of Artificial Intelligence by dictating what are the viable lines of research.
We have conventions, conferences, research institutes with pompous names like:
- Centre for the Study of Existential Risk
- Future of Humanity Institute
- Future of Life Institute
that are concerned with the problems of the next millennia, draining funds and talent from actual problems to cater for the pet peeves of the megarich.
And then, of course, a whole constellation of biotech research institutes that just slightly border on eugenics, typically in the form of assisted fertilisation institutes that not only screen out common genetic defects in embryos, but can also engineer in some desirable characters. Eye colour, hair colour, cognitive potential.
After all, the future of mankind cannot be guaranteed without the creation of a superior race capable of taking responsibility for millenary projects.
If you haven't yet caught the drift here, let me break it to you: we are facing a Nazi-like cult for the use and consumption of techno-fetishist megarich algojerks.
Try reading the pronouncements of the mega-rich from the perspective of the longtermist technocratic Nazi cult. Suddenly what sounded like delusions of grandeur of useless, bored, rich people take a much, much more disturbing turn.
However much they may try to cloak it in scientific pretensions and futuristic slogans, we are faced here with the same old story.
Over and over in history, there have been those claiming to act for the greater good as they defined it; humans claiming to be doing God's work; larger-than-life visionaries with goals so holy and grandiose that uncounted numbers of ordinary people may be sacrificed in their pursuit.
We know what interests they are really pursuing, and we know how it ends.