Panopticon

Based on two pieces by Nick Bostrom:

The article argues that we need a sophisticated global panopticon to avoid complete global human destruction. The basic argument is that as we keep inventing new technologies, eventually we will invent something that has the potential to wipe all humans off the face of the earth. This was the case with nuclear weapons however nuclear weapons are non-trivial to create. The difficulty level has basically prevented complete destruction by nuclear weapons.

What Nick Bostrom is talking about are a made-in-the-kitchen technology that can have the same or great effect as nuclear weapons. So that if the technology made it into the mainstream, then we can basically say good night.

The potential for disaster is presented by a class of actors - as named by the articles - called apocalyptic residual:

[…] and there are some actors (‘the apocalyptic residual’) who would act in ways that destroy civilisation even at high cost to themselves.

It is this apocalyptic residual from which we need to protect ourselves from. Both articles are very well argued and make a solid case for world wide surveillance and a global government to take charge in order to protect us from apocalyptic residuals.

These article make superb arguments for a 1984 scenario. However what are the potential dangers of such a system? Besides the obvious potential for total global corruption when there is a single global government.

Under a system that makes all humans behave the same, act the same and censor the same, one could argue that the measures advocated could lead to humankind being less resilient to viruses or other evolutionary crisis. Human biological harmonisation.

Imagining being in a constant state of - potentially - being watched, at any time, would certainly have negative emotional impact on certain parts of society. This would lead to an increase in depression and suicide. On the other hand, after one or two generations this might well become the new “normal” and no one will know the difference. Move on, nothing to see here.

Human Longevity

Since the main aim of the articles is to ensure the longevity of humans, from another perspective, the question becomes what will lead to human extinction more speedily: the invention of a table-top technology that can wipe us out or the measures we take to prevent this table-top technology from being invented?

What the articles potentially also miss is an invention that might wipe out a majority but not all. Leading to a world with less conflict. Since less humans means less chances of meeting another human. Since wars only happen when humans meet other humans.

Or what would happen if we invented a technology that would lead to world peace and harmony? That is say, we invented something (perhaps similar to global universal basic income) that made humans behave better to one another? One should be fearless enough to pose these questions.

Also it can be said that humans are very resilient: humans have, over the eons, come and gone, sometimes fewer and sometimes more, but somehow have always survived. So will we probably continue to survive. (Although one could also argue that in comparison to a lot of other lifeforms on this planet, we haven’t even evolved out of our evolutionary nappies.)

Even if this means leaving native tribes in the Amazons and Papua New Guinea alone so that they could act “seed” the next round of human evolution. These tribes are the last humans who still know how to live with nature and survive in a world where you don’t have electricity or creature comforts.

If someone who is used to having their food delivered, their thoughts implanted, and their internet beamed into their home would be the only survivor after a “near missing attempt” in wiping humans off the face of the earth, how long until all of humans would be gone? Hence the Amazonians.

Although, unfortunately, they also seem to want to annihilate themselves. Which does raise the question whether modern weapons are a kind of apocalyptic tool that Nick Bostrom describes but for native tribes?

So is it really worth giving up our creativity, individualism and freedoms for some potential threat that might never materialise? For those that have read 1984, this threat seems to be similar to the enemies of Oceania that never quite get defeated and neither do they defeat Eurasia. Thus the basis for restriction and constraints are maintained.

Basically: do not let governments control the future by ruining the present.

Great Walls

The Great Wall of Japan is another example of governments trying to control the future, instead they ruin the present. Staring at a concrete wall might save lifes sometime in the future, but it definitely ruins the quality of life today. Sadly this is what predictions are all about.

Reincarnation and Resurrection

And going off on a tangent, perhaps reintroduction of reincarnation into Christian teachings would make this place a better place. The basic tenor is: believing you will come back here makes you act differently then believing you will leave and go to heaven.

Corona

Corona/Covid/Sars is also, slowly but surely, becoming a Orwellian type of justification for complete state-based control. We can’t quite completely get rid of it, so you’ll have to stay at home.

Numbers go down, some measures are repealed, people spread covid, numbers go up and lockdown follows. Rinse and repeat.

Only with each repetition, the controls of the state over it’s people are increased - just that little bit more.

Extreme Views?

These ideas can also be seen as a preparation for changes that aren’t quite as extreme but still unacceptable with current state of affairs. Extreme points of views can be used to get people to accept things that they would never accept unless there is an even worse scenario on the table.