Digitalization is the integration of digital technologies into everyday life by digitizing all things and events around a person and society.
More and more countries are adopting their national programs for the development of artificial intelligence. The technology of various Internet things and systems like a smart home is actively developing. The first beginnings of a total digitalization of the economy began to appear with the advent of the first cryptocurrencies. Then, many countries announced plans to convert their national currencies to digital.
For example, in China, the digital yuan is already being tested, although so far in a limited framework.
The trend is obvious and understandable: in 10-20 years, humanity will enter the information era, digitalization will become an integral part of our civilization, and each person will have their own digital place in this system and leave their mark from any actions.
All kinds of horror stories (such as the emergence of malicious artificial intelligence, the uprising of machines against people, etc.) are too speculative, or rather even fantastic.
There is a threat far more real and sinister than the events shown in the Terminator feature film
Probably every advanced PC user is faced with a situation where a perfectly serviceable computer could suddenly freeze, reboot or shut down altogether. A “blue screen of death” suddenly appears on the monitor, and all sorts of bugs pop up in games and programs.
In 99% of cases, this is a software error, or a flaw in the code. However, in 1% of cases, this is an independent and no longer recurring anomaly.
A similar anomaly is the failure of an absolutely working device, in which case it is impossible to identify the cause of the error at the software level. Therefore, it is impossible to trace any pattern of its manifestation and prevent the occurrence of such an anomaly in the future.
The microelectronic industry began to face such anomalies as early as the 1970s, when integrated circuits became critically small
It became obvious almost immediately that the sources of such failures are high-energy particles of cosmic radiation. Once in an integrated circuit, a particle can completely disable it by inducing secondary radiation in individual elements of the circuit (but still, fortunately, in the vast majority of cases, particles fly through the circuit without causing physical harm).
Most often, cosmic rays change the potential of the memory cells of a microcircuit, for example, replacing one bit of information from zero to one.
This effect has been referred to as “single event disorder” (SEU)
Depending on where exactly and what effect the particle had on the microcircuit, the consequences of this “single failure” also differ. The program can either simply issue a critical data error, or automatically reboot, trying to restore its performance, or it can continue to work, giving out already erroneous data, which can provoke a cascading effect of a set of errors.
And it’s good if this program is isolated (the inverse bit can be rewritten to the correct one). And if it is open for general access and will serve as a source of data for the execution of other programs or subroutines responsible, for example, for the safe execution of processes dangerous to humans or society?
The situation is further aggravated by the reduction in the size of integrated circuits. Thus, in circuits made using 22 nm technology, the frequency of occurrence of “single effects” is increased by an order of magnitude compared to circuits based on the 130 nm process technology. With a dimension of 22 nm, a single cosmic particle is capable of striking not one, but several logical elements of the circuit at once. The incidence of irreversible damage to the microsome is also on the rise.
This problem is especially acute today in the space industry. The most common reason satellites fail is because of the failure of onboard electronics.
Our consumer electronics are generally not suitable for operation in space, they fail almost 100% of the time within a short period of time. Satellites need to work for several years. Therefore, all space electronics are manufactured according to special standards, they are maximally protected and duplicated by several parallel processes. That is why its price is unrealistically high and does not resemble mass distribution and use.
For example, some satellites use integrated circuits based on the KNS technology (silicon on sapphire), which ensures radiation resistance in space conditions.
Consider the most socially significant cases of computer failures due to spontaneous bit changes:
– In 2003, during a vote in Belgium: then a “one-off disorder” due to exposure to an ionizing particle changed one bit of information from “0” to “1”, which assigned an additional 4096 votes to the candidate, adding them to the actual 514 votes.
– In 2008, a particle of ionizing radiation changed a bit of information in the inertial navigation and flight control unit of the Airbus A330 aircraft, which led to incorrect data on the angle of attack, altitude and flight speed. The aircraft’s autopilot reacted to this with a sharp downward jerk of such force that 119 passengers were injured of varying severity. In addition, the error caused a cascade failure in the operation of several onboard systems, which made it impossible to safely continue the flight.
For more than 30 years of active study of this problem, no reliable way to protect electronics from single high-energy particles has been found.
This kind of failure can happen anywhere, anytime
For example, the most dangerous, in my opinion, may be the appearance of an error in the code of the artificial intelligence program, which will provoke the cancellation of important software locks (remember the very laws of robotics by Isaac Asimov?). A mistake in the banking system, which can lead to zeroing of a personal bank account, will also be a very unpleasant surprise.
But what if the autonomous taxi of the future, due to a systemic failure, crashes into a pole with passengers? Will they write it off to fate?
Universal digitalization – as a new evolutionary stage of society – can be one of the stages of the “great filter”
The Great Filter is a hypothesis put forward in 1996 by Robin D. Henson, designed to resolve the Fermi paradox, which proceeds from the absence of visible traces of the activities of alien civilizations, which, according to all calculations, should have settled throughout the universe during its existence.
All evolutionary leaps – both physiological and technological – are a kind of “Great Filter” that civilization either overcomes or stops developing further. What is the most serious barrier in the form of the main “Great Filter” is unknown to science today. Perhaps the digitalization of an intelligent civilization is the very barrier.
Imagine a society that has been living in a digital world, virtual environment and information space for at least 50 years. And then suddenly somewhere nearby (on a cosmic scale) a supernova flares up, the ionizing radiation of which completely destroys the world familiar to humans.
For us, such a supernova may be the star Betelgeuse, which, by the way, can explode at any moment. For biological life on Earth, the explosion of this star will not pose a particular threat, which cannot be said about modern electronics.
Despite all this, digitalization is a necessary stage that our civilization must go through in order to increase its productivity thousands of times while maintaining the level of consumption of primary resources.
All that remains for us is only to secure as much as possible and distribute data centers in such a way that it would be possible to avoid a total collapse of the entire system due to uninvited guests in the form of particles of cosmic radiation, the impact of which even some 50 years ago we would not even have noticed.