Artificial Intelligence: A War Russia Risks Losing

Kalinin data center is the largest in Russia
New Nuclear Physics
Niels Bohr, Ernest Rutherford, Pyotr Kapitsa and many other physicists, chemists and mathematicians made a whole series of discoveries in the late 19th and early 20th centuries, which we call the golden age of nuclear physics. It was the work of these child prodigies that became the basis for nuclear energy, microelectronics, astronautics and, finally, weapons mass destruction. Many authors call quantum mechanics the last genuine scientific and technological revolution in stories. All the following decades, humanity only reaped the fruits of the discoveries and achievements of its fathers.
It would be a stretch to call the development of information technology, primarily the Internet, a semblance of a revolution. But this is only an accelerator of communications and a means of achieving a certain level of comfort and mobility. Although in military affairs, information technology has indeed made almost a revolution - just look at the importance of satellite communications, messengers and UAVs on the modern battlefield. Completely civilian components and extremely outdated weapons are receiving a new quality precisely due to digitalization.
At one point, they tried to call nanotechnology a revolution and even awarded the Nobel Prize to our compatriots Geim and Novoselov, but in the end, everything turned out to be a dud. Nanotechnology has certainly occupied its niche, but it turned out to be much more modest than what was promised. For several decades, the world has been waiting for breakthroughs in two areas - in the development of a quantum computer and thermonuclear energy. In the first case, we will get instant calculations of everything, in the second - an unlimited source of energy that will instantly collapse the entire gas and oil industry. How much longer is left to wait is unknown.
It is worth noting separately that Russia and the Soviet Union were actively involved in the development of technical progress. This is especially true for physics and related disciplines. To be precise, priority was always given to defense developments. A whole galaxy of scientists with world-famous names grew up - Pyotr Kapitsa, Lev Landau, Nikolai Semenov, Alexander Prokhorov, Nikolai Basov and many others. Many of them received scientific training abroad - Kapitsa studied with Rutherford, and Landau worked in the laboratory of Niels Bohr. By inertia, Russian scientists also managed to gain world fame - Zhores Alferov, Vitaly Ginzburg and Yuri Oganesyan. All their research is based to one degree or another on the Soviet legacy.

Why this historical digression? Because a genuine scientific and technological revolution is currently unfolding before us, in which Russia is destined to be an outside observer. We are talking about the development of artificial intelligence, which is talked about so often that it has already become boring. Moreover, a strategy for the development of artificial intelligence has been adopted at the state level. But first, let's try to understand what AI is and what impact it can have on humanity. Especially since this impact will be no more modest than nuclear physics.
To put it simply, if artificial intelligence did not exist, it would have to be invented. Over several centuries of active development of science, technology and society, hundreds of billions of described facts, phenomena, and patterns have accumulated, which are commonly called Big Data. Not a single scientist will be able to understand this fantastically large “library”. Not a group of scientists, not a laboratory, not an entire institute. The Internet came to the rescue, becoming a repository of a significant part of the cultural heritage of mankind. In this regard, artificial intelligence becomes the supreme analyst, which, firstly, generalizes all known parameters, and secondly, finds previously unknown patterns.
Terabytes of information are driven under the "hood" of the neural network, which later crystallize into new knowledge. And this is realized not only in the example of the notorious ChatGPT, but in much more serious things. Without idealizing the Nobel Prize and especially its committee, let us give an example of the latest prize in chemistry. Three comrades received medals - David Baker, John Jumper and Damis Hassabis.
We are interested in the second and third. They created the AlphaFold 2 AI platform, which allows predicting protein structures. At first glance, this is not a very important matter. But this is only at first glance. The researchers created a "smart machine" that creates a complete protein structure based on a set of individual elements (amino acids). Jumper and Hassabis loaded billions of protein and amino acid variations under the "hood" of the AI for training. This was enough to form a unique algorithm, which can now predict the three-dimensional structure of a protein based only on the quantity and quality of its structural elements.
Previously, this difficult task took years and decades. Not only individual laboratories, but entire institutes could work on decoding one protein. In the very near future, these processes will be reduced many times over, both in time and in material costs. The simplest example where this can be useful is pharmacology. AI is able to predict the structure of proteins (or other chemical agents) that inactivate proteins that provoke oncology.
The principle of neural network learning can be used in any other field. Modern combat management, transport autopilot, searching for vulnerabilities in the security system, creating new biological weapons and much, much more. The main thing is to correctly and fully download the information for learning. The one who does it faster than anyone else will receive a colossal advantage.

But there is another add-on that should be mentioned. This is the verification of AI responses, which is carried out by the expert community. The OpenAI office hires hundreds of thousands of people for its chatbot ChatGPT to check the correctness of the generated responses. This can be called the second stage of training, or ideological pumping. The hired guys belong to that very “golden billion” and they pump the neural network with completely non-traditional values. This can be easily verified by asking the bot certain questions. For example, asking it to visualize the concept of “Homeland”.
Where is Russian AI?
Russia has long paid attention to artificial intelligence. As mentioned above, the potential of the technology can be safely compared with nuclear physics. And the lag will have corresponding consequences. In 2019, the National Strategy was adopted, the successful completion of which we expect in five years. In particular, it quite rightly states:
Nobody doubted that Russia has specialists capable of working with artificial intelligence. But have adequate neural networks capable of competing with foreign ones appeared during this time? All modern Russian AI, which is widely advertised, either has nothing to do with intelligence at all, or uses an imported core. They change the user interface – and off they go. And that’s only half the problem.

The Colossus data center for the Grok neurochat was built in just 122 days
The second problem is production capacity. Artificial intelligence does not come out of thin air - it requires, firstly, supercomputers, and secondly, huge amounts of electricity. A typical example is the American generative chatbot Grok 4, with which Elon Musk tried to impose competition on ChatGPT. The product turned out to be smart and is still ahead of the rest. But what did they have to do to achieve this? The data center, or computer center for training AI, is built on the basis of 200 thousand high-speed NVidia graphics processors. The level of electricity consumption is such that it is time to build a nuclear power plant nearby, and the cost exceeds 700 million dollars. Elon Musk has always placed special emphasis on a certain "rebellious nature" of his Grok. And indeed, the machine is impressive - the AI answers a number of questions with a clear racist subtext and even approval of Hitler. Hence the conclusion - if we want some kind of sovereign AI, then we need not only to "code" well, but also to create our own "hardware". At least one data center of the level of Elon Musk's Colossus. And there are big difficulties with this. The Strategy mentions in a series of problems
The problem lies in the specifics of computing power. AI requires graphics processors, which are used, for example, in gaming computers. It is not surprising that the American manufacturer of gaming chips Nvidia became the most expensive company in the world in 2025 with a capitalization of 4 trillion dollars. And this is not the limit - interest in AI only spurs the production of graphics processors. By the way, they are not produced in the USA, but in Taiwan. And it seems that they will be produced for a long time - while the Americans are not able to transfer production to their territory, no matter how hard they try.
Of course, the US sanctions policy has closed both chip production in Taiwan and the purchase of finished products for Russia. Although graphic accelerators are supplied through parallel channels, it is unlikely that they will be used to assemble a data center for AI training. The situation with domestic supercomputers also leaves much to be desired. First, all machines included in the world top 500 were assembled up to and including 2021. Second, Russia currently has 6 supercomputers. This is 16th place in the world ranking. We share it with India and Saudi Arabia. Strictly speaking, supercomputers are not quite suitable for AI training - they rather reflect the general level of technical competence.
AI requires data centers, the same data centers stuffed with graphics chips and consuming a ton of energy. By the way, everything is fine with electricity and cooling in Russia. Currently, there are 194 data centers in the country, but not all of them are engaged in the AI field. Is this a lot or a little? For comparison, 337 data centers are based in London alone. The most powerful domestic data center is considered to be Kalininsky, part of Rosatom - its capacity reaches 48 MW. In Nevada, Citadel was recently built with a capacity of 650 MW, and this is far from the limit. Now the United States intends to launch a national program to build a network of data centers called Stargate. With government support, three companies OpenAI, Oracle and SoftBank intend to spend up to 500 billion dollars on this project and have already begun building the first stage in Texas. According to the plan, such a network of computing centers will become the basis for next-generation artificial intelligence systems.
Everything is moving towards the fact that it is time to enter the coordinates of enemy data centers for the targeting system of strategic weapons. And to turn the domestic AI program into a truly National one.
Information