U.S. abandons AI elements in nuclear weapons management

48
The United States refused to equip strategic management centers with artificial intelligence systems weapons. As the head of the joint center of artificial intelligence, Lieutenant-General Jack Shenahan, said that people will always be responsible for ballistic missile launches.

U.S. abandons AI elements in nuclear weapons management




Although the leadership of the center advocates a tighter integration of systems with artificial intelligence in weapons systems, at the same time, it is categorically against the inclusion of artificial intelligence in the management of nuclear weapons. According to Shanahan, the decision to launch a nuclear strike should be made only by a person.

The general explained that at present five teams of two officers each are on duty at the ground mine missile complexes. One such team is responsible for launching 50 ballistic missiles. In the case of a decision to launch a nuclear strike, each unit receives a launch code. After receiving the order, all five teams must compare the launch code contained in it with the code already stored in the safes in the control centers, and then perform certain actions aimed at launching the missile launch procedure. In submarines, from receiving an order to launch no more than 15 minutes pass, ground installations must launch missiles at the indicated time.

Although human actions are slow compared to artificial intelligence systems, however, in nuclear weapons control systems, it is people who provide protection against errors when receiving orders to launch a nuclear strike. It is the complicated procedure for launching missiles that should eliminate such an error, and by assigning this task to artificial intelligence, a nuclear war can be unleashed due to a program error in the "electronic brains".
48 comments
Information
Dear reader, to leave comments on the publication, you must sign in.
  1. +7
    28 September 2019 13: 06
    Well, thank God ! I had no mind not to joke with the Apocalypse.
    1. +1
      28 September 2019 13: 10
      Quote: KVU-NSVD
      Well, thank God ! I had no mind not to joke with the Apocalypse.

      I agree, to entrust the fate of the entire planet to the computer, this is already too much. Glucan and all good night, eternal.
      1. -3
        28 September 2019 13: 26
        I agree, to entrust the fate of the entire planet to the computer, this is already too much. Glucan and all good night, eternal.

        A person can also fail. For example, a girl will throw a brow, because he did not often buy shavuha to her, but he grief the whole world in ruin. wink
        1. KCA
          +7
          28 September 2019 13: 35
          And what does an upset warrior do without access codes and one? Three people must pass access codes, and you can only launch a rocket together, turning the keys at the same time, I don’t think that in the USA it’s somehow simpler
          1. +3
            28 September 2019 17: 54
            Quote: KVU-NSVD
            Well, thank God ! I had no mind not to joke with the Apocalypse.

            They refused now - they will introduce it later.
            Quote: KCA
            And what does an upset warrior do without access codes and one? Three people must pass access codes

            Think of the "war averted".
            An American officer received a message "to deliver a nuclear strike against the USSR."
            And the war did not start only because the encryption was labeled "necessary, but not urgent." The officer doubted the order, although one of the subordinates had to be arrested, who began the procedure for launching missiles
            So one person is enough. A cryptographer or such an officer who knows the codes and lied about the order
            Yes, and three agents at the right places to implement, although difficult, but real.
            Apocalypse is inevitable
        2. 0
          28 September 2019 13: 53
          Quote: Jack O'Neill
          I agree, to entrust the fate of the entire planet to the computer, this is already too much. Glucan and all good night, eternal.

          A person can also fail. For example, a girl will throw a brow, because he did not often buy shavuha to her, but he grief the whole world in ruin. wink

          Maybe, but it’s usually noticeable for a person that something is wrong with him and one person cannot start such things. And according to the program, it is usually noticeable that something is not right after it happened. Significant difference.
          And a well-known fact, when the Soviet missile attack warning system issued an erroneous warning about the launches of American missiles, it was the officer who did not flog the fever and first decided to double-check everything and figure it out. And he was right, because it turned out to be a false signal. A person, unlike a machine, still has one significant difference - the instinct of self-preservation. Colleagues will stop. And mass insanity is something from the realm of fantasy. hi
          1. -2
            28 September 2019 14: 10
            Quote: Leshy1975
            according to the program, it is usually noticeable that something is not right after it happened

            For you too. Are you a program? wink

            Quote: Leshy1975
            mass insanity, this is something from the realm of fiction

            Oh well ... the demonstrations of the liberda that you so zealously support - they all dream of it, or what?
            1. -1
              28 September 2019 15: 18
              Quote: Cat Man Null
              Quote: Leshy1975
              according to the program, it is usually noticeable that something is not right after it happened

              For you too. Are you a program? wink

              Quote: Leshy1975
              mass insanity, this is something from the realm of fiction

              Oh well ... the demonstrations of the liberda that you so zealously support - they all dream of it, or what?

              You know Kisa, but I'll still slap you a minus, begged. Even here you with your liberda pinned. From the heart, keep "buddy" laughing .
              1. -4
                28 September 2019 15: 33
                Quote: Leshy1975
                Even here you with your liberda pinned

                It was not me who "pulled in", it was you - who pulled in. And this liberda is yours ... "owner". Have you already cut the fly? wink
          2. KCA
            +1
            28 September 2019 18: 38
            The officer violated all job descriptions and didn’t tell about the operation of the SPRN, no one would start rockets right away, at least 4-5 checks would pass, I don’t even know how miraculously Petrov did not go to the tribunal, but was simply dismissed from the ranks
      2. +1
        28 September 2019 14: 47
        hi Well, yes. And let it collect irrefutable evidence from social networks aside from the "button", but here you need a person. Moreover, with a human delay, it takes more time for relatives to call to say goodbye winked
    2. +1
      28 September 2019 13: 12
      I never tire of repeating that the main detail of any weapon is the head of its owner, and as for the strategic one, it must be smart and fresh.
      1. +4
        28 September 2019 13: 27
        Quote: Chaldon48
        I never tire of repeating that the main detail of any weapon is the head of its owner, and as for the strategic one, it must be smart and fresh.

        I immediately remember how the MCC excitedly expected the landing of the Buran shuttle. But, “something went wrong” (in their opinion) and Buran broke a steep turn and flew to the north-west. When they were about to give the command to destroy the shuttle, Buran again made a sharp turn and rushed off in the opposite direction. And only then did the MCC understand that the shuttle simply damped the speed with these turns in order to normally land. From here we make a disappointing conclusion that man himself no longer remembers what he laid down in the apparatus program. And even more so, he no longer knows what his car is capable of. In this case, the car was smarter than people. sad
        1. KCA
          +3
          28 September 2019 13: 37
          They wrote, it seemed that there was a strong side wind and "Buran" considered that it would not fit into the strip, and therefore went for the second run
          1. +5
            28 September 2019 14: 31
            I read that Buran quenched the speed, which turned out to be higher than the calculated one.
            Quote: KCA
            They wrote, it seemed like there was a strong side wind

            I won’t say anything about the crosswind, I don’t remember, I know that Buran turned long before landing and there was no smell of crosswind there either. The following happened, temperature sensors (when the shuttle began to enter the atmosphere) began to show overheating of the case, the automation immediately issued a command to decrease the angle of inclination of entry into the atmosphere. But, such an action was at the expense of speed, which was prohibitive. Further, the machine gave Buran a command to extinguish this speed by turns in order to sit down normally. So that we all have computers that work like the Buran automatics! good
        2. The comment was deleted.
    3. +5
      28 September 2019 13: 12
      Quote: KVU-NSVD
      Well, thank God ! I had no mind not to joke with the Apocalypse.

      Yeah ... it’s not clear that it’s better to have an AI or a stoned US soldier .. Five years ago, there was a scandal when on a combat alert, a US soldier was caught stoned ..
      1. +4
        28 September 2019 13: 20
        Maybe they were afraid that the AI, having weighed all the pros and cons, would have ordered the destruction of the White House and the Pentagon in order to end all the wars in an instant?
      2. -3
        28 September 2019 13: 57
        Quote: Svarog
        it’s not clear what is better than AI or a stoned US soldier ..

        Worst of all is the stoned Svarog.

        My friend, you are Nibelmes in AI, and in the Strategic Missile Forces - also Nibelmes ... but the "label" must be left, the reflex ... yes, "Svarog"? laughing
    4. +3
      28 September 2019 13: 17
      They are not friends with their intellect, even if they do not touch SkyNET. laughing
    5. +2
      28 September 2019 14: 18
      They are afraid of SkyNet. Suddenly it really wiser, it will see the light and with all the proletarian hatred in Washington and the Pentagon will fry! laughing
    6. 0
      28 September 2019 20: 51
      Probably the only right decision of the Americans recently.
  2. +2
    28 September 2019 13: 08
    The end of the world is being postponed.
  3. The comment was deleted.
  4. -1
    28 September 2019 13: 09
    Did the hostages want to live right away?
  5. 0
    28 September 2019 13: 09
    We understood that Hollywood is only a Hollywood, laughing but a serious person is needed.
  6. +1
    28 September 2019 13: 10
    AI is primarily an analysis technique that allows you to see the whole set of algorithmically changing events that can lead to one or another key burst of confrontation when you need to take equivalent measures to ensure your own security. Therefore, there is nothing to refuse because Americans do not have AI. Everything as it was on the subjective and emotional assessment of the so-called expetra remains. At the same time, methods for analyzing big data on functions of a constant value of a number are ignored. This means that science as such does not change the fundamental principles of the analysis of what it studies.
    1. +1
      28 September 2019 21: 01
      Quote: gridasov
      AI is primarily an analysis technique that allows you to see the whole set of algorithmically changing events that can lead to one or another key burst of confrontation when you need to take equivalent measures to ensure your own security. Therefore, there is nothing to refuse because Americans do not have AI. Everything as it was on the subjective and emotional assessment of the so-called expetra remains. At the same time, methods for analyzing big data on functions of a constant value of a number are ignored. This means that science as such does not change the fundamental principles of the analysis of what it studies.

      I knew that you would leave your comment under this article, tell me, how can I calculate the probability that I will know more than once exactly under what articles to search for your comments? (I didn’t look in profile - I swear by my beauty)
  7. +3
    28 September 2019 13: 13
    The largest scandal in recent years related to gross violation of military discipline broke out in the US Air Force, which is responsible, in particular, for the Minuteman ground-based nuclear intercontinental missiles. As Air Force Secretary Deborah Lee-James and Air Force Chief of Staff General Mark Welsh said at a Pentagon press conference on Wednesday, 11 American officers at six air bases were involved in drug use and distribution.

    ..... Or maybe better AI ..? request
    1. 0
      28 September 2019 13: 30
      Yes, one should not forget about the human factor.
  8. 0
    28 September 2019 13: 15
    Funny, but it’s both right and wrong at the same time.
    AI is not human, AI is alien to fear, hatred, envy, etc.
    We and the amers have already had situations when a nuclear war did not start only because of the "human factor".
    Yes, it’s good that it didn’t start!
    But, look at this from the other side: an order came to attack for example Moscow. Officer, by itself, first check the order, then it will stupid (do not press it? Can it, well, is it better to drink beer? .. laughing ), or it may not fulfill the order at all.
    And if the order came because our missiles were flying to Washington?
    AI will fulfill the order, but the person is not so reliable.
    You can change places in Moscow and Washington, I just gave an example.
    1. -1
      28 September 2019 13: 35
      Some kind of zugzwang. It is probably better to completely abandon nuclear weapons.
    2. 0
      28 September 2019 14: 36
      Quote: Jack O'Neill
      Funny, but it’s both right and wrong at the same time.
      AI is not human, AI is alien to fear, hatred, envy, etc.
      We and amers already had situations, when only because of the "human factor" a nuclear war did not start.
      American, are you talking about Stanislav Evgrafovich Petrov?
      https://topwar.ru/127057-stanislav-petrov-chelovek-predotvrativshiy-yadernuyu-voynu.html
  9. 0
    28 September 2019 13: 22
    Well, at least something, the right thing to do, in recent time and at least have not thought of introducing that technology that has not really been worked out yet.
  10. +2
    28 September 2019 13: 26
    U.S. abandons AI elements in nuclear weapons management

    The correct decision - you never know where the AI ​​will "fire" - on the enemy or on its territory - like the Patriot of the Saudis))
  11. 0
    28 September 2019 13: 46
    And how are things going with AI in this area?
    1. +4
      28 September 2019 13: 48
      why the hell is there anyway?
  12. +1
    28 September 2019 13: 46
    The United States refused to equip strategic weapon control centers with artificial intelligence systems.

    -America has not abandoned the use of computers and information processing algorithms used, and necessary for making the MOST DECISION on the ACTIVITY of nuclear weapons.
    —- Regardless of whether the information at the input of the SOLUTION is correct or not, it is up to the person.
    -This willful decision is ALWAYS and not a logical conclusion.
    —- However, the information from the sensors may be incorrect. And despite the many independent methods (sensors) for obtaining each of the necessary inputs of the solution, theoretically they ALL may be incorrect.
    —- In this case, using only the decision input data, a person can make the wrong decision.
    -Technique ALWAYS CAN SUMMARIZE.
    —- A person, unlike technology, is also given intuition, and the instinct of self-preservation. We hope that they will not fail when the input data is with glitches.
  13. +3
    28 September 2019 13: 47
    "Skynet" they still do not want .... reasonable.
    1. 0
      29 September 2019 02: 29
      We carried out the modeling, as they like, and as a result, Skynet went over to the side of Russia. bully
      1. +1
        29 September 2019 08: 26
        Quote: lexus
        We carried out the modeling, as they like, and as a result, Skynet went over to the side of Russia. bully

        Well yes, our hackers are the best hackers in the world! Ha ha ha
  14. +2
    28 September 2019 14: 11
    And immediately in my head the legendary voice of the translator Volodarsky from the Terminator:
    "And machines rose from the ashes of nuclear fire ..." laughing
  15. +2
    28 September 2019 14: 49
    That is, the story with the terminators will not become a reality?
  16. -2
    28 September 2019 15: 08
    The Pentagon will be forced to switch to AI in the control of the strategic nuclear forces after the deployment of "Frontiers" in Chukotka and "Zircons" on attack nuclear submarines within a 6-minute reach of the US national territory bully
  17. 0
    28 September 2019 15: 09
    The general thinks soundly. Although the codes are 000000 for everyone.
  18. 0
    28 September 2019 15: 37
    This is a risky business.
  19. 0
    28 September 2019 17: 09
    ... because AI does not exist in nature, and how to develop it - now no one is close. In general, for loading mattresses that load the flight mission from floppy disks to this day - calculators must first be upgraded. And not dream about AI
  20. 0
    28 September 2019 20: 54
    did the right thing.
    can't happen on August 29, 1997, 02 hours 14 minutes US Eastern Time
  21. 0
    28 September 2019 21: 06
    Although human actions are slow compared to artificial intelligence systems, however, in nuclear weapons control systems, it is people who provide protection against errors when receiving orders to launch a nuclear strike. It is the complicated procedure for launching missiles that should eliminate such an error, and by assigning this task to artificial intelligence, a nuclear war can be unleashed due to a program error in the "electronic brains".

    Though the brains are turned on, well done!
  22. 0
    29 September 2019 11: 25
    The key criteria for the work of AI is not speed, but the capacity of the processed information, as well as the algorithmic sequence of the formation of the stages of the analysis of these processes or the consequences of decisions made. Therefore, the transfer rate of a unit of information is something small compared to the transfer of transformable volumes of capacitive local processes.