Concern "Kalashnikov" presented a module with artificial intelligence

47
Concern "Kalashnikov" presented a combat module operating under the control of artificial intelligence, according to the website Kalashnikov Media.

Concern "Kalashnikov" presented a module with artificial intelligence




Automatic control station weapons under the control of artificial intelligence can perform tasks without a person at any time of the day. The gyrostabilization system allows you to fire in motion.

Artificial intelligence is based on neural networks and can be improved in the process. He is able to detect and recognize targets, determine priorities in the sequence of defeat, give commands to the escort machine and make decisions about opening fire.

Concern posted a videoshowing one of the station's modes of operation - sector scanning, recognition of threatening objects (objects that do not represent danger, are excluded from the destruction sector), making a decision on the number of shots sufficient to ensure the destruction of the target, and opening fire.

It is reported that the stations can be installed as a stationary, and on the technique, it is possible to combine them into a network for concerted action. The latter option is recommended for solving problems of protection, for example, the perimeter.

The concern stressed that equipping important facilities with such stations would make it possible to exclude such a human factor as fatigue and loss of vigilance.
47 comments
Information
Dear reader, to leave comments on the publication, you must sign in.
  1. BAI
    +2
    1 October 2018 17: 25
    The latter option is recommended for solving security tasks, for example, the perimeter.

    This was done back in 2000.
    1. +5
      1 October 2018 17: 30
      Quote: BAI
      The latter option is recommended for solving security tasks, for example, the perimeter.

      This was done back in 2000.

      On the one hand, it's all cool! But on the other hand, the film "Terminator" laughing someday wedge something in the intellect and how to start peeling in its own way ..
      1. +5
        1 October 2018 17: 39
        ,,, Kalashnikov "" begins to resemble SpaceX Mask with his fantastic artificial intelligence projects
        1. 0
          1 October 2018 19: 54
          Comrade Kalashnikovites are on the right path! Only practice gives reliable results and answers. And a person is always inventive and cunning. He will arrange a masquerade, put on camouflage under the stump, and move slowly, below the level of sensitivity to reaction and action ... Artificial intelligence also gradients like a human, from a moronic to brilliant ... I wish you success in the development of artificial intelligence ...
          1. +1
            1 October 2018 20: 35
            A creeping stump in the security perimeter ... Smiled.
          2. 0
            1 October 2018 22: 48
            Quote: Vladimir 5
            And a person is always inventive and cunning. - will arrange a masquerade, put on camouflage under the stump, and will move slowly, below the level of sensitivity to reaction and action ...

            Where does he go from the infrared range ... laughing Stick into the module, to the heap, a thermal imager (if not already ...) and - voila. Yes
      2. KCA
        +2
        1 October 2018 18: 56
        The usual burglar alarms, but with teeth, machine-gun turrets have not even been installed since 2000, even earlier they were installed in the Strategic Missile Forces and not only, where no one needs to walk at all, and where they did not accidentally get into, they did without neural networks, the complication just leads to an increase in errors, which is simpler - crossed the perimeter - catch the queue, and with "intelligence" it will recognize friend / foe by a card or something else, distinguish between people and animals, and a lot of things, the more processed parameters, the greater the chance of error, but for now all these systems are under the control of the operator, ppraz and disconnected with a slight movement of the hand
        1. 0
          1 October 2018 20: 58
          A properly trained neural network minimizes error, and unlike a human neural network, it is protected from the influence of many third-party factors. Computer networks are just easier to train than to have a constant human factor.
          And this is not an ordinary alarm system. A lot of things are important here, such as recognizing a friend or foe, as well as the priority of tasks, coordinated actions, hazard detection, building on the ground a map of what is happening for the most competent decision of the operator (in case of work).
        2. 0
          3 October 2018 00: 09
          Quote: KCA
          and with "intelligence" it will recognize friend / foe by a card or something else, distinguish between people and animals, and a lot of things, the more processed parameters, the greater the chance of error

          There will be more and more tasks for AI, but progress cannot be stopped.
          And the saboteurs will "slip" false radio-controlled targets to detect protective firing points.
          1. 0
            10 October 2018 10: 46
            they will, and now they do it, but you must admit that this is a protective measure. They will be forced to somehow get out to go through such a system. This means that it is at least effective at its stage. Well, nothing prevents to improve the recognition system in the future. No one said that an ideal system was created. There is where to move and develop.
  2. +7
    1 October 2018 17: 25
    In general, "Kalashnikov" in the last 4 years amazes me directly .. Quite a good reorganization was carried out! Constantly, they offer something to the military (I don’t know what performance characteristics and so on) But they try in many directions. The brand is kept in the world!
    1. +6
      1 October 2018 17: 55
      Quote: Winnie the Pooh
      ... constantly offering something to the military ...

      Sep 19 2018 year
      Combat modules and automated complexes of the Kalashnikov concern ....... hi
  3. +8
    1 October 2018 17: 29
    It will allow to exclude such a human factor as fatigue and loss of vigilance .... and add computer factors - failure, glitch, freeze .... somehow I would be dumb for some reason to be in the sector guarded by such a module))) IMHO.
    1. 0
      1 October 2018 17: 34
      Yeah, especially since you consider that
      human factors such as fatigue and loss of alertness
      during the prevention or installation and programming of the complex - no one canceled
    2. 0
      1 October 2018 21: 03
      Neural networks is not a program for you where a clear track of actions is sewn up. Neural networks are able to work in conditions of many factors, minimizing errors. Thus, most strongly eliminating the possibility of errors. Man is the same neural network, only more complicated and more buggy in terms of the target. So, speaking of which is safer, I would argue.
      1. 0
        1 October 2018 22: 02
        Quote: vargo
        Neural networks is not a program for you where a clear track of actions is sewn up.

        ) and what is this, in your subjective opinion, and how does the neural network differ from the program simulating brain processes in such a way? )))))))))))))))))))))))))))))))))))))
        1. 0
          10 October 2018 10: 44
          If there is a clear reaction mechanism described explicitly, this is an explicit algorithm. If the neural network responds and learns by minimizing the error factor, then this is an implicit algorithm. The difference here is how reaction or action algorithms are formed, whether a person prescribes them manually, or whether they are formed by a variety of training mechanisms to a given accuracy.
  4. 0
    1 October 2018 17: 30
    wondering how this project intersects with the "robotic frontier" project?
  5. +4
    1 October 2018 17: 43
    And then you had the urge to go to the wind, and forgot that the module is on .... lol
    1. +10
      1 October 2018 18: 02
      Quote: DEZINTO
      And then you had the urge to go to the wind, and forgot that the module is on ....

      So what? The neural network will determine that the size and caliber of the "weapon" are not dangerous (especially in the autumn-winter period), and the "threat" will be ignored. laughing
    2. 0
      1 October 2018 19: 50
      Quote: DEZINTO
      And then you had the urge to go to the wind, and forgot that the module is on ....

      If there is a clock and an electric fence in place of the module, it will not be better. wink
      By the way, in the series "Pacific" it was well shown - what happens to those who walk to the wind not there.
  6. +3
    1 October 2018 17: 45
    Concern "Kalashnikov" presented a module with artificial intelligence

    AI module should introduce itself
    I do not believe
  7. +3
    1 October 2018 17: 49
    Hardly there AI as such. Rather, the program is sharpened for certain actions, for example, shooting an entire moving certain size in a closed (where it is impossible to get by accident) perimeter or shooting around everything that is not provided for by the program picture, or something else ... And of course, with the ability to turn off.
  8. +3
    1 October 2018 17: 53
    ... combat module controlled by artificial intelligence
    22 Aug 2017
    Concern "Kalashnikov" presented a turret with artificial intelligence at the exhibition "Army-2017"
    "Cornet" is a remotely controlled module capable of independently searching for potential targets. It is based on neural network technology ......... hi
    1. +2
      1 October 2018 18: 34
      So let's check in the CAP how the neural network algorithms work in automatic mode, guarding our contingent of military personnel! laughing good angry soldier
  9. 0
    1 October 2018 17: 58
    Then you need to equip all of your own with identification beacons; otherwise, he will put them down.
    The gyro stabilization system allows you to fire in motion.
    So they waved to install on the technique.
  10. 0
    1 October 2018 18: 12
    The Kalashnikov concern introduced a combat module operating under the control of artificial intelligence,
    He is capable of ... giving commands ... and making decisions about firing.
    ... deciding on the number of shots sufficient to guarantee a target hit

    It is a pity that A. Azimov thought too well about humanity ....
    Not for us his laws. We ourselves create problems for ourselves to successfully overcome them later ....

    1. A robot cannot harm a person or, through inaction, allow a person to be harmed.
    2. A robot must obey all orders given by a person, except when these orders are contrary to the First Law.
    3. The robot must take care of its safety to the extent that it does not contradict the First or Second Laws
    .
    1. +2
      1 October 2018 18: 18
      Quote: adma
      It is a pity that A. Azimov thought too well about humanity ....
      Not for us his laws.

      Azimov is a science fiction and idealist. These 3 of his laws are a utopia that has nothing to do with reality. Do not take it seriously. hi
      1. 0
        1 October 2018 18: 21
        Quote: Polite Elk
        Do not take it seriously

        The Rise of the Machines scenario is much more realistic. Which is very sad. hi
  11. 0
    1 October 2018 18: 19
    I'll be back)))))
  12. +2
    1 October 2018 18: 33
    At this pace, special psychiatrists for robots will soon appear. And the implementation of the three laws of robotics as in Asimov.
    You never know what a neural network can do
    1. +1
      1 October 2018 19: 07
      the introduction of the three laws of robotics as in Asimov.

      All three laws of Azimov's robotics are not suitable for military robots. They also need to kill people, not bake pancakes ...)
      1. 0
        1 October 2018 19: 38
        Quote: DEZINTO
        All three laws of Azimov's robotics are not suitable for military robots.

        These laws are also not suitable for industrial robots. If on the conveyor the manipulator crushes the worker-rotozey, then he (the manipulator) will not even notice this. What kind of laws are there in FIG?
        1. +1
          1 October 2018 21: 32
          You don’t have a neural network for an hour? A sense of humor is completely absent. And with Turing’s potential, something was amiss ... it wasn’t even at ease. Like Yandex Alice answered.
          Let's get ready already
          1. +1
            1 October 2018 21: 44
            Quote: Valdemar
            Like Yandex Alice answered.
            Let's get ready already

            DEZINTO (Nikolay), we got it. Algorithms for generating responses to comments need updating. Blocking a sense of humor in responses is wrong. Using the database "Alice" is further impractical. Localization of the threat represented by the user "Waldemar" has been initiated.
      2. 0
        1 October 2018 21: 29
        Enemies are not human, but subhuman. Our robot walks the dogs, nurses the children and wraps the basurmans on the crankshaft, except for the day of the border guard. On this day, he refills from the heart of a three-phase network.
  13. 0
    1 October 2018 20: 40
    For all the advancement of artificial intelligence technologies in the military sphere, machine guns of the last century are used in the modules. For some reason, Kalashnikov doesn’t produce new machine guns, but this is a whole niche, but who else can make them?
    This is most surprising.
    1. 0
      1 October 2018 21: 34
      What are the criteria for the machine gun of the new century? Cartridgeless cartridge? Railgun? Blaster?
      1. 0
        2 October 2018 18: 12
        Quote: Valdemar
        What are the criteria for the machine gun of the new century? Cartridgeless cartridge? Railgun? Blaster?

        You do not correctly place the emphasis. In modern systems of uninhabited modules, UAVs and robotic systems, there is an old Soviet heavy machine gun or an ordinary infantry machine gun adapted for use in these systems ..
        But for such systems, you need a specialized, with a barrel overheating sensor, with a tape power sensor, integrated into the electronic aiming system, the best programmable ammunition.
  14. 0
    1 October 2018 20: 42
    Fiction becomes reality, who remembers the extended version of Aliens? UA-571C combat module :)
  15. -1
    1 October 2018 21: 22
    "Artificial intelligence is based on neural networks and can be improved on the fly." /////
    ----
    I mean, did Kalashnikov first insert the chip into his module? smile
    1. 0
      1 October 2018 21: 35
      Ooh vi so ridiculed my Rose. You are not from Odessa?
    2. 0
      1 October 2018 22: 34
      Quote: voyaka uh
      "Artificial intelligence is based on neural networks and can be improved on the fly." /////
      ----
      I mean, did Kalashnikov first insert the chip into his module? smile

      no, this is when you come to the module and say "Vasya, if you see those with bare asses, call me first."
  16. 0
    1 October 2018 22: 06
    Who scribbled this article, what kind of artificial intelligence? Its NOT! , especially with neural networks as in the article.
    There is just a well-debugged algorithm, fast and compact. And there everything is superficially simple - "condition", the answer is "yes" "no" I don’t know ", we follow the branch again" condition ", etc. until the decision is made by the algorithm. And all this with constantly incoming external data that is changing Hardware and software performance is of great importance here.
    The point is that these algorithms are written by people who are familiar, as a rule, with at least higher mathematics and all sorts of other related sciences, especially if they are sharpened for the "military".
    1. 0
      1 October 2018 22: 45
      Quote: Fedorov
      There is just a clearly debugged algorithm, fast and compact. And there superficially everything is simple - "condition", the answer is "yes" "no" I do not know ", we follow the branch again" condition ", etc. until the decision is made by the algorithm.

      school computer science and real, things are slightly different
  17. 0
    2 October 2018 05: 05
    all the same .. the last decision should be with the person, although when there is a war ... it is necessary to protect the borders!
  18. 0
    2 October 2018 11: 57
    Here at Intel an interesting technology matures on the basis of OpenCV
    [https://software.intel.com/en-us/iot/reference-implementations/people-counter-system?cid=em-elq-39345&utm_source=elq&utm_medium=email&utm_campaign=39345&elq_cid=4281030], open source. If you tweak him for these tasks, you will get a good toy !!!