Tomorrow and the day after fighting robots

26

Robotics is currently one of the most promising branches of military technology. To date, automated devices capable of performing various tasks have already been created. True, the current unmanned aircraft and helicopters, as well as ground tracked vehicles, for all their abilities, still cannot work completely autonomously. In most cases, autonomy is limited to some actions that do not require, as they say, great intelligence: moving to a given point, tracking space, searching for objects that stand out against the general background, etc. As for decisions about waypoints or about attacking a detected target, they are still made by the system operator, i.e. man. Fully automatic operation of the military robots so far remains the “property” of science fiction, and scientists and engineers are just now taking the first confident steps in this area. The development of robotic technologies can affect not only the capabilities of automated systems, but also other aspects of human society.

In science fiction, the serious question of the interaction of a man and a robot possessing artificial intelligence of one or another level is often considered. The current state of affairs suggests a gradual transition of this issue into real life. For this reason, already now some people and public organizations are trying to predict further developments and, if possible, take appropriate measures. Not long ago, the human rights organization Human Rights Watch (HRW) released a report on this issue. The work of Losing Humanity: The Case Against Killer Robots (“Losing Humanity: Arguments Against Killer Robots”) discusses the prospects for using fully autonomous combat robots, as well as problems that, according to the authors of the report, will necessarily arise when they are operated in real conflict situations. . In addition, the report discusses some of the legal aspects of such “progress”.



First of all, the authors of the report “Losing Humanity” noted the fact that all existing robots are to some extent autonomous, only the level of this independence is different. Therefore, all the robots with the possibilities of independent work, including combat, are conventionally divided into three groups: human in the loop (the person in the control system), human on the loop (the person above the system) and human out of the loop (the person outside the system management). In the context of combat robots, such a division implies the following work algorithms and levels of autonomy: if the human operator “is” in the control system, the robot independently finds the targets, and the person gives the command to destroy them. Two other types of combat robots can make their own decisions and attack, but the concept of human on the loop implies the possibility of control from the person and allows the latter at any time to adjust the actions of the robot at its discretion. Human out of the loop robots are completely independent and do not require any human control.

According to HRW employees, the third category of robots, completely autonomous and not controlled by humans, will be the greatest danger in the future. In addition to technical and moral issues, related legal issues are noted. Among other things, in the event of a certain development of events, such combat vehicles can greatly affect the entire appearance of hostilities, including violating basic international agreements. First of all, Human Rights Watch employees appeal to the Geneva Conventions, and more specifically to the part that obliges developers weapons check it for safety for civilians. HRW believes that manufacturers of combat robotic equipment are not interested in this issue and do not carry out any checks, which would entail losses among the civilian population.

The main prerequisite to the risks associated with the use of robotic combat systems, HRW employees consider the insufficient level of development of promising robots. In their opinion, a combat robot, unlike a human being, will not be able to guaranteedly distinguish an enemy fighter from a civilian or an actively resisting enemy from a wounded or prisoner. Therefore, the risks that robots simply will not take prisoners and begin to finish off the wounded are too great. The authors of the report, apparently, do not adhere to the best opinion about the capabilities of future robots and believe that promising combat systems will not be able to distinguish an armed and actively adversary from an innocent or aggressive civilian in their appearance and behavior. In addition, human rights experts deny the future of robots the ability to predict the behavior of the enemy. In other words, a situation is possible when an enemy soldier who wants to surrender, picks up or drops his weapon, goes to meet the robot, and he will misunderstand it and attack him.

A direct consequence of the lack of human traits, and a consequence dangerous, Human Rights Watch considers the possibility of using robots in operations to suppress people's freedoms and human rights. Human rights activists consider “soulless machines” an ideal tool for suppressing riots, repressions, etc., because, unlike a human being, a robot will not discuss an order and will execute everything indicated to it.

HRW fears that a characteristic feature of combat robots without human control will be the absence of any responsibility for their actions. If the operator of the remotely controlled drone struck at civilians, then he will be asked for it. If a robot commits such a crime, then there will be no one to punish. The robot itself is not a rational being capable of understanding the essence of punishment and correcting itself, and according to HRW employees, it is pointless to apply penalties against the military who sent it on a mission, as well as to punish the developers of the hardware and software of the robot. As a result, robots can become an excellent tool for solving combat missions in the most vile way - with the help of war crimes. In this case, all revealed facts can be blamed on a defective design or software failure, and proving the guilt of specific people will be almost impossible. Thus, what human rights activists are afraid of, no one will be punished for crimes.

Due to the high risks, Human Rights Watch calls on countries to abandon the development of fully autonomous combat robots and to ban such equipment at the legislative level. As for the concepts of human in the loop and human on the loop, the development of such systems should be monitored and checked for compliance with international standards. Those. all responsible decisions should always be made by a person who has the appropriate knowledge and tolerances, but not automatic.

Judging by the current trends, not all leading countries fully agree with the report from HRW. By now, the prerequisites have been formed not only for the creation, but also for the active use of the most automated systems. Moreover, in a number of cases, their application not only does not contradict international humanitarian law, but even in a certain sense helps to comply with its norms. An example of such work is the Iron Dome, an Israeli missile defense system. Since this complex is designed to intercept unguided short-range missiles, the algorithms of its operation are designed in such a way that most operations are performed automatically. In addition, with the appropriate command of the operators, it is possible to automatically carry out the entire interception cycle, from detecting an enemy rocket to launching antimissiles. Thanks to this, it is possible to destroy the enemy Qassams before they reach the settlements. As a result of the use of a virtually autonomous robot, Israel manages to preserve the lives and health of its citizens, and also save on the restoration of destroyed buildings.

The second argument in favor of continuing the development of automated "soldiers" also has humanitarian prerequisites. The use of a large number of ground combat robots will allow to abandon the living fighters and save their lives. If the robot gets damaged in a battle, then it can be quickly repaired or written off for scrap and replaced with a new one completely similar to the old one. Yes, and to produce such a technique by orders of magnitude easier and cheaper than raising and training soldiers. Obviously, the robot can recover into battle soon after assembly, and a man after birth needs to grow, learn basic skills, learn a lot of various information and skills, and only then can he learn military affairs. Thus, the widespread use of combat robots will help reduce the loss of manpower. In addition, to maintain a sufficiently large fleet of robotic "soldiers" will need a relatively small number of operators, mechanics, etc. So with regard to the replacement of living soldiers by mechanical mechanics, the gain is double: lives are saved and money is saved.

As for the fears of human rights defenders regarding the excessive independence of the combat robots of the leading countries, an answer has long been prepared. For example, a couple of years ago, the United States published its strategy for developing military automated systems up to 2036. Americans will develop the first so-called. controlled independent systems. Those. combat vehicles with the ability to work autonomously, but without the right to make serious decisions. In the future, it is planned to put into operation the armed forces and fully autonomous machines, however, the first prototypes of such technology, capable of truly taking on the duties of a person, will appear no earlier than 2020 of the year. So over the next few years or even decades on the battlefield will not appear a large number of fully automatic robots who do not know pity and mercy and can only carry out orders. All major decisions will remain a person’s responsibility.

Regarding giving robots greater autonomy, we need to recall one rather interesting opinion. His supporters believe that it is man who should be excluded from combat systems, and not automated hardware. As proof of this thesis, the “design flaws” of living people are cited. The operator controlling the combat robot, including fully controlling all his actions, may become ill, make a mistake or even consciously take any criminal step. According to this point of view, the “weak link” of the robotic combat complex is precisely a living human operator, fully compliant with the Latin proverb about human errors.

Of course, at present, for obvious reasons, both points of view have the right to life: as offering not to give robots freedom of action, and speaking about the need to remove a person from the system. Both of these opinions have their advantages and disadvantages. It is unlikely that the debate will soon cease to identify the most promising and viable concept of using combat robots. It is possible to find out who is right, only in one way: to wait for further events in the development of combat robotics. It is unlikely that the military of the leading countries of the world will begin to choose an unprofitable and difficult way of developing a promising direction. However, it is now quite difficult to draw any conclusions. Most likely, the current trend will continue in the coming years. Remotely controlled and limited autonomous equipment will continue its development and will be actively used in practice. In the meantime, radically new hardware-software complexes will be created in the laboratories that can act completely independently. The current state of affairs in such projects suggests that in the coming years people will continue to take full responsibility for the actions of robots, while the problems described in the Human Rights Watch report will remain the subject of interest of human rights defenders, science fiction writers and scientists.


On the materials of the sites:
http://hrw.org/
http://lenta.ru/
http://mport.bigmir.net/
http://ria.ru/
http://bbc.co.uk/
Our news channels

Subscribe and stay up to date with the latest news and the most important events of the day.

26 comments
Information
Dear reader, to leave comments on the publication, you must sign in.
  1. +2
    December 27 2012
    With the exception of aerial drones, ground robots will not soon take their place on the battlefield.
    Anyway, this is inevitable.
    1. ksandr45
      +1
      December 27 2012
      If humanity does not destroy itself even earlier. Of course, I do not believe that our country will be the initiator.
      1. +1
        December 27 2012
        At the expense of robots, I advise you to watch the channel of the company Boston Dynamics, which is engaged in the development of robots just for the US Army.

        http://www.youtube.com/user/BostonDynamics?feature=watch
      2. +1
        December 28 2012
        Quote: ksandr45
        If humanity does not destroy itself even earlier.

        not in time. Robots and computers go leaps and bounds. Very soon, America will begin testing robots in military conflicts (though if their entire GKO dollar pyramid does not fall apart in the coming 2-3 years).
  2. borisst64
    +1
    December 27 2012
    And yet the main question is how will a robot distinguish its own from strangers?
    1. +1
      December 27 2012
      Quote: borisst64

      And yet the main question is how will a robot distinguish its own from strangers?

      According to the microchip implanted in the body of a soldier.
      1. Skavron
        0
        December 27 2012
        According to the microchip implanted in the body of a soldier


        as an option
      2. fern
        0
        December 27 2012
        Unless human soldiers are a thing of the past. All the same, the moment will come when the manpower on the battlefield will become too weak and will not fit into the concept of war.
    2. +6
      December 27 2012
      It’s very simple: yours will be across the ocean, where you can’t get them, and everything that moves on enemy territory is alien ...
    3. Ura-1
      0
      December 27 2012
      It will bring down everything that creeps and flies laughing
      1. +2
        December 27 2012
        Damn mosquitoes in a swamp will be surprised :-)
  3. mamba
    +3
    December 27 2012
    One of Ray Bradbury’s tales describes a fully autonomous drone with a self-study program designed to perform police functions over the city. The main algorithm of his actions was the protection of those who were in mortal danger. Initially, he really destroyed the killers, and his brothers who exchanged information filled the sky of the city. But in the process of self-improvement, they realized that turning off the ignition in the car was an attempt on her life and began to kill parking drivers. Further more. Turning off the lights in the room is an attempt on the life of a light bulb and began to kill people in houses. Then they began to kill hunters, fishers, farmers. When they tried to get rid of them, they began a hunt for those who hunted them.
    1. +2
      December 27 2012
      Quote: mamba
      One of Ray Bradbury’s tales describes a fully autonomous drone

      You probably mean the story Robert Shackley "Guardian Bird" =)
      1. mamba
        +1
        December 27 2012
        You are probably right. Over the years I forgot. In school and student years he read a lot of science fiction, including Sheckley.
      2. Kir
        +1
        December 27 2012
        Rumata, in "Guardian Bird" it all ended with the unbearably smart created a hunter for the guard, and he already begins to rethink the situation, it's like a treatment for the consequences of poisoning with another poison, only the poison is also with the Brains!
  4. +2
    December 27 2012
    At such a pace, the film’s script is not far off, of course, there are pros and cons as well, I don’t want to fight for my life in the confrontation with the machine, although EMI can help but I don’t think that by the time of such an independent robot protection from EMI will not be invented. in general, progress does not stand still and will have to adapt to everything
  5. WW3
    WW3
    +2
    December 27 2012
    Warrior robots
  6. +2
    December 27 2012
    Human out of the loop robots already exist, they shoot children in schools. They are created by modern mass media. God forbid us also from the foolish piece of iron!
    1. WW3
      WW3
      +3
      December 27 2012
      Quote: plebs
      God forbid us also from the foolish piece of iron!

      This is inevitable ... The robot does not need to sleep, it does not get tired and does not lose its vigilance, it just shoots and does not know pity and has no fear .... it can work where a person can’t do it .... in a vacuum, at great depths under water, in conditions of strong radioactive and infection, he does not need protective equipment ...
      Big problem of course Artificial Intelligence,skidding is not a game of chess.... but scientific progress does not stand still ... and the scripts of science fiction films already in this century may become reality ... (I don’t mean "star wars") ....
  7. Kir
    0
    December 27 2012
    In fact, no one has canceled the fundamental laws of nature, and they can rightfully be transferred to robotics
    In nature: the more highly organized psyche an individual possesses, the less predictable
    As applied to robotics, this can be roughly expressed, the more complex the technology is, including the "brain", the higher the probability of uncalculated consequences, add here possible unintentional and malicious errors in the "knowledge" base and ...
    Not being a complete pessimist, I still tend to think that either artificial intelligence is a matter of the future when people can more clearly control all sorts of processes, although crooks and "adventurers" will hardly cease to exist, or the robots will remain like "long arms".
  8. 0
    December 27 2012
    It touches the links to fiction, do you really think that the scheme invented by the writer over a cup of coffee will work?
    A robot requires fuel (an analogue of food, only a robot is much more voracious), a robot requires maintenance (an analogue of rest).
    The identification problem is not as acute as it seems, our positions are here, there are enemy positions, there are none of them in enemy positions.
    1. Skavron
      +1
      December 27 2012
      Setrac, more recently, a mobile phone was something out of the ordinary, now the most sophisticated models do not even remotely resemble those that were 10 years ago. Only 10 years !!!
      Progress does not stand still, and in the field of electronics and IT, it is taking leaps and bounds. That which was fantastic 20 years ago is now all over the world. By 2050, there will already be the most perfect "thinking" robots ... if, of course, we (earthlings) live))
      1. 0
        January 1 2013
        Remind me what kind of science fiction predicted cellular? There is much more that science fiction writers have not predicted, so don’t rely on science fiction writers.
  9. WW3
    WW3
    +4
    December 27 2012
    Quote: Setrac
    A robot requires fuel (an analogue of food, only a robot is much more voracious), a robot requires maintenance (an analogue of rest).

    Wow .... but the robot is iron albeit dear ... she can be sacrificed, repaired, etc. ...man has no spare lives...
    besides, we don’t know what the power sources of the robots will be in the distant future ... the same robots can be serviced by the repairmen ...
    Already now robots are effectively used in mine clearance saving livesher sappers ....
    drones are developing, now the stealth drone is being tested ....
  10. Psychojoker
    +2
    December 27 2012
    The robot will still be controlled by the operator, as II, who is able to deftly navigate the situation and determine exactly who to blame and who is not, is too complicated to develop and does not protect against errors. A man with a joystick is somehow more reliable.
  11. +1
    December 28 2012
    at the end of the article! should be
    "... and the problems described in the Human Rights Watch report will still remain the subject of interest of human rights defenders, science fiction writers, scientists and spies."
    For the main task of this very HRW nursery of spies is clear:
    Forbid the whole world to do what the United States can do. So read the report between the lines: autonomous robots are in the Pentagon and are being developed. bully

    And even more so if the infantry mine is no longer an autonomous robot, only a little simpler than the terminator arranged? stop
  12. without
    0
    December 28 2012
    Iron will be collected soon, but time is needed for synthetic brains, and then people themselves will climb into these brains and say goodbye to a aging aging body
  13. Jenya
    0
    January 2 2013
    Robot soldiers are certainly cool, but I don’t even care so much about the uprising of cars, but the fact that, having reduced losses during the war to zero, mankind will start waging wars more often because the only strong barrier to large-scale wars is losses. As someone said (I don’t remember the name ) It’s good that the war is so terrible, otherwise we would love it too much, but with the robots this
    will pass
  14. 0
    January 12 2013
    real terminators
  15. 0
    January 10 2016
    Robotization can no longer be stopped; this is a fact to be reckoned with.

"Right Sector" (banned in Russia), "Ukrainian Insurgent Army" (UPA) (banned in Russia), ISIS (banned in Russia), "Jabhat Fatah al-Sham" formerly "Jabhat al-Nusra" (banned in Russia) , Taliban (banned in Russia), Al-Qaeda (banned in Russia), Anti-Corruption Foundation (banned in Russia), Navalny Headquarters (banned in Russia), Facebook (banned in Russia), Instagram (banned in Russia), Meta (banned in Russia), Misanthropic Division (banned in Russia), Azov (banned in Russia), Muslim Brotherhood (banned in Russia), Aum Shinrikyo (banned in Russia), AUE (banned in Russia), UNA-UNSO (banned in Russia), Mejlis of the Crimean Tatar People (banned in Russia), Legion “Freedom of Russia” (armed formation, recognized as terrorist in the Russian Federation and banned)

“Non-profit organizations, unregistered public associations or individuals performing the functions of a foreign agent,” as well as media outlets performing the functions of a foreign agent: “Medusa”; "Voice of America"; "Realities"; "Present time"; "Radio Freedom"; Ponomarev; Savitskaya; Markelov; Kamalyagin; Apakhonchich; Makarevich; Dud; Gordon; Zhdanov; Medvedev; Fedorov; "Owl"; "Alliance of Doctors"; "RKK" "Levada Center"; "Memorial"; "Voice"; "Person and law"; "Rain"; "Mediazone"; "Deutsche Welle"; QMS "Caucasian Knot"; "Insider"; "New Newspaper"