Demographic classification of targets in battle

4
Demographic classification of targets in battleJust ten years ago, one of the great challenges for AI was being able to tell women's underwear from men's underwear in the laundromat, or an adult's body from a child's on the Internet. Now the stakes on gender difference analysis have risen: the US military has commissioned an idea to distinguish women from men and adults from children for a target selection system for Air Force pilots and drones. Such a statement of the problem begins a new stage of cooperation between man and art.

One of the problems of the war is to achieve an acceptable balance between the losses of fighters and various categories of civilians. The one who is able to manage this balance gets a significant advantage. True, until now the main tool was “loss admission”: some hide behind children, women and prisoners of war as a human shield, others try to solve an unsolvable problem: how many children can be sacrificed in exchange for a fighter. The natural effect of this confrontation is to create a more selective weapons and the past few decades have been marked by significant progress in this area: conventional nuclear weapons are being replaced by neutron weapons, causing much less damage to the environment with equal losses of manpower, the concept of weapons of mass destruction is replaced by the concept of precision weapons and even the classic “evil scientists” in the face military chemists and biologists are engaged in the creation of selective means of extermination of certain races or demographic groups. But, despite impressive successes, the decisive result was not achieved.

Significantly high hopes inspired by the military use of artificial intelligence. Today it becomes clear that you can no longer throw a grenade into the crowd with the words "God will sort them out": the gods left the planet a long time ago and we need a new, more reliable way to separate ours from foreign and military from civilians. In modern battle, the role of God increasingly takes on artificial intelligence. More recently, its use has been limited exclusively to the management of firing systems according to the “released and forgotten” principle or “drones” - automatic killer aircraft that senselessly shoot all life on the surface of the planet by order of distant commanders, and now work on creating intelligent weapons is increasingly shifting from immediate killing to operations management. An illustrative example is the competition held a couple of years ago by the military department for "an intellectual means of reducing personnel losses during military operations in urban environments." The main requirement was that "this tool is not a weapon," and the recommended solution is to detect potentially dangerous subjects and alert the men about the threat.

The scandals of recent years have demonstrated that the accuracy and automation of weapons is pointless, if there is no way to accurately choose a target: then the pilots will fire civilians by mistake, then drones will suddenly destroy all life because distant commanders decided to "destroy people similar to military , to save the military, similar to people "It is precisely this problem that the idea, for which the USAF promises a little money, is intended to solve. If it is possible to develop a system to distinguish adults from children, and women from men, then in the future, the pilot can shoot into the crowd with a clear conscience: Iskine sorts the targets no worse than the ancient gods.

Interestingly, the development of such a classifier will remove the obvious obstacle facing cybernetic warfare. Until now, the fighters were implicitly assigned the role of a certain conductor of humanity in the inhuman time of the war: not the commanding fathers, not the party and the government, and not all the people, but the fighter delivered the final verdict. That is why so far in all weapons systems the last decision has always been left to man. Now the Iskin becomes the moral equivalent of a person and a simple optimization task is added to the fire control planning: “to maximize the losses of enemy personnel with a minimum of losses among the civilian population”.

There is no doubt that the Iskin will find the optimum better than the fighter, equipped only with a bare empririka, but it is not evolutionary interest in reducing losses by itself, it is much more important that the Iskin makes a decision now. So the last obstacle in front of the most effective means of warfare is removed and soon drones will turn from a dull plane with telepresence into self-sufficient machines of selective destruction. Thus, the war of man and Iskina begins, and it looks like it will be the most humane war of all: the exact equations of loss against ambiguous morality will sum up the eternal war of the forces of reason with the forces of good.
4 comments
Information
Dear reader, to leave comments on the publication, you must sign in.
  1. Tyumen
    0
    26 September 2011 21: 28
    Is it dangerous. Then they can come to such a classification as black-white, white-yellow, etc.
  2. zczczc
    +2
    26 September 2011 21: 32
    Now I understand that T-shirts with the words "I love NY" will be purchased by residents of the occupied countries before the invasion of ground forces ...

    Madhouse, of course, humanity must end with the United States, and as soon as possible. The main thing, because for this nothing needs to be done in the military sphere - just refuse the dollar all at once. Well, or not at once, but refuse.
  3. Fidain
    -2
    1 December 2011 03: 10
    Kakoi smisol yesli oni bonbiat 10 tonami bombami, a potom Upssss izvenite
  4. 0
    14 February 2013 23: 35
    This is already dangerous. I don’t want the robot to make such decisions.