by Joseph P. Farrell, Giza Death Star:

This story may be taken as the “counterpart” to my blog last Monday about Bunny the dog, for it concerns the Pentagram’s desires for a deployment of artificial intelligence that will be able to “decide” on taking a human life. What it interesting to me is that many of you who shared the story about Bunny the dog, also shared this story about the Pentagram:


Observe that the purpose of this “deployment” is perfectly clear, a machine is to be given the power to make life and death decisions over human life:

The United States government is on the verge of deploying new artificial intelligence technology (AI) weapons that can make decisions on whether to kill human targets.

The frightening lethal autonomous weapons, which are being developed in the United States, China, and Israel, will automatically select humans deemed a “threat” to the system and eliminate them.


If you’re thinking, “Wait a minute, isn’t this completely contrary to Isaac Asimov’s three laws of robotics,” you’d be entirely correct, and indeed, the article does intimate that this step (not only by the Pentagram but by the militaries of other powers) is really an “inflection point” for humanity:

“This is really one of the most significant inflection points for humanity,” Alexander Kmentt, Austria’s chief negotiator on the issue, said in an interview.

“What’s the role of human beings in the use of force — it’s an absolutely fundamental security issue, a legal issue, and an ethical issue.”

But wait, there’s more; according to the article, the deployment in the Pentagram’s case is due to the massive Chinese population and its correspondingly and potentially very large military:

According to a notice published earlier this year, the US government is working on deploying swarms of thousands of AI-enabled drones.


US Deputy Secretary of Defense Kathleen Hicks said technologies such as AI-controlled drone swarms will allow the US to o balance China’s manpower’s Liberation Army’s (PLA) numerical superiority.


“We’ll counter the PLA’s mass with mass of our own, but ours will be harder to plan for, harder to hit, harder to beat,” she said, according to Reuters.

Interesting Engineering reported: The Pentagon is reportedly developing a network of hundreds or even thousands of AI-enhanced, autonomous drones that could be rapidly deployed near China in the event of conflict.

These drones would carry surveillance equipment or weapons and would be used to take out or weaken China’s extensive network of anti-ship and anti-aircraft missile systems along its coasts and artificial islands in the South China Sea. This development could potentially be a major shift in military strategy.

Frank Kendall, the US Air Force secretary, said AI drones would need to have the capability to make lethal decisions under human supervision.

I think it should be obvious and self-evident that that it is a very short operational step from “the capability to make lethal decisions under supervision” to “one can envision circumstances that might arise where that human supervision is not obtainable or where a delay might jeopardize the operation,” and hence, the need to grant to the machine its own “decision making” capability.  And this in a world where, as I blogged just a short while ago, a robot recently killed a worker in South Korea. To be sure, that robot did not “decide” to do so, nor, as far as we know, was it programmed to do so.

But the accident raises the important point about automated drones in warfare: suppose the  Pentagram takes the decision to deploy such autonomous robots, and programs them to recognize Chinese military targets, including Chinese soldiers themselves. On what basis will it discriminate against, say, a Chinese solder and a Korean or Vietnamese or Japanese soldier or sailor? Physiology? Uniforms? Or even – perish the though – transponder signals?  “Your friendly local robocop will not kill you if you have the proper transponder signal: sign up and get your sub-cutaneous transponder chip today, and protect yourself and your family from the consequences of illegality and non-personhood.”

Read More @