Thursday, November 8, 2007

Robot killing machines or Psycopathic Law enforcement officers?

Do you remember this scene from the 1984 film Robocop?

You have 20 seconds to comply!

[Mr. Kinney points a pistol at ED-209]
ED-209: [menacingly] Please put down your weapon. You have 20 seconds to comply.
Dick Jones: I think you better do as he says, Mr. Kinney.
[Mr. Kinney drops the pistol on the floor]
Dick Jones: [Mr. Kinney drops the pistol on the floor, but ED-209 advances, growling]

ED-209: You now have 15 seconds to comply.
[Mr. Kinney turns to Dick Jones, who looks nervous]
ED-209: You have 10 seconds to comply.
[Entire room of people in full panic trying to stay out of the line of fire, especially Mr. Kinney]
Kinney: Help me!

ED-209: You have 5 seconds to comply... four... three... two... one... I am now authorized to use physical force!
[ED-209 opens fire and shreds Mr. Kinney]

At least Mr Kinney had 20 seconds warning. The 9 military personnel recently slaughtered by an out of control robot had no warning Robotic Cannon Mysteriously Kills 9 Soldiers.

“it is assumed that there was a mechanical problem, which led to the accident. The gun, which was fully loaded, did not fire as it normally should have," he said. "It appears as though the gun, which is computerised, jammed before there was some sort of explosion, and then it opened fire uncontrollably, killing and injuring the soldiers."

Young says he was also told at the time that the gun's original equipment manufacturer, Oerlikon, had warned that the GDF Mk V twin 35mm cannon system was not designed for fully automatic control. Yet the guns were automated. At the time, SA was still subject to an arms embargo and Oerlikon played no role in the upgrade.

Young says in the 1990s the defence force's acquisitions agency, Armscor, allocated project money on a year-by-year basis, meaning programmes were often rushed. "It would not surprise me if major shortcuts were taken in the qualification of the upgrades"


The questionable reliability and safety of robotic weapons is clearly an important issue and likely to be one that is going to come up again: according to the Pentagon: Our new robot army will be controlled by malware

"A US defence department advisory board has warned of the danger that American war robots scheduled for delivery within a decade might be riddled with malicious code. The kill machines will use software largely written overseas, and it is feared that sinister forces might meddle with it in production, thus gaining control of the future mechanoid military.

Apparently the FCS programme office has admitted that there is a "low to moderate risk that malicious code could be inserted... and exploited.""

So in the same way that your computer frequently crashes due to the incompatibility of hardware and software, inevitably from time to time robotic weapons developed under FCS projects are going to make mistakes.

The US Army's Future Combat Systems (FCS) programme

The FCS project website tells us that it is the Army's modernization program consisting of a family of manned and unmanned systems, connected by a common network, that enables the modular force, providing our Soldiers and leaders with leading-edge technologies and capabilities allowing them to dominate in complex environments.

The first barrier to a fully functional robot army is technical -- no one has created a reliable, effective way to make robots truly autonomous.[…]

The DoD estimated in 2006 that the total investment in robotic research from 2006 to 2012 would be $1.7 billion [source: Development and Utilization of Robotics and Unmanned Ground Vehicles].[…]

A major goal of the FCS project is to create a universal platform that the Army and other forces can incorporate into military systems from now on. One of the challenges the military has faced over the years is that it relies on a mix of equipment, vehicles and software that aren't integrated with one another, making battle coordination and tactical discussions difficult.
http://electronics.howstuffworks.com/robot-armies2.htm

If you are a front line soldier and your survival is dependent on the performance of an unmanned drone ahead of you would you feel safe knowing there was a moderate risk of the drone malfunctioning? The answer is no. But why is unreliable technology being put into the theatre of war?

The answer must be to do with cutting costs. Suppliers such as iRobot corp are in the business of making weapons for profit, as with any business, their interest is in increasing sales and profits for shareholders. So if it is cheaper for the programming to be outsourced to a sweat shop in India, with all the communication problems that might arise, then that is what is done.

So what if soldiers die at the hands of misfiring automated weaponry? or the elderly, disabled and children die from the use of a taser-wielding robot? It won't affect the profits of the company if there are a few casualties...

Then there are the Ethical issues:

Would a country with an armed robotic force be more likely to invade another country, knowing the invasion would likely result in very few casualties? By removing the human element from war, do we make it even more inhumane? When a robot breaks down during a mission, do we risk sending humans in to retrieve and repair it? Can we be sure that robots will know when to stop attacking when an enemy surrenders?
http://electronics.howstuffworks.com/robot-armies4.htm

Coming to a street corner near you...

The US military has deployed robotic weapons in it’s current empire building campaign: Gun Toting Robots See Action in Iraq but how long is it before armed robots are in place in civilian areas? It won't be too long before we see the Rise of the Machines: Military robots to be armed with Tasers:

RoboCops and robot soldiers got a little closer to reality Thursday as a maker of floor-cleaning automatons teamed up with a stun-gun manufacturer to arm track-wheeled 'bots for police and the Pentagon.

By adding Tasers to robots it already makes for the military, iRobot Corp. says it hopes to give soldiers and law enforcement a defensive, non-lethal tool.

Non-leathal tool? Amnesty International reports 152 taser-related deaths in the US

The article continues:

"I could see rent-a-cop companies wanting to buy it, I can see corrections departments wanting to buy it, because it might be seen as a cost-effective alternative to having a human guard patrolling a perimeter," Pike said.

"For now, as soon as you let go of the joystick, the robot just sits there," Pike said. "So questions of moral agency don't arise - that is to say, whose finger is on the trigger. But a little further down the road, when these ground vehicles do achieve greater autonomy, there may be no human finger on the trigger."

As commented on SOTT.net:

"questions of moral agency DO arise.- What is the difference between a robot and a psychopath taking life or death decisions over others if they are both equally heartless?"
You only have to read the news to see how psychopaths in the Police force may just as well be conscienceless robots:
Chicago Cop Being Investigated for Tasering an 82-Year Old Woman

Wheelchair-Bound Woman Dies After Being Shocked With Taser 10 Times

UK police are told they can use Taser guns on children

If our current law enforcement officials have no regard for human life then will an expensive lump of metal with faulty-software-controlled-weapons make you feel any safer? The answer is no. Will Police officers be free from incrimination and instead the blame shifted onto the technology manufacturers? The manufacturers will no doubt be able to negate any liability under corporate protection laws. Will we see more reports on software malfunctions ending in bloodshed? Will US, NATO or Israeli air strikes that kill civilians no longer be accompanied with the claim that human error was to blame but computer malfunction?

It is convenient to put the blame of loss of life onto someone or something else rather than face up to the truth. The fact is that the powers that be are spending billions of dollars on technology that is answerable to no one. Blame shifting is one of many tactics used by psychopaths who will do whatever it takes to fulfil their own political agendas and create as fruits of their labour, the psychopathic (Ponerised) societies in which we are now living.

The reader is invited to read Ponerology of Apathy and War:

” Understanding the science of ponerology presents a real and viable opportunity to create a life affirming world rather than the current world ruled by the inhuman laws of war and apathy.”

You have 20 seconds to comply!

No comments: