Stop killer robots, before it’s too late

Perhaps it seems that the gulf between Democrats and Republicans, Catholics and Protestants, Californians and Oregonians, is too vast to overcome.

Everything divides us these days — gun control, free trade, and whether or not a hot dog is a sandwich.

Yet there is no need to get all down in the dumps. We’ve scoured the globe to find an issue that a wide majority of people can get behind — and must get behind — in order for our world to be peaceful and supportive of human life.

It’s the worldwide Campaign to Stop Killer Robots. Yes, you read that right. The idea of killer robots is no longer Hollywood sci-fi: It’s real, it’s really important, and it’s really scary. Dozens of international organizations, mostly supporting peace and disarmament and human rights worldwide, have combined to lobby government, corporations and thinkers the world over to confront this issue before it’s too late.

Stuart Russell, a leading AI scientist at the University of California in Berkeley, told the United Nations last year that “pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security.”

As the campaign argues, the expanded use of unmanned vehicles over the past decade has already changed warfare. It’s now possible to create fully-autonomous weapons that are able to fire and “think” on their own, without any human intervention. That carries tremendous ethical, legal, moral, and technical concerns that humanity and government have yet to grapple with.

The campaign argues that “giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology. Human control of any combat robot is essential to ensuring both humanitarian protection and effective legal control. A comprehensive, pre-emptive prohibition on fully autonomous weapons is urgently needed.”

To get to that end, international treaties that call for a pre-emptive ban are necessary. And there is no time to waste.

Militaries in many nations, including the United States, China, Russia and the United Kingdom, are already moving toward systems that could give greater combat autonomy to machines. A step beyond remote-controlled warfare drones, it could be the beginning of a robotic arms race. If we head down that road technologically, it may soon be difficult to change course.

“Allowing life or death decisions to be made by machines crosses a fundamental moral line,” the campaign argues. “Autonomous robots would lack human judgment and the ability to understand context. These qualities are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.”

The scientific community is heavily united in desiring hard and fast rules, and in establishing international norms — such as we have done regarding biological and nuclear weapons.

This may seem futuristic and hypothetical, but it won’t be for much longer. So now is the time to give thought to how to limit robotic weaponry, and the rules and rule-enforcing bodies that will allow us to benefit from those technologies, not be destroyed by them.

Recommended for you

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.