Monday, November 20, 2017

It May Be Time to Fear Swarms of Autonomous Slaughterbots

Via mysteriousuniverse.org by Paul Seaburn

The good guys call themselves the Future of Life Institute. The bad guys are smart drones called Slaughterbots that can swarm a crowd yet kill precisely, delivering an explosive to the forehead of selected individuals while letting the others run in terror … or cheer the killing. While it sounds like a great plot for a dystopian movie or a sci-fi series, the Future of Life Institute is a real organization. And the Slaughterbots?

“Slaughterbots” the video (link here) was released this week by the Future of Life Institute at the United Nations Convention on Certain Conventional Weapons held in Geneva. The purpose of the Convention “is to ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.” The mission of the Future of Life Institute is “To catalyze and support research and initiatives for safeguarding life and developing optimistic visions of the future, including positive ways for humanity to steer its own course considering new technologies and challenges.” The purpose of the video is “safeguarding life” from tiny autonomous armed drones that can kill without human initiation based on things like data collected from social media.

Uh-oh.


The Future of Life Institute created the fictional (for now) video in conjunction with the Campaign to Stop Killer Robots, an international coalition working to preemptively ban fully autonomous weapons. It sound like the Convention on Certain Conventional Weapons is the perfect place to fight for such a ban … if it’s not too late.

Stuart Russell — Professor of Computer Science and Smith-Zadeh Professor in Engineering at the University of California, Berkeley, and a leading AI researcher – closes the video with a warning about Slaughterbots:

“Its potential to benefit humanity is enormous, even in defense. But allowing machines to choose to kill humans will be devastating to our security and freedom. Thousands of my fellow researchers agree. We have an opportunity to prevent the future you just saw, but the window to act is closing fast.”

The goal of the Campaign to Stop Killer Robots is not to restrict innovation in AI and drones but to stop the use if AI for selecting and killing without having a human in the decision-making process or pulling the trigger. The AI technology shown in the video is admittedly fictional but the components are recognizable in current devices, drones and social media.

Perhaps the most disturbing part of the video is not the killing but the cheering. It’s just as easy to imagine this happening as well. Maybe that’s what the powers-that-be should focus on when deciding whether to ban slaughterbots.

Thoughts?

Source

No comments:

Post a Comment