Saturday, February 2, 2019

People Are Violently Assaulting Robots and Scientists Are Worried

Via mysteriousuniverse.org by Sequoyah Kennedy

We just have to accept it. We’ve been dragged into the future, despite however hard we kicked and screamed. Robots are real, and they’re basically indistinguishable from the robots of our fictions. If you’re worried about these metal abominations coming to take everything you love, and would like nothing more than to go all John Connors and reduce the lot of them to slag, you’re very much not alone, and researchers are quite concerned.

In a January 19th New York Times article titled “Why Do We Hurt Robots?,” Jonah Bromwich cites a number of cases where people “brutally assaulted” robots. Poor, innocent “security robots” battered and beaten. Gangs of filthy meatbags attacking driverless cars. Three teenagers in Japan beating a robot “with all their might.” A Moscow man bludgeoning a “teaching robot” with a baseball bat as it pleaded for help. People, hilariously and ill-advisedly, crashing their own driverless cars on purpose (I respect the commitment to the fight, but those things are expensive). Another security robot was—just fight back the tears; tears can come later—wrapped in a tarp and covered in barbecue sauce. Humans are such horrible creatures that we manage to hurt things that don’t even have consciousness, let alone pain receptors.

And thus, in keeping with the new cultural norm of using words without any consideration for what they mean, the term “robot abuse” was born.


Destroying robots is wrong. For the same reason that destroying someone else’s car, toaster, musical instrument, house, or computer is wrong. It’s property destruction. However hilarious covering a security robot in barbecue sauce is, a lot of money and time went into creating that machine. But that’s all it is, a machine. Yet we all anthropomorphize robots, both those scared of a robot takeover and those so deeply entrenched in robotics that they see property destruction as abuse.

Cognitive neuroscientist Agnieszka Wykowska, a researcher at the Italian Institute of Technology and the editor in chief of the International Journal of Social Robotics thinks that our violence towards robots stems from the atavistic demons responsible for our tribalism and ostracization of the other. She says:

“You have an agent, the robot, that is in a different category than humans. So you probably very easily engage in this psychological mechanism of social ostracism because it’s an out-group member. That’s something to discuss: the dehumanization of robots even though they’re not humans.”

There’s a lot wrong with this statement. One: you can’t dehumanize something that isn’t human. Two: robots are not agents. They can only do what they are programmed to do. They literally have no agency.

Asked about potential solutions to this “disturbing” problem, Ms. Wykowska offers a story about the savagery a colleague witnessed a kindergarten class express towards a robot:

“Kids have this tendency of being very brutal to the robot, they would kick the robot, they would be cruel to it, they would be really not nice.

“That went on until the point that the caregiver started giving names to the robots. So the robots suddenly were not just robots but Andy, Joe and Sally. At that moment, the brutal behavior stopped. So, its very interesting because again its sort of like giving a name to the robot immediately puts it a little closer to the in-group.”

So the solution is to teach kids that machines are people? Fantastic. What if, and I know this might sound simple, ignorant, or possibly even on-the-wrong-side-of-history, but what if we don’t want machines as part of the in-group? What if assigning humanity to a tool cheapens and debases our own self-awareness and understanding of what it means to be conscious?

Let me just step up on this here soapbox for a minute. The whole problem is this shoving the future on us whether we want it or not. The Times article cites a Brown University and M.I.T. study that showed adding a robot to a workforce reduced the number of employed humans by six. In the very near future we will have armed security robots. We know these machines are coming, we know they’re changing the world, and we know it’s unavoidable. We need to treat them like the tools they are, not like people. We need, more than anything, to learn to celebrate humanity and all conscious non-humans (if I ever see a robot messing with an elephant, that’s going to be one broken robot), not blur the lines between conscious creatures and machines. Forcing people to deny a basic truth—that machines aren’t people—will only make people angrier in the short term, and in the long term will only further the mechanization and dehumanization of the world at large.

But stop breaking other people’s robots.

Source

No comments:

Post a Comment