Stopping Robot Abuse Starts with Design
By Mia C. Sorgi, Business Design Lead
I attended TechCrunch Disrupt London last month and what an event it was. There was a lot going on, from a dazzling supermodel presence to game-faced venture capitalists to open source bionic arms, but one segment of the program captured people’s attention more than others: the appearance of Boston Dynamic’s Spot Mini robot on stage.
Seeing the Spot Mini in person created a mix of admiration (look at the cool robot!) and unease (look at the scary robot!) It’s kind of like a big dog at about 28kg, and — as with any big dog — it seems cute until you feel like it might kill you.
A video accompanied the talk, and we saw more about how they teach a robot to learn by ‘disturbing it’ and making sure it can handle what they put it up against.
Let’s back up. Boston Dynamics is an old school robot company now owned by Google. They’ve made waves in the past when similar videos of their engineers violently shoving their contraptions hit the internet. The internet was aghast!
Marc Robert of Boston Dynamics responded to these critiques by saying that people were wrong to think of the engineer as being abusive; the robot is just a machine, after all. Moreover, these robots are like dear children to his team and they’d never intentionally harm their own children for purposes of cruelty.
There are two sides to this. Perhaps we will make the machines feel bad, or anger them. As a society, we have legitimate sci-fi fears as AI leaps forward; we often envision a world governed by evil robots who move like powerful spider-horses as they retaliate against us. Watch out!
But the other side — the side that got me wondering most — is more about how we treat the robots, not about how they treat us. What do our interactions with them say about our own temperaments and psychologies, and what design choices will impact those experiences?
Case in point. My three year-old enjoys calling Siri a “dum-dum”. We all laugh, because it’s fine to be rude to a machine with a human voice, right? Siri doesn’t engage, but she does take it.
Or how about Alexa, waiting bravely in the kitchen to receive our next epithet, which she will stubbornly ignore? Screw you, Alexa, I hate that song!! (Admit it: one of the first things anyone does is insult Alexa, just to try it out.)
I’ve also met Pepper, the Softbanks Robotics “feeling” robot in the Accenture Centre for Innovation in Dublin, who is designed to read and experience human emotion. How did our meeting go? I laughed at her and told her to get away as I ran off down the hallway… (I was scared by her giant claw hands.) No need for me to demonstrate empathy, it’s elementary school all over again!
So when does it really go from funny to evidence of our own poor behavior, ill temper, or abusive tendencies? When does right and wrong come in when designing interactions with machines?
Siri may not be programmed to engage with the name-calling, but she’s also not helping teach my daughter the value of kindness and polite behavior. My daughter gets away with rudeness, whereas I’d prefer if Siri insisted she behaves nicely. Alexa may withstand my shouting at her, but in the process, I’ve shown my kids it’s okay to shout at things for sport. Pepper? Well, with Pepper I’ve honed my Mean Girl social exclusion skills, and she can now internalize my rejection of her!
(And by the way it’s quite salient to point out that these machines in my examples have feminized personas – does abusing “female” machines come more easily than “male” machines? That’s a chilling thought right there…)
How do we preserve and promote a sense of decency in these interactions, be they chatbot or robot, and what obligation do we have to ourselves to preserve the elements of civility which we trust to stitch society together?
It seems to be increasingly urgent that we do so. Not because the robots could get smarter and fight back, but because we shall lose something fundamental about ourselves in the process.
I’m glad it’s upsetting to see someone kick a robot. Let’s hope it stays that way.