Last year, the Austin City Council approved the use of delivery robots in town. The robot delivery company in question is called Starship Technologies (even though based on their robot design Droid Technoiogies would probably been a better... wait that’s been taken twice, hasn’t it?)
Here’s a photo gallery, in fact, of the little guys visiting Austin:
So it was interesting to read in Business Insider that Starship has admitted people will sometimes, well, abuse the robots.
"Some people pass our robot and kick the robot a little bit," Heinla told Business Insider. "That's not really a problem I think, if people have such anger management techniques that's fine by us, our robot just drives on."
The piece also notes a 2015 study in which folks displayed “anti-social” behavior towards a robot in a Japanese shopping mall.
But it was this sentence that jumped out at me:
“Amid all the controversy about Google weaponising AI and fears over Boston Dynamics' door-opening robot dogs, perhaps we should actually be worried about how humans treat tech, rather than the other way around?”
Now, with all due respect to the writer of this particular piece: ARE YOU KIDDING ME?
There is literally an entire show about this on HBO. It is called “Westworld” and it is about humans abusing the living crap out of robots -- emotionally, sexually, physically. Then the robots start taking their revenge. (The 2003-2009 show “Battlestar: Galactica” explored similar themes.)
But robots don’t have souls, you might argue. They are not alive like a human or an animal or even a bug.
And yet, the debate continues: To what extent does an entity have to prove its consciousness to be treated as, well, something with consciousness?
Let’s cut to the bottom line: Humans have absolutely no problem abusing other humans/pets/animals/the environment. Let just go ahead and assume that humans are going to treat tech with no more respect than they treat anything else. What that means for us, for technology that is designed to imitate people and for said technology is for us to determine.
Which is to say: When the robots come to Austin, people are going to kick them.