0:00
/
0:00
Transcript

What Does a Humanoid Robot Do? I Asked a Humanoid Robot

Here's what it answered.

I am at ICRA – the IEEE International Conference on Robotics and Automation, aka 7,000 roboticists from all over the world converging to discuss every robot-related topic on Earth.

More posts are coming, but for now, day-of, here is a quick podcast interview (the first with video) with, and about, one of the robots I saw in action today – Ameca, by Engineered Arts.

After many an online video, this was my first time interacting with an Ameca robot in the (gray rubber) flesh. It was a different experience than watching a video, in much the same way a conversation in-person is different from watching a video of other people talking.

I don’t mean to suggest that the robot is the equivalent of a person or spurs the same thoughts and emotions. But it spurs some emotions and reactions, outside of conscious control, either innate or conditioned (which is not a subject for this post), that usually come up in a human conversation.

Yet I walked away cautiously optimistic about my ability to cope with people-like objects, on screens and in 3D life, as they proliferate.

There is little danger of anyone confusing an Ameca with a real person – not just because of its inhuman skin and eye color, but because (in this incarnation) it did not act like a person, with its eerily calm voice, unnaturally even intonation, and somewhat pat responses. (Others’ experience will vary, by design: Ameca has a number of available personas, and Engineered Arts sees it as a platform for developers to put their own AI into. I interacted today with one persona, at the Engineered Arts booth in the exhibitor’s hall.)

Still, the clarity of the boundary between human and robot (which Ameca brings up) gives me hope that people will not be easily fooled or manipulated into treating robots as people.

In fact, I find myself wondering if human-ish robots may actually be safer on that score than AIs, simply because AIs that “sound human” come to us over the same media as real people do. The medium that is a text from a sophisticated AI doesn’t look or feel different than does a text from me (though I am still better at the content part of the message).

But a robot body can’t offer that sense of “this is the same as with a person.” The illusion – that machine and human are the same type of being – is hard to maintain.

So my guess is that conventions will soon develop for dealing with robots that act human. These conventions will be based on human-to-human norms, but they won’t be the same. Maybe, as some have suggested, those norms will include built-in distancing gestures by the robot, like Bertolt Brecht’s “distancing effect” in the theater, so the audience doesn’t lulled into complacency and daydreams. For example, at Kodaiji temple in Kyoto, the Mindar “talking statue” robot was deliberately designed to show off a mechanical body under its human-like face. It’s a reminder that the robot is not human. Ameca looks rather similar to Mindar, actually.

Engineered Arts plans to give Ameca functional arms and hands (good enough to pick up a chess piece) and, later, legs it can walk on. But the company has built in other distancing effects – including that non-human-looking skin.

I think people will continue to feel sure that robots are not people – that the machines are, instead, representations of people. Like puppets, video game characters and characters in novels. (This is an idea that a number of thinkers have arrived at lately, from different starting points and in different disciplines. I’ll have more to say about this soon.)

So go my day-of thoughts about the experience of Ameca. As with most encounters with a complex robot, it was much less cartoonish, and more complex, than what I imagined. See what you think.

Leave a comment

Discussion about this video