Meet Erica and Borka.
Erica – 29 years old – lives not far from Budapest. At the age of 2, a road accident left her confined to a wheelchair.
Borka is a guide dog, trained to help disabled people.
Both of them are participating in a European Union research project, which aims to use their companionship as a model for interaction between humans and the robotic assistants of the future.
“Firstly, the physical help is very important – Borka picks up things that I’ve dropped, or fetches my basket, or opens doors or switches on the lights. But maybe even more importantly, she’s a real companion and a friend, she helps my social integration in many ways throughout the day.”
Thanks to cameras installed in a specially-designed room, researchers study how dogs position themselves, how they adjust the distance between themselves and their owners, how they interact, communicate, show initiative and follow orders.
“What we’re trying to find out are very simple behaviour elements, and these could be interpreted as some kind of algorithms. And I think that even a simple set of these could make behaviour of the robot more believable, more living being-like, or more companion-like”.
When Erica opens a cabinet, Borka approaches to see if she can help. Quickly understanding Erica’s request, she picks up a glove and carries it to the table.
What if robots could do the same?
At Hatfield – a University town near London – researchers are testing robot prototypes in a real residential home to see how they perform in true-life situations.
The opening of the refrigerator door triggers a sensor that wakes up a robot, named Pioneer.
Navigating via the pattern on the ceiling, it approaches the user, offering help.
When the user returns to his desk, the activation of his computer screen sends a signal to Pioneer, who approaches with a box of juice.
“Two of the main ideas behind a robot like this is cognitive prosthetic, which is basically helping you remember things, and physical assistance. So, say you’re having a dinner party and you don’t want to carry food around – then you can put it on top of the robot, and then the robot can take it to anyone asking for cup or coffee or something. This gets a little bit more important if the person is disabled, or has a walking stick.”
This robot tries to keep a comfortable distance from the user, measuring the range with laser, infrared and optical detectors. These are the first steps towards recreating social behaviour in man-made devices.
“Robots are not people, not animals – they are machines. But still they can be used as tools in order to investigate the creation of certain behaviour in these machines.”
This emphasis on interaction explains why many of the project’s leading scientists are not engineers but biologists, interested in recreating cognitive and behavioural processes that exist in the world of nature.
“Understanding, for example, how a dog attracts your attention, how the personality of the dog comes through the way it moves, allows us to be able to build those ideas from studying dog behaviour into the robots. And, hopefully, we will have robots that in future aren’t perfect model dogs – there are perfectly good dogs out there to fill that need – but we’ll have some of the kind of cuteness and interaction that the dogs have with human beings, which will mean that that technologies are just going to be much easier to use.”
Unlike robots, dogs have natural emotions to express – reacting, for example, when someone enters the room. They watch their owners’ behaviour and act accordingly.
Dogs are interested in any sudden changes in their environment – such as new objects needing to be investigated.
And they’re good at sensing trouble.
“She’s going to use a device to play back human cry… Look at the dog: it’s watching… kiss… another kiss… jumps, emotions, very heavy, expressed emotions… vocalisation… That’s an alarming signal… And then – approaching the stranger, because she didn’t do anything – and she’s supposed to help!”
“There will always be a difference between dogs and robots, because they have different functions, so robots are probably more useful for helping people in very special situations, where there is some reliance on verbal communication and information exchange – this is not possible with dogs. And dogs will still have this ability to please humans in a very social way, and they are, of course, living beings that have their special own world which people enjoy and I really hope that they will enjoy it in the future as well.”
So, robots are not set to replace our pets. But how close can they come to simulating real emotions?
Let’s visit Heriot Watt University in Edinburgh to meet SARAH – a virtual character with an impressive capacity for seeing, talking and moving around.
“SARAH is basically a Social Agent Robot to Aid Humans, so SARAH is supposed to perform some tasks which are useful to the users in the lab – for example, carry the phone to the user.”
Just like Pioneer, the robot we met earlier, SARAH navigates following ceiling patterns.
But her “mind” can also be transferred to a range of different mechanical incarnations, present in different parts of the building – after all, her personality is just a software programme.
“We came up with this concept of migration: what it does is actually migrates the mind of the artificial companion to a different embodiment. So, for example, robot mind can migrate to a handheld device, which the user can carry around and walk everywhere, and then the character again can migrate from the handheld device to a graphical character on the screen”.
Represented by an image on a flatscreen TV, SARAH can recognize a person in front of her and answer questions sent by SMS.
“We are also looking at creating and modelling a mind to be as plausible to human behaviour as possible. So we’re looking at different memory mechanisms that are useful in human – for example, generalisation and retrieval mechanisms that will help SARAH to remember the users and their preferences, and it adapts its interaction to help them better”.
And, what’s more, she says it with a smile.