I had hoped to pop in to see Cleveland Cutler at Boston University today to discuss the ‘nuclear renaissance’ but despite agreeing to an interview and numerous phone calls and e-mails to chase (including several chats to his administrator who assures me Cleveland will ‘call me back’) there has been a deathly, and let’s be honest, outrageously rude silence from the professor, not even a simple “I’m sorry, I need to cancel”. Having extended my stay in Boston (and my imposition on Tracy’s hospitality) to make time for this meeting I find this, well, just a bit arsey and disappointing. I must think of a suitably cutting joke about baldy geographers.
Instead I head into Cambridge and say hello to the personal robotics lab. Sadly both arch-enemy of lazy writers Polly Guggenhiem and uber-robotics pioneer Cynthia Breazeal aren’t there, but the ever-friendly Dan Stiehl who I met last time is on hand and I do get to see Nexi (the lab’s latest sociable robot in action). In tune with the lab’s focus on human-robot relationship Nexi is interacting with a young boy, no older than eight who finds Nexi’s human-like tracking of his movements as he dances in front of her enthralling. Despite being made of moulded white plastic Nexi’s face can express a whole gamut of emotions – her big eyes blinking, her white plastic ‘eyebrows’ moving, her mouth expressing slack jaw boredom (when not much is happening) to tight lipped interest (if the boy is doing something intriguing) or annoyance (if the boy gets too close). It’s startling to see how quickly all of us just accept Nexi as somehow sentient.
I give Nexi a personality. In fact, I can’t help myself, because this is a robot that acts in a, well, recognisably human way (and is therefore the exact opposite of Keanu Reeves). This is no accident. This is exactly what the Personal Robotics lab wants you to do.
“We put people at the core of what we’re trying to do,” explained Cynthia last time I was here. “A lot of work in robotics is still very focused on technology but in our lab we put these robots in front of real people so we can understand their impact. My group takes the relationship between social robots and people seriously and are trying to design for both sides. It’s not just about having the robot understand people, we’re trying to make people understand the robot so you’re naturally able to use your own way of thinking about the world to understand what must be going on in the mind, so to speak, of the robot.”
Mind of the robot? Let’s not get into that here, I have two chapters that address the subject of Artificial Intelligence in the book…
In the meantime, check out this animation of Nexi that demonstrates her range of facial expressiveness.