Meet the Robot Baby Whisperer
Inside the Children’s Hospital Los Angeles (CHLA) multiple high definition digital cameras are set up in a fixed-rig around a 22-inch bipedal (humanoid) NAO robot and a similar-sized (human) infant, as part of a study by the Infant Neuromotor Control Lab and the USC Interaction Lab. When the robot makes a specific leg movement, the infant hesitates, and then copies. Then the infant shifts, wriggles and kicks its other leg - which the robot mimics easily, to the infant’s surprise - and delight.
The NAO robot was developed by Aldebaran, now part of the United Robotics Group, and is a sophisticated machine, suitable for healthcare research studies such as this one. The robot has seven touch sensors (head, hands and feet), two ultrasonic sonars which allow the robot to estimate distances to obstacles, and inner technology which allows the robot to move, mimic and perceive its position within an environment. The NAO is a vital part of the global healthcare socially assistive robot market, which is projected to reach $38 billion by 2031.
Lauren Klein, a fifth year PhD student at USC’s Interaction Lab, has been using the NAO in the lab’s on-going research project, which has been running for over five years, in collaboration with Dr. Beth A. Smith’s Infant Neuromotor Control Lab at the Children’s Hospital Los Angeles. The aim of this research is to “develop a socially assistive, non-contact, infant-robot interaction system to provide contingent positive feedback to increase exploration and expand early movement practice.”
Simply put, it’s often hard to find out if an infant has developmental delays until it’s too late to really help them. Movement is how infants explore the world, and develop healthy brain-body connections. By using a robot to both elicit and mimic movement, researchers can find out much earlier if intervention is required to help the child develop correctly.
By using head-mounted eye tracking on the infants, the research team obtained significant data streams which proved that infants demonstrate what is known as “predictive gaze” (visual fussiness, accompanied by verbalizations) when they learn that robot reward is contingent on their behavior.
To date twenty-six infants, from a range of backgrounds and ethnicities, have participated in the research trial.
Robots and Infants
After graduating from Cornell, Klein took internships at NASA JPL, working alongside Caltech researchers on a machine learning team to support the Deep Space Network and then started looking around for funded research to take her ideas to the next level. An advisor suggested checking out the Interaction Lab at USC, which is how she ended up working on this current project.
“Our team showed that infants were able to learn that their leg movement activated the robot - this is known as a contingent learning task,” said Klein. “Once we had that foundation data in place, we could also prove that infants also grasped that when they changed their leg movements, the robot made its own movement in response.”
Essentially proving that the infant neuromotor skills are in good order, and it knows what’s going on.
USC Interaction Lab
The NAO is just one of many robots used by researchers at the USC Interaction Lab. On the day we visited, there were three iPals (Avatarmind), standing to attention. They're 3-feet tall, weighing 27 pounds and each containing 25 motors and 19 sensors - and someone had amusingly arranged their end effectors (robotic limbs) to portray Mizaru, Kikazaru and Iwazaru (the three wise monkeys of “see no evil, hear no evil, speak no evil” fame.) Resting on the floor, at the base of the iPals’ articulated limbs, were a couple of aibo (Sony) robotic dogs, while a QTrobot (LuxAI), an expressive social robot designed for autism research, looked down from the top of a filing cabinet nearby.
The Interaction Lab is entirely focused on robots deployed inside socially assistive research projects, bringing positive effects to disparate communities, with specific challenges, from mental health in adolescents and university-aged students, stroke rehabilitation in seniors, and human-machine interaction towards early detection of dementia.
“Another new project is using machine learning to help individuals with physical disabilities to learn to program and become part of the digital economy,” added Maja Matarić, PhD, Founding director.
Socially Assistive Robot Futures
At its current retail price of nearly $18,000 per unit, the NAO isn’t currently a feasible option for large-scale roll-out. That said, once the research is done, and the code is in place, the work being done can be downshifted into much cheaper robots.
“The ultimate goal of our research is to deploy a code using affordable socially assistive robot units,” Klein confirmed. “To allow in-home interventions that use play to improve healthy development in young children.”
This vision, however, is still some way off. Academic research is meticulous, stretching over many years, through trials which painstakingly ensure results are “reproducible” before they can be used in real-world applications. But using the NAO robot today, in trials like these, is just the beginning.
- Studio71 Wants To Help Creators Fight Re-Posted Content - dot.LA ›
- GrayMatter Robotics Raises $20M To Make Industrial Bots Smarter ›
- Two Months After Robots Were Introduced Into Nursing Homes, Residents Can't See Them Leaving: 'We'd Be Screwed' ›
- Part Pixar, Part Roomba: Meet Moxie, the Pasadena-Built Learning Robot for Children ›
- Just How 'Necessary' are Humanoid Robots? - dot.LA ›
- Gitai's Robots Will Build Future Space Stations for Humans - dot.LA ›