Eye Tracking and Robots: Early Interventions for Children at Risk for Autism

LUBBOCK, TX (NEWS RELEASE) - The following is a news release from Texas Tech University: 

Ask any new parent what they’re most excited about, and they’re likely to list getting to see their child achieve their first big milestones – their first step, their first word.
 
But before those big milestones are some that are even more important in determining if a child is developing normally, said Ann Mastergeorge, chair of the Texas Tech University Department of Human Development and Family Studies (HDFS) within the College of Human Sciences.
 
“When I talk to parents who are concerned, I’ll ask: Does your child point, does your child share things with you, do they reach out to show you something or give you something, and do they respond to their name?” she said. “If the answer is no, no, no to those things, of course it doesn’t mean they have autism, but they do have risk behaviors for autism.”
 
Now, thanks to some new technologies, Mastergeorge and her team are able to start helping children with those risk behaviors at a much younger age – as young as 12 months.
 
Eye tracking
The Research in Early Developmental Studies Laboratory, located inside the Burkhart Center for Autism Education and Research, looks almost like a daycare classroom. It has brightly painted walls, toys in colorful plastic bins on shelves and even a high chair. But facing the high chair is something you wouldn’t find in a daycare – a large black screen with a tiny camera on top. It’s an eye tracker, and it’s the key to Mastergeorge’s new study, which provides earlier intervention for children at risk for autism.
 
When parents bring young children into the lab to be tested, the child is placed in the high chair to watch images on the screen. The child is shown video clips from “Sesame Street” and “Peter Pan,” as well as images of a woman and a robot. As the child watches, the eye tracker calibrates the pupil of the child’s eye and shows the researchers exactly what part of the screen the child is focused on.
 
And most importantly, the eye tracker also remembers the calibration for that specific child so it can show any changes in eye movement during future sessions.
 
“We’re looking at what children are looking at before intervention and after intervention to see how their gaze shifts change,” Mastergeorge explained.
 
Mastergeorge points to the screen, showing two characters in a room. She explains that a child with autism would likely focus on the background or unimportant details, while a child developing normally would focus on the characters’ faces.
 
“Children with autism don’t understand that’s where social communication happens,” she said. “What this allows us to do, both pre- and post-intervention, is to see whether or not they’re learning to look where the action is happening. This is just a very sophisticated method for us to be able to really know where kids are looking. So, we are using this technology in a very precise way to study that.”
 
After the first eye-tracking session, study participants go through a 16-week, home-based intervention project. Program coach Jessica Blume, an HDFS graduate student, emphasized that the goal is to teach children how to interact with others, not just how to act.
 
“It’s not about teaching behavior correction,” she said. “It’s about shaping communication out of those breakdowns.”
 
Parents are asked to play with the child, using special toys and focusing on turn-taking, for 15 minutes a day and videotape their interactions once a week. The videos are sent to their program coach, who can provide suggestions for the parents to try.
 
“We just hope that over time, the more opportunities they provide, the more interested the child will get in those activities,” Mastergeorge said. “Initially, the child’s just wandering back and forth. We tell the parent just to play with the toys. Eventually, maybe you’ll recruit them back, even if it’s just for one time. We want the parents to feel like it’s OK – to be videotaped while your child’s not performing is anxiety-provoking. We want them to know we expect the child to go to the window, to walk around, to throw the blocks. But we start to see really big shifts by about week 10.”
 
After 16 weeks of home intervention, the child is brought back to the laboratory for another eye-tracking session – and the results of a pilot study have shown improvement in children’s ability to focus on where the action is happening.
 
“It’s a win-win situation, for them and for us,” Mastergeorge said. “We know that if we can get some of these behaviors changed very early in the child’s development with this early intervention, then it’s going to have a very different trajectory than for a child who doesn’t have any intervention until much later.”
 
Robots
For children who are already past the infant and toddler stages, Mastergeorge’s team is trying a groundbreaking method – using robots for the intervention.
 
Using the eye tracker, these children are shown images of a woman and a robot repeating the same activities and giving the same instructions.
 
“Robots are less social, so some researchers have said maybe they’re less complex or less intimidating,” said Rebecca Beights, a graduate student in the College of Education who leads the team’s robot research. “It may be more motivating for the kids to look at objects than people. So we want to see, are they looking at the robot in an attentive way to potentially learn from it? If a kid with autism responds more to the robot, that could be a start to get them to follow simple instructions. And with the eye tracker, we can really determine that.”
 
A pilot study using robots in interventions with these older children also has shown positive results. Vijayanta “V.J.” Jain, a junior computer science major and research assistant on Mastergeorge’s team, programmed the robots to give verbal instructions and perform movements for the children to imitate.
 
“A robot was presented to a kid in their usual therapy time, and I measured their engagement: How engaged are they with the instructions? Are they touching the robot? Are they actively listening to it? Body orientation, stuff like that, and also if they followed the instruction or not,” Jain said. “I did that over several therapy sessions, six with each child.
 
“It was interesting to find that engagement kept increasing with each exposure. It was not a completely linear, steady increase, but overall we found that between the initial and the final exposures there was an increase in engagement. We cannot really generalize, but it suggests that robots are not just a novel stimuli; they can be a salient instructional stimuli. These could be an effective tool in therapy sessions.”
 
Mastergeorge said the study is groundbreaking in her field.
 
“This is one of the first studies to look at both robots and eye tracking with children with autism,” she said. “We’re on the cutting edge.”
 
Similar to the efforts for younger children, the emphasis of the robot study is again the importance of early intervention.
 
“With early intervention, we can see such dramatic and wonderful change and improvement,” Beights said. “If we’re able to get a child into early intervention and have it matched to what they need, both developmentally and to their attentional styles – that’s the idea of the robot versus human – matching the intervention to the kid and the family, it can be very effective. Both of those projects are looking at how to do that best.”
 
Anyone interested in being involved in Mastergeorge’s study may contact her at (806) 742-3000 or ann.mastergeorge@ttu.edu.
 
CONTACT:
Ann Mastergeorge, Rockwell Endowed Professor and department chair, Department of Human Development and Family Studies, College of Human Sciences, Texas Tech University, (806) 742-3000 or ann.mastergeorge@ttu.edu 


More Stories

Don't Miss

Trending Stories

Latest News