Children diagnosed with autism spectrum disorder, which is a developmental disorder that directly involves communication and behavior, usually fail to recognize the emotional states of people surrounding them. They generally can’t differentiate between happy, sad, angry, and fearful faces, for instance.
Now, researchers believe that robots have great potential to facilitate future therapies for children with autism. Recently, a team of researchers found a way to combine robotics with deep learning to further personalize autism therapy.
Deep Learning in Autism Therapy
Deep learning, which is also called deep structured learning or hierarchical learning, is a component of machine learning. It is mainly based on understanding data representations, as opposed to task-specific algorithms.
Rosalind Picard, a lead researcher at MIT, stated that “Given that each case is different, personalization is becoming particularly important in autism therapy.” She added: “The challenge of producing machine learning and AI that will work in autism is especially vexing since the usual methods necessitate a number of data that are usually alike for every category that is learned. In an autism where heterogeneity reigns, approaches using the usual AI tend to fail.”
Picard and her fellow researchers have designed a personalized framework that has the ability to learn from the data that are collected on each child. They also captured a video of each child’s facial expressions, body movements, over-all posture and gestures, audio recordings and data on heart rate, body temperature, and skin sweat response from a monitor on the child’s wrist.
Using all those data, the researches built the robots’ personalized deep learning networks. Accordingly, the researchers then compared their assessment of the children’s behavior with an assessment from five specialists, who programmed the video and audio recordings of the children on a continuous scale in order to ascertain how pleased or upset, how interested, and how engaged the child is during the therapy session.
A Study to Correlate the Engagement of Children with Autism to Therapy
A study was conducted by the researchers to gauge the engagement of children with autism to therapy. There are 35 children with autism, aged 3 to 13, who participated in this study. Furthermore, the researchers used SoftBank Robotics NAO humanoid robots to help with the study.
The NAO robots are less than 2 feet tall and bear a resemblance to an armored superhero. NAO can express various emotions by altering the color of its eyes, movement of limbs, and the tone of its voice. Throughout the 35-minute therapy sessions, the children reacted in a variety of ways in response to the NAO robots; from looking bored and tired in some instance to excited, laughing, or touching the robot in other instances.
Overall, the deep learning-equipped robots assessed the children’s responses, which agreed with assessments by human experts, to have a correlation score of 60%.
Ognjen Rudovic, who is one of the postdoctoral researchers at the MIT Media Lab and leading this project, stated: “Our long-term goal is not to build robots that will replace human therapists, but to improve the therapy sessions by providing the human therapists with critical information that they can use to personalize the sessions and also make more engaging and spontaneous exchanges between the machines and children with autism.”
Meanwhile, the correlation scores of human observers in the study, who made their assessment on the children’s engagement, are generally between 50% and 55%. Rudovic and his colleagues have recommended that trained robots like the one in their study may eventually supply more consistent assessments of these behaviors.
How Deep Learning-Equipped Robots Aid in Autism Therapy
A personalized approach in therapy for autism often works like this: a human therapist would usually show child pictures or flashcards with different faces that are meant to symbolize different emotions in order to teach them how to identify expressions of fear, sadness, or joy. Later, the therapist will program the robot to exhibit these same emotions to the children and observe the child as they engage with the robot. The children’s behavior supplies useful feedback that the robot and therapist need to go forward with the lesson.
“Most therapists admit that trying to connect with the children, for even a few seconds, can be a big challenge for them. But with robots, the attention of the children is easily captured,” says Rudovic. “Another thing is, humans, change their expressions in myriad ways, but robots usually do it in the same way, and this is more suitable for children because they can learn in a very structured way how the expressions will be shown.”
Nevertheless, this kind of therapy would only be effective if the robot could also efficiently interpret the children’s behavior, whether they are excited or paying attention, during the therapy based on research.
“A majority of the children in the study regarded the robot ‘not just as a toy’ but interacted with NAO respectfully, just like they would with a real person,” said Rudovic, “especially during storytelling, when the therapists would ask how NAO would feel if they treated the robot with ice cream.”