Apr 24, 2017 | Atlanta, GA
When neuroimaging took gigantic leaps forward in the 1970s and 80s with the introduction of magnetic resonance imaging (MRI) and computed tomography (CT), it was a sign of just how closely advances in medicine or diagnostics correlate to the technological advances within the field.
Suddenly, researchers were able to more safely observe and document the brain in live subjects, opening them to a world of study that was previously unattainable. There was a massive increase in understanding about things like medical conditions or effects of alcohol and drugs on the brain.
That kind of technological advancement, one that drastically moves the needle of study in the field forward, hasn’t been as prominent in the field of behavioral psychology. It’s a challenge that many researchers in the Georgia Institute of Technology’s School of Interactive Computing (IC) are trying to overcome.
“The tools that exist today in neuroimaging as compared to 50 or 60 years ago, there’s just no comparison,” IC Professor Jim Rehg said. “People are looking at resolutions or structures in the brain that weren’t even on the map 50 years ago. We’re just trying to bring the behavioral measurement forward in the same way that is already happening for imaging and genetics.”
Rehg is one of a number of faculty focusing their efforts on developing new computational analysis tools to measure behavior. A key goal of the work is to improve understanding of Autism Spectrum Disorder, a complex group of disorders of brain development characterized by repetitive behaviors and difficulties in social interaction and communication.
He began collaborating with Professor Gregory Abowd and senior research scientist Agata Rozga, among others, during a five-year National Science Foundation Expeditions in Computing grant and has subsequently continued his work with Rozga under a grant from the Simons Foundation. The former grant was instrumental in setting up the Child Study Lab at Georgia Tech, which studies early social, communication, and play behavior in children, including those with autism.
Tracking Problem Behaviors With Technology
More recently, Rozga, the director of the lab, received – along with Associate Professor Thomas Ploetz and Dr. Nathan Call of the Marcus Autism Center – an NIH R21 grant for a project titled Objective Measurement of Challenging Behaviors in Individuals with Autism Spectrum Disorder.
The latter research deals in problem behaviors as exhibited by individuals with autism. Call, who is the director of Behavior Treatment Clinics at the Marcus Autism Center, described the challenge the research is aiming to address.
“Individuals with autism and other developmental disorders are more likely to exhibit problem behaviors like self-injury, pica, or property destruction,” he said. “Behavioral interventions exist, and can be very effective, but there are a few barriers. Data collection on the behavior is a key ingredient, but is most often done by a human observer, which is expensive, has the potential for reactivity, doesn’t work for covert behaviors, cannot always provide a good estimate of severity, and may not always be accurate.”
The project involves the use of accelerometers and machine learning to develop a measurement system that will detect and differentiate between different types of problem behavior in a way that addresses each of those challenges.
In the best of circumstances, such as a research or clinical setting, videos can be recorded and research assistants can go through the videos to find moments where the child engages in some type of behavior. Currently, that is the standard. As Rozga said, though, that is not something that scales to large samples or allows you to study behaviors outside the strictures of a research setting.
The approach, then, is to combine currently available wearable technology with computational analysis to see whether that might be used to advance the state of the art.
Using sensors attached to the wrists and ankles, the team records movement data from the individual.
“From a technical point of view, we want to know whether we can see when an activity starts, when it ends, and of what nature that activity actually was,” said Ploetz, who has worked with Rozga in the past and joined the IC faculty in February of this year. “An automated recognition of problem behaviors is a substantial challenge that involves capturing through sensors and analyzing through machine learning-based assessment techniques.”
The hope is that they can build statistical models that can analyze data streams and automatically pick out which kinds of activities or problem behaviors an individual engages at a given time, as well as their frequency and intensity.
“One of the things that Dr. Call said was a clinically-relevant measure they have not been able to gather is severity of the problem behavior,” Rozga said. “It’s hard to get two people to agree on any rating scale. We had this moment where we said, ‘You know, that information is already in the signal.’ If you look at the amplitude at the moment of impact, we have potentially a signal there that can speak to the intensity, or severity, of the behavior. What other things can you measure if you had access to this new measure?”
Further, and most importantly in the early stages, can these models measure with comparable accuracy to “ground truth” – labor-intensive, frame-by-frame coding – in the strict clinical setting.
If so, the long-term goal is to then deploy these behavior monitors into the home, a much less structured environment.
“Can we use this for treatment follow-up, or to understand how these behaviors manifest in the home or school?” Rozga said. “Does this work beyond just the clinical setting?”
Measuring Social-Communication Behaviors
Rozga’s work with Rehg is similar in that it attempts to take advantage of the vast availability of sensor technologies to improve measurement of social-communication behaviors in young children, such as eye contact, shifts of attention between objects and faces, and gestures.
“Historically, the problem was that our tools for getting information were very limited,” Rehg said. “What’s really changed is our ability to collect large-scale data. Generally speaking, this is the best moment in time as far as sensor tools go. Cameras, microphones, accelerometers, inertial measurement units – these are the sensors we’re most interested in.”
With them, they can continuously track individuals’ eyes, heads, limbs, posture, and many other movements associated with the production of relevant social behaviors, increasing the overall pool of available data.
“Large amounts of data from kids is what it takes to characterize behavior and how it changes over time,” Rehg said. “This is something that scales. You can replicate it in other settings, other labs. You can demonstrate that this approach works well across different data sets. We want to show that this is something that can be generalized.”
If they can, a more accurate picture of childhood development, as well as the response to treatment in behavioral problems, could emerge.
There are other important contributors to the research, Rozga said. Audrey Southerland is the lab coordinator at the Child Study Lab, where she has helped with research for over six years. She began as an undergraduate research assistant under the Expeditions in Computing grant in 2011 and joined the staff full time as the lab coordinator after graduating with a Bachelor’s degree in psychology in 2012. In this role, she oversees the lab, including data collection and current undergraduate research assistants, on a daily basis.
Dr. Mindy Scheithauer has also been a key collaborator at the Marcus Autism Center, where she works in the Severe Behavior Program.
Learn the Signs, Act Early
Others throughout the College of Computing have pursued other extensive research surrounding autism. Senior research scientist and developmental psychologist Rosa Arriaga is leading a team that has developed ActEarly, a mobile Android app that gives parents and caregivers a comprehensive and convenient way to track developmental milestones for children.
The app is designed to support kids – newborns to age five – by providing information on social, language, cognitive, and physical milestones children should achieve at each age.
“Parents may be unaware that a child is failing to meet important developmental milestones and this might put the child at risk,” Arriaga said.
Working with Laurel Warrell, a Master’s of Science Candidate in Human-Computer Interaction, they are working to deploy and conduct usability studies with the app, which leverages expertise from the Centers for Disease Control and Prevention (CDC) and is part of a broader “Learn the Signs, Act Early” campaign. This initiative seeks to identify developmental disabilities in young children and provide families with needed services.
Arriaga and her team are seeking parents to participate in their studies. They are asking that parents of children between one month old and 5 years old, who have an Android phone, to download the ActEarly mobile app and provide feedback. Interested individuals can follow the link for more information.
Additionally, Arriaga’s team is currently developing with the CDC an interactive e-book that will allow parents to track their 3-year-old child’s milestones while they read. She is also working with undergraduates to develop toddler games to help inform parents about what their child can do. A demo of the latter project can be viewed in a video here.