URI Study Explores Audio-Visual Speech Perception in Autism Parents

Children with autism often process audio-visual cues differently than their peers, especially during verbal communication.
It’s well established that children with autism often process audio-visual cues differently than their peers, especially during verbal communication. (representational Image: Unsplash)
It’s well established that children with autism often process audio-visual cues differently than their peers, especially during verbal communication. (representational Image: Unsplash)

 It’s well established that children with autism often process audio-visual cues differently than their peers, especially during verbal communication. They may not make direct eye contact or focus intently on a speaker’s mouth. These differences in audio-visual speech perception may contribute to why children with autism show differences in language development compared to peers.

Direct relatives of people with autism sometimes display similar traits, in a much milder form that may not even be noticed outside a lab. While the Broad Autism Phenotype—mild, sub-clinical autistic characteristics or behaviors in first-degree relatives of people with autism—has been studied extensively in siblings, few studies exist on parents of children with autism. Researchers in the University of Rhode Island Department of Communicative Disorders are looking to fill that gap to gain a better understanding of the traits of autism.

We’re trying to get a broad sense—not just kids with autism, not just parents, but getting a larger birds-eye view of autism as a whole.
Assistant Professor Alisa Baron

“We’re not trying to diagnose autism in parents; we’re just trying to get a better understanding of what various characteristics may be associated with autism.”

Harwood and Baron—who have extensively researched audio-visual speech perception—use eye-tracking technology to determine subtle differences in where participants look when observing human and computer-animated faces speaking. That technology is combined with electroencephalogram (EEG) sensors that monitor brain activity to determine the neural response while observing speech—tiny differences in the number of neurons that fire in the brain when looking at a speaking face and how fast that firing happens.

“Speech perception is a pretty complicated phenomenon,” Harwood said. “While we hear information from a speaker, learning happens with face-to-face communication. We learn by looking at a speaker’s mouth and integrating it with what we hear. Persons with autism don’t gaze the same way at the eyes and the mouth; they don’t focus on them as much. This is just one theory why kids with autism might not develop language at the same levels as neuro-typical kids.”

Studying direct relatives can help add to the knowledge base around the autism spectrum, giving researchers a wider look at the condition.

The researchers examined differences in how the brain processes subtle aspects of speech to understand the broad autism phenotype better. Studying direct relatives can help add to the knowledge base around the autism spectrum, giving researchers a wider look at the condition, the characteristics of which may be displayed in more people than researchers and clinicians realize.

“We look at autism broadly. We’re trying to think about putting all the pieces together,” Harwood said. “What are the characteristics of autism? How does it affect families? It’s a very complicated phenomenon. That’s why we’re looking at audio-visual speech perception within the population of autistic kids and a variety of people. We’ve done research on the broad autism phenotype in the population at large because there’s this population of humans that might have some elevated autistic traits.”

Harwood and Baron continue to recruit study participants interested in learning about their audio-visual speech perception and helping contribute to the growing body of knowledge around one of the fastest-growing developmental disorders in the United States. Anyone who has a child with autism and is interested in participating can contact Baron or Hardwood at ccnl@etal.uri.edu.

Harwood and Baron continue to recruit study participants interested in learning about their audio-visual speech perception. (Representational Image: Unsplash)
Harwood and Baron continue to recruit study participants interested in learning about their audio-visual speech perception. (Representational Image: Unsplash)

“The parents who have come in, like the autism community at large, are generally deeply invested in finding answers,” Harwood said. “They’re interested in finding better diagnostic processes, having earlier identification, and trying to find good interventions for persons with autism. We still don't understand the reason behind the wide range of communication abilities in individuals with autism. We don’t understand the mechanism of that to this day. By gaining a better understanding of the traits of autism, we maybe can flesh out that picture a little better.”

(Newswise/TAB)

It’s well established that children with autism often process audio-visual cues differently than their peers, especially during verbal communication. (representational Image: Unsplash)
Uncovering the Connections Between Autism, Sensory Hypersensitivity

Related Stories

No stories found.
logo
Medbound
www.medboundtimes.com