Dr. William Katz uses an electromagnetic process that lets patients view 3D images of their own tongue movements so they can learn to speak clearly again. The interactive device (seen in the background) is positioned like a shower head above the patient.
Issues involved in communication disorders are so complex that cross-disciplinary efforts are often required to figure out solutions. To inspire and nourish crossover projects, the University launched the Communications Technology Center, a collaborative effort among the School of Behavioral and Brain Sciences (BBS), the Jonsson School and the School of Arts and Humanities.
Housed on the University’s main campus, the center provides space and tools for research and the sharing of ideas among scientists, clinicians and students who use their expertise to invent and assess new technologies and treatments for people facing communication challenges.
“While science and art traditionally are thought of as separate and different, everything is connected today,” said Dr. Dennis Kratz, dean of the School of Arts and Humanities. “UT Dallas is becoming a place where ideas converge to create an entirely new perspective.”
Sensors are placed on the patient’s tongue, and imaging instruments show the target positioning for improved speech.
One project under way at the technology center—a tool to treat apraxia of speech—benefits from the efforts of communication scientists, computer scientists and animators.
Apraxia of speech, often the result of brain damage caused by stroke, affects the timing and placement involved in speech movements. Dr. William Katz, BBS professor, employs electromagnetic articulography to allow patients to view 3D images of their own tongue movements on a computer screen while they’re speaking. The interactive device is positioned like a shower head above the patient, and sensors are placed on the person’s tongue.
“We are dealing with patients who know what they want to say, but can’t position their tongues in a way that allows them to get the correct sounds out. It’s extremely frustrating for them,” Katz said. “We’ve found that allowing them to see where the tongue should go speeds up the recovery of some of their speaking abilities.”
Katz’s research was enhanced by the work of a team led by Dr. B. “Prabha” Prabhakaran, a computer science professor in the Jonsson School. The team helped create the computer program to translate tongue movement from human subjects to animated avatars.
See the latest edition of UT Dallas Magazine online.
The avatars used by Katz were created by animators in the A&H Arts and Technology (ATEC) program who used movement data to create realistic images highlighting irregular motions.
Eric Farrar, an assistant professor in A&H who previously worked on Hollywood blockbusters, applied cinematic animation techniques to this clinical project, enabling the patient to pinpoint the placement of the tongue and lips needed to produce the correct sounds during therapy.
“By crossing boundaries between disciplines on projects like this we open ourselves up to completely different ways of looking at problems,” Farrar said. “From the student’s perspective, interdisciplinary projects can open doors to entirely new lines of study and research. So many ATEC students are focused on careers in the film or gaming industries, but this type of application can show other possibilities for use of the technology.”