Researchers Create AI Tool To Help Sight-Impaired Programmers

By: Kim Horner | Dec. 3, 2025

Computer science doctoral student Yili (Angel) Wen demonstrates an AI-assisted tool that makes it possible for visually impaired computer programmers to create, edit and verify 3D models independently.

A University of Texas at Dallas researcher and his collaborators have developed an artificial intelligence (AI)-assisted tool that makes it possible for visually impaired computer programmers to create, edit and verify 3D models independently.

The tool, A11yShape, addresses a challenge for blind and low-vision programmers by providing a method for editing and verifying complex models without assistance from sighted individuals. The first part of the tool’s name is a numeronym, a number-based contracted word that stands for “accessibility” and is pronounced “al-ee.”

A11yShape takes digital pictures of 3D models that developers generate in the open-source computer code editor OpenSCAD to capture objects’ shapes from several angles. The system utilizes GPT-4o, together with the code and multiangle views of the generated 3D model, to provide detailed descriptions of the model for blind and low-vision programmers. A11yShape tracks changes and synchronizes them with the code, descriptions and the 3D rendering. The tool includes an AI assistant similar to a chatbot that can answer questions about the model and edits.

“This is a first step toward a goal of providing people with visual impairments with equal access to creative tools, including 3D modeling,” said Dr. Liang He, assistant professor of computer science in the Erik Jonsson School of Engineering and Computer Science.

“This is a first step toward a goal of providing people with visual impairments with equal access to creative tools, including 3D modeling.”

Dr. Liang He, assistant professor of computer science

Researchers presented a paper about the technology in October at ASSETS 2025, the international conference of the Association for Computing Machinery’s Special Interest Group on accessible computing, in Denver. Dr. He collaborated with researchers from the University of Washington, Purdue University, Stanford University, the University of Michigan, the Massachusetts Institute of Technology (MIT), The Hong Kong University of Science and Technology, and Nvidia Corp.

Researchers in Dr. He’s Design & Engineering for Making Lab created a video to demonstrate the technology.

An estimated 1.7% of computer programmers have visual impairments, according to a survey by Stack Overflow, a public platform where users can ask questions about programming issues.

Dr. Liang He (right), assistant professor of computer science, is shown with, from left, computer science doctoral students Yili (Angel) Wen and Difan (Bobby) Jia; and Junke Zhao, a researcher in Dr. He’s lab.

Visually impaired programmers use tools including screen readers that read code aloud and Braille displays. Dr. He became interested in developing tools for blind and low-vision programmers when he saw the challenges that a graduate school classmate who was blind faced with 3D-modeling tasks.

“Every single time when he was working on his assignment, he had to ask someone to help him and verify the results,” Dr. He said.

Dr. He said his research group will continue to develop tools that will support additional advances in technology to help vision-impaired programmers with creative tasks, such as 3D printing and circuit prototyping.

“All of these things are very challenging for blind users, especially when they are doing it alone,” said Dr. He, who joined UT Dallas in August. “The next step is to try to support this process — this pipeline from 3D modeling to fabrication.”

Researchers tested A11yShape with four programmers with impaired vision who independently created and modified 3D models of robots, a rocket and a helicopter using the tool. One of the study’s co-authors, Gene S-H Kim, who is blind and a PhD student at MIT, provided insight from a user perspective.

“He used the first version of the system and gave us a lot of really good feedback, which helped us improve the system,” Dr. He said.