Mar 19 2025
A groundbreaking AI-powered ring is transforming ASL communication by enabling seamless, real-time fingerspelling tracking—paving the way for more accessible and practical sign language translation solutions.
Research: SpellRing: Recognizing Continuous Fingerspelling in American Sign Language using a Ring. Image Credit: Rebecca BowyerResearch: SpellRing: Recognizing Continuous Fingerspelling in American Sign Language using a Ring. Image Credit: Rebecca Bowyer
*Important notice: arXiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as definitive, used to guide development decisions, or treated as established information in the field of artificial intelligence research.
A Cornell University-led research team has developed an artificial intelligence-powered ring equipped with micro-sonar technology that can track fingerspelling in American Sign Language (ASL) continuously and in real-time.
In its current form, SpellRing could be used to enter text into computers or smartphones via fingerspelling. Fingerspelling is used in ASL to spell out words without corresponding signs, such as proper nouns, names, and technical terms. With further development, the device—believed to be the first of its kind—could revolutionize ASL translation by continuously tracking entire signed words and sentences.
"Many other technologies that recognize fingerspelling in ASL have not been adopted by the deaf and hard-of-hearing community because the hardware is bulky and impractical," said Hyunchul Lim, a doctoral student in the field of information science. "We sought to develop a single ring to capture all of the subtle and complex finger movement in ASL."
Lim is the lead author of "SpellRing: Recognizing Continuous Fingerspelling in American Sign Language using a Ring," which will be presented at the Association of Computing Machinery's conference on Human Factors in Computing Systems (CHI), April 26-May 1 in Yokohama, Japan.
SpellRing is worn on the thumb and equipped with a microphone and speaker. Together, they send and receive inaudible sound waves that track the wearer's hand and finger movements, while a mini gyroscope tracks the hand's motion.
A proprietary deep-learning algorithm then processes the sonar images. It predicts the ASL fingerspelled letters in real-time and with similar accuracy to many existing systems requiring more hardware.
Related Stories
Developers evaluated SpellRing with 20 experienced and novice ASL signers, having them naturally and continuously fingerspell more than 20,000 words of varying lengths. SpellRing's accuracy rate was between 82% and 92%, depending on the difficulty of the words.
"There's always a gap between the technical community who develop tools and the target community who use them," said Cheng Zhang, assistant professor of information science and a paper co-author. "We've bridged some of that gap. We designed SpellRing for target users who evaluated it."
Lim's future work will include integrating the micro-sonar system into eyeglasses to capture upper body movements and facial expressions, for a more comprehensive ASL translation system.
"Deaf and hard-of-hearing people use more than their hands for ASL. They use facial expressions, upper body movements and head gestures," said Lim, who completed basic and intermediate ASL courses at Cornell as part of his SpellRing research. "ASL is a very complicated, complex visual language."
This research was funded by the National Science Foundation.
*Important notice: arXiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as definitive, used to guide development decisions, or treated as established information in the field of artificial intelligence research.
Cornell University
Journal reference:
Preliminary scientific report. Lim, H., Dang, N. A., Lee, D., Yu, T. C., Lu, J., Li, F. M., Jin, Y., Ma, Y., Bi, X., Guimbretière, F., & Zhang, C. (2025). SpellRing: Recognizing Continuous Fingerspelling in American Sign Language using a Ring. ArXiv. DOI: 10.1145/3706598.3713721, https://arxiv.org/abs/2502.10830