IEEE Spectrum AI→ original

Electronic rings translate sign language into text with AI

Korean scientists developed a system of seven electronic rings with motion sensors. By wearing the rings and connecting via Bluetooth, a person can translate si

Electronic rings translate sign language into text with AI
Source: IEEE Spectrum AI. Collage: Hamidun News.
◐ Listen to article

Electronic rings equipped with motion sensors and artificial intelligence can translate sign languages into text — scientists from Yonsei University demonstrated this for the first time.

Why Previous Attempts Failed

For many years, researchers have tried to create a practical sign language translation system. Some used cameras and computer vision, but this only worked in controlled conditions and often failed when lighting changed. Others developed smart gloves with sensors. However, they accumulated heat and moisture, causing discomfort, and did not account for individual differences in hand size and finger length. Most sensors required wired connection to computers, restricting hand movement. Even wireless versions typically connected via wire to a single central transmitter.

Seven Rings Instead of Gloves

Scientists led by Ki Jun Yu developed a new approach: a set of electronic rings, each transmitting motion information via Bluetooth Low Energy. Using rings instead of gloves allowed flexible sensor placement and accounting for natural anatomical differences among people. Wireless communication gave hands complete freedom. Analysis revealed that seven fingers play the primary role in sign language. Therefore, they used exactly seven rings. Each is equipped with an accelerometer that tracks both static poses and movements. This is critical because sign languages combine static positions and dynamic transitions between them. During development, the system faced a reliability problem. Direct copper contacts broke with frequent bending. Solution: switching to contacts with a winding pattern that withstand thousands of cycles.

AI Recognizes Gestures

Researchers developed a deep learning system that recognizes gestures with striking universality. The system learned to work not only with two people from the training sample, but also with five completely new people — without adjustment for each one. In experiments, the system recognized 100 words of American Sign Language and 100 words of International Sign Language with accuracy of 88.3% and 88.5% respectively. This is a huge leap forward: previous attempts were limited to fewer than 50 words in vocabulary. It is also impressive that the system translated not only individual words, but entire sentences in continuous gesture translation. This is exactly what is needed for real-time natural conversation.

What This Means

The long-term goal is to integrate the system into smartphones without specialized equipment. Rings would transmit signals to a mobile device, where they would be automatically translated in real time. This will make the technology accessible for everyday communication. Next steps: expand the system to work with more people, add more words to the vocabulary, support Korean Sign Language and other regional variants. Scientists also plan to increase battery life from the current 12 hours to a full day. One important point: the current system translates only hand movements. The next major challenge is to add facial expressions, lip movements, and spatial constructions that are grammatically important in sign languages.

ZK
Hamidun News
AI news without noise. Daily editorial selection from 400+ sources. A product by Zhemal Khamidun, Head of AI at Alpina Digital.
What do you think?
Loading comments…