Georgia Tech Engineers Develop Finger-Based Gesture Technology for Phones and VR
Researchers at Georgia Tech have unveiled an innovative technology that allows users to trace letters and numbers on their fingers and see them instantly displayed on a nearby computer screen. The system, called Fingersound, relies on a thumb ring equipped with a gyroscope and a tiny microphone, which detects subtle movements as users "write" on their fingers.
In demonstrations, traced figures appear in real-time on adjacent screens, opening up new possibilities for hands-free interaction. Future applications include sending phone calls to voicemail or responding to text messages without touching or even looking at a phone.
"When someone reaches for their phone during a meeting, it can disrupt the conversation," said Thad Starner, a Georgia Tech professor leading the project. "With this technology, users can perform simple gestures like writing an 'x' on their hand to manage calls discreetly."
Fingersound also has potential in virtual reality, enabling users to input commands without removing headsets or using traditional controllers. Unlike other gesture-based systems that require in-air movements, Fingersound uses the fingers themselves as a canvas. This allows the technology to accurately detect the start and end of gestures while providing tactile feedback, improving overall usability.
Graduate student Cheng Zhang, who helped develop the system, explained, "Our approach combines sound and motion to distinguish intentional gestures from everyday finger movements, significantly enhancing accuracy compared to systems relying on motion alone."
The technology filters data from the contact microphone and gyroscope to separate genuine gestures from incidental activity. Fingersound is part of a suite of ring-based gesture systems from Georgia Tech, including FingOrbits, which lets users control smartwatches or head-mounted displays, and SoundTrak, which allows for 3D doodling in the air with real-time visualization.
The research was showcased earlier this year at Ubicomp and the ACM International Symposium on Wearable Computing, highlighting the growing potential of wearable gesture-based controls in everyday life.

No comments:
Post a Comment