Using funding from the National Science Foundation and Google, a group of UW engineers and scientists have created FingerIO, an innovative gesture-based program that allows users to interact with their smart devices without having to physically touch them.
Computer science and engineering graduate student Rajalakshmi Nandakumar, and her colleague, Vikram Iyer, a graduate student in electrical engineering, created FingerIO to solve one of today’s main concerns with technological devices: Devices are becoming increasingly smaller.
“There are lots of variable devices today,” Nandakumar said. “But, the issue is that they are all becoming smaller and smaller, which means that the display of these devices, which is the major mode for a touch-based interaction is also becoming smaller, and so it’s difficult to type on a smartwatch or smartphone.”
Nandakumar and Iyer will be giving a demonstration of their program at the Association for Computing Machiner’s CHI 2016 conference, which will be held in San Jose, Calif., in May. They will be showing how the FingerIO prototype app, which they have installed onto an Android device, will be able to track two-dimensional motions without having to touch the Android smartphone at all.
“How it all started was that we were detecting minute breathing motions using off-the-shelf devices, so once we saw that we could detect these minute motions, we thought of using this for different various applications and different domains,” Nandakumar said. “The first thing we thought of was gesture-based interaction, and that’s how we came up with FingerIO.”
As of right now, researchers have only been able to install FingerIO on an Android, but they are in the process of developing an iOS app.
“In principle there is no reason why you couldn’t use it on other phones, because all we really need physically is the two microphones,” Iyer said. “Which is pretty standard as the microphones are built onto phones.”
The two microphones are needed because FingerIO uses sonar to replicate gestures onto the screen.
“We are sending inaudible sound signals and these signals are getting reflected off our fingers,” Nandakumar said. “If we measure the time it takes for the sound to go from the speaker, hit our finger, and come back to the microphone, you know at what distance your finger is. Then you move your finger, and you know the signal is going to arrive at a different time, which means if you move farther away, it’s going to arrive later, and the closer you are, it’s going to arrive earlier.”
To test the accuracy of the sonar system, individuals drew basic shapes on unrelated touchpads while the FingerIO program picked up the motions. There was less than a centimeter difference between the original and FingerIO smartphone drawings.
Nandakumar and Vikram are currently trying to improve their program by finding a way to detect three-dimensional localization. They also want to enable multiple fingers to interact simultaneously with the smartphone. As of right now, FingerIO can only detect two-dimensional motions from a single finger at a time.
“We’re taking FingerIO to the next level by attempting to have a three-dimensional localization, using less power, and enabling the ability for multiple finger interaction,” Nandakumar said. “These are the things we are looking at because we want to expand the applications in which we can use FingerIO.”
Reach reporter Praphanit Doowa at firstname.lastname@example.org. Twitter: @prabdoooowa