With Skinput, we can use our own skin–the bodys largest organ–as an input device, Harrison says Its kind of crazy to think we could summon interfaces onto our bodies, but it turns out to make a lot of sense. Our skin is always with us, and makes the ultimate interactive touch surface.
Harrison will present his research on April 12 at CHI 2010, the Association for Computing Machinerys annual Conference on Human Factors in Computing Systems in Atlanta. His co-authors, Desney Tan and Dan Morris of Microsoft Research, will provide a live demonstration of the device onstage. The trio worked on the device while Harrison was an intern at Microsoft Research last summer.
Skinputs acoustic sensors are attached to the upper arm. These sensors capture sound generated by such actions as flicking or tapping fingers together, or tapping the forearm. The sound is transmitted directly through the skin and by longitudinal, or compressive, waves through the bones.
“In the prototype we built an armband that could be the entire computing device, all self-contained,” says Harrison, age 25. Tests results from 20 subjects showed that the device was less accurate when the keypad showed more than ten locations–the same number as a standard telephone numberpad.
Harrison says that Microsoft Research, with whom he shares a pending patent on Skinput, is building a second generation of the device. Meanwhile, Harrison, a third-year Ph.D. student in Carnegie Mellon Universitys Human-Computer Interaction Institute (HCII), will move on to other projects.
Source: Chris Harrison, CMU
Writer: Chris OToole