TapSense touchscreen technology distinguishes taps by parts of finger (w/ video)

TapSense touchscreen technology distinguishes taps by parts of finger (w/ video)

Smartphone and tablet computer owners have become adept at using finger taps, flicks and drags to control their touchscreens. But Carnegie Mellon University researchers have found that this interaction can be enhanced by taking greater advantage of the finger's anatomy and dexterity.

By attaching a microphone to a , the CMU scientists showed they can tell the difference between the tap of a , the pad of the finger, a fingernail and a knuckle. This technology, called TapSense, enables richer touchscreen interactions. While typing on a , for instance, users might capitalize letters simply by tapping with a fingernail instead of a finger tip, or might switch to numerals by using the pad of a finger, rather toggling to a different set of keys.

Another possible use would be a painting app that uses a variety of tapping modes and finger motions to control a pallet of colors, or switch between drawing and erasing without having to press buttons.

"TapSense basically doubles the input bandwidth for a touchscreen," said Chris Harrison, a Ph.D. student in Carnegie Mellon's Institute (HCII). "This is particularly important for smaller touchscreens, where screen real estate is limited. If we can remove mode buttons from the screen, we can make room for more content or can make the remaining buttons larger."

TapSense touchscreen technology distinguishes taps by parts of finger (w/ video)

TapSense was developed by Harrison, fellow Ph.D. student Julia Schwarz, and Scott Hudson, a professor in the HCII. Harrison will discuss the technology today (Oct. 19) at the Association for Computing Machinery's Symposium on and Technology in Santa Barbara, Calif.

"TapSense can tell the difference between different parts of the finger by classifying the sounds they make when they strike the touchscreen," Schwarz said. An inexpensive microphone could be readily attached to a touchscreen for this purpose. The microphones already in devices for phone conversations would not work well for the application, however, because they are designed to capture voices, not the sort of noise that TapSense needs to operate.

TapSense touchscreen technology distinguishes taps by parts of finger (w/ video)

The technology also can use sound to discriminate between passive tools (i.e., no batteries) made from such materials as wood, acrylic and polystyrene foam. This would enable people using styluses made from different materials to collaboratively sketch or take notes on the same surface, with each person's contributions appearing in a different color or otherwise noted.

The researchers found that their proof-of-concept system was able to distinguish between the four types of finger inputs with 95 percent accuracy, and could distinguish between a pen and a finger with 99 percent accuracy.

More information: chrisharrison.net/index.php/Research/TapSense

Citation: TapSense touchscreen technology distinguishes taps by parts of finger (w/ video) (2011, October 19) retrieved 29 March 2024 from https://phys.org/news/2011-10-tapsense-touchscreen-technology-distinguishes-finger.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Wearable depth-sensing projection system makes any surface capable of multitouch interaction (w/ video)

0 shares

Feedback to editors