vendredi 26 avril 2013

A Simple Way to Turn Any LCD into a Touch Screen

A lire sur:  http://www.technologyreview.com/news/514061/a-simple-way-to-turn-any-lcd-into-a-touch-screen/

Electromagnetic interference can turn a plain LCD into a touch screen on the cheap.
Electromagnetic interference can screw up cell phone and radio reception. But it may also be the key to cheaply transforming regular LCD screens into touch- and gesture-sensing displays, according to recent research.
A group of researchers from the University of Washington’s Ubiquitous Computing Lab developed a method called uTouch that uses a simple sensor and software to turn an ordinary LCD into a touch screen display. The system takes advantage of the low levels of electromagnetic interference produced by many consumer electronics, harnessing it to do things like control video playback with pokes and motions on an otherwise noninteractive screen.
“All these devices around you have all these signals coming out of them, and we ignore them because we think they’re noise,” says Sidhant Gupta, a PhD candidate at the University of Washington’s Ubiquitous Computing Lab and one of the co-authors of the paper.
While touch screens are the norm on smartphones and tablets, they’re still not common on TVs, computer monitors, and other big displays. Existing methods that turn passive LCDs into touch screens typically use cameras or other sensors, but they’re not always practical. The group’s findings, explained in a paper that will be presented in May at the Computer Human Interaction conference in Paris, could eventually be used to cheaply add touch and gesture interactions to TVs, computers, and much larger displays, too.
Gupta says his group’s method works by measuring signals that are normally given off by an LCD display and how they change when a user brings a hand near the screen. These signals show up as electromagnetic interference, and can be measured with a $5 sensor that plugs into a wall outlet.
In the study, users’ gestures and touches controlled an on-screen video player. Information about how the user’s actions changed the LCD’s electromagnetic interference was gathered by the sensor, and then sent to a connected PC, where software isolated the display’s signal and tracked how it changed over time. The software used machine learning to predict if changes were simply “noise” or one of five gestures and touches that it had been set to respond to. Once the touch or gesture was determined, it would elicit an appropriate on-screen response—like pausing or resizing a video.
“What we’re trying to find out is how that signal changes, and in particular we’re looking for changes in the intensity of that signal,” Gupta says.
The system can tell the difference between different displays, since each has its own electromagnetic interference “fingerprint,” and a single sensor can be used to track interactions on numerous displays. Eventually, Gupta says, the sensing and processing could be done in a single unit that’s plugged into a wall socket.
The technology won’t make a noninteractive display as touch-sensitive as an iPhone or Android smartphone. The gestures are much simpler than the complex swipes and pinches you can make on those gadgets.
Still, Gupta can imagine it being used to do things like make large screens at museums interactive. It could also be used to add interactivity to other devices that emit electromagnetic interference—something Gupta and some of his uTouch colleagues explored in an earlier project called LightWave that uses a plug-in sensor to enable compact fluorescent lightbulbs to sense human proximity.
 “The more things we can make interactive that already exist, the better,” says Chris Harrison, cofounder of a startup whose touch-screen technology can tell the difference between fingernail and knuckle taps and a PhD candidate at Carnegie Mellon University’s Human-Computer Interaction Institute. “It’s very expensive to just put touch screens everywhere.”
The researchers aren’t planning to commercialize the technology, but Gupta says the sensor uses off-the-shelf parts, and the algorithms are included in the paper, so any motivated person could put together the same system.
The challenge to building interest, Harrison thinks, will be in refining the gestures that uTouch can understand—which are currently quite coarse—and finding the right applications. “You could never write an e-mail with this system, but you could do some cool gestural interactions,” he says.

Aucun commentaire:

Enregistrer un commentaire