eyeSight lets you control your mobile device with simple hand gestures; it uses the front-facing camera on devices like the Sprint Evo 4g and the iPhone 4 to see your gestures and then convert them into commands.
Two distinct products are being offered: eyeCan for phone control and eyePlay for gaming. Take a look at this explanatory video.
(eyeSight Gesture Control of a cell phone video)
The company - eyeSight Mobile Technologies - claims these benefits for their gesture-control technology:
Software based - does not require any hardware changes.
Increased revenue opportunity as a result of easier operation of existing services.
Enables greater differentiation through the introduction of new and intuitive ways to operate applications and services.
Simple API, facilitating easy integration with operating platforms and applications.
Small footprint, optimized for CPU and power consumption.
eyeSight has two different approaches: eyeCan, which provides a touch-free interface for phone use, and eyePlay, which seeks to provide a gaming interface.
Utilizing the mobile phone's built-in camera, along with advanced real-time image processing algorithms eyePlay allows gamers to play beyond the boundaries of their screens and keypads.
eyePlay recognizes 4 direction of a hand motion – Left, right, up and down, in addition, eyePlay recognizes ‘blocking’ hand gesture as well as a repetitively waving hand motion.
Enhance the User Experience in your games by allowing the gamers to use the actual hand motions they would use in a real life scenario, instead of using the less intuitive controls available today
Douglas Adams wrote about the idea of gesture-controlled systems more than thirty years ago in his 1979 blockbuster The Hitchhiker's Guide to the Galaxy. In typical form, he also illustrated some potential problems with such a system (any comments, eyeSight?):
The machine was rather difficult to operate. For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive--you merely had to brush the panels with your fingers; now all you had to do was wave your hand in the general direction of the components and hope. It saved a lot of muscular expenditure, of course, but meant that you had to sit infuriatingly still if you wanted to keep listening to the same program.
Update 15-Jul-2016: Here's an earlier reference to the idea of a gesture interface from Samuel R. Delany's Babel-17 (1968):
Rydra shook her head. She passed her hand before the filing crystal. In the concaved screen at the base, words flashed. She stilled her fingers. "Navigator-Two. . . ." She turned her hand. "Navigator-One. . . ." She paused and ran her hand in a different direction.". . . male, male, male, female...
Rydra watched, her hand drifting through centimeters over the crystal's face. The names on the screen flashed back and forth.
Rydra's hand came down on the crystal face, and the name glowed on the screen.
(Read more about the filing crystal)
Technovelgy (that's tech-novel-gee!)
is devoted to the creative science inventions and ideas of sf authors. Look for
the Invention Category that interests
you, the Glossary, the Invention
Timeline, or see what's New.
A System To Defeat AI Face Recognition
'...points and patches of light... sliding all over their faces in a programmed manner that had been designed to foil facial recognition systems.'