1. Eye Control
It may seem the stuff of sci-fi fantasy, but controlling your computer with your eyes is a practical reality in 2012. One system, Tobii's "
Gaze Interaction," lets users navigate, scroll, zoom and select using just their eyes.
Tobi's OEM components could see laptops, peripherals and game consoles with built-in eye tracking technology as part of the user interface. You could gaze directly at an icon to open an app, browse files with your eyes, and stare at an item to zoom in on it.
This tech also has the potential to make interfaces more adaptive. By just looking at a widget or icon, the item could become responsive and change or update the information displayed.
2. Gesture Control
Game consoles have educated consumers to the entertainment potential that gesture control offers, but such tech is also heading to your traditional PC.
While Microsoft has launched Kinect for Windows and other computer manufacturers are experimenting with gesture recognition, startup
XTR3D now offers an exciting cross-platform gesture control system.
What's particularly interesting about XTR3D's solution is that it can work with any ordinary 2D camera (such as a webcam or forwarding-facing camera), so it could easily be deployed on existing laptops and tablets.
The solution can read hand gestures up to 17 feet away, so you could open a file on your laptop by unclenching your fist, or swipe through your music library with a flick of your wrist from across the room.
3. Advanced Touchscreen Control
While touch is undoubtedly an intuitive input method, touchscreens currently don't offer the same kind of easily accessed advanced controls and menu options that more traditional computers do.
As mainstream computing moves over to touch, one device that could help the transition with the multiple inputs that we're used to (such as the right mouse button or shortcut keys) is the "
Ringbow."
This finger-worn tool adds extra layers of functionality to touchscreen computing. It can be used to wirelessly click or right-click, open non-visible elements, such as menus, or be programmed to replicate the actions of any traditional input keys.
4. Wearable Input Device
A wearable mouse has long been the dream for anyone looking to really immerse themselves in the computing experience. A successful Kickstarter project, the
Keyglove, could see that becoming a reality in 2012.
The Keyglove is a wearable, wireless, open source input device that boasts unprecedented flexibility and convenience for all kinds of computer applications.
With exciting potential for gaming, design, art, music, device control and even data entry, the glove-based system's multi-sensor combinations mean it could be programmed to offer one-handed operation of many systems and software.
In addition to the benefits a wearable mouse could offer a traditional computer user, the Keyglove could also be of interest to users of small screens, RSI sufferers or those with physical impairment.
5. Speech Control
Finally, we're taking a look at the most natural way that humans interface -- with our voices.
Speech recognition is rapidly becoming mainstream. The iPhone's "Siri" assistant, Microsoft's Kinect, Google Search and even Windows 8 will all help to make talking to your computer or gadget as commonplace as clicking a mouse.
As voice recognition, artificial intelligence, semantics and natural language technologies continue to improve, we're interested to see how speech will be incorporated into interfaces.
We predict that the press-a-button-and-speak method will become outdated as smart virtual assistants -- which offer an AI-powered, conversational style solution -- emerge.