Friday, April 12, 2013

TECH SPECIAL...First we typed, then we clicked, now we touch. Next?



First we typed, then we clicked, now we touch. Next? 
BLINK. TALK. WAVE. 

The way man interacts with computers is always evolving, from the humble keyboard to high-tech touchscreens. Soon, machines will understand what you say, track your eyes for input, and obey what you sign with your hand

WRITING IN AIR
    Imagine you’re on a phone call, and you need to jot down an address, but there’s no pen at hand. What do you do? Well, in the near future, you could simply scribble in the air with your finger — and when you check your handset, you will find that your ‘air’ doodle has been converted into text on your smartphone.
    Airwriting — developed at Germany’s Karlsruhe Institute of Technology (KIT) — is an input technology that uses acceleration sensors and gyroscopes attached to a thin glove to record hand movements and transmit them to a computer via a wireless connection. Of course, it first checks whether you are indeed writing. And “all movements that are not similar to writing, such as cooking or waving to someone, are ignored,” says doctoral student Christoph Amma who developed the system along with Professor Tanja Schultz at KIT’s Cognitive Systems Lab.
    The technology, presently in early stages of development, can recognise complete sentences written in capital letters and has a vocabulary of 8,000 words.
    “Airwriting currently has an error rate of 11 per cent,” Amma discloses. But “when adapted to the individual writing style of the user, the error rate drops to 3 per cent,” he says.
    The inventors are now working to make the system smaller — like an “unobtrusive wrist band” — in order to increase “wearing comfort and user acceptance”. And future plans include the integration of this system in smart phones. “In which case, neither the wrist band nor a tiny keyboard would be required to write a text message,” Amma says.  

GAZE TRACKING
    Technology that tracks a user’s gaze is already available on mobile handsets. Samsung’s Galaxy S III, for instance, is equipped with a feature called Smart Stay that uses its front camera to detect when a person is looking at its screen — otherwise, it dims the display to save battery. And now, the South Korean company has equipped its Galaxy S IV smartphone with what it calls Eye Scroll: If the camera detects the user’s eyes moving down a web page or a document on the device, it automatically begins scrolling. And it even ‘pauses’ video if it detects that no one’s watching!
    But Samsung’s devices will not be the only ones that follow your eyes. Israel-based Umoove has also been working on software that will use the front-facing cameras in mobile devices to track eye and head movements — and it plans to offer its development kit to other phone manufacturers, including Apple.
    “Our product was initially developed to assist disabled persons,” Umoove co-founder and CEO Moti Krispil said in an interview. “(But) we made a decision that the technology is so diverse that we cannot just allow it to be confined.”
    Gaze-tracking, he says, can transform the way people use their devices to do things like browse the web, play games and read books.
    Still, this technology is not wholly new, and neither is it limited to mobile handsets alone. Sweden’s Tobii has been a frontrunner in this area since 2001. And already, the company’s products are widely used by the research community — in usability testing of websites and software; for pre-testing and analyzing marketing campaigns; in sports research to detect concentration and flaws in hand-eye coordination — and to help the severely disabled complete everyday tasks such as browsing the internet by using the movements of their eyes.
    This year, Tobii unveiled the REX — a device that connects to a computer via USB. Once you attach it to the bottom of your PC monitor, all you have to do is calibrate it once, after which it gives your machine the ability to track and interpret your gaze; working surprisingly well with the tile-based interface of Windows 8. If you want to access your mail app, for instance, you simply stare at it. REX follows your eyes and moves the onscreen cursor to the icon. After that to open, you simply press a predefined button on your keyboard or tap a touchpad.
    “Gaze is as intuitive as touch, as precise as the mouse and more ergonomic and effortless than both,” says Henrik Eskilsson, Tobii’s CEO and co-founder. “Laptops and tablets today are designed to feel natural either for touch or for the mouse pointer, never both. Gaze technology enables you to point in a consistent way that feels natural on all devices.”
    The company believes that the technology has immense potential in fields such as computeraided design, military applications and computer gaming. Eventually, they hope to integrate eye-tracking straight into devices such as laptops and tablets.  

VOICE CONTROL
    In Iron Man, Tony Stark is assisted by JARVIS – a computer program that helps him build his suit. But what makes ‘Just A Rather Very Intelligent System’ really smart is its ability to converse with Stark in plain spoken English.
    In the real world, technology like JARVIS — one which can communicate in conversational language — is still a few years away. But scientists are making rapid progress in this field. The iPhone, for example, has Siri — a digital assistant that can take voice commands to complete tasks like searching the web, initiating voice calls, and returning with weather updates. Similarly, Android has Voice Search, and BlackBerry’s OS10 has its own Voice Control.
    Then there’s TomTom’s Via series of satellite navigation systems for cars that features voice control and recognises over 1,000 commands so you can simply ask it “where am I?” if you’re lost, or say something like “go to the nearest petrol station” for the device to chalk-out a route.
    In your living room, this technology is already making inroads with electronics like Samsung’s ES8000 — a TV that allows users to increase or decrease the volume, and even switch it on with the voice command, “Hi TV”.
    Indeed, the idea is to create an ecosystem of devices that can understand voice, and even reply, if required. At the chip level, companies like Qualcomm are engineering mobile processors that can be voice activated. Earlier this year, CEO Paul Jacobs unveiled the company’s Snapdragon 800 chip that can be “woken up” by a custom voice command. So by the end of this year, you could activate your handset by saying something like “Hey Snapdragon” and the device will recognise the phrase even if it had been in standby or airplane mode. 

USING GESTURES
    Gesture tech works on a basic premise: You are the controller. No keyboards or mouse, no gamepads or joysticks. Just you. And right now, some of the best work in this area revolves around Microsoft’s Kinect accessory.
    The device, equipped with stereo cameras — to detect actions in a 3D space — started out as an add-on to the Xbox 360 game console, allowing users to play by using gestures alone. But in February last year, Microsoft launched development kits that allow programmers to build gesture-based software for Windows PCs. “Just as Kinect revolutionised gaming, we’ll see it revolutionize other industries, like entertainment, healthcare and more,” Microsoft CEO Steve Ballmer had said when announcing its launch.
    And already, companies have developed software that actually works. Canada-based GestSure, for instance, has created a program that lets surgeons use gestures to control a computer showing medical images, so they don’t have to physically operate ‘contaminated’ hardware after sterilisation. The tool recognises hand gestures to allow the physician navigate through reports and pictures during surgery.
    In the home space, LG and Samsung have already demonstrated smart TVs that recognise gestures. LG’s technology includes a Kinect-like accessory that lets users navigate through the TV’s menu system and apps — and they can choose what they want by waving their hands, and selecting by making a closed fist. Samsung’s TVs, on the other hand, use an in-built camera. And users have to gesture with open palms when navigating its menu system, and close it into a fist when they want to select an option.
    And then there are chip manufacturers such as Nvidia and Qualcomm, as well as companies like Fujitsu and Toshiba, who are working to bring image tracking and motion detection technology to mobile phones and tablets that will enable users to play games, browse the web or use a ‘near swipe’ gesture to turn the page of a recipe ebook, when your hands are messy from cooking.  
TEXT: SAVIO D’SOUZA




No comments: