Virtual Touch Screen Floats in Mid-Air

Virtual Touch Screen Floats in Mid-Air

A Taiwanese technology company has developed the ultimate in hands-free technology: a virtual, see-through touch screen or keyboard display that, when “touched” in mid-air by the user’s finger, transmits that signal directly to the computer or mobile device. Physical contact with a touchpad or keyboard is not required for typing or transmitting commands. The user’s hands are totally free of any computing or transmitting device.

The Industrial Technology Research Institute calls its new product “iAT”—short for “i-air touch” technology. By using a pair of special glasses, users can type on a virtual ”floating” keypad, keyboard, mouse, or touch panel just as they would do with physical keys, maximizing flexibility, comfort, and convenience. Because the user is the only one who can see the keyboard in virtual space, privacy is guaranteed.

How It Works

The key elements for this technology are special eyeglasses, an internal DDDR (defined distance with defined range) camera, and an air-touch interface. When linked together, these components allow users to interact with a virtual input device that can be changed at any time from a simple keypad to a keyboard or a touch screen.

The eyeglasses are required to actuate the keys on the virtual input device. They are equipped with miniature displays that can be seen with two eyes or just one eye. The glasses allow the user to see the surrounding environment while also displaying data, images, and input devices.

The DDDR camera is the most important part of the air-touch technology and is designed to consume as little power as possible, which is a major challenge for wearable computing devices. The special phase-coded and color-coded lens will only discern an object at a predetermined distance between 28 cm and 32 cm. This way it is only possible to transmit a signal from the camera by detecting the presence of a fingertip within the input range. This also conserves power, with the signal shutting off if no fingertip is present.

The viewer wearing the glasses can see a 10-inch virtual screen in front of the eyes. Image: ITRI

When the DDDR camera captures an image of the user’s fingertip, it splits the image into green and red color codes to provide segmentation in image processing, while phase coding provides distance and depth perception of the fingertip. The camera lens focuses the green light component at 28 cm and the red at 32 cm. The combined green and red components resolve to the strongest image signal at 30 cm, the distance of the virtual input device from the user. The camera then captures the image signal at 30 cm as input. Because the camera cannot “see” image signals outside of the 28 to 32 cm virtual target plane, any signals that fall short or long of this plane will not be processed, which conserves image processing power.

Finally, the air-touch interface functionally links the DDDR camera and the special eyeglasses together. The interface interprets what the DDDR camera detects as user input and relays it to the wearable display, which then displays a corresponding graphical response to the user. This linkage gives the user a sense of action-and-result, as normally experienced with physical input devices.

Future Possibilities

There are, of course, many possible applications for this type of technology. It is ideal for wearable computers because of its accuracy, versatility, privacy, and overall convenience.

“i-Air Touch creates new possibilities for wearable and mobile computing by freeing users from the distraction of locating and touching keys on a physical input device, resulting in hands-free computing and improving security over voice commands,” says Golden Tiao, deputy general director for the Industrial Technology Research Institute’s Electronics and Optoelectronics Research Laboratories. “In addition to consumer applications, this new technology is suitable for medical applications such as endoscopic surgery, augmented reality technologies for gaming, and any industrial applications that benefit from hands-free input.”

Mark Crawford is an independent writer.

i-Air-touch technology creates new possibilities for wearable and mobile computing by freeing users from the distraction of locating and touching keys on a physical input device.Golden Tiao, Industrial Technology Research Institute

You are now leaving ASME.org