IBM and Emotiv show advances in virtual reality worlds
- 10 January, 2008 16:20
Emotiv Systems, partnering with IBM, have demonstrated an alpha version of a neural input device that it plans to unveil as a consumer product at the Game Developers Conference in San Francisco next month.
The device is designed to wirelessly transmit the brain's electronic signals, including emotions and cognitions, from sensors on a person's head to a PC.
IBM believes that such neural input can be an important part of a broad range of virtual reality uses for industry, not just for games, said Dave Kamalsky, project manager virtual world research.
Next to the Emotiv demonstration, IBM was showing a variety of virtual reality systems, including Second Life and Activeworlds, that businesses can use for training employees, holding meetings and demonstrating products to consumers.
Emotiv's working product name is the Emotiv Headset, which would be similarly priced to the cost of a high-end handheld game controller, said Patrick McGill, a spokesman at the San Francisco-based startup.
The alpha version includes about a dozen sensors that pick up a brain's signals, transmitted via a 2.4 GHz wireless signal, said Emotiv product engineer Marco Della Torre. He demonstrated the alpha version while wearing the sensors that picked up his eye movements, eye blinks, smiles and frowns, which were shown on the PC and a large display at the Emotiv booth. Each facial gesture was quickly and accurately recorded on a large graphical representation of a face on the display.
In addition to the simpler facial expressions, Della Torre was able to transmit the brain's affective impulses, such as calm or excited (which involves a group of facial movements) and even cognitions. The cognitions (conscious control) that Della Torre demonstrated were the ability to make an animated cube on the display move up or down or spin in space. He was able to train the software to interpret the cognition in less than 20 seconds.
While such capabilities might seem rudimentary, the control of the animated cubicle could eventually be extended to "think" whether an avatar in a virtual world should gesture with face or hands, shake someone's hand, or even throw a ball, Della Torre said. By comparison, in Second Life, many controls of an avatar are now possible, including facial expressions and walking and even flying, but all must be input through a keyboard.
Della Torre said many small companies have been working with neural input technologies for years, including helping the disabled, but he said Emotiv hopes to be one of the first to market with a device that has a broad range of capabilities.