Microsoft Research showed off a small slice of its futuristic work Tuesday that included user input sensing technology, and graphics technology that morphs still pictures, video and audio into a single display.
The demonstrations were part of the 15th anniversary celebration of the opening of Microsoft Research (MSR), which now has 700 researchers working out of five labs around the globe.
MSR is staffed by computer scientists, psychologists, sociologists, anthropologists and medical doctors who are tasked with pushing the envelope on state of the art technology as much or more than transferring their technology into new and existing Microsoft products.
"Technology transfer is a full contact sport," said Rick Rashid, senior vice president in charge of Microsoft Research. "It can happen by accident, but mostly it is hard work."
MSR's technology is in products ranging from Windows Server to XBox 360.
MSR focuses its work across 55 disciplines in 20 areas including databases, security, mobility/wireless, social interaction/collaboration, graphics, adaptive technologies and next generation user interfaces.
During the anniversary event for press and analysts, Dan Ling, director of Microsoft's lab in Redmond, led a series of demonstrations showing off a slice of the lab's work including one of those new interfaces called TouchLight .
The technology, developed by researcher Andy Wilson, uses computer vision and sensing to enable new applications including gesture-based inputs that replace the mouse and keyboard. Wilson used a projector and a camera to project a rectangular white box onto a tabletop. Using his hands, he interacted with objects projected into the "desktop" such as a bouncing ball. A map was brought up and he zoomed in and out and rotated the map by moving his hands on the surface. He also laid a piece of paper on the surface, which then became a screen showing a video. An integrated Bluetooth-like technology called Blue Rendezvous let Wilson lay a camera phone on the surface and have the pictures automatically downloaded to the computer. Disconnecting the phone from the system was simply a matter of picking it up. Wilson said the TouchLight technology could have applications for such things as video conferencing and augmented reality.
"This uses a lot of computer vision technology which we began developing 10 years ago, but as we move into the digital era we have more and more ideas how to apply computer vision," said Ling.
Another demonstration showed the Life Browser, a search based technology that interprets activity and context to augment or refresh a user's memory. The Life Browser constructs a timeline using data on a users machine including audio, video, pictures and text. A slider bar allows users to increase or decrease the level of detail. Users can see file activity, what documents have recently been edited and the order in which things have been accessed.
"We are looking at this technology for things like autobiographical tools," says Eric Horvitz, principal researcher and research area manager at Microsoft. "It started out as a search browser and now it has expanded."
Microsoft also showed machine learning projects that collect and analyze data about a user and his behavior and make inferences based on that data such as if a user should be bothered with an incoming call.
Another demonstration was of a technology called Code Thumbnails a tool for developers that replaced the scroll bar in Visual Studio to provide a more "human visual" representation of the developers work. Users could click on parts of the code thumbnails and have the editing tools move to that part of the file. Users could search for certain methods and see their whereabouts in the entire file. Other visualization technologies included FastDash, a dashboard that shows all the pieces of a project including checked out or open files, and a FaThumb, which uses hierarchies of metadata to narrow search results and reduce typing for users with devices such as mobile phones.
Microsoft also demonstrated its work with computer graphics, computer vision and image-based modeling and rendering including technology that stitches together photos, video and sound into a single display.
Researcher Rick Szeliski showed a set of pictures taken on hike in the mountains that were stitched into a single image that also included video of a waterfall shot on the same trip and with audio that was recorded.
Also shown was a panoramic shot of downtown Seattle that included 800 images of various levels of detail. From the panorama, the user was able to zoom in on a pair of gloves laying at a construction site and a fake owl perched on a rooftop.
Most of the technology demonstrated has not found its way into the Microsoft product set, but the company is planning to offer an early version of the photo stitching software online in the coming months under the name Photosynth , which includes technology acquired in February when Microsoft bought Seadragon Software.
"Research is about getting the opportunity to see the future," concluded Kevin Schofield, general manager for strategy and communications at MSR.