At this year’s Consumer Electronics Show, Microsoft Research unveiled one of their latest projects called IllumiRoom. Like most of their ongoing technological marvels of the future this one is quite fascinating so perk up. Imagine you’re playing a video game–let’s say Halo 4–and all of a sudden the on-screen action extends beyond the confines of your TV set. In an instant the mysterious jungles of planet Requiem surround you and you feel as if you’re truly immersed inside the captivating game developed with precision by 343 Industries. Microsoft’s IllumiRoom attempts “to blur the lines between on-screen content and the environment we live in allowing us to combine our virtual and physical worlds.”
So how does it all work? A peek behind the curtain reveals two devices: the pairing of a Kinect for Windows camera and a projector. “Our system uses the appearance and the geometry of the room (captured by Kinect’s sensors) to adapt projected visuals in real-time without any need to custom pre-process the graphics.” Sounds simple now, doesn’t it? Unfortunately like most Microsoft Research projects IllumiRoom is only proof-of-concept, but with engineers working hard to make Kinect even more powerful and projectors keeping pace with high definition resolutions, the technology is there for this prototype to enter the marketplace. Let’s place IllumiRoom in the pile labeled “not if, but when.”
Watch Illumiroom perform its magic in the video embedded above; Microsoft ensures the action was “captured live and is not the result of any special effects added in post production.”
The inventive minds at Microsoft Research and the company’s Applied Sciences Group are experimenting with advanced technologies to come up with new ways of computing and communicating for the future.
First up is “IllumiShare”, a camera-projector pair that enables remote people to share any physical or digital object on any surface. As you can see in the demonstration embedded above, with IllumiShare a simple Skype conversation can be transformed into an interactive workspace that can be manipulated by one or more persons. The applications for this are endless; this tech can allow for remote gameplay, as well as introduce new methods of remote teaching.
Microsoft Research is back, and this time they are bringing a new technology to the table (hehe) that’s going to eliminate any desire you might have had to purchase an exuberantly pricedMicrosoft Surface.
LightSpace combines elements of surface computing and augmented reality research to create a highly interactive space where any surface, and even the space between surfaces, is fully interactive. Our concept transforms the ideas of surface computing into the new realm of spatial computing.
In essense LightSpace rips out the multiple depth 3D cameras and projectors from their secret cove beneath a table and places them up in the ceiling. In effect, this means that all Surface user interfaces and features can be displayed on virtually any flat surface; the actual Surface table is no longer required. You’re going to want to watch the video demonstration above; some of the LightSpace applications are quite extraordinary. In one example the Microsoft researcher “picks up” an object located on a table projection and transfers it in his hand to a second wall display. It’s drag-and-drop IRL. Now remember, this is a Microsoft Research project so there’s no telling how long it’s going to cook in the labs before it makes its away to the general public (if ever).
With Google Maps and Bing Maps Streetside you can navigate between immersive 360-degree panoramas to visualize your route. Although this street-view integration is very helpful in visualizing your route before you drive it, there are problems with it and the brainiacs at Microsoft Reseach think they have the solution. They describe the problem like this: “The discrete moves from bubble [360-degree panorama] to bubble enabled in these systems do not provide a good visual sense of a larger aggregate such as a whole city block. Multi-perspective “strip” panoramas can provide a visual summary of a city street but lack the full realism of immersive panoramas.” In other words they can be quite disorienting. Their solution, called Street Side, allows you to seamlessly zoom out of the bubble to view a multi-perspective panorama view of a street. In this zoomed out view you can pan across an entire street to find exactly what you’re looking for or to plan your route in a more effective way. Once you find a particular destination or a location you’d like to investigate further simply zoom in to view a part of a street on more detail. The mapping tech is extremely impressive; check it out for yourself in the demonstration above. The developers are currently making an iPhone (and presumably a Windows Phone 7) version of the maps to bring to mobile devices. Don’t get too excited, though; only about 2400 panoramas of 4 kilometers of streets has been covered thus far.
Microsoft Research is back with a new way to interact with their Surface multitouch table.
Manual Deskterity is a prototype digital drafting table that supports both pen and touch input. We explore a division of labor between pen and touch that flows from natural human skill and differentiation of roles of the hands. We also explore the simultaneous use of pen and touch to support novel compound gestures.
The combination of pen and touch input makes for a wide range of gestures like holding, tapping, dragging, and crossing that can be used in ways you likely have never seen before. Check it out in the video demonstation above. I smell a hint of Courier here.
Microsoft’s Surface table is fairly large and very expensive. And those are two factors that don’t mesh well with the general consuming public. Microsoft gets that, so they’ve gone ahead and created a prototype version of their multitouch table called Mobile Surface. Like its older brethren, Mobile Surface uses a projector/camera combo that allows you to interact with on-screen images. Difference here is that the image projection can be displayed on any surface (making it portable) and it allows for in-air manipulation. For example, as seen in the video above, you can play the drums without physically touching the tabletop. Mobile Surface links up to a secondary device, like a cell phone or laptop, to indicate what you’re interacting with. Pretty neat if you ask me. Currently Mobile Surface is a Microsoft Research project and Microsoft did not comment on a potential mainstream release.
Microsoft Research believes it can. The video above goes behind-the-scenes at MS Research, revealing five prototype mice that are still in the works. They include “cap mouse,” “FTIR (Frustrated Total Internal Reflection) mouse, “orb mouse,” “arty (articulated) mouse,” and “side mouse.” Each of them use a different method of multitouch to perform on-screen action. They are like multitouch track pads featured in laptops but converted and developed in mouse form. Very interesting stuff, to say the least. I’m glad to see Microsoft taking a look at unique and intuitive input methods for the future. With word of a new multi-touch capable Apple Mighty Mouse in the works, this was a timely move for Microsoft.