At this year’s Consumer Electronics Show, I walked many miles and saw countless demos. Several of these demos were memorable, but one in particular really got my mental gears turning: Microsoft’s HoloLens.
HoloLens will spur many “aha” moments, leading to accelerated innovation in wearable computer vision devices, low-power 3D computer vision, and mixed reality.
HoloLens, of course, is Microsoft’s “mixed reality” glasses product, which has been shipping in pre-production form for about a year. Previously, I would have used the term “augmented reality” to refer to HoloLens, which overlays computer-generated graphics on the user’s view of the physical world. But here I’m adopting Microsoft’s preferred term, “mixed reality,” which many people now use to describe systems in which “people, places, and objects from your physical and virtual worlds merge together.”
Over the past five years, I’ve seen many demos of virtual reality, augmented reality and mixed reality. Most of these showed promise—but the promise usually felt distant, because the demos weren’t sufficiently polished to feel “real,” and weren’t easy to use.
That was then, this is now: HoloLens has nailed both the “feels real” and ease-of-use aspects. Wearing HoloLens, I played a shoot-em-up video game against an army of robots, illustrated in this video. The experience was stunning, thanks to three key capabilities. First, HoloLens is a wearable, battery-powered device so I was able to move about the room to dodge hostile robots. Second, HoloLens accurately mapped the room I was in, enabling the robotic invaders to create what looked like real cracks in the actual walls of the room. And third, as I turned my head and shifted my position within the room, HoloLens adapted to these movements seamlessly so that the illusion of merged physical and virtual worlds was maintained.
Now that I’ve experienced robust mixed reality, I foresee many compelling applications for this technology beyond gaming: Enabling physicians to see inside a body to enable safer, more accurate treatment. Giving utility workers a clear view of underground pipes and cables. Providing consumers with a realistic preview of how a room will look after redecorating it. Allowing museum visitors to see a skeleton transform into a fully formed, animated dinosaur (the fact that HoloLens sells for $3,000 suggests that, for a while at least, this technology is more likely to be adopted by hospitals, utility companies and museums than by individual consumers).
Of course, a convincing mixed reality (“MR”) experience—one in which the virtual and physical worlds interact in a realistic way—requires the MR device to maintain an accurate understanding of the surrounding physical world—and the user’s position within it—in three dimensions with very low latency. That is, it requires fast, highly accurate 3D computer vision.
Mixed reality doesn’t necessarily require a wearable device. Vehicle applications, for example, can use the windshield as a projection screen. And 8tree’s clever handheld device for quantifying surface damage projects information onto the surface being inspected. But in many cases, glasses are the most compelling way to deliver mixed reality. This is because they leave your hands free, because they know where you are looking, and because they have the ability to project information into your field of view wherever you’re looking. Packing all of the technology required for a convincing MR experience into a wearable device is a daunting challenge, however. With HoloLens, Microsoft has given us a hint of what’s possible. The HoloLens team has clearly put enormous effort into everything from custom chips to industrial design to create a device that’s reasonably comfortable to wear (though still bulky).
One of the key challenges for developers of products like HoloLens is harnessing the capabilities of heterogeneous compute resources—CPUs, GPUs, DSPs, FPGAs, and fixed-function accelerators—to deliver high performance with low cost and low energy consumption. HSA provides an approach that enables developers to easily and efficiently apply compute resources to demanding applications in today’s complex SoCs.
Learn more about heterogeneous computing for efficient computer vision at the upcoming Embedded Vision Summit. Marc Pollefeys, Director of Science for HoloLens and a pioneer in 3D computer vision, will be one of the keynote speakers. If you work in the computer vision field and are interested in implementing visual intelligence in real-world devices, this event should be on your calendar! Register by March 15th using promo code bphsaf0308 to save 15%.