The experience of actually using the HoloLens 2 can be difficult to describe for someone who hasn’t had the chance to interact directly with the device in person and has been overwhelmed by its immersive capabilities.
That’s why any new exploration towards uncovering the augmented reality magic enabled by the HoloLens 2 requires attention as we show how computing is about to change for the entire planet. This latest demo is no different.
• Don’t Miss: Intel Takes Us Inside to Reveal How HoloLens 2 Is Transforming Computer Chip Manufacturing
This demonstration focuses on illustrating the hand tracking systems and object interactions that allow the HoloLens 2 to simulate the experience of touching and holding objects. In this case, the object is a virtual cube handled by a pair of virtual hands overlapping the user̵
Besides the fact that the occlusion mesh that produces the interaction shows realism, the demo also introduces us to the virtual physics that makes the interaction with a virtual object seem real.
“HoloLens 2 does not simulate the sense of touch, an important part of hand interactions with objects,” said Oscar Salandin, a designer at Microsoft’s HoloLens designer based in London, in a blog post describing his virtual design methodology. “We cannot make virtual objects physically affect your hand through touch, but we can use light to show the relationship between the object and your hand.”
The absence of any haptic feedback in augmented reality, as well as in virtual reality, can often act as a stumbling block in terms of conveying a sense of realism (assuming you’re not wearing a haptic vest / bodysuit). In VR, this haptic gap is often closed by providing haptics through the game controllers. But in gesture-based AR, the challenge is greater.
“When you hold a bright object, light shines on and through your hand, giving you more feedback on how your hand and the object interact,” says Salandin. “Adding this subtle effect to the virtual hand has a surprisingly strong effect on the realism of the interaction and provides information about depth, proximity and direction.”
Indeed, this effect of virtual realism is most often seen in VR, where the full immersion of the experience (for example, being close to a virtual flame or the avatar of another person invading your personal space) can help bridge the gap between the real and the real. to bridge. virtual. But in AR, where the virtual is integrated with the real, it’s a bit harder to trick the brain.
“This light effect blurs the line between digital and physical, as the hand you are looking at now is a composite of the illumination of both the user’s real environment and virtual objects,” says Salandin. “Some users reported that holding a particularly bright red glowing hologram and seeing the effect on their skin made their hand feel warm, even though they knew it couldn’t really heat up.”
Another clever concept introduced in the demo video is the idea of a “telekinesis gesture”, which basically gives your virtual hands the power to realistically move and control virtual objects. “While telekinesis is not part of real physical interactions, it is a gesture many people have seen or imitated through popular media outlets such as Star Wars,” said Salandin. “Here we use a combination of eye gaze and hand tracking to let the user confidently move an object without touching it.”
These references to telekinesis and Star Wars to explain full AR immersion illustrate that while many AR hardware and software vendors are focused on enterprise, science fiction and gaming remain the best vectors to translate these advanced interactions into the mainstream, even if the technology is still not optimized for wide mainstream use on the high end with devices such as the HoloLens 2 and the Magic Leap 1.