Leap Motion: When Hand Tracking Was Revolutionary
In 2016, hand tracking in VR wasn't a built-in feature—it required external hardware. Leap Motion was the solution: a small sensor mounted to the front of a VR headset that could recognize hand positions, finger movements, and gestures. For the time, it was groundbreaking technology. You could see your virtual hands, close your fist, point, and have the system recognize and respond to those actions.
Today, modern Oculus headsets detect hands natively. But back then, achieving this level of interaction required specialized hardware and careful integration. Leap Motion opened up possibilities that controllers alone couldn't match—natural, intuitive hand interactions that felt closer to how we interact with the physical world.
Gesture-Based Interactions in Unreal Engine
The project for ViaTechnik involved creating practical interactions using shape recognition and gesture binding in Unreal Engine. Users could perform actions simply by using their hands:
- Turn on ceiling lights by pointing upward with a hand gesture
- Display item information by pointing at objects in the virtual space
- Move and manipulate items using hand movements and grips
- Control devices like turning on a TV through gesture commands
Each gesture was bound to specific commands using Leap Motion's shape recognition system. The technology could distinguish between an open hand, a closed fist, a pointing finger, and more complex hand shapes. This allowed for a vocabulary of interactions that went beyond simple button presses.
ViaTechnik: VR for Construction Visualization
ViaTechnik was a fantastic client to work with. They're a US-based company that specialized in using VR and advanced visualization for construction projects. Before construction even began, stakeholders could walk through buildings, examine spaces, identify potential issues, and make informed decisions.
At the time, ViaTechnik was heavily invested in Unreal Engine for these visualizations. The combination of high-fidelity graphics and natural hand interactions made the experience incredibly immersive. Being able to point at a wall, inspect materials, or rearrange furniture with your hands made the virtual space feel tangible.
They still exist today, though they've shifted away from Unreal Engine projects in recent years. But during that era, they were pushing the boundaries of what VR could do for the construction industry, and it was a privilege to be part of that work. If they ever decide to return to Unreal Engine, I'd jump at the opportunity to collaborate with them again.
From External Hardware to Native Hand Tracking
Looking back, it's remarkable how far hand tracking has come. What required a dedicated piece of hardware in 2016 is now seamlessly integrated into consumer VR headsets. The Quest 2, Quest 3, and other modern headsets track hands without any additional sensors or mounting.
But at the time, Leap Motion represented the cutting edge. It proved that hand tracking was viable, that natural interactions could enhance VR experiences, and that controllers weren't the only way to interact with virtual worlds. The work done with Leap Motion helped pave the way for the native hand tracking we take for granted today.
The Leap Motion project with ViaTechnik was about more than just hand tracking—it was about proving that VR could offer intuitive, natural interactions that felt closer to reality. It was a glimpse into the future of VR, and working on those construction visualization projects remains a highlight of my career.