Third-Party Tutorials: Consuming OpenXR Hand-Tracking Data
Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Detection in Unreal Engine
In this tutorial, you'll learn how to get the OpenXR Hand-Tracking Data from the Unreal Engine API and how to consume it to draw virtual hand models using cubes (as hand joints). It will also dive into gesture recognition by implementing a simple pinch gesture recognition.
In the first part, it will focus on UE 4.26
to 5.4
API. And, in the second part, you'll learn how to update the project to work with 5.5
.
Procedural Virtual Hand Mesh Animation Using OpenXR Hand-Tracking Data
Building on the Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine tutorial, this slightly more advanced tutorial will dive deeper into the following topics:
- Transitioning seamlessly between motion controller and hand-tracking modes in Unreal Engine.
- Adding custom debugging gizmos to improve development and testing workflows.
- Visualizing debug virtual hands by incorporating the custom gizmos.
- Animating virtual hand meshes with OpenXR hand-tracking data, moving beyond basic joint representation with cubes.
- Re-using and adapting the gesture recognition code from the introductory tutorial to integrate with the new animated virtual hand meshes. This guide will help you take your VR projects to the next level with polished and practical implementations.
Part 1:
Part 2: