Third-Party OpenXR Integrations
The SenseGlove Unreal Engine Plugin registers itself as an OpenXRHandTracking
provider, making it a fully compatible, drop-in replacement for Epic’s own OpenXRHandTracking plugin in Unreal Engine. This allows it to integrate seamlessly with any third-party system or plugin that can consume OpenXR hand-tracking data.
One notable example is the open-source, MIT-licensed VR Expansion Plugin (VRE).
important
As explained in the Third-Party Tutorials: Consuming OpenXR Hand-Tracking Data section, it’s entirely possible to build your own custom hand interaction system without relying on SGPawn or any third-party OpenXR-compatible interaction plugin altogether.
If your project requires finer-grained control than what these solutions offer, the tutorials in that section will guide you through understanding the OpenXR hand-tracking data format in Unreal Engine and help you implement a fully tailored interaction system from the ground up in a few hours.
As the SenseGlove Unreal Engine Plugin is fully OpenXR-compliant, it provides OpenXR hand-tracking data in the expected format and takes over as the active provider within Unreal. If your existing interaction system (e.g. VRE plugin) already uses OpenXR hand-tracking, SenseGlove will function as a direct tracking source instead of a real hand.
note
Since most hand-tracking systems are not capable of haptics feedback, integrating SenseGlove's haptic feedback requires a small amount of additional effort.
The SenseGlove API is fully exposed to Unreal Engine via C++ and Blueprint, so triggering haptic feedback is as simple as calling a function. For more information, refer to the Blueprint Changes section below.
important
If you're using a third-party OpenXR hand interaction system, configuring the Wrist-Tracker Hardware Settings will likely have no effect, and your hand offsets may appear at the incorrect location in the scene.
This is because those settings are only recognized by SenseGlove’s native
actors and components such as SGPawn
, SGWristTrackerComponent
, etc.
Most third-party plugins are unaware of these settings. As a result, you'll
need to figure out how to manually apply the appropriate offsets within your
chosen OpenXR hand interaction system.
For example, the VRE plugin provide similar configurations in their plugin's settings section. For more information refer to the Changing Wrist-Tracker Offsets section below.
Comparison of Supported OpenXR Hand-Interaction Systems
Built-in? | Works out of the box? | Beginner-friendly? | Learning Curve | Featureful | Customizable? | Supports Custom Gestures? | |
---|---|---|---|---|---|---|---|
SGPawn | ✅ Yes | ✅ Yes | ✅ Most beginer-friendly | ✅ Very easy | ⚠️ Very basic | ⚠️ Very limited | ❌ Not yet, maybe added in the future |
SenseGlove OpenXR | ✅ Yes | ❌ Requires Blueprint or C++ coding | ✅ Requires a few hours of watching tutotrials | ✅ Moderate | ❌ You need to develop your own features and functionalities from scratch | ✅ Your imagination, creativity, and skill level are your limits | ✅ You have to implement your own recognition logic; example pinch detection algorithm is taught in the tutorial series |
VR Expansion Plugin | ❌ No | ⚠️ Partially – requires setup | ❌ Best suited for intermediate or advanced users | ⚠️ Steep | ✅ Diverse features and functionalities | ✅ Highly customizable | ✅ Via custom logic |
Other OpenXR-compatible Plugins | ❌ No | ❓ Check their documentation | ❓ Check their documentation | ❓ Check their documentation | ❓ Check their documentation | ❓ Check their documentation | ❓ Check their documentation |
VR Expansion Plugin
The VR Expansion Plugin (VRE) is a robust, community-driven plugin for Unreal Engine that focuses on advanced VR interaction and gameplay mechanics. It is open-source (MIT licensed), actively maintained, and has received support from Epic via the MegaGrants program.
Designed to extend Unreal’s capabilities for virtual reality, VRE offers a modular set of tools covering:
- Multiplayer and networking
- Locomotion systems
- Object gripping and interaction
- Custom movement and physics handling
The plugin is particularly useful for teams building sophisticated VR experiences. While it's beginner-friendly to an extent, its depth and flexibility are best suited for intermediate to advanced Unreal Engine developers. Whether you're prototyping with built-in features or extracting specific systems for your own framework, VRE offers a rich foundation for VR development.
note
For support and assistance with the VRE plugin, you can join its active and welcoming Discord community, known for being responsive and supportive.
SGVRETemplate Demo Scene
To showcase how SenseGlove can be integrated with OpenXR-compatible third-party interaction systems, SenseGlove provides a ready-to-use VR Expansion Plugin Integration Demo for Unreal Engine 5.4.
This repository includes UE 5.4
–compatible versions of both the SenseGlove and VR Expansion plugins, with all necessary setup and configuration already in place. Simply download the project and it should run out of the box, allowing you to explore the integration without additional setup.
note
SenseGlove provides this demo to demonstrate the potential for integrating with third-party OpenXR-based hand interaction systems. Please note that the VR Expansion Plugin is a third-party solution, and as such, we do not offer official support for it.
For help with the VRE plugin, refer to its documentation at vreue4.com and consider joining the official VRE Discord community, which is active, supportive, and very responsive.
SGVRETemplate Modifications
The SGVRETemplate is built on top of the VR Expansion Plugin Example Template. However, since the original template is not directly compatible with SenseGlove, several adjustments were necessary.
In addition, a few known issues with OpenXR support in the VR Expansion Plugin for Unreal Engine 5.4
required us to modify the plugin itself to ensure smooth integration.
Below is an overview of the key modifications made to both the project template and this version of the VRE plugin.
Blueprint Changes
- Content/VRE/Core/Character/BP_VRCharacter: Four functions were added:
SendVibration
,SendFFB
,SendSqueeze
, andResetHaptics
. These functions retrieve the glove instance and send the appropriate haptic command to it. In theOnPossessed
event,Load Controller by Name
was added along with a string upropertyTracking Offset
, which is used to load the correct tracking offsets based on the selected profile.
note
If you'd like to implement your own haptic functions, the most convenient approach is to safely acquire a glove instance. Once you have the glove instance, applying haptic feedback is as simple as calling the appropriate function.
SenseGlove supports three types of haptics: Vibrations, Force-feedback, and Wrist-squeeze.
- Using
Send Custom Waveform
, you can send vibrations to the glove instance. - Using
Queue Command Force Feedback Levels
, you can send force-feedback. - Using
Queue Command Wrist Squeeze
, you can send a wrist-squeeze command to the glove.
Each of these functions can be called directly on the glove instance to trigger the desired haptic feedback.
- Content/VRE/Core/GraspingHands/GraspingHandManny: In the
SetupFingerAnimations
function, replace the hardcoded check forHandType == Left
with a string comparison: convert the enum to a string and check if it contains"Left"
. This allows compatibility with alternative tracking sources such as"Left Foot"
.
C++ Changes
- Plugins/VRExpansionPlugin/Source/VRExpansionPlugin/Public/Grippables/HandSocketComponent.h: The following line was added as a public
UPROPERTY
in the header file:
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Hand Animation")
float HandAnimationProgress = 0.0f;
- Plugins/VRExpansionPlugin/Source/VRExpansionPlugin/Private/Grippables/HandSocketComponent.cpp: In the function
bool UHandSocketComponent::GetBlendedPoseSnapShot(FPoseSnapshot& PoseSnapShot, USkeletalMeshComponent* TargetMesh, bool bSkipRootBone, bool bFlipHand)
, theTrackLocation
calculation was modified from:
if (TrackIndex != INDEX_NONE && (!bSkipRootBone || TrackIndex != 0))
{
double TrackLocation = 0.0f;
HandTargetAnimation->GetBoneTransform(LocalTransform, FSkeletonPoseBoneIndex(TrackMap[TrackIndex].BoneTreeIndex), TrackLocation, false);
}
else
{
To:
if (TrackIndex != INDEX_NONE && (!bSkipRootBone || TrackIndex != 0))
{
double TrackLocation = HandTargetAnimation->GetPlayLength() * HandAnimationProgress;
HandTargetAnimation->GetBoneTransform(LocalTransform, FSkeletonPoseBoneIndex(TrackMap[TrackIndex].BoneTreeIndex), TrackLocation, false);
}
else
{
- Plugins/VRExpansionPlugin/Source/VRExpansionPlugin/Private/GripMotionControllerComponent.cpp: In the function
void UGripMotionControllerComponent::GetCurrentProfileTransform(bool bBindToNoticationDelegate)
, the following logic was updated from:
if (HandType == EControllerHand::Left || HandType == EControllerHand::AnyHand || !VRSettings->bUseSeperateHandTransforms)
{
NewControllerProfileTransform = VRSettings->CurrentControllerProfileTransform;
}
else if (HandType == EControllerHand::Right)
{
NewControllerProfileTransform = VRSettings->CurrentControllerProfileTransformRight;
}
To:
if (UEnum::GetDisplayValueAsText(HandType).ToString().Contains("Left") || HandType == EControllerHand::AnyHand || !VRSettings->bUseSeperateHandTransforms)
{
NewControllerProfileTransform = VRSettings->CurrentControllerProfileTransform;
}
else if (UEnum::GetDisplayValueAsText(HandType).ToString().Contains("Right"))
{
NewControllerProfileTransform = VRSettings->CurrentControllerProfileTransformRight;
}
The following function was also updated; from:
void UGripMotionControllerComponent::GetHandType(EControllerHand& Hand)
{
if (!IMotionController::GetHandEnumForSourceName(MotionSource, Hand))
{
// Check if the palm motion source extension is being used
// I assume eventually epic will handle this case
if (MotionSource.Compare(FName(TEXT("RightPalm"))) == 0 || MotionSource.Compare(FName(TEXT("RightWrist"))) == 0)
{
Hand = EControllerHand::Right;
}
// Could skip this and default to left now but would rather check
else if (MotionSource.Compare(FName(TEXT("LeftPalm"))) == 0 || MotionSource.Compare(FName(TEXT("LeftWrist"))) == 0)
{
Hand = EControllerHand::Left;
}
else
{
Hand = EControllerHand::Left;
}
}
}
To:
void UGripMotionControllerComponent::GetHandType(EControllerHand& Hand)
{
if (!IMotionController::GetHandEnumForSourceName(MotionSource, Hand))
{
// Check if the palm motion source extension is being used
// I assume eventually epic will handle this case
if (MotionSource.Compare(FName(TEXT("RightPalm"))) == 0 || MotionSource.Compare(FName(TEXT("RightWrist"))) == 0 || MotionSource.ToString().Contains("Right"))
{
Hand = EControllerHand::Right;
}
// Could skip this and default to left now but would rather check
else if (MotionSource.Compare(FName(TEXT("LeftPalm"))) == 0 || MotionSource.Compare(FName(TEXT("LeftWrist"))) == 0 || MotionSource.ToString().Contains("Left"))
{
Hand = EControllerHand::Left;
}
else
{
Hand = EControllerHand::Left;
}
}
}
Changing Wrist-Tracker Offsets
If you are using wrist-tracking hardware supported by the SenseGlove plugin, you can change the offsets inside BP_VRCharacter
using the uproperty Tracking Offset
typing or copying any of the following, depending on your hardware:
-
SenseGlove_Quest3: The wrist-tracking controller profile for for Meta Quest3.
-
SenseGlove_ViveWristTrackers: The wrist-tracking controller profile for HTC VIVE wrist trackers.
Changing Motion Source
In BP_VRCharacter
, you can change the wrist-tracking motion source for each hand. This is required depending on which tracker you are using.
Adding More Gestures
In the GraspingHandManny
Blueprint, we’ve created a simple function called SaveHandPose
. If you press the Space Bar
while the game is running, it will save the current pose of the corresponding hand. The pose is stored in a gestures database located under Content/SenseGlove
with the default name NewHandPose
. You should rename the pose to something meaningful when you intend to use it.
It’s helpful to add an Event Dispatcher to the GraspingHandManny
Blueprint, which is triggered in the Event Graph by the On New Gesture Detected
event from the OpenXRHandPose
component. This system is index-based rather than name-based, so keep that in mind when adding more dispatchers. By default, we’ve included examples for Teleport
, Grab
, Release
, and Use
.
Video Summary
This short video provides an overview of some of the key changes and modifications behind the SGVRETemplate demo scene, mentioned above.
SGVRETemplate Demo Calibration Scene
The SGVRETemplate includes a basic Calibration Scene located at Content/SenseGlove/Maps/Calibration
. Inside this level, you’ll find a copy of Content/SenseGlove/Blueprints/Calibration/BP_Calibrator
Blueprint responsible for transitioning to your desired target map after the calibration process is complete. You can configure the target map directly within this Blueprint by adjusting the Level to Load
uproperty.