Planned Features Completion Status

Implemented as of v2.1.x

  • Full SenseGlove low-level core API access through Unreal C++.
  • Full SenseGlove low-level core API access through Blueprint.
  • DK 1 Support.
  • Nova 1 Support.
  • Nova 2 Support.
  • Support for Microsoft Windows as a development platform.
  • Support for GNU/Linux as a development platform.
  • Support for Microsoft Windows as a deployment platform.
  • Support for GNU/Linux x64 as a deployment platform.
  • Support for GNU/Linux AArch64 as a deployment platform.
  • Support for Android as a deployment platform.
  • Support for Oculus Quest 2 and Oculus Quest Pro.
  • Support for HTC VIVE Pro and HTC VIVE Focus 3.
  • Support for HTC VIVE Trackers and HTC VIVE Wrist Trackers.
  • On-device calibration for Android without the need for SenseCom.
  • Haptic feedback including force feedback, buzz, and thumper commands.
  • A customizable Grab component that could be added to any actor.
  • A customizable Touch component that could be added to any actor.
  • Ability to grab, release, and throw objects around.
  • Separation of the real and virtual hand rendering.
  • An out-of-the-box customizable SGPawn with the ability to be extended in C++ and Blueprint.
  • Easy wrist/hand tracking debugging using the SenseGlove Debug module.
  • A generic Settings module with the ability to override settings.
  • C++/Blueprint interaction events such as OnGrabStateUpdated, OnTouchStateUpdated, OnActorGrabbed, OnActorReleased, OnActorBeginTouch, and OnActorEndTouch.
  • A fall back to HMD and wrist tracker hardware auto-detection mechanism when automatic detection of the wrist tracker hardware is desired.
  • OpenXR-compatible hand tracking (XR_EXT_hand_tracking) support.
  • FXRMotionControllerData compatible hand animation system.
  • FXRMotionControllerData compatible wrist tracking system.
  • FXRMotionControllerData compatible hand interaction manipulation system.
  • Ability to fallback to hand tracking when a glove is not present and use the bare hands for interactions, or a combination of glove and hand tracking if no motion controller input is detected.
  • The SenseGlove grab/touch sockets one-click-setup ability on any Epic-compliant virtual hand mesh from within the Unreal Editor's Content Browser, Skeleton Editor, or Skeletal Mesh Editor.
  • A flexible virtual hand animation system that can take the mesh bone's transforms into account for a more reliable hand animation.
  • Ability to manage the Engine Scalability Settings through the SenseGlove plugin in order to change the graphics settings on the fly.

Upcoming features planned for the v2.2.x release

  • Migrating away from the deprecated FXRMotionControllerData in favor of FXRMotionControllerState and FXRHandTrackingState.

Planned features long-term

  • Get tracking input from sources other than a SenseGlove device.
  • Be able to assign behaviors to different objects (meshes) in the scene (e.g. Slider, Hinge, basic Grabables, etc).
  • Make it so developers can define or extend their own behavior(s) to an object through Code / Blueprint (e.g. I want a car door that is like a slider, but follows a path rather than a straight line).
  • Make the hand(s) able to push around physics-driven objects (for as much as their behaviors allow) (in backlog).
  • Be able to grab objects with up to 2 hands (and move them around with both hands at the same time in a way that seems realistic).
  • Ensure that our virtual hands (and the objects they hold) do not phase through other physics objects (e.g. walls and tables).
  • Allow other scripts to force a grab and/or release to occur (for example, when you place it apart at the designated location, it gets removed from your hand and snaps into place).
  • Have some form of weight simulation by making certain objects harder to push, lowering manipulation speed, or making objects only moveable with two hands.
  • (Optional) Make it so the fingers of your virtual hands do not clip inside the meshes you are holding (certain people see this as an indicator of how fast the Force-Feedback activates - but it's basically just rendering).