Unity Quest Hand Tracking Not Working on Meta Quest 3 - XR Input Fix
Problem: Hand tracking does not work on Meta Quest 3 in Unity. Hands may never render, pinch/select does nothing, or input actions are stuck in "not performed."
Quick Solution: Most failures come from one of four causes: missing OpenXR hand-tracking feature group, Quest runtime permissions not granted, wrong interaction profiles, or input actions not bound to hand controllers.
This guide gives you a reliable setup and troubleshooting flow so hand tracking works in Editor testing and device builds.
Why Quest 3 Hand Tracking Fails in Unity
Quest hand tracking depends on several layers:
- XR provider and feature setup (OpenXR or Meta XR)
- Android/Quest build target and permissions
- Interaction profiles and input action bindings
- Runtime mode on headset (controller-only vs hands and controllers)
If one layer is wrong, Unity usually falls back to controllers or returns no input.
Solution 1: Verify XR Plugin and Feature Setup
- Open Edit > Project Settings > XR Plug-in Management.
- Ensure OpenXR is enabled for Android.
- Open the OpenXR section and enable Meta/hand-tracking related feature groups.
- Disable conflicting providers for Android so only your intended XR path is active.
Verification: In Project Settings, OpenXR is active for Android and hand-related features are enabled.
Solution 2: Confirm Meta Quest Runtime Hand Tracking Mode
On the headset:
- Open device settings for hand tracking.
- Enable Hands and Controllers (or equivalent hybrid mode).
- Confirm apps are allowed to use hand tracking.
- Reboot headset after changing mode to clear stale runtime state.
Verification: System-level UI can detect your hands outside Unity apps.
Solution 3: Fix Input Action Bindings for Hands
If hands are visible but gestures do not trigger:
- Open your Input Actions asset.
- Add or verify bindings for hand interaction paths used by your XR toolkit.
- Ensure action maps are enabled at runtime.
- Avoid duplicate maps that consume the same actions first.
Common issue: Only controller bindings exist, so hand gestures never hit gameplay actions.
Verification: Debug logs show performed/canceled events when pinching or grabbing.
Solution 4: Check XR Origin and Hand Prefab Wiring
With XR Interaction Toolkit style setups:
- Ensure your XR Origin has left/right hand objects assigned.
- Confirm hand visual prefabs are referenced and active.
- Check tracking components are enabled on both hands.
- Remove duplicate rigs in the scene that may fight for tracking ownership.
Verification: Hands render and move with your real hand motion in Play/build.
Solution 5: Build Settings and Permissions for Quest 3
- Set Build Target to Android.
- Confirm minimum API and graphics settings match current Quest requirements.
- In Player settings, ensure XR and required permissions are included.
- Rebuild cleanly after major XR setting changes.
Verification: Fresh APK runs on Quest 3 and hand input initializes during startup.
Alternative Fixes for Stubborn Cases
- Delete Library folder and reimport if XR packages desynced after upgrades.
- Update Meta XR/OpenXR packages to compatible versions.
- Test in a minimal blank XR scene to isolate project-level conflicts.
- Disable custom input wrappers temporarily and test raw action callbacks.
Prevention Tips
- Keep one canonical XR setup prefab and reuse it across scenes.
- Lock package versions per project to avoid silent XR regressions.
- Add a startup diagnostic panel that reports active XR runtime and interaction mode.
- Test both controllers and hands before each release build.
Related Problems and Links
- For broader XR initialization problems, see Unity XR/VR Setup Not Working - Virtual Reality Development Fix.
- For multiplayer XR sync issues, see Unity Multiplayer Netcode Errors - Networking System Fix.
- Official docs: Unity OpenXR, Meta Quest Unity docs.
Bookmark this fix for your XR deployment checklist. Share it with your team if it saved a debugging session.
FAQ
Do I need controllers paired for hand tracking to work?
Not always, but hybrid mode and runtime configuration can affect fallback behavior. Test both controller-present and controller-absent scenarios.
Why does hand tracking work in one scene but not another?
Usually scene wiring differences: missing action map enable calls, wrong XR Origin prefab, or duplicate rigs.
Should I use OpenXR or a Meta-specific path?
Use one consistent path per project and keep package versions aligned. Mixed setups are a frequent cause of intermittent failures.