Skip to main content
graphwiz.ai
← Back to Posts

WebXR Hand Tracking: Building Touchless Interfaces

XRWebXR
webxrhand-trackingvraccessibility

WebXR Hand Tracking: Building Touchless Interfaces

Hand tracking in WebXR enables natural, touchless interactions. The WebXR Hand Tracking API provides joint positions for building gesture-based interfaces.

Browser Support

Browser Support
Chrome 94+ Full support
Edge 94+ Full support
Firefox In development
Safari Limited

Basic Setup

Session Configuration

const session = await navigator.xr.requestSession('immersive-vr', {
  requiredFeatures: ['local', 'hand-tracking']
});

Accessing Hand Data

function onXRFrame(time, frame) {
  const session = frame.session;

  for (const source of session.inputSources) {
    if (source.hand) {
      processHand(source.hand, source.handedness);
    }
  }
}

Joint Positions

The API provides 25 joints per hand:

const WRIST = 0;
const THUMB_TIP = 4;
const INDEX_TIP = 9;
const MIDDLE_TIP = 14;
const RING_TIP = 19;
const PINKY_TIP = 24;

function getJointPosition(hand, jointIndex) {
  const joint = hand.get(jointIndex);
  return joint?.pose?.position;
}

Gesture Recognition

Pinch Detection

function detectPinch(hand) {
  const thumbTip = getJointPosition(hand, THUMB_TIP);
  const indexTip = getJointPosition(hand, INDEX_TIP);

  const distance = thumbTip.distanceTo(indexTip);
  return distance < 0.02; // 2cm threshold
}

Pointing Gesture

function detectPoint(hand) {
  const indexExtended = isFingerExtended(hand, 'index');
  const othersCurled = ['middle', 'ring', 'pinky'].every(
    finger => !isFingerExtended(hand, finger)
  );

  return indexExtended && othersCurled;
}

Building Interactive Elements

Raycasting from Hand

function handRaycast(hand, scene) {
  const wristPos = getJointPosition(hand, WRIST);
  const indexPos = getJointPosition(hand, INDEX_TIP);

  const direction = indexPos.clone().sub(wristPos).normalize();
  const raycaster = new THREE.Raycaster(wristPos, direction);

  return raycaster.intersectObjects(scene.children);
}

Accessibility Considerations

  1. Provide alternatives - Not all users can use hand tracking
  2. Visual feedback - Show hand position and gesture state
  3. Haptic feedback - Use vibration for confirmation
  4. Rest positions - Avoid fatigue with neutral poses

Performance Tips

  • Update hand meshes at 30fps max
  • Use simple collision shapes
  • Batch joint updates
  • Debounce gesture detection

Conclusion

Hand tracking opens new interaction paradigms. Start with simple gestures, add complexity gradually, and always provide fallbacks.