Live Camera Tracking

Real-time hand tracking with gesture recognition using your webcam

Click "Start Camera" to begin live hand tracking
Camera access denied or unavailable. Please allow camera permissions and try again.
Gesture Detected
Open Hand

Interactive Demo

A live p5.js simulation of hand tracking inference

Move your mouse to control the hand — click to toggle gestures
Open Palm All fingers extended
Pointing Index finger up
Peace Sign Index + middle up
Fist All fingers curled
Thumbs Up Thumb extended, others curled
Rock Sign Pinky + index extended
OK Sign Thumb + index forming circle

Project Recording

Actual hand tracking model running on webcam input

Hand Tracking using ML - real-time hand detection and gesture recognition demo showing webcam input with tracked hand landmarks and gesture classification

Live capture of the hand tracking model detecting joints and predicting gestures


About This Project

This project implements real-time hand tracking using machine learning models to detect and predict hand positions and gestures from webcam input. The system identifies 20 individual finger joint landmarks across the full hand skeleton, tracking the palm center, each fingertip, and every intermediate knuckle joint with sub-pixel precision.

Building on state-of-the-art hand pose estimation research, the model processes video frames to produce a full 2D hand skeleton in real time. Detected landmarks are connected to form a hand graph, enabling downstream gesture classification: the system recognizes common gestures such as open palm, pointing, peace sign, fist, thumbs up, rock sign, and OK sign by analyzing the spatial relationships between finger joints.

The pipeline incorporates a bounding box detector to first localize the hand region, followed by a landmark regression model that predicts precise joint coordinates. Confidence scores are computed per-finger to indicate tracking reliability, and the full system runs at interactive frame rates suitable for gesture-based user interfaces.

20 Landmarks

Tracks palm center plus 4 joints per finger across all 5 fingers in real time

Gesture Recognition

Classifies hand poses into gestures using spatial joint relationships

Real-Time Inference

Runs at interactive frame rates for live webcam input processing

Confidence Scoring

Per-finger confidence values indicate tracking accuracy and reliability


Technologies & Topics

Core areas explored in this project

hand-tracking machine learning gesture recognition computer vision hand detection pose estimation landmark regression real-time inference Python TensorFlow OpenCV deep learning