Camera-driven Particle/Hand-tracking Effects in the Browser
Camera-driven “sand/particles follow your hand” — Web Examples
A short, linkable list of browser-based projects similar to the TouchDesigner demo (webcam → motion/hand tracking → particle effects).
Live demos / code you can open now
-
threejs-webcam-particle-visualizer — Three.js particles driven by webcam pixels.
GitHub: FollowTheDarkside/threejs-webcam-particle-visualizer -
Sand ghost (Chrome Experiment) — Uses a displacement map from your webcam to affect particle velocity (WebGL).
Experiment page: experiments.withgoogle.com/sand-ghost -
ASCII portal + hand tracking — Real-time browser effect using Three.js, MediaPipe, and WebGL shaders.
Post: Reddit /r/webdev Related thread: /r/creativecoding -
Three.js webcam material example — Minimal example of using the webcam as a texture; useful as a starting point.
Examples: threejs.org/examples
Building blocks (libraries & docs)
-
MediaPipe Hand Landmarker (Web/JS) — Detects 21 hand landmarks in real time, great for driving particle emitters/forces.
Guide: ai.google.dev … /hand_landmarker/web_js Overview: Hand Landmarker (Overview) -
Three.js — JavaScript/WebGL/WebGPU 3D library used in many of the demos above.
Repo: github.com/mrdoob/three.js
More inspiration (hand/pose + Three.js)
-
Fruit Ninja with MediaPipe + Three.js — Gesture-controlled browser game example.
Thread: Reddit post -
Three.js scene using MediaPipe hand tracking — Discussion and pointers for integrating both.
Thread: Reddit /r/threejs
Typical browser pipeline (quick reference)
- Get webcam stream with
getUserMedia()
. - Run motion or hand tracking (e.g., MediaPipe Hand Landmarker).
- Pass mask/landmarks/flow vectors to a particle system (Three.js + shaders).
- Map gesture/motion → emitter position, velocity, or force fields.
- Add color/gradient/noise post effects; render to screen.
Comments
Post a Comment