Camera-driven Particle/Hand-tracking Effects in the Browser

Camera-driven Particle/Hand-tracking Effects in the Browser

Camera-driven “sand/particles follow your hand” — Web Examples

A short, linkable list of browser-based projects similar to the TouchDesigner demo (webcam → motion/hand tracking → particle effects).

Live demos / code you can open now

  1. threejs-webcam-particle-visualizer — Three.js particles driven by webcam pixels.
    GitHub: FollowTheDarkside/threejs-webcam-particle-visualizer
  2. Sand ghost (Chrome Experiment) — Uses a displacement map from your webcam to affect particle velocity (WebGL).
    Experiment page: experiments.withgoogle.com/sand-ghost
  3. ASCII portal + hand tracking — Real-time browser effect using Three.js, MediaPipe, and WebGL shaders.
    Post: Reddit /r/webdev Related thread: /r/creativecoding
  4. Three.js webcam material example — Minimal example of using the webcam as a texture; useful as a starting point.
    Examples: threejs.org/examples

Building blocks (libraries & docs)

More inspiration (hand/pose + Three.js)

  • Fruit Ninja with MediaPipe + Three.js — Gesture-controlled browser game example.
    Thread: Reddit post
  • Three.js scene using MediaPipe hand tracking — Discussion and pointers for integrating both.
    Thread: Reddit /r/threejs

Typical browser pipeline (quick reference)

  1. Get webcam stream with getUserMedia().
  2. Run motion or hand tracking (e.g., MediaPipe Hand Landmarker).
  3. Pass mask/landmarks/flow vectors to a particle system (Three.js + shaders).
  4. Map gesture/motion → emitter position, velocity, or force fields.
  5. Add color/gradient/noise post effects; render to screen.

Comments

Popular posts from this blog

Japan Jazz Anthology Select: Jazz of the SP Era

In practice, the most workable approach is to measure a composite “civility score” built from multiple indicators.