Gesture-based reactive viewport instrument, converting movement to melody. Passing live camera data into Blender’s physics engine via MediaPipe, the chimes react to hand gestures with a delicate generative soundscape through glistening reactive chimes.


Blender isn't built for live interaction natively, so I got creative with how the viewport responds to real-time input. A custom python bridge monitors velocity spikes, triggering audio samples the moment a gesture produces an impact. Collision data is mapped directly to audio samples. Each clink has subtle pitch and volume variations to keep the acoustic soundscape organic and artful.
Impacts trigger an object-level heat signal. The shader reads this data to create snappy, beautiful pulses of light. Resulting in an immersive simulation where hand movement controls the wind.
All recorded real-time in EEVEE through the viewport.

