Jason Levine's profile

Kinect Dance Experiments

Up until 2013, I used the Kinect to projection-map generative graphics onto musicians, dancer, circus performers and myself in real-time.  But one day, I wondered about the possibilities of using the Kinect to put a performer into a virtual world.  I made a quick and dirty OpenNI recorder from an openframeworks example, and invited some awesome dancers to my studio.  After recording the dancers, I wrote an app that let me playback the footage and I instantly saw the potential.  For my first experiment, I created transparent ribbons from the position of the dancer's hands and elbows, and controlled the brightness of the points making up the dancer's body with the different frequencies of the music.  I played with rotating the dancer, and tried out different render modes and blend modes.  
 
The dancer is  Claudia Chan Tak, and the music is Science Friction by Bird of Prey and Random.
For my second experiment, I designed a new from scratch.  I began by running bezier curves along the dancer's joints and made each curve light up to a different frequency in the music.  Then I created a particle system and made each particle attracted to the whicherver one of the dancer's joints were closest to it.  I then had the size of the particle vary based of the frequencies in the music.  Finally, rather than rotating the point's of the dancer in 3D space, I rotated the camera around the dancer.  Through on some video feedback for good measure, and then went through my blend modes.
 
The dancer is Milan Panet-Gignon, the music is Inna Citadel by Jeremy's Aura
Kinect Dance Experiments
Published:

Kinect Dance Experiments

Up until 2013, I used the Kinect to projection-map generative graphics onto musicians, dancer, circus performers and myself in real-time. But one Read More

Published: