Tangible Dynamics (2019)

People with orb in hand.

This project was the final project for my Introduction to Physical Computing class at NYU. It was co-created with Nicole Ginelli.The idea was the combination of our combined interests in interactive 3D animations, real-time user control, and a desire to put into use several approaches we learned during the semester,

We used a Kinect 2 for depth, luminance, and color tracking. Using this combined approach, we were able to control the tracking in a way where only the spheres would be tracked, and people’s extremities would not. 

This video feed was managed and programmed using Touchdesigner, which we used to get X and Y coordinates for each of the tracked objects. We then used the OSC communication protocol to send these values to Unity3D, where we assigned each set of coordinates to control objects inside the particle systems we created with the game engine.
 
Simultaneously, we used SPOUT to send the game engine’s camera view as a texture over to Touchdesigner, where we adjusted colors, contrast, and brightness before video mapping the image on top of the table using a short-throw projector.
 
Finally, we also had an Arduino NANO 33 IOT inside each of the spheres you can observe on the table. Each Arduino controlled an independent neo-pixel LED strip. We used UDP communication to control the color and luminance of each of the neo-pixel strips according to the particle simulation we decided to show.
 
In total, we had four different particle simulations. Each one had with own properties, LED color combination, and audio-reactive composition.
Child in play.
Child in play.
Objects in play.
Objects in play.
People with orb in hand.
People with orb in hand.
Several people playing.
Several people playing.

Developer View

Game Engine View, Tracking View, and Top View.