First, I constructed a 122×122 flat and painted a white circle (to allow for it to be projected on) with an inside diameter of 61cm and an outside diameter of 122cm. I did this at the workshop in the NYUAD Arts Center.
After that I was able to set up a Kinect and Projector from the the Grid, both facing downward toward the playing space. The playing space is limited to the the flat, after mentioned as the stage. The stage was spiked on the floor to accommodate for people moving or bumping the sage out of position.
In processing (the code can be found below) I used Kinect tracker with OpenCV to track objects with a bounding box in the playing space. The point of the object was established as the center of the bounding box. I adjusted the threshold to ensure it only tracked object (or people’s arms as the case may be) above the mid torso level of a human.
I created 8 sections in an array. The sections were then used, also in an array, to establish the audio and visual triggers and boundaries.
The section were bounded by 8 radians and an outside and inside circle. The radius of these circles could be adjusted using a key press, allowing for easier set up with the physical playing space. The center of the circle as well as the before mentioned Kinect distance threshold could also be adjusted using Key presses.
The large circle was set smaller enough to avoid nosie from passers by and the small circle set small enough only to cover the individual in the middle of the stage. I used createGraphics in the the KinectTracker to overlay the smaller circle so that the Kinect would not track the individual who was stood there.
Midibus was used to send notes on a C major scale to sforzando. During testing I discovered that a C major is not the best set of notes to use because, in an installation setting, the notes often clash due to inexperience with the system. I therefore changed it, first to a diminished 7th, then to a C, Eb, G, Ab, Bb, C, D, and and F (60, 63, 67, 68, 70, 72, 74, 75).
During testing we also discovered the notes needed to be faded in and out, rather than simply triggered on and off. We therefore sent the sforzando output through ____, by way of Soundflower, and used the reverb setting to set the decay time to 60 seconds.
In Isadora I coded (can be found in the link below) a color circle with 8 sections using the GCLS Shader actor, corresponding to the section coded in Processing. The 8 sections could be individually triggered to appear through 8 OSC receivers on different channels from Processing. The sections had to be matched between Processing, Isadora, and the projector so that they all aligned.
The final step in Isadora was projection mapping the Shader to the stage. Because the stage is based on a 122×122 flat, I created a white square that overlaid the circle to allow for quicker and more accurate mapping.
Performance and Installation:
Code: December 13th
Code: December 15th
Disclaimer: ‘I’ refers to the combined efforts of Ethan David and Prof. Aaron Sherwood