Nahil and I first started by having a conversation about our own ideas and sharing videos that inspired us. I was keen on doing a live video performance, but also trying out something I had seen in a video. Coincidentally, the two videos we shared with each other were similar in that they had to do with lines that were made or affected by human movement.
https://www.youtube.com/watch?v=JA6kl4RlA0I&app=desktop (reference point- at 0:20 or 0:40)
https://www.youtube.com/watch?v=g-a9WJA1aJY (reference point- at 0:30)
We then talked about what our options would be to detect human form and movement. The kinect was one, and colour detection was the other. Before getting ahead of ourselves, we decided to settle on a theme. Our main vague themes were –
- human-robot relationship
- gravity-anti gravity
We landed on ‘connectedness’ as a theme. I still wanted to involve live video performance in some way, by the turn of our discussion that didn’t look like it was going to happen. I gave up on trying for it to be, especially since it seemed like we were beginning to have a solid idea on what we wanted in our performance. Settling on connectedness as a theme also meant that we would have two people/objects in our performance. Inspired by the videos we had shared, the lines would be drawn between the people to show a relationship between the two rather than them being seen as isolated figures.
We both knew that we didn’t want to be the performers in the piece. Since I was working on Rita Akroush’s Capstone, puppets were on my mind so I suggested we have one human performer and one puppet performer. Nahil suggested that to differentiate between the figures, inclusive of the figure operating the puppet, that blob detection and chromakeying would be the way to go. I tried working with the kinect, but I really wasn’t happy with the lag time in detecting human form. It also wasn’t detecting the form of the puppet very well. It sometimes did (hooray), but most often wouldn’t. I reluctantly abandoned using the kinect and decided that Nahil was right – that we had to use a PS3 IR camera.