Bringing the sky from above to eye-level

I’ve been working off my initial instinct of being inspired by UVA’s Triptych piece is Paris. Even though I was leaning towards creating a piece with a sound component as well, I’ve decided to create a project with the following impulses:

  • Colour as a real presence, and colour as being soothing/meditative
  • No sound
  • An installation encountered while passing by
  • To be placed in an outdoor walkway or mall

I’m still pursuing creating an installation with the idea of going against a blase mindset. At the same time, I want to create an experience that allows people to engage in a meditative mindset.

Because I was following my impulse to creating an installation without sound, I initially thought to visually portray sound and decided to create a formation that resembled a volume control.

Represented above is a series of light panels. A passerby would affect the luminance of each panel. As they travel from left to right, the luminance of each panel would gradually increase. Traveling form right to left, the luminance would decrease. To make it more complicated, I thought that each panel would have a distance sensor of their own, and a passerby could stop and see the colours of the panels change depending on how close or far they are from the panel. After a long duration of standing in one place, or having no person in front of them, the panels would return to the initial colour (which is the strawberry pink in the image). I thought about the light being not just light, but a video of clouds in the sky and that changing the colours would be like changing the time of day to see clouds during the day (white/light blue), at sunrise (orange/yellow), sunset (red) or at night (dark blue tinge).

I was very intent on using at least seven ultrasonic range finders. Eventually, I realized that I was limited by the number of sensors I could source. And so I changed my idea slightly. Instead I decided to use one screen that would show clouds passing by. This (ideally circular) screen would be built into a wall so that it’s part of the building, or store in a mall.

For now, I use a video retrieved from online of clouds in the sky. Ideally I would film my own video, but what I really want to do is show a live-stream of the sky, either of Abu Dhabi or from somewhere else. I think it would be really nice to see the sky of Peru, for instance, while walking in Abu Dhabi. It would also convey a sense of the difference in time.

The only thing I have happening now, is that the colour changes according to the passerby’s distance. But this isn’t enough, so I’m trying to find ways to complicate it. For now, I’m looking at finding ways to create the illusion of the clouds dispersing if a passerby were to blow at the clouds.

The violet/strawberry colour I choose to be my neutral colour is an homage to La Monte Young and Marian Zazeela’s Dream House.

Meditative Spaces

I’m interested in doing an installation about meditative spaces in the city – creating a space, a moment of tranquility and calm within the everyday crazy of the cityscape. Cities can sound loud, chaotic and can be overwhelming. In my sound art class, we did a sound walk around the city and I was really surprised by how many sounds I could hear: a car backing up, children playing in a school down the block, the humming of generators, the chinking of the chain of a flag against the flag pole. I kept on thinking about the field recordings I did in Florence of piazzas, and how the soundscape of the place changes over time. Walking around Florence and Abu Dhabi just listening, made me realize how important it is to walk around being receptive of the soundscape around me rather than adopting a blase mindset.

I was inspired by United Visual Artists’ light installation, Triptych, debuted in Paris in 2007. I love seeing these colours in the very grey, stoic place. From my understanding, the colours of the screen change according to the proximity of the spectator. When you are close against it, the screens turn white. I assume they are using proximity sensors with a narrow field of vision so as to not be overwhelmed if there are many spectators in the space, so it likely detects the person closest to the panels.

Live visuals w/ heart pulse (or, sensors & senses sensing sensational sensations)

I loved the workshop with Kiori. I was surprised by how simple exercises could take over the whole body. Certain images stayed with me: the image of eyes sinking into the sockets of your skull, being a palm tree swaying in the breeze but never falling, focussing on the balance of your body, the head being separate from the body. I thought it was really wonderful that an interactive media class take a moment to think about bodies and sensation. I found Andrija’s comments particularly valuable in thinking about the body – and specific parts of it, like the eyes – as an interactive system with inputs and outputs. Recently, a discussion about my capstone led to the realization that the framework within with I’m approaching my research has to do with the concept of inputs and outputs, looking at what happens when I take music as something I don’t understand, and use video to try and decipher it, and seeing what the result is of this mix.

The week’s article seemed technical at first, but for my own benefit I just want to break down the definition into one-liners:

Absolute threshold: the point where something becomes noticeable to our senses

Difference threshold: recognizing changes (in sensation)

Signal detection theory: focussing on something, ignoring/minimizing everything else

Sensory adaptation: being sensitive to stimuli (and ergo, changes)

Bearing these words in mine, I set out to create this week’s assignment. At first I was really unsure of what to do, since the possibilities were endless. I did know that I wanted to try working with a sensor I hadn’t used before. Thinking of our time with Kiori (thinking about breathing, having a sense of calm), I thought to use a heart pulse sensor. I decided that my subtle reaction would be the blinking of an LED light to the pulse of a participant. I thought about ‘noise’, and since I was unsure of how to play with frequencies in Isadora I thought of noise in terms of visuals. I thought to use Processing so that values from the pulse sensor would be the input, and the output would be a crazy array of shapes appearing and changing colours. While this sounds doable, I am without skill in this matter and so very reluctantly abandoned the idea of using Processing. I should stick to what I know.

Which is how I began to use Isadora. I explored the possibilities of video effects with Isadora. I realized I could create shapes in Isadora, and then play with effects. While the VJ in me did experiment with syphoning to Modul8 where I could endless play with effects, I thought to try doing everything in Isadora. And this was the result:

First success: creating a circuit with the pulse sensor. Yay!

Second success: figuring out how to create visuals in Isadora. Double yay!

Where it falls short, however, is in the fact that sometimes the pulse sensor spits out values and blinks even when I am not in contact with it. Not sure why it does that still. Huzzah for the fact that it works with my pulse though.

All the visuals are affected by the user’s pulse. Values are affecting the scale of the shape and the number of facets it has. I added a motion blur on the shape so that there’s a trace of the shape every time it changes. The background is also affected by the pulse, though this might not be so obvious. I added a slit scan so it rips the video/shape across the screen. Its height is affected by the pulse values. I used a wave generator and set it to random so that the background shifts. But having it in one place also looks great.

For fun, this is me playing with it in Modul8:

It’d be interesting to extrapolate this and explore the change in heart rates of DJs and VJs and how that could affect their music or visuals.

Childhood/real world

Since I’ve been deconstructing The Teletubbies in a different class, I thought I’d play with the idea of childhood utopia versus the fear + tragedy of the real world. I contrast an episode of The Teletubbies with breaking news of North Korea threatening to bomb the United States and the defense the U.S. is taking.

I want to play with motion. The faster you moved – the more active you are like a child – then you would see the video transition to The Teletubbies. If you stopped, and stayed still, the real world would hit you. The idea is to keep moving so that you stay in your childhood utopia with no grown ups, and in a land of green where rabbits sound like birds.

I tried this out with the PIR sensor, and though I was able to hook it up and make it work with Isadora, it was being finicky. It wasn’t outputting a huge range of values. So instead, I tried the same idea but using the webcam on the camera as an input. I had never tried this before, and found out that the best way of detecting motion was to detect changes in brightness that the camera would pick up. It’s a little faulty, but it’s responding pretty well. To meet the assignment’s requirement of using a sensor, I use an ultrasonic sensor to mix between two versions of The Teletubbies – one in colour and one in black and white to look like a kind of dystopia to convey that a childhood of no worry and complete happiness is just an illusion and to acknowledge that the ‘darkness’ in the world is ever present. By moving closer or further away from the sensor, you choose whether you want to view The Teletubbies in colour or in black & white. 

Prototyping Orbit (Arduino + Isadora)

So this took longer than expected. I spent a while trying to figure out what was wrong with my code – and then began a series of experiments sourcing and mixing code. Vicky was very kind to help me out with the code, but it still didn’t work. And then it ended up being a circuit problem (Attilio helped me out). On the plus side, I figured out what was wrong and was able to reconstruct my circuit. The light now changes brightness according to proximity – yay! Thinking this wasn’t enough, I wanted to see if I could incorporate sound in any way and so turned to Isadora where I had a Serial In Watcher actor detecting the Arduino. I had values correspond to change of speed because I wanted a drastic change in sound to be heard – ideally in the manufacturing of Orbit, it’s the pitch/frequency that changes and not speed. It’s finicky, but it does work. Ish.

Orbit

There is a pitch black room. It’s a small room, but it’s so dark that you cannot see corners in the room and so have a feeling of expanse, of endlessness. Unseen in the middle of the room is a floating sphere.
A participant enters the pitch black room. As soon as he/she takes a step forward, the sphere illuminates. Dimly, just enough so that the participant is aware that there is a source of light in the room.
The closer the participant walks towards the sphere, the brighter it becomes. If the participant circles around the sphere, it flickers (gentle dimming and returning to its luminosity). The sphere moves away if the participant attempts to touch it. Meaning that, a participant never makes physical contact with the sphere but instead at a certain proximity (ex. three inches between the participant’s hand and the sphere), the sphere will move. This means that the participant is able to guide the sphere where it gently floats, without making contact with it.
The idea behind immersive installation was to create a meditative space. It was also to mimic a small scale solar system in a way, with a participant being a planet that travels around the source of light. I want participants to trust themselves in being placed in a room that is completely dark, and without knowing how big or small the space is. Taking a step forward ‘activates’ the sphere, conveying that their assurance and trust in themselves will guide them.
For sound, there is a general sound to the ‘air’ of the room. The sound changes the closer to the person is to the sphere, perhaps increasing the frequency/pitch gently but never so that it is deafening (emphasis on meditative space).
One problem though is that the installation is conceived for one person in the space. I thought about expanding the idea for multiple participants, and having it so that each participant enters and embodies a new musical note. Each step a participant takes is equivalent to the act of playing a note. However, I am uncertain of how the source of light should change with the number of participants, and whether in this case the sphere should be kept still rather that being able to float freely. The room would then be bigger and have multiple entry points so that participants can enter from wherever they please, and are aware of each other’s presence because of the addition of new musical notes and how bright the sphere might be.

Roy and Chris and Interactivity

What I love about the way both Roy Ascott and Chris Crawford write is how their work is easy to digest because of their conversational tones.

Crawford’s main argument is that there is a difference between interaction and reaction. I am reminded of my brother when he uses the word ‘interactive’ in relation to blogs and websites, and I often have to ask him what he means  by that. Now, I can bring up Crawford’s points in how he anthropomorphizes interaction by using the terms ‘listening’, ‘thinking’, and ‘speaking’.

There were two main arguments I valued from Ascott’s text. The first was how he articulated interaction as a feedback loop. The second, was the importance he placed on uncertainty and participation. He talks about how “an important characteristic in modern art…. is that it offers a high degree of uncertainty and permits a great intensity of participation” (Ascott 112). This re-emphasizes Crawford’s argument on how the installation/object should receive an action from a participant, decipher and choose a response, and then respond. Interaction should be seen as a conversation between audience and the product.