Shaders

Mouse controls the X axis with a Stage Mouse Watcher.

Y axis remains even.  I did this because I discovered that putting Stage Mouse Watcher through a calculator stops the Shader from receiving the inout.

The shader is put through Difference which is activated when the mouse moves. I then colorised the shader it with four different colors, activated at will by different Key Board watchers (and toggle switches attached to the bypass) to create different variations of colors.

The intention was to give a interactive neon lighting effect. Inspired a little by Tron.

I changed the speed of the lines in the coding of the Shader, as well as the size of the lines and the distance between them. I also had make the addition of the mouse to the code. The issue that came up was that (0,0) is dictated as the middle of the Stage in the coding. I counteracted this a little by changing the line vec2 uv = fragCoord.xy / iResolution.xy -.225but it meant that the mouse only lines up in the middle and gets slightly off at the edges of the screen. I was unable to find a solution to this because, as mentioned earlier, putting the mouse through a calculator negated it’s input.

 

Shader Experiment – Sept 25th

I added mouse and speed controls onto the camera movement of a shader that moved down a long groovy cube-y hallway. I added some new variables that I could change with the mouse and keyboard watchers. Having made a kind of control scheme I would next want to see what ways the visuals (such as the color and vertices) can be modified. I also want to figure out how to make accelleration smoother, or add a more “bouncy” feel to the countrols.

Shader used is “Merger Tunnel” https://www.shadertoy.com/view/XslGzl

Particle Patches – September 18

Musical Particles

My first patch, Musical Particles, uses a sound watcher to dictate the number of particles being generated as well as the size and color of the particles. This was done by limiting the range of three Sound Level Watchers and connecting the outputs to a Color Maker HSBA, 6 Comparators, and the Mid Size of 3D Particles. I also used a mouse watcher to rule where the particles would come from, this would allow a person to trace a dancer across a screen. The Stage Mouse Watcher was connected to a series of Calculators, Maps, and Limit-Scale Values. A 3 Random number generator also decided the respective x, y, and z velocities. These Random number generators are also connected to a Comparator and Sound Level Watcher to ensure no particles are produced under a certain level of sound.

Rainbow Snake

My second patch, Rainbow Snake, uses the Stage Mouse Watcher connected to two Maps and, in the case of the vertical position, a calculator (to counter the vertical mirror you get when using the Map). A Pulse Generator is used to generate particles on it’s highest frequency, giving the illusion of a continuous line. The number of particles aloud on screen is also increased to it’s highest setting. A Wave Generator is then connected to a Color Maker HSBA in order to cycle through the colors of the rainbow as the line is traced out. In honor of the game Snake, a Picture Player is used, with the logo of the game, as a Texture Map.

September 18th – Particles

 

The first patch uses the counter and keyboard watcher(and WASD +QE) controls) to translate one particle along the 3 axis. It also switches between two walking sprite animations depending on which direction the particle is moving and plays the video only when a button is pressed. Pressing “v” also causes a particle to be “thrown”. I wanted to see if the particle system could be used to make a game, or at least simulate a playable game (since I haven’t figured out if collisions are possibles with particles).

The second patch is a modified version of Aaron’s computer vision patch, using “Eyes” to highlight an area where “tears” fall. This was to see how particles could combine with computer vision.

The third patch also uses “Eyes” and computer vision, but with only 3 Columns. These change the direction of the horizontal gravity on the particles (-10 or 10), which are constantly being generated. If the level of movement(brightness) falls below a certain number, or “Eyes” tracks something in column 2, the gravity is set to 0. This seems to make an interesting effect like having to brush away dust or leaves. I also want to add a changing vertical gravity based on the rows.

These patches can be downloaded here: https://github.com/thatgracehuang/Sensors-Body-Motion/blob/master/particle%20exp.izz

September 11 – 2 Isadora Patches

Patch 1

The first patch uses three different groups of generators, a pulse generator attached to a random number generator, a pulse generator attached to an envelope generator, and a wave generator, to adjust the color of the dancer continuously throughout the patch. I went through many renditions of this patch, noticing an interest in watchers. I therefore, in the end, placed a sound watcher in this patch that increases the size of the dancer’s dots when the room get louder. I tried emulating a disco feel with the patch and therefore used fish eye on the bubbles video to create a rounded effect. I also used difference and motion blur to distort the video slightly. I wanted to crop the video into a circle and have a mouse watcher be able to adjust its location in the crop size (imagine a spinning disco ball), but I could not find the appropriate actors.

Patch 2

In the second patch I used to reflectors, one horizontal and one vertical, to create a four way reflection of the dancer. I then introduced difference and shimmer actors. I had difficulty using reflector actors. I wanted to use a 4 way reflector (or possibly larger), but every time I introduced it, it would cut the output and I could not make it appear again no matter what I adjusted. My work around, therefore, was to use multiple reflector actors. I used a circular screen actor with two pulse/envelope generators to adjust the position and a wave generator to adjust the line spacing. I spent a long time attempting to introduced rendered objects, such as lines. I wanted them to reach across the screen in random patterns and although I finally managed to get them to render in the correct form, I could not work out how to do this randomly. Do I need to set parameters on the random number generator that decides their beginning and ending xy position?

Isadora Patch Documentation

Here’s a video of the two patches I worked on this week:

I had a strange issue with the Isadora key as of tonight, where the program was considered unregistered even though the key was plugged in. It wouldn’t let me record the stage for more than 5 seconds so I’ve recorded it like this instead.

The first patch involves 4 separate sections, controlled using Trigger Delays and Trigger Values. The first is a moving landscape that was rotated using Spinner and then put through the Warp effect. Envelope generators changed the amount this video was warped. The second was a video of what appears to be Las Vegas that fades in as an Envelope Generator decreases the intensity of the landscape. This video is also put through the Negative effect, attached to a Pulse Generator and Random that causes the video to randomly switch between negative and positive images. I used Shapes, with another Random and Pulse Generator to create randomly sized squares in random locations, and an Envelope Generator increases the frequency of this over time. A Random and Pulse Generator connected to a Colorizer and Shapes make the flashing square that ends this patch. Trigger Values with values of 0, 1 and 100 are used to initialize this and the second scene and start it playing.

The second scene uses Slitscanner connected to a square Wave Generator connected to the horiz/vert setting to create the cross effect, and several Trigger Delays to trigger additional changes to the video. A Displace combines the slitscanned video with another video. 3 Live capture projectors with Difference  and different horizontal displacements are faded in to show off some awkward dancing from whoever is in front of the camera. Eventually another video fades in and Envelope Generators decrease the vertical displacements of projectors except the latest video, sliding them out of frame. Explode combined with a sine Wave Generator create the final effect.

Below are photos of each patch. Neither fit completely on the screen so apologies if these photos are hard to read/barely coherent.

The first patch:

 

The second patch:

Footage and music were all obtained from Youtube

Another 3 Examples of Video Art

Exodus by Jon Kessler is really interesting to me as it takes the relatively simple idea of creating a feedback loop with an iPhone and creates the effect of a endless tunnel of travellers spiralling down into who-knows-where, while also just going in circles. I like how sculptural this piece is, as well as how it openly shows to viewers how it works, rather than hiding the process and technology that creates the visual effects.

 

Girl X was a theater piece I watched in February by Sugaru Yamamoto that where the projector embodies/performs all of the characters that the two main characters interact with. The way the projector casts light on the performers and how their shadows interact are integral to the narrative of this play, showing how technology mediates a majority of their character’s lives. The projections are entirely scripted, composed mostly of text and blocks of colors but still feel alive as characters send text messages through it to the protagonists, blocks of color box in the protagonists and large text establish locations. This is woven into a tale contemplating birth and youth. I thought this piece was an effective and inventive experiment in the intersection between projection art, performance and storytelling.

Les Leveque is a video artist who uses Max/MSP to algorithmically re-edit films and TV footage to articulate arguments about those particular works. The resulting edits can vary from hypnotising to painful and Backwards Birth of a Nation is probably one of his most physically painful to watch. He takes D. W. Griffith’s Birth of a Nation and plays it backward, compressed and periodically inverting black and white to impart an unsettling feeling to seemingly patriotic images, highlighting the film’s racist undertones. I find it interesting how Leveque’s work creates meaning through patterns and that his work is able to speak through creating a range of experiences, including discomfort, though I never want to watch this video fully ever again.

 

Aaaaand a bonus example because I stumbled across it and it seemed worth sharing:

html_butoh has participants enact the “Global Top 500” websites by translating the functionality of each HTML tag into choreography. It was an open-for-participation video clip database that cycled through the 500 websites over 24 hours.

“In Butoh—a Japanese dance technique—the dancer “becomes” an image through her movements, which parallels to how a web browser scans through HTML and displays its content.” – Turbulence.org

3 Example of Video Art

I selected “Do You Like Me Now?” because it is in line with my interest of combining art forms. I like the collaboration between video and dance. In film we are often told to “show, not tell” and this video is a prime example of being able to express emotion and tell a story without having to use dialogue. The video also uses the human body to the excess, which I find very compelling to watch. It is treated like a dream sequence through the the neon lights, ecstatic music, and quick editing cuts. I like it because the movement is satisfying to watch, the design in pretty, and I can translate the story from movement to speech (I know the secret).

I selected Puppet Parade – Interactive Kinect Puppets because it re-imagines the childhood game of shadow puppets. I like that it brings my childhood nostalgia and current interest in interactive technology together, while exhibiting its vast potential to be used in performance. The exhibit makes me ask what else could be done with this technology and this idea. It also demonstrates the far reaching audience range. I think this video is one of the best example of how Interactive Technology has the potential to bring in, and entertain, a more diverse guest list, whether 6 or 106.

I selected Forms Installation at the National Media Museum because I found the the visuals and their accompanying sound ‘tasty’. The very nature of this video is tactile to me, whether its the flow and folding of the images or the clicking and clacking of the sound. I like how you can see the foundation of human movement in the video, but that it’s been taken a step forward and, while complementing the human body, demonstrates the beauty of interactive design. It seems to be based on pre-recorded video however, could this potentially be used in a live installation or performance?

Examples from Class 1