Tree of Life

For this assignment, I decided to project onto my clay tree model since it looked like a decent surface to project onto and an interesting shape. I found a shader on GLSL Sandbox that reminded me of outer space. After playing around with the shader a bit, I realized that I could change the overall color scheme to be more red, blue, or black. I also happened to find a glowing blue ball on GLSL Sandbox; it then immediately occurred to me that I could use this blue ball or orb as an abstract looking alien. From there, I decided to try to project the outer space background onto the tree and make another red alien. The position of each alien would change the color of the tree. So if there are no aliens in the scene, then the tree would be black:

If only a red alien comes into the scene, then the tree would turn redder and redder as the alien got closer to the tree:

If only a blue alien comes into the scene, then the tree would turn bluer and bluer as the alien got closer to the tree:

Finally, if both aliens come into the scene, then the tree would turn purple overall:

I struggled a lot when it came to implementing this project. First, projection mapping onto the tree was much more difficult than I thought it would be. I initially used Aaron’s method of taking a photograph first and cutting my input according to the shape of the tree in the photograph. I was very careful about trying to the photograph of the tree from the same angle that the projector would project onto the tree. So before taking the photograph, I projected a green rectangle onto the tree to see which parts of the tree get illuminated easily. However, after I cut the input and tried projecting this shape onto the tree, it ended up being very different from the actual tree. So I repeated the process a few times. I came to a point where I just directly cut the shape according to what I physically saw on the tree instead of using the photograph. This proved much easier to get the mapping right and this worked well for the trunk and the thicker bottom branches near the trunk. It got very confusing very quickly when I got to the finer branches, which is also where I kept tripping up earlier when using the photograph technique. Maybe this was because the tree was very round and so the light kept moving in weird ways because the branches in the shape looked very peculiar and bizarre compared to reality. There came a point where I decided to stop trying and made a mental note to choose a fatter object next time.

After projecting the shader onto the tree, I realized that you couldn’t see the shader that well on the tree. I then tried to project a huge rectangle instead and this looked nice. So the tree must have been a much too thin object. My tree was standing on a box, and seeing the shader underneath the tree gave a very nice effect, especially if you think about it metaphorically as the foundation of the tree. So I decided to project some roots onto the tree.

For the inputs, I used the mouse position to control how red the tree is. I also used the mouse to change the location of the red alien. To make the tree blue or not, I toggled between the “a” and “s” keys to add a blue tint. The mouse would simultaneously control how blue or purple the tree is accordingly. I also used keys 1-5 to change the x-position of the blue alien and keys 6-9 to change the y-position of the red alien. Ideally if I had more time, I would use serial communication with Arduino to make more intuitive input controls.

Steering the Sea

By Nahil and Erika.

Idea

Our idea was inspired by two different touch installations. For the input of out project, we were inspired by Touch Me created by Emily Biondo.

In her installation, two people have to wear gloves and touch each other’s gloves in order to affect the lighting in the surrounding area. We wanted to use a similar kind of interaction between users, but instead of using gloves, we wanted to create a common object that people could hold together in order to cause some sort of effect.

For the output, we were inspired by Robobble, which is a sphere whose shape can be manipulated using a smartphone with the help of an inner, extendable and contractible skeleton with wide heads on it.

This project inspired us to also manipulate the shape or form of an object. So we decided to take a piece of fabric lying flatly on a raised surface. Using servos underneath, we wanted to move parts of the fabric up and down in the attempt to simulate the motion of water. This then reminded us of sailboats, so we decided to make the input object a ship steering wheel. Two people have to hold the steering wheel together in order for the water to move and make it look like the sailboats on top of the water are actually sailing, thus showing the necessity of teamwork in order to set sail. If only one person is using the steering wheel, then the circuit does not complete and thus the sea is motionless and the boats are no longer sailing. However, when two people each hold two different colored handles of the wheel, then the water and boats begin to move.

Our prototype is small in size, hence right now it is possible for one person to cover the entire wheel with her/his palm and close the circuit, as you can see Erika doing in the video below. Yet, if the project was developed to its full extent, then it would feature an enormous steering wheel, which would compel people to collaborate with each other to experience the voyage.

Implementation

The circuit consists of two human switches that are split over eight handles of the steering wheel. So, one open wire connects to two handles. We color coded the steering wheel so that its easy for the user to understand which handles to hold. Initially, we thought of using aluminium foil as a conductor on our steering wheel, but we found that the signals were unreliable and therefore switched to copper tape since this is much more conductive.

When both human switches are closed and the current is flowing through both circuits, then two small servos with extensions made out of popsicle sticks are activated under the “sea” fabric and create the wave-like effect. The waves continue only as long as the users hold on to the steering wheel. We hot-glued a few tiny paper sailboats to the fabric.

The Arduino code can be found here.

Challenges

The servos are very fragile and difficult to work with, especially since in our case the servos have extensions that are supposed to move both fabric and paper boats. That part of the structure had to be adjusted and handled very carefully.

Another challenge was incorporating two human switches. Initially we tried making two gaps in one circuit, but this didn’t give stable enough readings. Then we tried to make two separate circuits and check that both are complete before activating the servos. Although we succeeded in wiring up a working prototype, there are still a couple of inconsistencies with how it is supposed to work. Firstly, the circuit is closed when one person holds the red handle and the other person holds the blue or yellow handles, because the red handle is directly connected to power. Secondly, the circuit works only if one person holds the blue and yellow handles and the other person holds the green and red, while it does not work if one person chooses the green and yellow handles and the other chooses the red and blue. This arises from the fact that each separate switch has to be closed for the circuit to be complete.

Squeeze Out Some Stress

Kiori’s workshop was a new, but interesting, experience for me. Coming from a “no-touch culture” as Kiori termed it, the sensations resulting from the close proximity between others and their touches during some of the exercises was not something I had encountered too much before. However, she led the workshop in such a way where it wasn’t nerve-wracking at all, and I learned to really focus on the smallest of sensations and the effects they had on me. So as a result, I decided to use touch as my main input sense for the projects this week. Considering the fact that I’m a very visual rather than audio person, I decided to focus purely on audio as the output.

For the first part of the assignment, I used a FSR to detect changes in pressure. I attached this sensor to a stress ball. I also used a piezo buzzer to produce sound and decided to play with frequency since the average human can only hear between 20 Hz to 20,000 Hz. I mapped the sensor so that the harder you pressed, the higher the frequency. The mapping took place between 31 Hz (the lowest frequency an Arduino Uno can produce) and 20,000 Hz (the highest the average human can hear). The lightest touch will produce an audible sound. But as you squeeze the ball harder, the frequency gets higher and more annoying until you press too hard so that you can most likely not hear it buzzing anymore.

For the second part of the assignment, I made a total of six pressure detecting stress balls. Initially, I wanted to play six different audio files of annoying sounds in Isadora and change their frequency according to the pressure detected. However, after a lot of research, I realized that you can’t change the frequency of an audio file. So instead, I changed the volume levels and speeds of the audio files in the sound players. So, one ball’s pressure would affect the volume of one audio clip but the speed of another. The result would be a stressful cacophony of sounds, for which you need a few people to try to minimize. I made the default so that while the balls are not being touched, the volumes are at their maximum level and normal speeds. Increasing the pressure will decrease the volume but increase the speed. The sounds I chose were of a person clicking a pen continuously, a kid screaming, blinds rattling, a clock ticking, fingernails screeching on a chalkboard, and crickets chirping.

I did encounter a few other problems while working on this project. I couldn’t find plain stress balls, but I did find princess balls, which worked better since that added to the irony behind my project. Also, initially I was using a very small volume range of 0.53 to 2 for the kid screaming and a small speed range of 1 to 3 for each audio file. I was doing all my mapping in Arduino so that there would be less burden in Isadora in the hopes of mitigating it from crashing often. However, I found out after a while that you cannot get floats as a result from mapping. So I multiplied my range boundaries by 100, then mapped it, then divided the result by 100 in order to get the desired resolution for the mapping result. Then, while doing the port setup in Isadora, I wrote value = float 4.2#. Although I’m still unsure whether Isadora is understanding the incoming information as floats, based on my observations of the numbers changing in the router outputs vs. the Arduino serial monitor. Finally, sticking the FSRs to the balls proved more challenging than I thought it would be. At first, I avoided soldering wires onto the FSRs since I had read online somewhere to avoid doing since the heat will damage the sensor. So I tried to female headered wires but they kept coming out of the FSR pins. Then I tried using alligator clips, but I encountered the same problem when I tried to then attach the FSRs to the balls. Afterwards, I tried soldering wires onto the FSRs’ pins despite Adafruit’s warnings. After figuring that out, I tried to attach the FSRs to the balls by wrapping tape around the FSR’s stem and the ball, but when I started reading the values off Arduino, it was detecting maximum pressure. So I had to resort to just putting tape behind the sensor and sticking it to the ball, which meant that it wasn’t a very sturdy connection. I could have hot glued the sensor to the ball, but I wanted the sensors to be easily removable so that other people could use them in the future.

The code for this project is available here.

To Be A Sensor

What does it mean to be a sensor? In the world of microcontrollers and electronics as well as the world of measuring tools like spring balances, it’s just the ability to detect a change in something in the environment. A basic sensor usually detects changes in only one aspect, sometimes a few more aspects. It’s easy to use these items to detect the absolute and distance thresholds. The moment you see a number on a screen or display, it somehow feels precise, especially compared to just holding or feeling an object or environment yourself and trying to describe its properties in your own words.

But in the world of humans and animals, it gets so much more complicated than that. We’re able to sense so many different things at once, without it necessarily getting too confusing. We take the ability sense things for granted so easily, until we let go of one or more senses or try to hone in all our focus on a specific one. Yet, we are able to sense so much without realizing it. In Kiori’s workshop, we learned to do this, and as a result, innately become more aware of our own absolute and difference thresholds for touch or movement within our bodies, especially during the first exercise, as well as for the heartbeat and nervous or playful energies of our partners in the remaining exercises. Learning about the more humanly side of sensing during Kiori’s workshop inspires me to try to capture this more the next time I want to use a sensor for a project.

Spirit Animal

Inspired by a graphic novel about tricksters I found recently, I decided to focus on the fable of the tortoise and the hare for this assignment. I wanted to build a project that could show an individual which end of the spectrum he/she seems to be on. One the one hand, the hare tends to live a more fast-paced life, while the tortoise takes its sweet time to accomplish its tasks. Thinking about pace, and to some extent stress levels, I decided to use the pulse sensor to detect a person’s heart rate. If a person has a slow heart rate, then they are currently in the state of being similar to that of the tortoise; and if a person has a fast heart rate, then they are more like the hare at that point in time.

To build the project, I first focused on the Isadora side. To avoid clichés and make things more interesting, instead of using a video of a hare and a tortoise, I used videos of a hamster running quickly on a hamster wheel and a video of sea turtles drifting in the sea and just going with the flow. I wanted the heart rate readings to not only show which animal you’re more like, but I also wanted to show to what extent that was the case and thus show a whole spectrum. To achieve this, I first checked whether the heart rate was above or below a certain threshold, 85 beats per minute. This then determined which video would be displayed using the video mixer tool. Then, I added a Gaussian blur effect onto the video. The heart rate reading was mapped onto a scale of 0 to 20 which would affect the size of the blur such that if the hamster video was playing, the faster the heart rate, the clearer you could see your animal. On the other hand, if the sea turtles video was playing, the mapping was done such that the slower your heart rate, the clearer you could see the sea turtles and thus see which animal spirit you were embodying.

Here’s some diagrams of how I set up the Isadora side of the project:

Here’s a photo of the circuit:

To see the code, go here.

Here’s a screen capture video of the effects of the heart rate on the depiction of the videos of the animals. In this case, the person had a high heart rate initially and the video shows the heart rate slowly coming to rest.