My room, my haven

We have all developed personal relationships with our rooms during the COVID-19 pandemic. During quarantine periods, we spent most, if not all, of our time in our rooms, thinking, moving and feeling different things in our personal space. Personally, my room became a safe haven for me, because I knew that no matter what happened, whether I was fearful of the pandemic, or experiencing social anxiety, I could always go back to my room and keep myself company. As I spent more time in my room, my mind started to divide it up into parts – holons. Each section of my room evoked different emotions within me. For my final project, I wanted to capture this feeling, as I believe that we can all relate to this, especially because of the pandemic.

What: ‘My room, my haven’ is a project that highlights how each part of my room makes me feel. It combines visuals and sounds together to make one interactive piece. I divided my room into four main sections: bedroom section, kitchen, cosmetics section and workspace. Each section has an accompanying visual and sound that is triggered once I move within the section. The sounds become louder if I start to move more in each section.

How: I used frame differencing, shaders and TidalCycles to implement this project. I browsed on shadertoy and chose four shaders that I thought represented each section of my room nicely, and made appropriate adjustments to the shaders on Atom. After this, I combined the four shaders into one shader file and used ‘if else’ statements to place each shader on an appropriate part of the screen. I also included an extra ‘else’ statement at the end to make sure that the unused parts of the screen would just be the webcam. Next, I used frame differencing in the sketch.js file to detect movement of pixels in specific locations where the shaders were placed. If the movement in a specific section was greater than a threshold chosen by the user, then a boolean will equal true, and this would be sent as a uniform value to the fragment shader file. In this file, if this boolean is true, then the shader will turn on. I also used the function setTimeout() to set a timer of 1 second, so the shader turns on for only one second unless there is more movement in the section. I also used OSC to send values from the sketch.js to the TidalCycles file containing the sound, so that I could control sound based on my movement.

Evaluation: I really enjoyed working on my final project! I learnt a lot by coding and experimenting with different shaders and sounds with TidalCycles. I am very satisfied with my final project because it fulfils my original vision of what I wanted it to be. It was very difficult for me port the first shader from shadertoy, because it required version 300 of glsl. But after fixing all these errors, I got the hang of it. The part I also found difficult was using frame differencing to trigger each section of the screen, but after receiving help from my Professor, I was able to complete this. Overall, I believe my final project was successful, and I look forward to sharing my work with the rest of my class!

Here’s a recording of me ‘performing’ my final project in my room:

Week 11: Final Progress

This week, I focused on how I would implement my final project. I initially planned to use GridEye to detect my location in my room. Then Professor suggested that I use body pix and blob detection. However, this would make my computer go very slow. After more brainstorming, I decided that I would use frame differencing instead to implement my idea. I would split my room into four sections, and each section will have its own shader and its corresponding sound that will be triggered when I move in each section. This week, I focused on choosing the appropriate shaders, and making sure they work on atom. I got several errors, but after spending hours on google, and asking Professor for help, I was able to fix these errors and successfully implement the shaders on atom. Moving forward, I will focus on creating the sounds using TidalCycles, then using frame differencing to tie everything together. I look forward to seeing how everything turns out in the end!

Week 10: Progress

I started preparing for my final project by planning what parts of my room I want to be included in my final project. My room doesn’t have much in it: it consists of my bed area, my ‘kitchen’, my work area and my cosmetics section. Since these sections are the four main areas in my room, I have decided to use four different shaders and four different sounds that will be triggered depending on what part of the room I am in. I want the shaders to be the mask kind, where the shader will appear inside my body. Each shader will represent the way I feel standing in each part of my room. So, the shader will be on my body while the rest of the room will be visible. For my bed area, I want to use a very relaxing shader and very calming sounds to show how I feel relaxed. For my kitchen, I want the shader to be more vibrant, and I want energetic sounds. For my work area, I want to shader to be more dull, and the music to be tense to demonstrate how I often feel exhausted in this area, especially during the pandemic. For my cosmetics section, I want the shader to be more creative. I am still unsure what method I will use to achieve my idea, but I am leaning towards using the gridEye to detect where I am in the room. I look forward to working more on my final project, and implementing it in my room.

Week 9: Shaders

Starting this week’s assignment was hectic. I was experiencing so many issues to do with MIDI on my window’s laptop, which meant I couldn’t even start the assignment for so long. But luckily, I was able to receive my Macbook that I ordered, meaning I could start the assignment. But it wasn’t such a smooth ride after this, as I had several errors when I tried to import the shader from shadertoy onto atom. Eventually, after Aaron helped me out, I was able to do so successfully and start coding. Anways, here’s my list of holons:

Child, childhood, toys, legos, games, music, Ariana Grande, pop, dance, exercise, pain etc. The list could go on and on.

I noticed that a lot of the holons I came up with had something to do with childhood. Hence, with this assignment, I wanted to capture this idea through sounds and visuals. I used the ‘Terrible Console Webcam’ shader from shadertoy, because I really like how it looks. Here is a picture of how the shader looked initially:

To make the shader represent childhood more, I wanted to make myself look like a lego, since this is a common toy that we all used to play with as children. So, I proceeded to change the colors and make the pixel size larger. I also made some other changes to change the overall look so it suits my idea more. Unfortunately, I wasn’t able to figure out how to successfully use midi to trigger sounds, so my assignment only consists of visuals. Ideally, I have loved to use sounds from one of my previous Tidal assignments, where I combined different sounds of toys and the alphabet to accompany this assignment. But hopefully I can find my mistake soon. Here’s the final video:

Hopefully you can spot me in the video lol! I’m the red thing that’s moving side to side.

Week 9: Final Project Idea

For my final project, I want to focus on how one’s body interacts with the environment it is in. I want to focus specifically on my room. The reason why is because during the pandemic, I feel as though my room has become a ‘safe space’ for me, and I have almost developed a personal connection with my room, because I have no other choice but to be there most of the times due to social distancing regulations. I feel as though a lot of other people can relate to this as well. My room as a whole (holon xd) makes me feel comfortable in a way no other space does. However, each place in room makes me feel a different way. For example, my bed makes me feel relaxed. I feel more tense when I am near my desk because it reminds me of how much homework I have to do. My kitchen makes me feel excited, because it reminds me of food. I want to capture all these feelings through an interactive performance that will tell the audience a story. I want to be able to tell the audience a story of my experience in my room, specifically during the pandemic.

What: During my performance, I want to move around my room and trigger different sounds and visuals, which will all vary based on my location in the room. Depending on where I am, specific sounds + my voice narrating a story about where I am in the room will be triggered. I also want to use shaders to amplify how I feel in each part of the room. Additionally, I will be using my body to depict my emotions during the performance.

How:  Since I need to be able to detect my position in the room, I think a Grideye will be the appropriate equipment for me to achieve this effect. I will also use atom, shadertoy and tidalcycles to achieve the sound and visual elements of the interactive performance.

Week 8: Upside Down

This week, my emotions ranged from sadness to fatigue to stress and to confusion. But ultimately, I just felt overwhelmed to the point where I felt like my brain was upside down. I wanted to reflect this in my work, so I used the delay shader. I really liked how every movement was delayed on the screen, and in a different colour. It almost looked as if everything was in slow motion. This reflected how I was feeling throughout the week. I felt like everything was slowing down. I also used my body to embody my emotions.

I struggled a lot with this assignment, because every attempt I made to try to change the code and make it my own resulted in errors. Also, most of the times, any changes I made to the code was not shown on the server. I kept on trying and trying to fix this problem. Eventually, after much trial and error, I was able to get it to work. I really enjoyed this task despite all the obstacles I faced. I look forward to practicing shaders more in future assignments!

Week 8: A Brief History of Everything

Ken Wilbur introduces a very interesting idea that made me reframe how I view life and its components. In this reading (listening), Wilbur firstly defines wholons. Wholons are whole parts. They are neither a whole, nor a part, but they are wholes that make up a part of a larger whole. This sounds confusing at first, but with a deeper understanding of the idea, it starts to make much more sense. When you think about it, everything is a wholon. Take your room, for example. My room comprises a bed, which is a part of my room but also a whole that is made up of smaller parts, such as my blanket, bed sheet, pillow and so on. My room itself, though a whole, is also part of a larger whole – the building. This applies to anything and everything. Often, when we think about what makes up life, the universe, the ‘cosmos’ as Wilbur calls it in this reading, we are taught to think of everything as made up of atoms. But this reading (listening) made me think of life in a different way, where everything makes up a part of a larger whole despite being a whole itself. Wilbur mentions how thinking of the cosmos as simply being made up of atoms is reductionist, as it does not take into account the mind/ spirit. Wilbur also talks about how a wholon is defined by its ability to maintain its autonomy, and simultaneously remain a part of a larger whole.

Within the field of interactive art, this idea that everything is a wholon could be applied when creating the artwork. The art installation, the participant and the artist are wholons that are made up of smaller parts (for example, technology, move-think-feel and creativity respectively), but also contribute towards the larger whole which gives the artwork its meaning. Now, when completing the weekly projects, I will keep in mind the idea that everything is a wholon, so that my work becomes more meaningful, and each part of the artwork contributes to a larger whole.

Week 7: Shaders

This week’s assignment was hectic, but fun! I started off by watching the videos, but encountered several errors along the way. After spending so much time trying to fix the errors (that I was convinced were due to my faulty laptop), I decided to go the mall to purchase a brand new Macbook. Unfortunately, there were none in stock, and I decided that the universe just hates me. After staring at my faulty laptop for a long period of time, I was finally able to solve the errors (with trial and error) and proceed with the assignment.

I decided to choose the “dilated shapes” example, as I really liked how colourful it was, and I loved the multiple outlines created around me once I moved. While playing with this example, I also noticed that clicking on the screen caused the outlines to move more quickly. I decided to dance in front of the screen as if I was at a disco party, because that is the overall vibe I got from the visual. I also danced with my friend in the video to show how two people interact with the interactive art.

To make the example more my own, I changed some of the colours, and changed “2D” to “3D” everywhere in the code to see how the end result would look different. I hope you enjoy the dancing from my friend and I!

Week 6: What is Somatics?

Thomas Hanna defines Somatics as the study of the ‘soma’, which is the human body perceived from a first-person perspective. The soma is different from the body not becayse the subject viewing it is different, but because the lens by which it is being viewed varies. Hanna mentions how the first-person viewpoint, as opposed to the third-person viewpoint, is immediately factual, and does not need to be proven correct through ‘universal laws’. Hanna uses the study of substances such as rocks or minerals as an example; when scientists study this, the science is completely backed up by the results they achieve and all the calculations that they do. However, when studying human beings, there is the third-person view that the scientists see, as well as the first-person view that the person being studied has.

Understanding Somatics

Hanna discusses how in order to fully understand somatics, we need to first recognise that somas are not bodies. Furthermore, somas are not only self-aware, but they are actively engaged in the process of self-regulating and acting upon itself. For example, human beings are never passive, and are always sensing and moving even when we are being observed. This links to the idea of thinking-moving-feeling that we have discussed previously, where viewers of interactive art become participants because they begin to act upon what they see; even the people who choose not to participate in the art are still acting, and are not passive.

Consciousness and Awareness

One interesting topic that Hanna brings to light is the idea that consciousness is not a fixed lens, but rather a learned skill, that springs into action once we encounter external stimuli. I was very intrigued by this idea, as I have always thought of consciousness as something that all human beings had in similar amounts (though before reading this I never thought of consciousness as something that could be counted). S

Week 6: GridEye

This week’s assignment was very enjoyable for me, especially because I was able to fix the errors that I had during the assignment by myself (I think I might be getting the hand of everything!). This task was very similar to frame differencing, except this time, there were 63 boxes that I could trigger. On Atom, however, I was only able to trigger 7 sounds, because doing more sounds than this created a glitching sound effect that wasn’t very pleasing to the ear.

The most difficult part of this assignment for me was trying to trigger the individual sounds, because moving just slightly caused several boxes to be triggered at the same time. To tackle this, I used mostly my arms to trigger the sounds. Though, I was not able to screen record my movements because my laptop did not capture the video. The boxes I tried to trigger were: 0, 5, 15, 21, 32, 45 and 52. Hopefully, it is evident in the video which sounds are played when the specific boxes are triggered.

For this week, I wanted to use TidalCycles to create ‘music’ or sounds that reminded people of childhood. So, I searched for sounds that could be associated with children, such as the alphabet, numbers and toy sounds. However, when I played the sounds together, it was actually more haunting than I intended. Due to this, I decided that I could experiment with the sounds in a different way. I adjusted each of the sounds so that when played together, it sounded like a bunch of malfunctioning toys.

I can imagine this being used in a children’s toy shop. Perhaps as a child moves through the room and approaches a specific toy, then specific sounds are triggered. I guess this could be useful during the pandemic as it would eliminate the need to touch a toy in order to trial it.