Week 10: Final Progress

This week I focused on content and interviewed my friends about the ways in which they connect (or feel like they don’t connect) virtually and which parts of their body they use to express themselves. Some of them said they leave their camera on just so that their professors “does not feel alone” or feels acknowledged but the majority also felt very tired of zoom and constantly “being watched”. This made me thinking that filters could actually a good mask or ‘ comfort layer’. We would still see each other and our movements – in essence our presence – but maybe we don’t need to see each other’s faces in full resolution with every yawning and scratching your nose unless we speak…

I was also looking at the new shaders of this week and particularly like the masked mosaic delay (Example 22). I was initially wanted to prerecord everything and insert my own video or animation as texture to be masked but with this example, I liked how the face/body was made up of close-ups of its parts, i.e. the area of the eyes had close-ups of the eyes and the same goes for the mouth (do you know what I mean?). This made me think that it would actually be nice to have this real time experimentation so that my final is not just a ‘finished product’ going out there but also an experimental experience for those interacting with the shader as well.

I am looking forward to learn about our class content today as it might allow me to take input from an actual Zoom meeting and pull it into my shader which might refine my idea of final presentation or performance.

Week 10: Final Project Proposal w/ Alia

After giving our final project more thought we decided to change our idea. We took the same core ideas from our previous idea (the restriction of movement) but incorporated them in a piece that strikes the balance between individuality and universality.

One way in which the pandemic has affected our connections is by restricting free movement, as well as eliminating safe physical contact. For our project, a camera would detect the number of viewers on a screen and draw a thin green outline around each of them. When the viewers move the outline starts becoming red and distortions may be added the more and faster they move. Movement would also be met with loud beeps and alarm like sounds, all things that communicate that movement is undesirable, causing the viewer to feel most comfortable still. The camera would also detect when two people are too close together, the computer would produce an even louder and more disruptive sound and both of the peoples’ outlines would be filled red.

Week 10: Final progress

The first step I did was to create the physical ‘touch me’ object, and connecting it to a teensy board.

I used our old archives of information to set up the teensy with a tilt switch. After I got it working, I began creating the physical object for the project. I decided on a pyramid since it seemed the most reasonable, and intriguing object. I initially wanted to create it out of something a bit more translucent, just because I wanted an LED to light up to signify that the object was in fact moved, but only had opaque paper.

The board fits perfectly inside the pyramid, with only the wire sticking out, and I left room for me to secure and tape it once it is finalized. I tested it out a couple of times just to make sure it was triggered when it moved, and thankfully it is.

I also browsed more on shadertoys to decide which shader I wanted to recreate into the final. I looked at old shaders I did, as well as ones robert did.

Sorry for the messy table

Week 9: Shaders

Starting this week’s assignment was hectic. I was experiencing so many issues to do with MIDI on my window’s laptop, which meant I couldn’t even start the assignment for so long. But luckily, I was able to receive my Macbook that I ordered, meaning I could start the assignment. But it wasn’t such a smooth ride after this, as I had several errors when I tried to import the shader from shadertoy onto atom. Eventually, after Aaron helped me out, I was able to do so successfully and start coding. Anways, here’s my list of holons:

Child, childhood, toys, legos, games, music, Ariana Grande, pop, dance, exercise, pain etc. The list could go on and on.

I noticed that a lot of the holons I came up with had something to do with childhood. Hence, with this assignment, I wanted to capture this idea through sounds and visuals. I used the ‘Terrible Console Webcam’ shader from shadertoy, because I really like how it looks. Here is a picture of how the shader looked initially:

To make the shader represent childhood more, I wanted to make myself look like a lego, since this is a common toy that we all used to play with as children. So, I proceeded to change the colors and make the pixel size larger. I also made some other changes to change the overall look so it suits my idea more. Unfortunately, I wasn’t able to figure out how to successfully use midi to trigger sounds, so my assignment only consists of visuals. Ideally, I have loved to use sounds from one of my previous Tidal assignments, where I combined different sounds of toys and the alphabet to accompany this assignment. But hopefully I can find my mistake soon. Here’s the final video:

Hopefully you can spot me in the video lol! I’m the red thing that’s moving side to side.

Week 9: Shaders and Tidal

The amount of holons in my life is probably uncountable, but here’s a list of some that are on the top of my head:

  • Daughter/sister
  • Major/minor
  • Room
  • Meals
  • Workouts
  • Heart
  • Brain
  • Body
  • Ideas/thoughts
  • Running
  • Clothes
  • Phone
  • Steps/Walking
  • Cake ingredients
  • Classes/zoom
  • Peers
  • Covid-19
  • This assignment
  • …. and the list goes on

As for the practical assignment, I played around with the different shaders and followed through with Aaron in duplicating the rings visual separately and changing how often the rings appear in a sin wave, and changing the speed. I also looked at some of the shaders on ShaderToy but most of them were too complex and my browser would even crash.

I decided to work with visual 22 as I found it very interesting when I was watching the lecture. It reminded me a lot of how the JBL moves to a beat and splashes water when placed on it, https://www.youtube.com/watch?v=Oz2o6mfe2Fk&ab_channel=TechHive, as can be seen here.

This is my final product:

I didn’t really like the red color so I decided to change it to a color that I like that would also contrast nicely with both the black and the white so I went ahead with a blue-purple-pink transition. I also changed the circle size and switched around the mouseX and mouseYs. To continue with the JBL theme/idea, I decided to choose the techno sound and play around with the speed and the gain of it. It also took my some time to synchronize the sounds and the beat as much as I can and I believe that the result was not bad.

For some reason, I got really random errors throughout, some on the browser where the terminal just disconnected and some with setting up SuperCollider.

Week 9: Final Project Proposal w/Omar

For the final project, Omar and I will be working together and so we came up with an idea that is personal to both of us, relating to Egypt where we were both born and brought up. 

Concept: An interactive map of Egypt that would work in one of two ways, we still haven’t decided. The first way is to use the mouse location on the screen and the second is by using the camera and your hand. It starts off by displaying a map of Egypt with 3 major locations that we’ve selected (Cairo, Alexandria/North Coast/ Luxor & Aswan). The viewer chooses an area and accordingly, the map starts playing sounds and images of that place. Also, within the chosen area, the movement of the viewer would also be tracked to change the images and sounds displayed; the four corners are different whereas the center would display all 4 simultaneously. We will try to incorporate sounds and images that we have recorded ourselves.

Execution:  After the viewer has picked their destination (one of the major locations), the computer’s camera would use color detection to track the hand position of the viewer who would be instructed to wear a bright and distinct colored glove. The x and y coordinates of the detected hand would be used to control the gain and opacity of sounds and images. For example, the x and y values for the middle of the screen would set the opacity of everything at 50%, and the sounds would all play at half-gain, but the x and y values at the top right corner would only set the gain and opacity at a set max (eg: 1.5 gain and 100% opacity) for 1 set of image and sound and then 0 for the other 3 sets. We’re still not set on how exactly we would go about the viewer picking a destination, we were thinking of having the map as a constant backdrop and the overlaying images and sounds would differ based on which location the mouse is placed on, but we are compelled to do something more demanding in terms of body motion.

Week 9: Final Project Idea

For my final project, I want to focus on how one’s body interacts with the environment it is in. I want to focus specifically on my room. The reason why is because during the pandemic, I feel as though my room has become a ‘safe space’ for me, and I have almost developed a personal connection with my room, because I have no other choice but to be there most of the times due to social distancing regulations. I feel as though a lot of other people can relate to this as well. My room as a whole (holon xd) makes me feel comfortable in a way no other space does. However, each place in room makes me feel a different way. For example, my bed makes me feel relaxed. I feel more tense when I am near my desk because it reminds me of how much homework I have to do. My kitchen makes me feel excited, because it reminds me of food. I want to capture all these feelings through an interactive performance that will tell the audience a story. I want to be able to tell the audience a story of my experience in my room, specifically during the pandemic.

What: During my performance, I want to move around my room and trigger different sounds and visuals, which will all vary based on my location in the room. Depending on where I am, specific sounds + my voice narrating a story about where I am in the room will be triggered. I also want to use shaders to amplify how I feel in each part of the room. Additionally, I will be using my body to depict my emotions during the performance.

How:  Since I need to be able to detect my position in the room, I think a Grideye will be the appropriate equipment for me to achieve this effect. I will also use atom, shadertoy and tidalcycles to achieve the sound and visual elements of the interactive performance.

Week 9: Heat Death and Holons

Disclaimer: Will sound crazy: this is not to be taken as any form of truth, I don’t know entirely what I believe but I think what is said below is a nice way for me to view the origin and destruction of life.

Holons: Atoms, Bacteria, Cells, Hydrogen, Water, Humans, Light, The Universe.

One of the things I truly enjoy about this class, especially as a CS major, is the lack of a rigid structure. As a CS major most of my coding assignments follow some form of rubric and are dictated heavily by constraints on the performance, efficiency and algorithmic design, but most frustratingly, the necessity to exclude self expression (I know that is not entirely true and there are ways you can do so but let’s ignore the nuance in this scenario).

I really felt like I was able to express myself in this assignment, although I must admit the cliche in the way I expressed it (and even more-so this entire post). I have been a lot more spiritual the past 4-5 months. I have been ruminating on the entire idea of death, life, the universe and after death. Last week’s reading (listening) about the concept of Holons really stayed with me because it aligned with many of my previously mentioned topics of inner-dialogue and thought.

The idea that everything, including us, is not simply a whole, but most importantly a part of a larger whole coincided perfectly with my new found belief in what life and death are. I have been thinking that life and everything within it and within the universe is possibly just some other form of energy. One of the most basic, and possibly most important, concepts physics that we are taught from a young age is the conservation of energy. “Energy can’t be destroyed or created, only transferred“. We are all different forms of energy, that once we die is transferred into other forms. We break down in the same way Holons are described to breakdown, they decompose into smaller Holons. We become what we have always been, a collection of millions/billions of Holons, and different forms of energy and we are cast back into the Universe, to become something else. We might not necessarily become “living” again, but an inanimate thing. And then once that starts to decompose again, our elements are converted into another form of energy, continuing the never-ending process of rebirth.

My shader assignment for this week, was an attempt to convey that. The Universe is born, everything else is created, then us, then we die, the universe rebirths us in a different form and the process continues for eternity.

P.S Sorry for the terribly long rambling post. Hope this makes a little bit of sense to anyone else (if not, well celebrate the fact that at least you guys don’t need mental/medical help lmao)

Week 9: Final + Holons

First of all, forgive me for my post last week where I kept spelling it ‘wholeons.’

Here is my list of holons, some big, some small, some random.

student – daughter – sister – friend – mentor -role model- music – guitar -art – mental health – community – idenity – laughter – tiktok- dancing – joking – driving – roads- food – coffee – malls – dogs – family – friends – student – university – immigrant – curly hair – height – travel – drawing – arts and crafts – scrapbooks – voice notes – face time – zoom – twitter – activism – muslims – LGBTQ – BIPOC – environment – counsellors – rainbows – colours – sneakers – UAE – hair care – skin care – parks – walks – africa – clothes – lyrics – harmonies – pandemic

For the final project, Robert and I decided to be working together. We have two ideas of what we could possibly do, though we are both leaning towards the second idea.

IDEA 1:

SETUP: A screen and a camera are setup, and as people are shown in the screen, their bodies, display different visuals. Their outline is apparent that is them, but their bodies show a random visual. Some could be glowing, some could be scribbles, some could be blank, some could be a heart beating, some could be hollow.

MESSAGE: That you can never know what is going on inside someone’s head. These could be completely random, in a way where the audience wonders what they may get. There is also a reference to people with invisible disabilities such as autoimmune disorders, ( eg, the heart beating), to also show that you do not know what may be going on just by looking at someone from the outside.

IMPLEMENTATION: this could use posnet, as well as shaders, a human prescence sensor, and music to reflect what visual is triggered, to really set the tone, so that it is clear what emotion we are conveying.

IDEA 2: 

SETUP: An ‘artwork’ is setup that says ‘touch me to watch.’ Once someone triggers it, a video, or visual starts playing, and we expect the audience member to watch it in its entirety. While they watch, we will attempt to track to see if the audience member touches their face, if they do, we will take a picture of them in action. At the end of the fake video, the photos will be displayed, as well as a number of people who touched the object before them

MESSAGE: to make people aware of how many times they may unconsciously touch their face, and how much germs they could be receiving by only touching this one thing.

IMPLEMENTATION: This could use a teensy capacitive touch for the trigger, as well as posnet to track them touching their face. Sounds and visuals could be used as the distractuion while we test if they touch their face.

Concerns:

How can we create this in a way that it can be presented in a virtual setting too?

Week 9: Shader toy

Scrolling through the shader examples on ShaderToy was a lot of fun, and I scrolled upon one that I immediately knew I wanted to do. It was split into 4 screens where each screen had a different filter or effect.

Was trying to show all the effects excuse my pose

I wanted to match a sound to each filter, either triggered with my body (difficult), or with my mouse (more plausible). Unfortunately, i faced many obstacles along this journey

My first issue was that my code refused to show on the localhost. I changed it, refreshed, restarted the terminal multiple times, and it didn’t work. Nothing changed from the initial code. I tried creating multiple files, and it didn’t work. Finally, thanks to robert, he told me to try Cmd+ Shft+ R, and that it force refreshes it, which worked for me. Unfortunately, I kept getting a white screen. On the console, no errors were shown. I kept going over my code again and again, trying to figure out what went wrong. I tried multiple times, restarted atom, and my laptop, tried the iMac, for them all to fail. I tried changing, editing, and trying out new codes to see what the bug was, but nothing was working. Here was my code, see if you spot my error.

Code that didnt work

Finally, I gave up and went browsing again in the website. I picked another example I could try using, only to fail again. I must have ported at least 3 different types of shader examples, and managed to fail every single time. I was beyond frustrated. Here’s another example of a code that didnt work (The same error).

In case you didn’t catch it, my error was that I accidentally typed gL_FragColor instead of gL_FragCoor in one of the lines, which affected the whole code ! A moment of silence for the (literal) 4 and a half hours I wasted because of this.. ***

Once i fixed that, I needed a little bit of debugging, because the images seemed different. 1. They were all upside down. 2. The third one wasnt working. 3. the black and white was inverted in colour

bugs in the code

For some reason, I couldnt find a way to fix this, so I decided to create a version of this that I liked.

Final outcome of this shadertoy

Here’s also another one of the many I did :

another example

After that, I tried to add music to it, as I wanted initially (controlled by the mouse), however really struggled with how to do this. I tried duplicating the mosiac example and replacing the effects with mine, but that didn’t work. I tried incorporating new code into one of the examples ready, but they seemed too different. I just wasn’t understanding how to proceed with switching from this format to one where it is controlled by a midi.

Overall I am disappointed with not achieving what I wanted initially, but happy to say that I really practiced porting Shadertoys!