Acslepius – Final Project – Grace

Below is documentation of my final project:

In terms of setting up since last time, I managed to connect my two Cap1188 capacitive touch sensors to connect via I2C, changing the identifier of one of them by connecting the 3v pin to the AD pin.

I had the wires from each of the Cap1188 pins run through to triangles on my structure. I had issues getting these sensors to register when soldered to a Protoboard, so I connected them directly to pins on the Cap1188s with a solderless breadboard, organising the wires using a piece of cardboard with holes punched into it. The Cap1188s were wired as per the Arduino website’s instructions. The Arduino had a serial connection to an Isadora patch, sending it numbers between 1-14, in relation to the numbering of the triangles. My patch would generate a square particle of a particular size and particular location corresponding to the number received from the Arduino. When seen all together, they effectively make a 3×5 grid, the repetition of certain x and y values making it tidier to assign locations to each sensor  number. I sent this screen using Syphon to Mad Mapper, and mapped each particle location to the corresponding triangle on the structure.

You can find my code HERE
My Isadora patch

You can see the Mad Mapper setup here with the square particles and corresponding structure location:

 

Showing it at the IM Showcase had interesting results. Overall people seemed to really enjoy the installation and thought the effect was quite magical. I had the problem that a lot of people would not enter or touch the walls unless I told them to, even though I put signs saying to do so. The only exception to this was when children saw  and played with it. Putting a blanket inside was also useful for prompting people to come in and sit. A lot of people also missed interacting with the triangles opposite the entrance as they would tend to stay facing the opening, possibly because I would do a lot of explaining there. Everyone was also very curious about how it worked. If I were to do this again, I think I would keep the triangular cells but build a wall instead with the projector placed on the opposite side of the audience as I think the cave-like structure, while cosy, was not easily accessible for everyone and therefore was less inviting of interaction. Another option I would be interested in is a taller version that is a full geodesic dome someone could stand in, with a two projector setup.

I hope to put footage of my performance too. It was a very good learning experience to work with both the interactions and aesthetic design of the structure, as it worked best with long flowing motions, and low to the ground movements when inside because of the height. Footage of my performance with Dina coming soon hopefully!

Final Project Progress

Code is below:

https://github.com/thatgracehuang/Sensors-Body-Motion/tree/master/Final%20project

I have an arduino file sending numbers to my isadora patch in relation to which wire is touched on my structure

What the setup looks like from inside and outside room 006.

 

To do:

  • Projection Map
  • Figure out how to use 2 Cap1188 shield
  • Match up all sensors with an image on Isadora

Will add more description later today, but class is starting now!

Some time “later” when I finally found time(!!!):

UPDATE 16th Dec 2017

At that time, I had run into trouble with connecting two Cap1188 sensors through I2C, this was eventually resolved with the discovery that when wires came out of the sensing pins, it screwed up the sensor a lot and required a reset. Two sensing wires running through the structure had broken at some point also, and once replaced with new wire, worked fine. There was an attempt to use two arduinos with a sensor each, which was somehow harder and confused both Arduino’s serial monitor and Isadora more.

At the time I had been experimenting with using the Shapes actor in Isadora to create the visuals. I eventually opted to use Particles, as I could use one actor to generate all the visuals instead of 14 shapes which had built in fade in and fade outs. It also meant I could put images into the particles if I wanted to in the end (I didn’t).

I also had trouble getting my image to show up on the projection mapping portion of Isadora. I switched to MadMapper and it worked fine so I stuck with that, though I found out after that it was to do with the particle actor sending information to the stage instead of to a renderer.

 

Final Project Update – Sensors

I plan to use two adafruit Cap1188 capacative touch sensors for my project. After some experimentationwith different types and configurations of wire, conductive ribbon and conductive thread, I found the most reliable way to use this sensor for my project was to solder an elongated Y shape out of stranded wire, embedding the two ended side into the fabric and running the other end along the frame of the structure to the arduino. On testing these, I found them to be a fairly reliable interaction.

The arduino code I used for this is here: https://github.com/thatgracehuang/Sensors-Body-Motion/blob/master/Cap1188_test.ino

I used masking tape to secure the wires to the frame. I intend to replace these with hot glue eventually.

The two sided end going into the tulle. I also numbered both ends of the wire with masking tape to keep track of each section. These will be removed once the structure is projection-mapped.

All the ends go here to be connected to an arduino. To test it, I used a solderless breadboard to connect everything, I plan to solder these wires to a PCB board or Protoboard so that everything is secure for the final presentation.

 

The wire tip, from the inside and outside. Since the whole length of the wire can be a sensor, I am running the wires on the outside of the structure to avoid them being touched. The tulle fluff prevents a lot of this from happening. 

The Arduino and CAP1188s. This is not the complete wiring, but served to test that my arrangement of wires was working well. I am able to connect  both sensors to the Arduino via I2C, changing one of their addresses, by attaching a wire from the 3VO to AD pin. I am also still figuring out the code for using two of these sensors, and plan to have the code and wiring locked in by the weekend.

Lastly, due to taking up space that was for other classes, I have had to move my structure to Ume, James and Pierre’s office down the hall. During the move, I found that this structure is not that fragile, light and flexible enough to bend through doorways, which is good to know for moving it to its final destination.

Final Project Progress – Grace

Below is the structure I built for my project.

The frame is made from rolled paper, staples, yarn, duct tape, gaffer tape and hot glue. I based the structure on instructions for making a newspaper geodesic dome, though with the intention of making if a half dome and with a less uniform shape, so that my structure would look more organic, though these choices have made it somewhat less stable. It has been a learning process to figure out how to construct this, such as finding the best way to roll paper poles, how to stack them for length, how to create a stable shape and how to attach the poles securely to one another. I figured a lot of these things out along the way, so the structure is really half holding together in the best way, with some structural weaknesses because of that. Because of this, the structure also needs to be propped up and taped to the wall and floor to remain stable, though it can stand for some time without that, and I would have liked to redesign it to not need a wall, now that I have a better idea of how to construct geodesic dome-like structures.  In general, this is a situation where if I could make a version 2 with what I had learned, it would be much better, though I’ll won’t have time for this project.

The paper structure:

 

 

After finishing the paper structure, I wrapped yarn around the poles to fill in the panels. I then started tying strips of tulle onto the yarn so that the panels seemed to be filled with fluff (see images below). I bought 3 different colours of tulle to give some variation, though all are quite pale and give an overall whitish look. As of writing I am still filling in the panels with tulle strips.

 

From inside:

Fabric used:

As for naming the project, I am thinking of calling it Asclepias, or Milkweed, after a plant whose pods turn into puffs as seen below. I drew inspiration from these pods in creating this structure. This plant was named after the greek god of healing by Carl Linnaeus and is the only food source of the monarch butterfly. I’m not sure what this means for a story or performance but this will be the starting point for one.

Research into Final project – Interactive Geodesic Dome

 

I worked on figuring out a prototype of the cells I’d use to make up my final project. I made two triangles out of rolled up paper, stapled two layers of fabric onto them, then attached tilt switches to the center of each. I then added a neopixel strip that would be activated by one of the tilt switches. You can see the results below:

From the back:

I was able to run the wires through the tubes to keep things neat.

Generally this setup is able to distinguish between the two cells, unless the frame is bumped. Giving a bit of slack to the fabric of each cell seems to help prevent cross-activating switches across cells. I had the tilt switches pointed up, and had the code set to activate the lights if the switch sent a 0.

The code is below:

#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h>

#endif

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1
#define PIN 6

// How many NeoPixels are attached to the Arduino?
#define NUMPIXELS 40

// When we setup the NeoPixel library, we tell it how many pixels, and which pin to use to send signals.
// Note that for older NeoPixel strips you might need to change the third parameter–see the strandtest
// example for more information on possible values.
Adafruit_NeoPixel pixels = Adafruit_NeoPixel(NUMPIXELS, PIN, NEO_GRB + NEO_KHZ800);

int delayval = 100; // delay for half a second

bool on = 0;

void setup() {
Serial.begin(9600);
pinMode(5, INPUT);
pinMode(4, INPUT);

// This is for Trinket 5V 16MHz, you can remove these three lines if you are not using a Trinket
#if defined (__AVR_ATtiny85__)
if (F_CPU == 16000000) clock_prescale_set(clock_div_1);
#endif
// End of trinket special code

pixels.begin(); // This initializes the NeoPixel library.
}

void loop() {
int One = digitalRead(5);
//Serial.println(One);
if (One == 0 && on ==0){
on = 1;
}

if (on == 1){
for(int i=0;i<14;i++){

// pixels.Color takes RGB values, from 0,0,0 up to 255,255,255
pixels.setPixelColor(i+14, pixels.Color(delayval,delayval,delayval)); // Moderately bright green color.

// pixels.show(); // This sends the updated pixel color to the hardware.

// delay(1); // Delay for a period of time (in milliseconds).

}
Serial.println(delayval);
pixels.show();
delayval –;
if (delayval == -1){
on = 0;
delayval= 100;
}
}

}

I have also been researching the feasibility of building a geodesic dome. Given the time constraint and amount of materials required I am opting instead to make smaller frame that can hug around a wall or corner of a room to make a makeshift tent. This would be made with a large wooden frame filled in by smaller triangles made of rolled up paper to form the individual cells. Instead of neopixels, I plan to install a projector above the structure with white triangles or squares projection-mapped to the triangles of the structure, as I envision wiring up that many neopixels and getting the light to diffuse properly would be very difficult.

Aesthetically, I want to mimic the look of a butterfly or moths nest that is wispy and light, or cocoon-like, so that it could almost seem like a fantasy fairy dwelling that has a sense of magic to it.

Computer Vision assignment – Grace

The goal of this assignment was to take the patches developed for the Farfalle workshop and adapt it to a IR camera and projector set up in the lab using computer vision.

The camera feed was collected using Processing and sent to Isadora through Syphon.

In Isadora, I made a patch to figure out computer vision with background subtraction. The first thing to do was use Zoomer so the projection space plus a little extra was captured.This then went to a Freeze actor to grab a frame of this image when the stage was empty. This went into an Effect Mixer with the unfrozen but zoomed feed, set to difference. This information all went into Eyes, which could track the location of one person(or blob) moving around the stage. I copied this into a couple of my Farfalle patches, and then mapped the location values to correspond to the projection when someone was walking inside it. I did not do all my patches as there were too many. I had the problem that when under the projection I would cast a shadow, which made it harder to see the effect of the patches. This was especially noticable as the projected surface was smaller than it was in the Farfalle workshop. Some patches felt better to move around under, especially if it had some constant movement and involved the whole of the stage rather than being focused on the location of the person onstage.

I used the Processing code provided in class unchanged for sending the IR camera feed via Syphon to Isadora.

The Isadora patch I made can be found here:

https://github.com/thatgracehuang/Sensors-Body-Motion/blob/master/Farfalle%20modified%20.izz

 

A video:

I was able to detect two objects using Eyes++ as well, though I had the issue of the blob number being identified constantly changing and jumping between blobs, making it hard to maintain 1 blob to grab a location from. I fixed this by decreasing the number of objects to 2. I do wonder if there’s a way to have this actor default to blob 1 if no other blobs are detected.

Here are the two chairs on the stage:

Here they are on Isadora as two separate blobs:

I didn’t do anything with this however, as I didn’t have any patches capable of using two objects (without freaking out).

Touch Sensor Game – Grace

The game I’ve made is a mix of a matching game and Snap.

The setup:

5 pieces of cutlery are arranged on each side of a box. Each piece is connected to another piece on the opposite side with a piece of wire under the box. A weight- triggered switch (made with wire and a folded piece of paper) is placed in the center of the box with a teacup (or other object) on top. Two pieces of wire are attached to metal rings at one end. On the other end, one is attached to the 5V pin on an Arduino, the other to the A0 pin (with a pulldown resistor). Two LEDs are also powered through the arduino and are visible if sitting at the box. One is a dim red and the other a bright blue. The red will light up when switch at A0 is activated, the blue will light up when the central weight-triggered switch is de-activated. OPTIONAL:place a plate in the center of each row of cutlery for prizes or sweets. Lacy tablecloths are also optional.

To play:

Requires 2 players. Each player puts on a ring and sits on opposite sides of the box. Together, they can touch the cutlery on their own sides to find two that are connected by wires. They must watch the for the red LED at the Arduino to tell if it is a match. Once a match is found, it is a race to be the first to grab the teacup from the center. Once the teacup is removed, the bright blue LED with light up, signaling the end of the round, the player holding the teacup when the blue lights up wins a point and can optionally eat a sweet from their plate. To begin a new round, place the teacup on the paper sensor.

Players should not repeat a previous match. The Arduino cannot enforce this so it is up to the players themselves to remember, or mark the matched cutlery somehow.

Here is a picture of the setup in progress:

From inside the box:

 

Finished setup:

   

 

Pictures of the paper switch. One of the wires goes to pin 5, the other to 5V:

 

The arduino and breadboard

The arduino code:  https://github.com/thatgracehuang/Sensors-Body-Motion/blob/master/EtiquetteGame.ino

Video of playing the game:

Process:

I had originally wanted each piece of cutlery to be its own switch, so that the matches between cutlery were handled in the Arduino and the combinations could be changed to give the game more longevity, though this proved too complicated to achieve in time. I also generally struggled with making sure there was enough conductivity across all the cutlery pieces, and had to be very careful with wrapping everything with wire and having a paper switch that was activated by weight (I originally tried using copper tape on the bottom of the teacup to bridge the gap between two wires but this made for an inconsistent switch).  I still have issues with conductivity and getting stable readings from the cutlery switch, especially with establishing contact through the rings.

 

Open Studios Installation (Ethan + Grace)

Welcome to the I.M. Lab Cat

Two images designed on Adobe Illustrator and Sound Effect. Using Isadora with a IR range finder and Serial Xbee. Signal from the receiver trigger the meow sound effect and ‘Welcome to the I.M. Lab’ image. Uses a gate and trigger delay to avoid toggling the image and sound effect continuously when multiple people walk through.

Neopixels

A strip of Neopixels line the bottom of the TVs the cats are on. These are controlled by a Redboard. When a signal is received from the Xbee connected to the Redboard, two pulses of bright light travel from the center of the strip outwards, leaving behind a dimmer trail of color. Multiple pulses can fire at a time. These pulses are controlled by a matrix that keeps track of the pul’s location, whether it is active or not, and what color it is. The is also a pulsing center section of light which quickly randomises its color.

User Testing

Individuals noticed the cat first but failed to notice the neopixels. After discussion, we concluded that this was due to the lights not being at eye level and the cat, therefore, taking attention, further due to the additional sound effect. We moved the lights up so they are visible with the cat.

We also deemed the original cat sound effect to be annoying and sound like a sickly cat. The sound effect was changed to a kitten’s meow to be more pleasant to the audience.

User Testing Documentation

 

 

Neopixels and arrays

I worked a bit on figuring out how to use a matrix so that a pulse travels down the neopixel strip every time a button is pressed. These pulses can work independent of each other. Only 3 pulses of light can travel down the strip at a time as that’s how many arrays I have set up to keep track of pulses. Each array keeps track of whether it has been activated, what its location is and color. I used a counter in the code for timing, but set each pulse to update at a different point during the timers counting, so the pulses seem to move a little out of sync and inch forward a  bit like a worm, which I found interesting.

For open studios, I would like to work on the code to be able to handle more pulses, randomize the color and speed, and possibly have pulses interact with each other if they overlap. I would also like to have some kind of background that the pulses travel through, perhaps a faint rainbow cycle as well that could also stay as a non-triggered mode. I would also like to have each pulse bounce around the strip 2 or 3 times before terminating, which could be a kind of intermediate between triggered and non-triggered modes.

Lastly I want to clean up my code a bit, putting repetitions into functions, to help me sort through things as I try to add more moving parts to it.

I also tried putting the neopixel strip into a spherical glass lampshade thing. I like the effect but a longer strip would be needed to cover the whole sphere.

 

The Arduino code can be found here:  https://github.com/thatgracehuang/Sensors-Body-Motion/blob/master/neopixel_array_test.ino

Particle sensor experiment

I attached the Sparkfun Particle Sensor to an animatronic tentacle robot, using it as a temperature sensor. When the sensor reads below 26C, the robot goes into an idle “shiver” mode. As the temperature increases above 26C the arc of the tentacle’s movement increases exponentially. Ideally, the sensor would be placed somewhere that better simulates warming up the creature. The temperature sensor is quite slow, which means it takes a while for the tentacle to respond, which does make it seem like a more organic reaction interestingly.

The Arduino code is here:

https://github.com/thatgracehuang/Sensors-Body-Motion/blob/master/Temperature_tentacle2.ino

The video:

The Particle sensor is a very sensitive photon sensor that also comes with a red, green and infra-red LED. It is able to measure heartbeat through pressure (like a pulse ox) via the change in the red and green lights it detects. It can also measure the change from a baseline distance (i.e. presence sensing) at a range of approx 1m. The Sparkfun page claims it can measure eye blinks but no documentation describes how.