M5 Musings

“If you do not ring the bell by pressing on the yellow strip, you will have the pleasure of continuing to ride with me.” – M5 bus driver, November 8, 2009

On a perfectly brisk and sunny Sunday morning, the kind of day on which I can’t help but feel happy as a clam despite feeling the effects of a long night of festivities, I boarded the M5 bus at Houston and LaGuardia with Lucas Zavala, Sarah Dahnke, Mike Cohen and Sebastian Buys. Armed with coffee, cameras and sketch pads, we sat in the elevated area towards the rear of the bus for the optimum vantage point on the world beyond the windows.

The bus driver smiled and chatted as we boarded along with two other passengers, both in seemingly good spirits. The bus felt warm and cozy.

Our first conversation of the trip focused on frozen desserts. The sight of a Tasti D-Lite store set me off on a deprecation of the low-fat, low-sugar dessert which isn’t altogether the healthful alternative to ice cream that it pretends to be. Lucas recommended Yogurtland, a frozen yogurt establishment in the West Village, as the best frozen dessert in town (I keep meaning to check out Yogurtland sometime soon – maybe now that I’ve written it down, I’ll finally make the trip!). I countered that the Nocciola gelato at L’Arte Del Gelato, with its impeccably clean taste and smooth consistency, is the best frozen dessert that the city has to offer, particularly if paired with a scoop of a tangy, fruity sorbet such as Frutti di Bosco (mixed berry), Pera (pear) and Limone Arancia (lemon-orange).

We touched upon the merits of some of New York’s other greatest frozen desserts including Sedutto frozen yogurt, Grom gelato and Haagen Dazs ice cream – and then the discussion steered back to frozen dessert chains. We unanimously agreed upon the need for a Dairy Queen somewhere around here, but took comfort in the fact that there are at least a few Carvels in the city. Can one ever get enough of Carvel’s amazing chocolate crunchies?

I’ve lived in New York City for five years, perhaps not long enough to consider myself a “true” New Yorker, but I think I may get there someday. I never expected to stick around for more than two or three years. Here I am, five years, two jobs, four apartments and countless ups and downs later, and this city continues to intrigue, amaze, surprise and excite me every day. There’s always something around the corner that I just can’t bear to miss, and I’ve begun to wonder how on earth I could ever make the decision to leave this place.

I took a few notes while on the trip – mostly of signs and phrases I saw along the way. The sign on a florist shop in Chelsea boasted a Bouquet-a-Week Club. I love the sight of fresh flowers, but I always like to go to the bodega and pick out my own (or have those who know me select them on my behalf). The sign made me wonder, who subscribes to a service that provides them with a bouquet each week. An absent or lazy significant other? Or maybe the curation is better than one would do oneself. I’m skeptical.

As the bus passed by my favorite gelato place, a former boyfriend’s apartment, the office of my first job, and the salon I once frequented for eyebrow threading, I began to comprehend the relationship I’ve developed with New York through the personal meanings and experiences I’ve attached to places and things throughout the city.

Since moving to the East Village and starting at ITP, I rarely make the trip uptown, save for the occasional trip to the Upper West Side. My friend Luke lives on 95th and West End. Luke is good people. A month or so ago, he kindly agreed to take the fifth role in a movie project I worked on with Michael Edgcumbe, Patricia Adler and Poram Lee for Marianne’s Comm Lab class. Luke displayed the patience of a saint on the day of our shoot, and was pretty much the only member of the cast with the slightest hint of acting skills.

Upon arriving at 179th street bus terminal, we were eager to stretch our legs. I suggested that we stroll west towards the Hudson to see if we could find the Little Red Lighthouse beneath the George Washington Bridge.

The sidewalks between the 179th Street bus terminal and the nearby park are scattered profusely with dog droppings, to the point where we practically had to adopt hopscotch moves to avoid stepping on the stuff. The varying sizes of the droppings suggested several sources. I continue to wonder why the dog owners of Washington Heights don’t clean up after their canines. Is it something in the air? Is there some common personal trait among residents of the area that makes them disinclined to dispose of their dogs’ poop? Is there a single irresponsible dog walker who fails to clean up after his or her charges bomb the curb?

At Riverside and 181st Street, we stopped for some time at a viewing point. It was quite the Kodak moment. We leaned against the wall overlooking the Hudson River to take pictures, watch boats meander by, marvel at the scale of the George Washington bridge and imagine what it would be like to own numerous sets of dentures like America’s first president, which led to the sharing of personal dental experiences.

A few years ago, I learned that I had seven wisdom teeth. I have no idea how this happened – none of my family members have experienced abnormal teeth counts. Six of the teeth were removed in two operations that took place three weeks apart. To lessen the possibility of permanent nerve damage, the oral surgeon elected to give me numerous novocaine injections instead of general anesthetic, so I was awake and alert for the whole process. I felt no pain, but experienced pressure and discomfort that have made me reluctant to have the seventh wisdom tooth removed. For this reason, I get a slap on the wrist every time I see my dentist.

We observed a convoy of FDNY fire engines with lights and sirens blaring, attempting to navigate their way around a corner onto a quiet one-way street. I’ve always been impressed by fire engine drivers’ skillfulness when it comes to reversing their vehicles, but these guys displayed particular dexterity as they backed the trucks around the tight corner, which was made even tighter than usual due to road construction.

When the fire engines disappeared out of view, our hunger pangs kicked in. We decided to rethink searching for the lighthouse, and headed back to the 179th Street bus terminal. With stomachs growling, we re-boarded the M5 downtown.

In contrast to our journey uptown hours before, the M5 filled up immediately for the downtown trip. The driver was gruff and abrupt with a female passenger seeking directions, and snapped at a young gentleman who inserted his Metrocard into the reader the wrong way. After what seemed like an eternity in heavy traffic, we hopped off around 131st Street and walked west to Dinosaur BBQ on 12th Avenue.

In spite of the hour-long wait at Dinosaur, Mike managed to snag us a table outside almost immediately. The sun had set, but we welcomed the breeze on our faces. Over barbecue ribs, pulled pork sandwiches, macaroni and cheese and tasty microbrews, we recounted our trip through the pictures, notes and random things observed over the course of the day. There was unanimous agreement that we had no idea where to begin writing for the M5 bus assignment.

ICM Final: Poetic Waxing

Somewhere amidst the amazing projects, flashing lights, sweet sound and occasional free food, there is a void at ITP. That void is a refrigerator. Some consider a fridge a cooling appliance, while others such as yours truly consider a fridge a canvas for artistic creations, unpaid bills, and other bits and pieces.

With this in mind, I created Poetic Waxing (please click for link to Processing sketch and source code!) in Processing as my final project for ICM. Rather than pouring energy into creating bills in Processing, I decided to focus on artistic creation using words, similar to Magnetic Poetry.

The first step was figuring out how to drag and drop a rectangle, hereafter referred to as a magnet, in Processing (sad, I know – but now I know how to do it!). Thanks to Digital Noah for helping me figure out how to use the mouseDrag() function!

I put a word, “KAT,” on the magnet, and played around with my Processing sketch to make the width of the magnet correspond to the length of each word. Then, I created an array of rectangles, which we’ll call magnets, and amended the mouseDrag function in accordance with the array so that I could move the magnets independently.

My next challenge was to figure out what the words of Poetic Waxing should be. Realistically, I’d be playing with Poetic Waxing more than anyone else, and I figured that to maximize the creativity and uniqueness of the compositions I created, it would be best to solicit words from sources beyond my noggin. I toyed with the idea of parsing words from XML sources and text files such as James Joyce’s Ulysses – but figured that the most interesting language was much nearer than the info stored on remote servers. A quick email to the ITP email list yielded hundreds of awesome words from my fellow students, which I entered into a text file and put in the data folder for the Processing sketch.

One wee issue I encountered: when one magnet overlapped another, they stuck together unless the user pulled them apart from the non-overlapping sections. After a couple of minutes trying to figure out how to prevent this from happening, I realized that this was one of those fortunate mistakes. Magnets stick to magnets in the real world, so the need to manually pull the magnets apart using the mouse adds to the whole user experience. A valuable lesson learned – just because a computer may be able to do all of the work for us, doesn’t necessarily mean that it should always do so.

Lastly, I made Poetic Waxing pretty! I positioned a semi-opaque rectangle beneath each magnet with a slight offset to create drop shadows. I used this image of a brushed steel texture from Dim Sum! on Flickr for the kind of background to which I knew my magnets would enjoy affixing themselves.

The user interface is pretty straightforward: drag and drop words using the mouse. When you become bored or dislikes the random selection of words on the screen, press the space bar to reset.

The Processing sketch and source code can be found here. Enjoy!

 

 

Media Controller

I worked with Nikolas Psaroudakis and Matt Swenson for our Physical Computing media controller assignment. From the outset, we knew wanted to create a physical device that controlled sound, allowed the user to change his/her actions and see in real time how these changes affected the sound.

We all had different ideas about what we wanted to build, some of them being easier and others more difficult. After toying around with numerous concepts, we whittled our ideas down to a MIDI controller that enabled the user to create fairly complex sounds by controlling simultaneous musical loops.

Some rambling sketches from our first brainstorm

Our device would enable the user to compose and perform unique compositions by moving their hands at varying distances above the sensors to control and modify prerecorded sounds, or music. Each sensor on the device would correspond to a different prerecorded sound sample, and the user could control the pitch, tone or tempo of the music.

We used Ableton Live for the music part of our project. We selected a set of sample loops that together nicely when played together and a set of audio filters that would be applied and controlled through MIDI.

Programming the sensors
We chose to use sonar sensors due to their ease of operation. The first step was to connect one sensor for testing. The MaxSonar EZ-1 and the Parallax PING were almost equally priced, so we decided to test both. The Parallax sensor gave more accurate results – but because it was less compact than the MaxSonar sensor, we decided to go with the MaxSonar.

Upon testing the three MaxSonar sensors simultaneously, we discovered some odd errors in the readings that hadn’t occurred when operated individually. After a bit of research, we realized the problem: the sensors were cross-talking, i.e. the pulse of one sensor was detected by another one, making the results very unstable (see below). To eliminate this interference, we could either position them farther apart, or point them in different directions, e.g. on a curved surface.

When we tested all three sonar sensors together, we found that the readings interfered with each other.

The first time the circuit was soldered together, only the sensor on pin 2 was working. Upon double-checking the code for accuracy, we determined that the error must be in the connections – so we unsoldered the circuit board and rewired the components to a breadboard for troubleshooting.

While positioning the sensors at different angles helped reduce interference between them, we found that it was not enough on its own. Readings continued to be unstable, and jumps were experienced at random moments. We decided to employ averaging on the readings and see if that helped, but unfortunately, it didn’t make much of a difference.

Our biggest problem was cross-talking between sensors as well as random jumps in our readings that were causing MIDI send events to take place.

Physical Design & Interface
We loaded prerecorded musical loops on the sensors and played around to determine what gestures made sense to control the levels, and decided that the vertical (up-down) motion of the user’s hand above the sensors would provide the most intuitive method of controlling levels varying between high-low; enable the user to remain in place while operating the device; and lessen the chance of accidental interference from external sources.

The curved top surface allowed for fluid, natural hand movements. The positioning of the three sensors enables user can easily switch to right or left hand to control the center sensor.

At this point, the device was almost ready to go - we just had to attach the Plexi on top.

At this point, the device was almost ready to go - we just had to attach the Plexi on top.

We cut a rectangle of Plexiglas for the curved top of the device, and used the drill press in the shop to make holes for the screws to affix the Plexi to the base. The drill bits in the shop were too small to create holes for the sensors – so John Duane introduced me to the hole saw, a tiny saw in the shape of a circle that fits onto a drill (like a drill bit) and enables the user to cut large holes in fairly thin materials. Fun!

We used a hole saw fitted onto the drill press to make holes in the Plexi large enough to accommodate the sensors.

Matt created a Processing sketch to provide the user with a visual indication of the levels of the sensors.

The Presentation
We tested our MIDI controller in the student lounge on the ground floor of Tisch and were satisfied with the outcome. When the time came for us to present in class, however, the system proved unstable. After troubleshooting for some time, we realized that student lounge was ideal for our experiment due to the high ceiling – and additionally, we were the only people close to our device. In contrast, the classroom’s low ceiling, and perhaps also the presence of others in the vicinity, caused the sensor readings to become unstable.

If we were doing this project again, we’d probably try using IR sensors instead of sonar. We’d also use thinner Plexi, as the Plexi we used developed small cracks when we curved it.

The code we used is below.

int analogPin1 = 0;
int analogPin2 = 1;
int analogPin3 = 2;
int digPin12=12; //Set BW pin to HIGH
int digPin13=13; //Set pimn to HIGH after 250ms
int switchPin=2;
int sensorValue = 0;
int switchState=0;
int prevState=0;
int state=0;
int n=1;/// average over n times
int maxLevel=3000;
int cnt=0;
boolean started=true;
void setup()
{
  // start serial port at 9600 bps:
  pinMode(switchPin, INPUT);
  //pinMode(digPin12,OUTPUT);
  //pinMode(digPin13,OUTPUT);
  Serial.begin(9600);
  establishContact();
}
void loop() {
  //digitalWrite(digPin12,HIGH);
  initSensors();
}
void initSensors(){
  if (millis()>=250){
    if (started==false){
      //digitalWrite(digPin13,HIGH);
      delay(30);

      started=true;
    }
    else {
    //  digitalWrite(digPin13,LOW);
      readSensors();
    }
  }
}
void readSensors(){
  switchState = digitalRead(switchPin);
  if (switchState==1 && prevState==0){
    state=state%3;
    state++;
    prevState=1;
  }
  else if (switchState==0 && prevState==1){
    prevState=0;
  }
  if (Serial.available() > 0) {
    // read the incoming byte:
    int inByte = Serial.read();
    // read the sensor:
    sensorValue = avgReadings(analogPin1,n);//analogRead(analogPin1);
    // print the results:
    Serial.print(maxSense(sensorValue,maxLevel), DEC);
    Serial.print(",");

    // read the sensor:
    sensorValue = avgReadings(analogPin2,n);//analogRead(analogPin2);
    // print the results:
    Serial.print(maxSense(sensorValue,maxLevel), DEC);
    Serial.print(",");
    // read the sensor:
    sensorValue = avgReadings(analogPin3,n);//analogRead(analogPin3);
    // print the last sensor value with a println() so that
    // each set of four readings prints on a line by itself:
    Serial.print(maxSense(sensorValue,maxLevel), DEC);
    Serial.print(",");
    Serial.print(state,DEC);
    Serial.println("");
  }
}
void establishContact() {
  while (Serial.available() <= 0) {
    Serial.println("hello");   // send a starting message
    delay(300);
  }
}
int avgReadings(int pin,int n){
  int avg=0;
  for (int i=0;ilevel){
    return level;

  }
  else {

    return sensorValue;
  }
}

The processing code follows:
(File midiController.pde)

import promidi.*;
import processing.serial.*;
MidiIO midiIO;
MidiOut midiOut;
Controller controller,controller2;
//int inByte=20;
Serial myPort;
sonarSwitch switches;
int nSonars=3;
void setup () {
  size (1024,768);
  switches = new sonarSwitch(nSonars,0);
  background (0);
  smooth();
  initMidi();
  initSerial();
}
void draw(){

}
void myDraw() {
  //println("NO height is:"+height);
  switches.draw();
  fill(0,10);
  rect (0,0,width,height);
}
void serialEvent(Serial myPort) {
  switches.serialStuff(myPort);
  myDraw();
}
void initMidi(){
  //switches=new sonarSwitch(1,0);
  midiIO=midiIO.getInstance(this);
  midiIO.printDevices();
  midiOut=midiIO.getMidiOut(0,0);
}
void initSerial(){
  println(Serial.list());
  myPort = new Serial(this, Serial.list()[0], 9600);
}

(File dotColumn.pde)

class dotColumn {
  dotTrack[] dots;// = new dotTrack [1000];
  int r,g,b,x;
  int nLines=50;
  int nLinesCurrent=0;
  dotColumn(int _x, int _r,int _g, int _b){
    r=_r;
    g=_g;
    b=_b;
    x=_x;
    dots =new dotTrack[nLines];
    for (int i = 0; i < nLines; i++) {
      dots[i] = new dotTrack(x, height-i * 10, 6, 0, r,g,b, 23);
      //println("Height is:"+height);
    }
  }
  void  draw(){
    for (int i = 0; i < nLinesCurrent; i++){
      dots[i].dotDraw();
    }
  }
}

(File dotTrack.pde)

class dotTrack {
  float x;
  float y;
  int w;
  float ySpeed;
  float xSpeed;
  int r,g,b;
  int inByte;
  int h;
  dotTrack (int x_, int y_, int ySpeed_, int xSpeed_, int r_, int g_, int b_, int inByte_) {
    xSpeed = xSpeed_;
    ySpeed = ySpeed_;
    x = x_;
    y = y_;
    r=r_;
    g=g_;
    b=b_;
    inByte = inByte_;
  }
  void dotDraw() {
    noStroke();
    fill (r,g,b);
    float x2 = random (x,x+40);
    ellipse (x2,y,5,5);
  }
}

(File sonarSwitch.pde)

class sonarSwitch {
  int nSonars;
  boolean firstContact = false;
  dotColumn[] columns;
  int[] sonarON_OFF;//=new int[0];
  int [] sonarON_OFFprev;  //previous state
  int[] sonarLevel;//=0;
  int button=1;
  int thres;
  boolean state=false;
  String mySt;
  int check;
  int upperLevel=100;
  int lowerLevel=60;
  int sonarLevelMax=50;
  public int[] sensors;
  boolean[] states;
  sonarSwitch(int _nSonars, int _thres){
    nSonars=_nSonars;
    // sensors
    columns=new dotColumn[nSonars];
    //println("The number is: "+columns.length);
    //println("Switches are: "+nSonars);
    thres =_thres;
    states=new boolean[nSonars];
    sonarON_OFF= new int[nSonars];
    sonarON_OFFprev= new int[nSonars];
    sonarLevel=new int[nSonars];
    for (int i=0; i< sensors.length-1; sensorNum++) {
          if (sensors[sensorNum]>upperLevel){
            sensors[sensorNum]=upperLevel;
          }
          //print("Sensor " + sensorNum + ": " + sensors[sensorNum] + "\t");
        }
        // add a linefeed after all the sensor values are printed:
        println();
        if  (sensors.length==nSonars+1){
          // println(sensors[nSonars]);
          for (int i=0;i 1) {
            switches.checkSwitches(sensors[nSonars]);
            switches.checkSliders(sensors[nSonars]);
            // println(sensors[nSonars]);
          }
        }
        myPort.write("A");
      }
    }
  }
  void checkSwitches(int button){
    // println("Switch "+button);
    for (int i=0;ithres&&columns[i].nLinesCurrent

Clif Bar Locker Robbery: The Remix

Our final Comm Lab class takes place this afternoon. The purpose of this class was to experiment with a wide range of communications techniques including audio, video, animation and digital imaging, through technologies including HTML, Final Cut Pro, Audacity, Photoshop, Illustrator, After Effects, iStopMotion, Soundtrack Pro and others. I’ve really enjoyed working on many of these projects, and the techniques we’ve covered have given us the confidence to utilize these programs for other projects at ITP, moving forward. 

Earlier this semester, Patricia Adler and wrote and recorded a song entitled Clif Bar Locker Robbery for our Comm Lab sound assignment. As sound editing newbies, we found Audacity to be pretty challenging from a first-time user perspective – and on the whole, our initial cut was incohesive in terms of volume, tempo – and, well, pretty much everything. 

So, for our final Comm Lab class, Patricia and I decided to give our song another shot. We’ve added an intro, reverb, extra beat and more…. and we’ve changed Patricia’s voice to that of a child.

A big THANK YOU to Matt Ganucheau, who provided us with some much-needed guidance along the way!

Clif Bar Robbery, new and improved! (parental guidance, or possibly earplugs, may be necessary: profanities aplenty)

Beverage cabinet update!

Our interactive beverage cabinet is coming along… slowly! Alex and I have been working on the Arduino code for the past ten days or so, but we’ve hit a number of obstacles along the way and are still working through the kinks.

The cabinet has five switches, one in each compartment. When a bottle is removed, a switch is flipped in that compartment, triggering a music track and light sequence corresponding to that bottle.

Initially, we figured that the code would be relatively straightforward – it’s just five switches, a few lights and some tunes, right? Well, yes, but using multiple switches to control multiple light sequences through the MIDI dimmer was a little trickier than anticipated.

Our first step was getting the MIDI dimmer to communicate with the Arduino. We tested this using Rory Nugent’s test code.

Next, we programmed the dimmer to begin the light sequence when a switch is turned on, and stop the sequence then the switch is turned off, without delays (which cause problems as the Arduino stops).

When we tested the code using the lightGo() and lightHalt() functions, the light turned on as hoped when the switch is pressed, but didn’t begin the incremental dimming up & down.

So, we sought advice and learned that the parts of code involving (switchState ! == lastSwitchState) and if (millis() – previousMillis > interval) were the source of our issues. So, we added arrays for brightness values, switchState, lastSwitchState, previousMillis and interval variables for each switch, ending up with the following code:

MIDI Dimmer code

int midiPin = 1;   //  digital output pin for the midi dimmer
int lastSwitchState[5]; // previous state of the switch
int lightToggle[5]; //DOES THIS NEED TO BE AN ARRAY? Variable for toggling the light at max and min values
int switchState[5];         // current state of each switch
int lastswitchState[5];     // previous state of each switch
int previousMillis[5];      //DOES THIS NEED TO BE LONG?
int brightness[5];          // array of brightnesses for each light
int switchPins[] = {
2,3,4,5,6};  // digital input array to hold the switch pin numbers

// brightness goes from 1-127…not sure what goes in the curly brackets below
//int brightness[] = { ? }

void setup() {
for (int thisChannel = 0; thisChannel < 5; thisChannel++) {
//initialize switch pins as inputs
pinMode(switchPins[thisChannel], INPUT);
pinMode(midiPin, OUTPUT); //initialize midi dimmer as output
Serial.begin(31250); // set MIDI baud rate
}
}

void loop() {
for (int thisChannel = 0; thisChannel < 5; thisChannel++){
// read the switch:
switchState[thisChannel] = digitalRead(switchPins[thisChannel]);
//if the switch has changed,
// compare the switchState to its previous state
if (switchState[thisChannel] != lastswitchState[thisChannel]) {
// if the switch has changed from off to on:
if (switchState[thisChannel] == HIGH) {
// if the current state is HIGH then the switch
// went from off to on:
lightGo(thisChannel, 10); //REPLACE 10 with “interval”
}
else {  // else the switch changed, going from on to off:
lightGo(thisChannel, 3); //REPLACE 10 with “interval”
}
// save the current state as the last state,
//for next time through the loop
lastswitchState[thisChannel] = switchState[thisChannel];
}
else {  // else the switch didn’t change.
// dim all the light channels:
brightness[thisChannel]–;
lightGo(brightness[thisChannel], 30);

}
}

//if the switch is off:
if (switchState == LOW) {
//call the lightHalt function
lightHalt (0);
//lightHalt (1);
//lightHalt (2);
}
*/
}

// data1 should be from 0-5, and tells the dimmer which light group
// data2 should be from 0-127 and represents the brightness
void noteOn(char cmd, char data1, char data2) {
Serial.print(cmd, BYTE);
Serial.print(data1, BYTE);
Serial.print(data2, BYTE);
}

void lightGo (int thisChannel, long interval) { //
if ((millis() – previousMillis[thisChannel]) > interval) {
//if the light is on at all
if (brightness[thisChannel] == 127) {
lightToggle[thisChannel] = 0;
// Serial.println(“here”);
}
//if light is off
else if (brightness[thisChannel] == 0) {
lightToggle[thisChannel] = 1;
}

if (lightToggle[thisChannel]==1) {
brightness[thisChannel]++;
}
else {
brightness[thisChannel]–;
}
//turn on the light
noteOn(0×90, thisChannel, brightness[thisChannel]);
// save the last time you blinked the light
previousMillis[thisChannel] = millis();
}
}

void lightHalt(int thisChannel) {
//if the switch is off, turn off the light:
noteOn(0×90, thisChannel, 0);
//save the last time the light blinked:
previousMillis[thisChannel] = millis();
}

We’re still working on figuring out and finishing this code and making it work. In the meantime, we’ve hard-coded everything in the hope of making it work.

MP3 Trigger
The MP3 Trigger allowed two ways for us to play MP3 files through Arduino: we could either use the 7 trigger pins on the board to directly trigger 7 pre-selected tracks, or use serial communication through Arduino to enable remote triggering of all tracks on the board, with the added advantage of volume control. We opted for the latter – however, the MIDI dimmer was already using the serial port, so we got the MP3 Trigger functioning through software serial.

We loaded our tracks onto the MP3 Trigger, and used the XX library to link the songs to specific switches.

Switches = Wicked Witches!
We tried out one switch to see if the music and lights would work together and after a bit of troubleshooting, it worked! However, when we tried the cabinet with more than one switch, everything went a bit crazy.

And that’s where we are right now. We’ve worked our posteriors off, we’ve learned a TON, and we’re pretty certain that we’re almost there.

Images/video of our progress shall be forthcoming. But first, sleep. And breakfast.

Pcomp final: Interactive Liquor Cabinet

Over the past few weeks, Alex Vessels and I have brainstormed on the floor at ITP, in the East Village, Chinatown and everywhere in between to settle upon the perfect form factor for our party device. Somewhere along the way, we settled upon the perfect form factor for our party device: an interactive liquor cabinet that will create a different type of party mood, depending on which liquor is selected.

We’re using an existing antique cabinet with five compartments, which will light up when the cabinet is opened. There will be one object placed in each compartment. When an item is removed from the cabinet, a song corresponding to that object will play, and a light sequence will kick in. For example, take out the bottle of Jameson, and you’ll hear the Pogues and be dazzled by a green light display. Or select the Jim Beam, and you’ll hear Guns ‘n’ Roses and be dazzled by red and blue flashing lights.

We’re experimenting with building various types of switches to place on the shelves of the cabinet, and we’re currently learning to use the MIDI dimmer from the ER to control AC lights. For now, we’ll use the Processing Minim library and a laptop to control the music – eventually, we’ll probably switch to aWave Shield, which can be hidden inside the cabinet (and eliminating the laptop from the equation).

Below is a photo of the cabinet as is. We’re hoping to keep the exterior appearance intact and preserve the old-time aesthetic.

 

Cabinet with the door open