top of page

A Message from the Future: final project documentation

Today I am grateful for the fact that I've been documenting the process for the last three weeks (see progress reports part 1 and part 2 for full process details), because between the last update on 12/5 and my final ICM presentation on 12/6, and between then and my final PComp presentation on 12/12 a lot happened to finalize the project.


Final video here and after the jump!


Refining code for narrative experience

(process update 2 has description of the actual narrative experience)

My biggest challenge was making the project "invisible" so to speak. That is, since the interface is a phone, I wanted people to experience a phone as they would expect to and not see an ITP project. "Uncanny, not unfamiliar" has been the mantra.

a picture of a corded desk phone, with a surprising experience coded inside
Award for "Most Unassuming First Year Project, ITP Winter 2018"

I was able to figure out how to make the IVR (interactive voice responder) seamless by creating a boolean variable that logged whether a sound file was launched or not. Whenever one sound file was playing, all others are disabled except for the DTMF keypad sounds. This prevented interruptions of the messages/menu options, and layered launches of sound files. I kept the keypad sounds to keep the phone "alive" so to speak; one would expect to get sonic feedback from buttons being pressed even while listening to the menu or message. Choosing an option from the menu cuts the menu sound file; pressing zero relaunches the original menu but jumps over the introduction straight to the itemized options so people have an opportunity to re-listen to their options if they missed something, or listen to another message when they've finished the experience once. I created a second boolean that identifies all sound files and kills them upon hanging up the phone. Invisibility accomplished, in time for the ICM presentation.


Put a Ring on It, Getting Physical

Pop reference humor aside, this last 10% of the project was days of roadblocks, frustration, and tears. Here were the two issues:

1. I was originally using a sound file for the ring, an option that became untenable once I was using the handset earpiece as the speaker for the narrative experience. Therefore, the ring sound would not be heard.

2. Because the ringing is triggered by proximity, there were a lot of conditions that needed to be addressed in order to keep the phone from ringing during the experience, stop the ringing if someone triggers it but does not pick up the phone, and keep the phone from ringing again as soon as a person hangs up but is still in front of the phone (as one tends to be when they are attached to the phone/interaction by a cord).


Solution for number one was going physical. With Tom's guidance, I chose to implement a vibration motor to ring the original bell and moved from an ultrasonic pulse proximity detector to an infrared proximity detector which simplified the code and was much less erratic. I'll probably never use an ultrasonic proximity detector again. Mounting the bell was a headache, but I figured it out. I want to move to using the original driver if possible though since the ring is unstable and a bit unsatisfying tbh.

Vibration motor and metal piece for bell, new IR sensor
Vibration motor and metal piece for bell, new IR sensor

Solution for number two was creating a switch-case state machine. This is where the lost days went. Tom tried to explain it to me. Seho tried to explain it to me. I read through about thirty other examples, and it just would not stick. Finally Allison walked me through the logic (where does the original state get defined? How do we move from one case to another? Why is everything nuts?) and it clicked! Mapping out the states was extraordinarily helpful and it gave me a place to consider all interactive possibilities (what should happen if the ring is triggered and the phone is answered vs not answered? What should happen while someone is on the phone? What should happen when the phone is hung up?). I'm definitely missing some possibilities in the state machine (eg: what happens if one person picks up the phone, hands the receiver to someone else, and then walks away?) but I've covered the basics for invisibility.


Coding for the physical bell (vibration motor) wasn't too bad. I created a function for the vibration bell that riffed off of the BlinkWithoutDelay tutorial by Tom, and added a variable to the edge detection that determined when the phone was picked up or hung up so I could use a boolean to determine its state to help define the cases for the phone ring. I don't think I'm explaining that well, but I'm still learning the language.. hopefully the notes in the code will help convey. I also created countdowns for the ringer so it would stop ringing after 20 seconds if someone doesn't pick up, as well as a countdown that completely stops the Arduino code for 10 seconds after the phone is hung up to keep the ringer from being triggered again immediately.


Video Documentation & Code

Without further ado:


Arduino code

p5 code (https://editor.p5js.org/medusamachina/sketches/r19_7GL1N)

Circuit Diagrams

tktktk


Credit and Inspiration

Thank you Allison Parrish (ICM) and Tom Igoe (PComp). Additional help from Seho Bek, Hayley Hwang, Yeseul Song, Dominic Barrett, Danny Rozin. Inspiration (and loads of help due to amazing documentation) from Angela Perrone's The Museum of Funny Ladies. General emotional inspiration from Chino Kim's Mouth-to-Mouth Dating. Thank you to my classmates who occasionally peeked over my shoulder and pointed out errors/helped fill missing pieces.


Helpful links to be posted asap.

Comments


bottom of page