Self Portrait

While listening to Ross’s thunderous snore on our 7 hour flight to Amsterdam, I decided that I might as well start working on my IM project since sleeping was already out of the question.

My portrait begins with a black background that sort of resembles a dark room. Once you click the switch, the “lights” turn on and the actual portrait is revealed. I added some interactions to my portrait to make things a bit more interesting.

Here’s a video demonstrating how it works:

My code:

int clickedX;
int clickedY;
PImage img;

void setup(){
size(800, 800);
background(0, 0, 0);
img = loadImage(“switch.jpg”);
image(img, 300, 300, 200, 200);
}

void draw(){
if(clickedX > 300 && clickedX < 500 && clickedY > 300 && clickedY < 500){
background(255, 150, 150);
noStroke();

//face
fill(224, 172, 105);
rect(200, 150, 400, 500, 70);

//eyebrows
strokeWeight(20);
stroke(0, 0, 0);
line(250, 250, 290, 230);
line(290, 230, 380, 250);
line(420, 250, 510, 230);
line(510, 230, 550, 250);

//eyes
noStroke();
fill(255);
arc(320, 300, 100, 70, 0, 2*PI);
arc(480, 300, 100, 70, 0, 2*PI);
fill(25, 51, 0);
arc(322, 300, 60, 60, 0, 2*PI);
arc(482, 300, 60, 60, 0, 2*PI);
fill(0, 0, 0);
arc(322.5, 300, 40, 40, 0, 2*PI);
arc(482.5, 300, 40, 40, 0, 2*PI);

//sunglasses
if(mouseX > 250 && mouseX < 550 && mouseY > 250 && mouseY < 340){
noStroke();
fill(0, 0, 0);
rect(250, 255, 140, 100, 30);
rect(410, 255, 140, 100, 30);
rect(300, 280, 150, 20, 30);
rect(200, 280, 150, 20, 30);
rect(540, 280, 60, 20, 30);
}

//nose
strokeWeight(8);
stroke(25, 51, 0);
line(400, 350, 430, 450);
line(400, 450, 430, 450);

//mouth
strokeWeight(12);
stroke(139, 0, 0);
bezier(375, 550, 390, 555, 400, 560, 425, 560);

//ears
noStroke();
fill(224, 172, 105);
arc(200, 350, 80, 100, PI/2, 3*PI/2);
arc(600, 350, 80, 100, -PI/2, PI/2);
fill(25, 51, 0);
arc(200, 350, 40, 60, PI/2, 3*PI/2);
arc(600, 350, 40, 60, -PI/2, PI/2);

//hair
strokeWeight(8);
stroke(0, 0, 0);
bezier(210, 170, 213, 150, 217, 130, 240, 80);
bezier(240, 170, 243, 150, 247, 130, 270, 80);
bezier(270, 170, 273, 150, 277, 130, 300, 80);
bezier(300, 170, 303, 150, 307, 130, 330, 80);
bezier(330, 170, 333, 150, 337, 130, 360, 80);
bezier(360, 170, 363, 150, 367, 130, 390, 80);
bezier(390, 170, 393, 150, 397, 130, 420, 80);
bezier(420, 170, 423, 150, 427, 130, 450, 80);
bezier(450, 170, 453, 150, 457, 130, 480, 80);
bezier(480, 170, 483, 150, 487, 130, 510, 80);
bezier(510, 170, 513, 150, 517, 130, 540, 80);
bezier(540, 170, 543, 150, 547, 130, 570, 80);
bezier(570, 170, 573, 150, 577, 130, 600, 80);

//chin
strokeWeight(4);
stroke(25, 51, 0);
noFill();
bezier(360, 635, 375, 660, 385, 660, 400, 635);
bezier(400, 635, 415, 660, 425, 660, 440, 635);
}

//mustache
if(mouseX > 370 && mouseX < 430 && mouseY > 545 && mouseY < 565){
//noStroke();
//fill(0, 0, 0);
//rect(350, 500, 100, 20);
strokeWeight(10);
stroke(0, 0, 0);
noFill();
bezier(300, 480, 310, 505, 370, 508, 380, 510);
bezier(420, 510, 430, 508, 490, 505, 500, 480);
}

//ears
//noStroke();
//fill(224, 172, 105);
//arc(200, 350, 80, 100, PI/2, 3*PI/2);
//arc(600, 350, 80, 100, -PI/2, PI/2);
//fill(25, 51, 0);
//arc(200, 350, 40, 60, PI/2, 3*PI/2);
//arc(600, 350, 40, 60, -PI/2, PI/2);

//left airpod
if(mouseX > 120 && mouseX < 200 && mouseY > 350 && mouseY < 430){
noStroke();
fill(255, 255, 255);
rect(180, 340, 15, 70, 6);
rect(185, 340, 15, 20, 6);
rect(605, 340, 15, 70, 6);
rect(600, 340, 15, 20, 6);
}

//right airpod
if(mouseX > 600 && mouseX < 680 && mouseY > 350 && mouseY < 430){
noStroke();
fill(255, 255, 255);
rect(180, 340, 15, 70, 6);
rect(185, 340, 15, 20, 6);
rect(605, 340, 15, 70, 6);
rect(600, 340, 15, 20, 6);
}

//moving eyebrows left
if(mouseX > 250 && mouseX < 550 && mouseY > 230 && mouseY < 250){
strokeWeight(22);
stroke(224, 172, 105);
line(250, 250, 290, 230);
line(290, 230, 380, 250);
line(420, 250, 510, 230);
line(510, 230, 550, 250);
strokeWeight(20);
stroke(0, 0, 0);
line(250, 220, 290, 200);
line(290, 200, 380, 220);
line(420, 220, 510, 200);
line(510, 200, 550, 220);
}
}

void mouseClicked(){
clickedX = mouseX;
clickedY = mouseY;
}

Physical Computing’s Greatest Hits and Misses Response

Taking an already existing idea and building onto it is great. Just take a look at the world’s most valuable tech company, Apple. Apple has been lagging behind in terms of groundbreaking innovation in recent years (despite their wild claims repeated 100x during their keynotes). Yet, their devices are still selling well enough to keep them atop that list. Why buy Apple’s products when you can just as easily spend your money on something that’s actually fresh and new? For many of us, the answer is simple: Apple’s products are designed and manufactured to such a high level of reliability that we would rather be a few months/years late to a technology than have it now and have it break in a couple of days. Apple knows this and they are not afraid to keep taking existing ideas, perfecting it, and spice it up with a cool minimalistic, Apple design.

Making Interactive Art Response

The author/artist can try to incorporate his/her intention when creating the piece. Whether or not that intention translates into reality is completely up to the audience/user. Without the aid of a forceful “user manual”, it is almost impossible for the author to accurately convey his/her intentions to all of his/her audience, especially since artists are notoriously bad at portraying their message clearly. What happens then is that users make wild guesses and end up interpreting the piece/item in a way that was not aligned with the author’s intentions… and that is how its true utility and/or significance is born. As authors/artists, we should strive to be open to letting our work go once we’ve completed it. Let it stretch out its wings and explore the cruel realms of humanity. Let it discover its true purpose and be whatever it is destined to be. Perhaps, its destiny is was far more remarkable than the one you envisioned for it.

Human Motion Simulation

Initially, I wanted to create a slapping machine that would wack us across the face with a palm shaped cardboard cut out. But then, that kind of seemed too dark so I opted to make a leg that would simulate the kicking motion.

I used a few pieces of acrylic that were cut into rectangles to form the actual leg and I implemented a standard servo motor to generate the motion. The motion was triggered by a button. To finish things off, I added some lovely decorations that would bring the device to life.

My code:

#include <Servo.h>

Servo myServo;

const int servoPin = 5;

const int buttonPin = 3;

bool buttonState = LOW;

bool prevButtonState;

void setup() {

// put your setup code here, to run once:

pinMode(buttonPin, INPUT);

myServo.attach(servoPin);

Serial.begin(9600);

}

void loop() {

// put your main code here, to run repeatedly:

buttonState = digitalRead(buttonPin);

Serial.print(buttonState);

if (buttonState == HIGH && prevButtonState == LOW){

myServo.write(30);

delay(500);

myServo.write(160);

delay(100);

}

prevButtonState = buttonState;

 

}

Device Demo:

 

 

 

Stupid Pet Trick

Without a decent idea and having wasted a lot of time messing around with a handful of different motors, I decided to go on a little getaway to Yas Mall with Ross. As we stopped by Daiso to pick up some bubble solution, I spotted this little solar-panel-powered-plastic-plant (SPPPP) with leaves that would bob up and down if given a strong enough light source. I figured then and there that I could perhaps create some sort mechanical plant that would simulate the behavior of the actual plant. The sunflower and it’s tendency to follow the sun throughout the day seemed like as good a choice as any.

I began by mapping out the algorithm and writing it into code. The logic behind my device was basically just calculating the difference between the values from the two LDRs placed on two opposing ends of my panel and tilting the panel towards either end based on the value and the sign of the difference.

My code:

#include <Servo.h>

Servo myServo;

const int servoPin = 5;

const int photoResistor1 = A0;

const int photoResistor2 = A1;

int prReading1;

int prReading2;

int readingDiff;

int servoPos = 90;

void setup() {

// put your setup code here, to run once:

myServo.attach(servoPin);

Serial.begin(9600);

}

void loop() {

// put your main code here, to run repeatedly:

prReading1 = (analogRead(photoResistor1) + 79);

Serial.print(prReading1);

Serial.print(”    “);

prReading2 = analogRead(photoResistor2);

Serial.println(prReading2);

readingDiff = prReading1 – prReading2;

 

// if light is very dim, return to starting position.

if (prReading1 < 30 && prReading2 < 30){

servoPos = 180;

myServo.write(servoPos);

}

// when reader1 > reader2

if (readingDiff > 10){

if (servoPos <= 180){

servoPos++;

myServo.write(servoPos);

}

}

// when reader1 < reader2

if (readingDiff < -10){

if (servoPos >= 0){

servoPos–;

myServo.write(servoPos);

}

}

}

With the code completed, I built a prototype out of cardboard (stand, panel, servo arm) and wire (pivot, axel). The design was sound until when I had to figure out how to replicate it with acrylic/wood. The pivot made out of wire no longer worked as it did with cardboard since acrylic is a much harder and solid material that simply did not allow the kind of penetration that cardboard did. I ended up redesigning the pivot/axel by laser cutting out holes in the panel and stand and installing a bolt/screw as the axel.

After putting the device together, I realized that the white acrylic panels coupled with the metallic bolt/screw were actually very aesthetically pleasing. Instead of covering it with sunflower decal, I decided to just leave it as it is. A practical implementation for this device could be to enable solar panels to turn and face the sun throughout the entire day, maximizing the amount of energy it could harvest.

The following video is a demonstration of the device in action:

Responses: A Brief Rant on the Future of Interaction Design Response

Touch screens are actually great — for now. They facilitate for immediate, responsive, and efficient interactions with our smart phones. In recent years, companies have been trying to move towards voice control as an alternative/supplemental mode of input. Though the technology has been available for years, voice control still does not feel as convenient, efficient, and useful compared to touch screens. Sure, it’s lovely to be able to holler at Siri and ask her to read you your notifications while you’re in the shower. However, I still do not see voice control becoming something that users could fully integrate and replace touch gestures with. For one thing, it’s definitely not as discrete/private as touch and that would be a problem when in the public. Until the next great interactive technological advancement takes place, the touch screen is probably the best option we have.

A Brief Rant on the Future of Interaction Design Response

The future depicted in the video is cool, but not because of the technology exhibited in it. What attracted me the most was the bezel-less, minimalistic, ultra-modern design, which I found very aesthetically pleasing. I strongly agree with the author on the observation that the featured technology really is a rather small increment in the functionality department. The design makes everything look fresh and much more interesting, but if you look closely, we’re essentially doing the exact same thing we’ve been doing all this time. Victor’s analysis on how we should pay more attention to our humanly capabilities (maybe other than our hands) and go on from there is definitely a good start. Perhaps one day in the near future, our “smart phones” would cease to be handheld devices. Perhaps the display and controls could be seamlessly integrated into our natural vision via high tech contacts — just like those featured in season 1 episode 3 of the Black Mirror.

Emotion & Design Response

I’m going to start off by saying that I really quite liked this piece. Since middle school, I’ve always been an adamant believer in the concept: Attractive Things Work Better. In fact, I’d like to take it one step further by saying: Using beautiful things may even make the user more productive.

Most people believe that in a certain price range, the item with the best technical specifications and the most functions is the one with the best value. The aesthetic design, compatibility (with software and other accessories or products), and other less “tangible” qualities are often overlooked. In my opinion, these less concrete elements play just as important of a role as the tech specs in creating the best possible user experience, contributing significantly to its “value”, though in a less tangible way. As Norman stated in the second last paragraph, “good design means that beauty and usability are in balance”.

Take Apple as an example. Many Apple products such as the MacBook Pro or the iPhone do not have technical components that even come close to rivaling those employed by other products of other brands in the same price range. Yet, there are many (although still a minority) that choose to go with Apple’s products because of their soft qualities (i.e. sleek design, interconnectivity, aka Apple Ecosystem, etc). Throughout the years, Apple has stuck by its core values by putting as much emphasis on design and software as its hardware. Yes, they have been vehemently criticized because of the subpar specs, but they’ve also managed to garner a loyal customer base that appreciate the soft qualities of its products. Seeing how they’re currently still the world’s most profitable company, I’d say it worked out pretty well for Apple.

The Design of Everyday Things Response

Interesting and informative piece. The section discussing the concept of human centered design resonated strongly with an important bit of advice Aaron gave us after the first assignment. That piece of advice was to always consider the audience/customer/user and all the possible ways they might use your product when designing the item because chances are, many of them might not read the manuals and use the item the way we thought they would when we created it. I agree that the best way to design/create something is to put the audience’s needs, capabilities, and behavior first, then design something to accommodate all of it.

The TherMOMeter 2000

People never really know if their fever is bad enough to get them out of bed and go see a doctor. This often causes them to miss the optimum diagnosis and recovery window. But don’t fear just yet! Introducing the all new revolutionary therMOMeter 2000! This intelligent device will take your temperature in a matter of seconds and remind you that you need to go see a doctor right away with an alarming red light — just like how a mom would!

The hue alternates between blue (you’re cold), green (you’re fine), and red (you have a fever).

Here is a simple demonstration of the TherMOMeter in action:

Before settling on the TherMOMeter, I experimented with many different sensors including the pressure sensor, the photoreceptor, and the sound sensor. I ultimately opted for the temperature sensor because it was very responsive and adorably compact. The sparkfun website offered a lot of insight in terms of how to properly install the sensor onto the circuit board. Installing the sensor backwards (incorrectly) actually caused it to heat up!

A screenshot of the code I used is attached below: