Final Project – Sigh Machine

For a final project, we will create a sigh machine which can be controlled by a personal emotion or a social problem. The project will be a bit artistic mechanic machine which is connected with some servos. We are thinking about that the machine will probably be a spatial experience with projections and sound systems. We were inspired by Theo Jensen’s projects and will invent mechanical ribs.

As for input data, we are thinking about tweeter text data or human mouth or the movement of the chest. It will be determined by the final goal.

07. Midterm – Robot Clock


Robot Clock


Conceptual Thought

I wanted to make a robot clock which can be operated by a mechanical system. To begin with, I tried to design a rough system with three servos. 1st Servo and 2nd Servo are robot’s hands to draw numbers, and I planned the lifting function with 3rd Servo because I wanted to separate it out for drawing and waiting functions as if human’s handwriting works.

To engineer a physical clock, I chose to use 3d printing with three servos because I needed to put together each other. So I designed two wings by making holes for servos.


3D Printing

I measured physical objects from Arduino materials including three servos. And then I designed some stuff by using Cad and Rhino3D.


Putting Together

I assembled the objects with three servos by screws, and then I adjusted servo angles to make a great balance.



I connected three servos with digital signals, 4, 3 and 2, respectively. (Right =4, Left=3, Lifting=2)


Time Library

I applied “Time Library” to draw actual time even though I failed to make a real-time clock system. I manipulated example code to draw time by using wings that I made.

05 Iteration & Testing / Wearable Physical Computing


Iteration & Testing

It was not easy to create a glove piano by manipulating the digital piano that I made last week. To make a glove piano, I was able to use ADC Touch library with analog signals as touch sensors that can be played individually by five-fingers. I also included the reference values for each pin by utilizing notes C through G.



Wearable Physical Computing

During the last week, I have been searching for the needs of people, reading books ‘Design Thinking’, ‘Field Guide to Human-centered Design’. By understanding the sources, I figured out some social or individual issues which are frequently occurred in blind people, missing child, and pet.

I would like to focus on creating a wearable device which can interact with one another for people visually impaired or children or pets because they have disadvantages which cannot smoothly interact with people.



04. Digital Piano


Digital Piano

I created a digital piano which can be played by three buttons. They have discrete notes which are connected the digital signal 2,3 and 4 respectively. As we know, we can play piano when we are striking the keyboard, so I made it in the same system.

To begin with, I created a small digital piano which has a note by Arduino. The piano is connected by the digital signal (pins 2,3,4) and set the signal as NOTE_E4, NOTE_D and NOTE_C4, respectively.

Also, I added Void Setup by using “for” function, because I wanted to make the sound when I’m pressing the button. Plus, I added Void Loop to make the sound by connecting to pin 8 with duration 20 [Syntax : tone(pin, frequency, duration)] if the signal is HiGH (digitalRead==HIGH).

Actually, I wanted to make a digital piano which can be played by a glove. If it would be possible, we can probably play the piano on the table without a piano. Although I failed to make a playful piano by gloves, I was able to make a digital piano by Arduino. 

03. Observation & Labs

Connected Worlds by design-io

Exhibits at the Hall of Science. Photo by David Handschuh

I visited Maker Fair and the Hall of Science last weekend. In the Hall of Science, I interestingly witnessed people who are enjoying Connected Worlds attraction, used by multiple people. The attraction is a large scale immersive and composed of six interactive ecosystems spread out across the walls, connecting together by a 3,000 sqft.

The attraction is surrounded by interactive forest and people can interact with animals and objects by hands. I just thought the interaction was made by the movement of people such as jumping, running and walking. However, the interactive experiences are made by gesture sensing by using MS Kinect. The devices were hanging up with the ceiling and this traced the behaviors of people, sensing with the digital animals in the wall.

Exhibits at the Hall of Science. Photo by David Handschuh

However, there was a difficulty to make multiple interactions with a few people because the Kinect was able to trace only one person each. Also, the device could not offer people to specific interactions because of technical problems.

Although there are some difficulties, the way of interaction and intuitive conversations are really great in terms of usability. Especially, simple interactive ways provide children with enjoyable experiences by responding their behaviors.

Exhibits at the Hall of Science. Photo by David Handschuh

The whole transaction takes almost 10min by offering various creatures and plants based on the health of the environment.



I set up a basic digital circuit which is connected to IDE. The five LEDs that I added are sequentially controlled by the logic that I made. Because of the rule of the digital signal, I used pinMode and digitalWrite(number,HIGH/LOW) to connect with digital IO circuit. Also, I added delay time to make an integrated LED performance.


01. What is interaction?

Now, we live in the connected world. The world is built by advanced computer science tech by using a specialized language. The language enables us to interact with people, creatures and artificial objects, so we can converse with them regardless of a language barrier. Although I agree that interaction is a conversation as the author’s opinion, I would like to emphasize that interaction is “communication” between people and objects or entities that provide the functionality of these actions in the context.

I always communicate with them in the real world. I wake up at 8 o’clock by clicking alarm button on an app and this automatically tells me today’s weather. I commute to ITP by checking PATH time schedule on iPhone. Google map shows me a variety of ways to get there. At the lunchtime, my friends and I order meals on Uber eats while the plants I grow are also having enough water, light, and nutrients with a smart garden device. After class, I frequently work out at 404 fitness to check my health condition with a wearable device. And then, before bedtime, Awair which is a smart air conditional device informs me air condition of my room and I control air condition by utilizing the app to have a good night’s sleep.

Those actions and behaviors are interaction which can not only communicate with devices and services, but also artificial objects, and creatures by connecting incognizable health condition, imperceptible air condition, and inexpressible plants. In this process, communication is connected by “body language” called physical computing. Some creatures and objects tell their conditions or opinions with a movement. Designed sensors and technologies immediately detect the movement and convert to a specialized language to connecting their communications.