hyperSense: Augmenting human experience in environments

Prof Dina El Zanfaly | Carnegie Mellon University, School of Design | Fall ‘20

9.3 // Intro

I’m a huge fan of popular culture, interactive media, design, and learning. In my free time, you’ll likely catch me watching a quirky comedy show (latest show rec: What We Do In the Shadows), running outside, or spending time with family.

path to design: bay area, fitbit, cmu

Prior to CMU, I was a project manager by day and a dance teacher by night. I’ve always been drawn to making, and decided to transition into design by coming back to school. I’ve been at CMU for two years now, and it’s been an awesome experience!

9.8 // Engaging the Body

Design an interaction that engages the body physically.

I was thinking about connection and/or play in remote contexts. Specifically, what are different ways to feel connected to the context of someone else? I thought about two different options, although they could also potentially be stand alone. The first involves light + movement and the second sound + movement.

light + movement

9.15 // Arduino + Iterations

Arduino

SOS LED + Piezzo Buzzer Activities

Interactions

3 ideas that materialize a ceratin interaction

I was thinking about two questions: (1) How might we feel connected to other people in remote spaces? (2)How might we feel connected to other remote environments?

My first two ideas build on and combined ideas I presented last week. Last week, I was urged to think about the sense of touch and incorporate different materials and textures. I was also encouraged to think combining elements of my second idea, specifically sound, into my first idea.

Will be addin gmore

Case Study Inspiration

9.22 // Refining Interaction + Arduino + Reading Tweets

Refining Interactions

Connected aquariums through proximity + sound wave detection//visualizing proximity + sound through water, plant movements

Two small aquarium objects that detect proximity and sound waves. Proximity of someone to the aquarium in space 1, water/algae in space 2 will start to sway (similar to a seafloor). And vice versa. A person can come close and speak to the aquarium, which will cause aquarium in the second space to vibrate faster/seafloor to move faster.

Arduino

button + ultrasonic sensor

Reading Tweets

Exploring the Reflective Potentialities of Personal Data builds upon the concept of slow technology to explore how making data more materially present and interactive can open up possibilities for reflective, memory-oriented experiences.

The Perception of the Environment explores the complex nature of temporality in which past, present, and future are intertwined amongst people and the environment.

9.29 // Refining Interactions + Phenomonology Reading Tweet

HMW brainstorm // concept ideation

Reading Tweet

Concept Progress

form + interaction

10.14 // Progress + Reading Tweet

storyboard credit: rachel!

Body to LED Mapping

Kinect — Max8 — Arduino — Adafruit Servos

Arduino Uno — Arduino EyeShield — LED Matrix

Webcam — Touchdesigner — Arduino LED Matrix

Arduino — Phototransmitter — LED Matrix

Kinect — Raspberry Pi — LED Matrix** (what we’ll use)

Other Technology to Look Into

touchdesigner-arduino connection // ultrasonic sensor controlled LEDs based on distance

Reading Tweet: CYBORGS! I really enjoyed this reading. I was immediately reminded of Donna Harraway’s The Cyborg Manifesto and of Kara Platoni’s talk Transforming Perception One Sense at a Time. I appreciate that inter-disciplinary groups of people are coming together to discuss long-term symbiotic man-machine futures. Also, I could not help but think about “trust” between human and machine. This encompasses both the trust of a machine working in close coupling with the body and in matters of data.

November Updates // Deep diving into the technical

Because I’m remote, Dina sent me all of the hardware needed to prototype various interactions for our project (!!).

the hardware setup: pi, rgb bonet, matrix, kinect, peripherals

Prototyping stages

Kinect to MacBook / Raspberry Pi via OpenKinect + OpenCV

  • Installations took some time but eventually worked with these two tutorials and some refinements: Installing OpenCV + Libfreenect Installation
kinect depth data to pi + snacks

REFINEMENTS for OpenCV to work with comp/matrix

Section 1: Installing Packages for OpenCV

do everything until step 6, then do:

sudo apt-get install libgtk2.0-dev

*rpi already has python 2 +3 installed

Section 3: Compiling OpenCV on your Raspberry Pi

use this for step 2 instead (*adding GTK on for UI elements):

cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=~/opencv_contrib/modules \
-D ENABLE_NEON=ON \
-D WITH_GTK=ON \
-D ENABLE_VFPV3=ON \
-D BUILD_TESTS=OFF \
-D INSTALL_PYTHON_EXAMPLES=OFF \
-D OPENCV_ENABLE_NONFREE=ON \
-D CMAKE_SHARED_LINKER_FLAGS=-latomic \
-D BUILD_EXAMPLES=OFF ..

Section 5: Testing OpenCV on your Raspberry Pi

step one, instead do (python 2 not 3)

python

import cv2
cv2.__version__

Kinect to Pi to Matrix: 4 prototypes

(1) live video: Kinect depth data to display on a 32x32 LED matrix.

  • rpi-rgb-led-matrix github
  • Getting the matrix up and running required splicing together/refining code from here (demo threshold file) and here (rgb matrix as a display).
im waving at the kinect here, can see handprint on matrix

(2) past video: Kinect depth data to (1) record, (2) store, (3) replay on matrix.

  • Added trackbar to rewind video. Needs a monitor to work. The idea in the future would be to replace the monitor trackbar with a capacitive sensor/copper wire on the installation itself.

(3) superimposed past on live video: getting stored Kinect depth data to replay on matrix with live video superimposed on top.

matrix: shows superimposition of past + present — can see two sets of my hand // monitor: shows present video + trackbar which controls video rewind on matrix

(4) kinect depth data to trigger sound + volume: based on distance from kinect, sound volume will increase or decrease. Sound is ambient noise from environment.

Next steps include stringing together 6 matrices // getting the installation up and running.

Form

Putting it together

Interaction Designer | Carnegie Mellon University, School of Design | MDes ’21