Computational Media


For my final P-Comp / ICM project, I want to create a physical, interactive 360 video experience.

I love this weird transition that we’re at – the trends of building towards immersive virtual experiences, meanwhile the hardware is clunky, awkward, not easy to operate, and generally not sexy.  There is so much we don’t know yet about our new storytelling tools, and I want to explore what might be out there.

The idea is to use 2, possibly 3 projectors to map 360 video onto a giant round hanging lantern, where the viewer sticks her head instead while seated on a chair. There is a tiny camera mounted within, which is used to detect when the viewer blinks – a “long blink” (eyes closed for a second or more) triggers an edit, by either cutting to a different scene, or flipping the projected 360 video upside down.  There is a physical controller as well, with a “play”, “pause”, and “rewind” button, reminiscent of VCRs.

There are a lot of components involved, but I’ll be using OpenCV for blink detection, Arduino for the “VCR” controls, and MadMapper for projecting:

I tested out the projection with a smaller lantern and the effect is exactly what I hoped for, the video looks grainy and nostalgic in some way. This is the view from inside the sphere:


ICM // Peachy Hospital

Here’s me restructuring code by creating functions for each aspect of the visuals (also, the editor randomly named this “Peachy Hospital” which seems kind of perfect:

I was trying to figure out how to scale up the pawprints so it could randomly generate bigger or smaller ones, but this seemed almost impossible would grouping all of its components into one object (which I feel like I haven’t quite learned how to do yet?? maybe?) code below!

function setup() {
createCanvas(400, 400);
background(255, 230, 230);

function draw() {

function mousePressed() {
//background(255, 230, 230);

function pawprint() {
fill(random(100, 255), 0, random(100, 150), 100);
size = 8;
ellipse(mouseX, mouseY, size * 3, size * 3);
ellipse(mouseX + 18, mouseY – 5, size, size);
ellipse(mouseX + 12, mouseY – 15, size, size);
ellipse(mouseX, mouseY – 19, size, size);
ellipse(mouseX – 12, mouseY – 15, size, size);

function squares() {
let color1 = map(mouseY, height, 0, 0, 255);
let color2 = map(mouseX, 0, width, 0, 255);
stroke(color1, color2, color1);

for (let y = height + 8; y >= mouseY; y -= 25) {
for (let x = 7; x <= mouseX; x += 25) {
rect(x, y, 10, 10);

P-Comp // Violin Tuner

So this week I attempted to make a tuner for string instruments with a 440A. This tuner was specifically meant for violin beginners, so the four buttons correspond to the E string, A string, D string, and G string. HOWEVER, as the video demonstrates below, I had some issues getting the A, D, and G buttons to play a clean sound.  I know this isn’t an issue with the speaker, as I tried just playing a note without using the buttons through it, but for some reason, the E button is the only one working (and even that one doesn’t produce the best sound). ¯\_(ツ)_/¯

I’d like to figure out why this is happening and how to fix it. Updates to follow. Here’s my code and a better photo because I think it has something to do with my wiring of it:

#include "pitches.h"

int input_g = 7, input_d = 8, input_a = 9, input_e = 10;
int output_g = 2, output_d = 3, output_a = 4, output_e = 5;
int tonePin = 6;
int e, a, d, g;

void setup() {
  pinMode(output_g, OUTPUT);
  pinMode(output_d, OUTPUT);
  pinMode(output_a, OUTPUT);
  pinMode(output_e, OUTPUT);

  pinMode(input_g, INPUT);
  pinMode(input_d, INPUT);
  pinMode(input_a, INPUT);
  pinMode(input_e, INPUT);

  pinMode(tonePin, OUTPUT);

void loop() { 
  g = digitalRead(input_g); 
  d = digitalRead(input_d);
  a = digitalRead(input_a);
  e = digitalRead(input_e);

if (g == HIGH) {
  digitalWrite(output_g, HIGH);
} else {
  digitalWrite(output_g, LOW);

   if (g==HIGH) {
      tone(tonePin, NOTE_E5);
   } else {

if (d == HIGH) {
  digitalWrite(output_d, HIGH);
} else {
  digitalWrite(output_d, LOW);
  if (d==HIGH) {
    tone(tonePin, NOTE_E5);
  } else {

if (a == HIGH) {
  digitalWrite(output_a, HIGH);
} else {
  digitalWrite(output_a, LOW);
  if (a == HIGH) {
    tone(tonePin, NOTE_A4);
  } else {

if (e == HIGH) {
  digitalWrite(output_e, HIGH);
} else {
  digitalWrite(output_e, LOW);
  if (e == HIGH) {
    tone(tonePin, NOTE_E5);
  } else {

//tone(12, NOTE_A4);

  Serial.println(String(g) + " - " + String(d) + " - " + String(a) + " - " + String(e));



week 1 (ICM): omg im coding!

WHAT an exciting weekend this was!

being that I’m 100% new to coding, this was truly a BIG moment for me.

I went ahead and rounded out the vibe by messing with the background values til I achieved the “look” I was going for:

Looking at my perfect circle in hot pink/orange, I realized it would be best to make a cat. so I shrunk the circle into a teeny tiny head and made a cat!

so I got a little carried away with the creation of this and forgot to fully document the process (lesson learned)! but anyhow, I initially regretted making her head so tiny because I had squint to keep making her, but I was too far in the process to start re-scaling and re-positioning all her elements. but I managed to turn this into an opportunity to create an environment for her so that she now has a moon (with craters that are all shaped the same because those arc()s are real tricky) and a sun to boot!

Generally, I found the web editor really easy to use, and it made everything feel cohesive. I felt safe to venture out and break the example codes and experiment. This draw function in particular reminded me a lot of my time using Premiere to affect scale/position/rotation etc by manually entering values. that being said, I definitely didn’t even try to get into rotate() because apparently it rotates based on the entire canvas rather than itself as an object. yikes. and I couldn’t quite figure out how to get the arc() I wanted for the tail and the craters, but I made do. it also took a little getting used to hitting the “play” button to see the outcome.


I definitely see how computation applies to my background and interest in video editing, sound design / music editing, immersive (virtual/augmented/mixed/extended) realities and platforms, and cats.

As a video editor, I think that the magical human element and decision making process is essential to putting a meaningful cut together, but “happy accidents” have also been a big part of my creative process. I could see how computation could help generate those in a creative but controlled way, especially in a less traditionally narrative form, like with fashion commercials or video art. And then so many possibilities with generating graphics, animating them, and making them interactive.

But by moving beyond video into the immersive sphere, I’m increasingly interested in the relationship between physical and virtual spaces, and augmenting/enhancing virtual spaces with physical elements to heighten the experience. I can envision mapping some kind of defunct, tangible keyboard to one that reveals itself in a virtual space and creates beautiful notes that come alive and animate (perhaps even multi-user duets or symphonies can be created as well).

Moving further beyond, when it comes to thinking about computation for an immersive platform my brain implodes, so will have to think about that some other time.

This project, “City Symphonies – Westminster” by Mark McKeague is really cool!! I feel like these kinds of sound-based projects are not always pleasant to listen to, but this is nice. and I like the execution of the animated notes, mimicking traffic pattern.

And as for cats, I hope my computational drawing above speaks for itself.