For my final, I deconstructed an old project from exactly 10 years ago to create a video album of a documentary that never was. I wanted to capture the feeling of being in college at that time- the music we listened to with the aux cord in the car, our flip phones, the quality of the video, all capturing the nostalgic atmosphere of a bunch of 19 year olds having fun, pre-social media.
Over the break I plan on getting these photos printed and putting them in a real photo album, here it is so far:
For my final project I would like to expand on one of my projects from the past few weeks.
One of the ideas I had was to continue experimenting with playing videos from still photos as an exploration of stories content and memories in physical spaces rather than in digital folders on a desktop – I’d like to continue de-archiving the hard drive I found with footage from 10 years ago to create a physical photo album to house all these memories in a non linear way (or perhaps, somewhat linear because the pages of a photo album are usually turned from left to right). I would like to finesse the way the video starts playing on actual photos that are developed (fading up, mapping accurately, etc), and perhaps use virtual buttons to begin playback or sound.
I’d ultimately like this to be something like an interactive documentary, the adventure of a few wide-eyed 18-year-old college kids traveling down to this forgotten town of Centralia in Pennsylvania in 2007, shooting social media style “stories” on a camcorder with our Magnetic Fields and Beirut songs playing throughout our car rides. I believe there are only 7 inhabitants left in the town today, but there was also a time capsule that the town buried in 1966 that was uncovered after we shot all this footage in 2014, so in a way I think the format of this video content could mimic the idea of preserving lost memories of this town in physical artifacts.
Another separate idea I wanted to expand on is the 360 video orb. I’d ideally like to have the orb floating in space and when the user touches it, it expands to a larger semi-translucent 360 video sphere that encompasses the user’s immediate area. For this next week though, this might be a bit too tricky, as from my previous experiments trying to do this with Vuforia, the tracking kept breaking, and between my personal device and what’s available at the ER, I don’t have access to a device that can run ARKit. I would also ideally prefer to build this for the hololens, since I think the effect of looking around and feeling immersed in this sphere would be much much more effective without holding a screen up on a device to view it.
Keeping in line with my video theme, I wanted to start exploring 360 video in AR to try to evoke a sensation of being elsewhere. I had some 360 footage I shot in the desert of Israel and I used it as the texture on a sphere with inverted normals in Unity, with some opacity to be able to see the real world environment through it.
I initially wanted to make it a giant 360 video that surrounds the viewer by using the ground plane, but apparently my iPhone6 doesn’t support “SmartTerrain”. So I kept my image target, but I’m happy I was pushed to explore virtual buttons because by adding a rotating animation to the sphere, it made the experience better so the viewer can “spin” the video a full 360 degrees around the Y axis to see the whole environment without having to move all around it. I tried to apply extended tracking to a giant sphere that surrounds the viewer but it kept breaking, so the end result is a scaled down version, which has a different effect – kind of like a snow globe or crystal ball.
Ultimately I would love to try to have a live 360 video feed shown in this way, so the viewer can feel more connected in realtime to another location.
Augmentable Object Fantasy
The biggest obstacle I encounter every day when I’m trying to get out the door is getting dressed. This has been a problem in my life ever since I can remember, and so if I could augment any object, it would be an entire closet system that has all my clothing items and accessories scanned and integrated with an AR mirror. I realize this isn’t the most original idea; there are probably many companies developing this already, but I first came across the idea in “Clueless”, and have been dreaming about it ever since.
To expand on this idea, I think it would be amazing to have an augmentable wardrobe- clothing items that fully allow you to express yourself in AR. People could follow or subscribe to your “AR channel” and see your crazy outfit, hair style, or anything really.
I’ve been really interested in integrating video content into this spectrum of extended reality, so I decided to use still images that would come to life in AR to play a video recording I took almost exactly 10 years ago – to imbue memories into physical photographs as a sort of time capsule. At the time, I was a sophomore an an undergraduate student at NYU, and I went with a group of my friends on a roadtrip around Pennsylvania for spring break. I had a video art assignment I had to fulfill, and while I didn’t end up finishing the project, I was left with a hard drive I uncovered recently, which had a wealth of these recordings that feel like a mix between homemade videos and snapchat stories.
There are two image targets, one is a still of a title sequence, which explain the narrative a bit:
“Centralia: An ancient municipality located in the outskirts of society in rural Pennsylvania. Population: 9 due to coal mining fire circa 1962. The remaining 991 residents (approx.) are expected to return in 2016 to open a buried time capsule.”
The other image target is a rough trailer I had made at the time, the audio didn’t come through on this screen recording (I’ll have to re-upload this), but it’s meant to be a dream-like glimpse into this time and space.
I linked the video player within Unity to each of the image targets using simple planes. Initially, the videos started playing automatically as soon as I hit play, so it was a bit tricky to figure out how to make the video start playing only when the image target was detected in order to create the illusion that the photo became animated. I still need to work on making it stop playing when the image target is no longer detected.
I definitely want to keep going with this and incorporate all this footage into a larger mixed reality project, perhaps to explore relationships with time, memories, and archival content within extended reality.
For this first assignment, I didn’t deviate too far from this course’s title: MAGIC WINDOWS
I thought it would be interesting to try to alter the perception of space to teleport from within the inside of a room. I tried changing the scenery of the outside world by projecting live video feed from Tokyo, onto the inside of a window from my friend’s place in the East Village. After spending all of January hanging around Shibuya, this is the place I wanted to be teleported to the most, so I found this live webcam feed of Shibuya crossing:
and taped white paper onto the glass to project onto:
After tweaking around MadMapper for a bit:
I also thought it would be cool to project the actual street view in the East Village back onto the window, so you could still get the real view of the street while having the window fully covered up and private. I hooked up a webcam to look onto the street corner, and since the night scene matched the lighting on the inside, it looked kind of uncanny; there was some pixelation because I had to blow up the video feed in MadMapper to fit the perspective, but I rather like the effect:
In thinking about integrating these two ideas and about possible futures of mixed reality, I started mocking up what it might look like to have someone call from overseas and have the room transform to that place’s time and atmosphere. Though the clip below doesn’t fully illustrate it, the idea is that when my mum in Tokyo calls, the window would change to the Shibuya crossing scene to heighten the sense of connection – a magic window 🙂
Finally, for a GIF that best fits me, here’s a representation of my beloved cats Uni & Schnitzel in my bed: