home

Body Video Final

20 Nov 2013

Video Distortion

One of my goals at ITP is to explore how current technologies can be used to design and tell interactive narratives. I am also intrigued by the idea of intimacy, between the person, the body and technology. A discussion from our class' Applications class has prompted me to seek a definition of interactive narrative. But I suppose I would also need to define interactive and narrative.

Light research reveals that the appropriate terms are "interactive narrative design" and "interactive storytelling". According to the dominant theory, interactive storytelling is defined as a role-playing game essentially. The Viewer-User-Player makes decisions that alter plot and characters. I think this definition is highly inaccurate, inappropriate, and tiresome. A game has a story, a game is not a story. As much as I adore playing Uno), I doubt it would be an entertaining novella. Imagine if readers had the choice to alter the ending of A Wrinkle in Time? It would become void of all meaning. Or White Teeth? How can one describe Zadie Smith's latest work NW, a novel that unfolds from the perspective of four characters, as NOT interactive? I am bothered by this concept that providing choice to a user is necessary to build an interactive narrative. Choice is a privilege, a privilege I am not extending to my readers and viewers. You made the choice to read my story, watch my video. That's it. If the story requires something more from you to unfold, you will know, you will be asked.

From this stance, I wanted to create an interactive viewing experience. Unfolding in front of the viewer is a short film that becomes more or less distorted based on the user input, the mouse.

What was HARD

The code. Understanding how to map the user input to the video pixels was the first step. The second step was to configure the distortion so that the video could play at 30fps. I was advised to look at two examples from the openFrameworks workshop I attended in September and October. I needed to use OpenGL shaders.

Programming directly onto the GPU renders extremely powerful results.

What was EASY

Nothing.

Usertesting: Results and Feedback

At the first round of usertesting, though the system was still underdeveloped, the users were intrigued by the pixel manipulation.

I did a second round of usertesting where I received the great feedback that the sound should be played continuously.

Next steps

I will use this code as part of my Physical Computing final, a wearable video control. I will replace the mouse with the wearable device. The movement of the viewer's hand will determine the level of distortion of the video. It is not a simple replacement because I am quickly realizing that the accelerometer and mouse move through the mathematical grid differently.

I plan to continue with video distortion using openFrameworks, OpenGL, and WebGL. I enjoy making and editing videos, especially 'simple' setups just by myself and manipulating the footage through editing and now through computer algorithms. Though I have had just a surface introduction to OpenGL shaders, I am no longer intimidated and plan to experiment with WebLGL over the holiday break.

As I was working on the project I became intrigued by the idea of intimacy, the intimacy people have with the inanimate objects that program and run our lives, that communicate to others for us. The Body Video project definitely scratches at this idea, and I am excited to explore this concept along with interactive storytelling.

Direct links to posts related to project are below

icm

comments powered by Disqus Categories