home

Body Video

06 Nov 2013

Back, Back; Forth and Forth

*name to be changed*

My ICM final project will be the code for my PhysComp Final: a wearable accessory that controls video playback.

Worn on the wrist, this video controller distorts image, sound, and playback position depending on the position of the user's hand and wrist. My goal is to map the image pixels and sound data to the incoming accelerometer data well enough so that video is always obscured, abstracted. The user must always be moving (ABM) to understand the video. I imagine the video projected on to a screen or wall with the user facing the projection.

example

example

example rough sketch of whole experience

Why?

I made a (defunct) wearable object for my PhysComp midterm and am not satisfied where I ended. Having a better idea of how to prototype wearables, I want to build on the idea by adding more functionality and creating a solid system.

Materials

...Questions...

How do I map the values from the acclerometer to Processing/openFrameworks?
Will connecting a projector to the computer change the mapping algorithm?
What are sensor/serial ports limiations, e.g. If the user moves his/her/their hand too fast, will the serial buffer/port become overwhelmed and freeze the program?
Not truly pertinent to the project but still **...QUESTIONS...**
Would it be feasible to throw a Kinect in the mix to get depth values and interpret those as distance? (But that's not about the device is it?)

Anyways

Direct links to posts related to project are below

icm

comments powered by Disqus Categories