Back, Back; Forth and Forth
*name to be changed*
My ICM final project will be the code for my PhysComp Final: a wearable accessory that controls video playback.
Worn on the wrist, this video controller distorts image, sound, and playback position depending on the position of the user's hand and wrist. My goal is to map the image pixels and sound data to the incoming accelerometer data well enough so that video is always obscured, abstracted. The user must always be moving (ABM) to understand the video. I imagine the video projected on to a screen or wall with the user facing the projection.
rough sketch of whole experience
I made a (defunct) wearable object for my PhysComp midterm and am not satisfied where I ended. Having a better idea of how to prototype wearables, I want to build on the idea by adding more functionality and creating a solid system.
- I had really wanted to use openFrameworks for this project, but I am no longer as sure, since I have the Pixel assignment from two weeks ago.
- Accelerometer and Gyroscope: I am using Adafruit's FLORA LSM303DHC sensor. I am becoming more familiar with it,
- XBee Radio: I will use this device to bridge communication from the computer to the board.
- Wearable circuit board: Adafruit's FLORA. Probably a mistake purchasing a new board for an assignment with tight deadlines. The FLORA is a bit smaller that the Arduino LilyPad, is black and uses the I2C protocol. Communication with my Mac's serial port is not that great, but I was advised to not purchase an alternative until my code is solid.
- Fabric: I would like the device to have adjustable straps so that hands of different sizes can user-test. Since I prefer the board, sensor, and XBee to not be exposed, I will need to configure an enclosure: perhaps, a flap whose snap the the on/off switch of the circuit. I am not sure if I will be able to achieve both these design choices in the time remaining.
- Thread: The 3ply conductive thread I used for my midterm project was extremely difficult to manipulate especially for a novice sewer. I need to research alternatives to conductive thread.
- How do I map the values from the acclerometer to Processing/openFrameworks?
- Will connecting a projector to the computer change the mapping algorithm?
- What are sensor/serial ports limiations, e.g. If the user moves his/her/their hand too fast, will the serial buffer/port become overwhelmed and freeze the program?
- Not truly pertinent to the project but still **...QUESTIONS...**
- Would it be feasible to throw a Kinect in the mix to get depth values and interpret those as distance? (But that's not about the device is it?)