All posts by Tom

TypeSafe final post

ShowcasePoster

Prototype2

 

TypeSafe is a hand-worn device designed to improve the wearer’s computer usage habits by discouraging uncomfortable wrist postures and encouraging  regular periods of rest during prolonged keyboard typing.  The idea is to help the wearer avoid long-term health problems that can result from excessive and unhealthy computer usage, such as Carpal tunnel syndrome.

The device consists of a finger-worn inertial measurement unit (IMU) and a wristband containing an Arduino microcontroller, vibrating component, and battery. The IMU tracks the orientation of the wearer’s hand. The Arduino program analyzes the orientation data and infers the wearer’s hand posture and typing activity. When the hand posture is inferred to be”bad” (uncomfortable), the device alerts the wearer with vibration. Moreover, the device keeps track of how long the wearer has been typing and produces vibration to let the wearer know when they should take a break.

Both hand posture and activity inference are accomplished by performing analysis of the orientation signal. Current posture is classified as bad when orientation significantly departs from the neutral (healthy) hand posture, while typing activity is inferred from high-frequency content of the orientation signal.

The device currently exists as a fairly crude prototype. Although most of the planned features of the device are supported in the prototype, their implementation is still very basic and lacking in robustness. The original project plan foresaw several iterations on the prototype, during which I would have performed more principled collection and analysis of IMU data during everyday computer usage and development of more sophisticated inference machinery for postures and activities. However, these steps never occurred due to lack of time.

A technical hurdle which proved quite costly in terms of time was getting the IMU to work with the chosen microcontroller (Arduino LilyPad) and obtaining reliable orientation data.  The solutions to this hurdle were relatively simple – changing the configuration of the microcontroller’s SDA and SCL pins, and making sure connections between the microcontroller and IMU were properly soldered. However, diagnosing and fixing these issues took long enough that less than two weeks remained for other steps on the project – data analysis, development of inference models, and user testing.

Given more time, I would make several improvements to the device:

– Use better sensors for posture tracking. IMU seems to be quite prone to drift, which makes it difficult to reliably track absolute posture. Flex sensors (https://www.sparkfun.com/products/10264) may be a better – and cheaper – alternative for this purpose.

– Introduce additional sensors for tracking postures of the elbows, shoulders, and the back, as these are also prone to repetitive strain during computer usage.

– Collect and analyze sensor data during real computer usage and develop trained models for posture and activity classification. This poses an interesting challenge, because these models must be fast and compact enough to run on a microcontroller with limited memory capacity and computing power.

12/7 TypeSafe

This week’s activities:

– Got a nice sweatband and used it to build the first, rough prototype of my device. This involved quite a  bit of stitching, soldering, connecting things with wires, and taping them to other things. Here is what the prototype looks like:

Prototype1

– I’ve also written a good chunk of the Arduino code, though the meat – activity and posture inference – is still missing.

Problems encountered:

– My first attempt at prototype didn’t work at all. IMU was unreliable, due to wiring issues, and I also had a few shorts due to stitching issues (stitching  on this wristband is not as easy as on a rectangular piece of fabric).

Successes:

– All of the hardware problems that plagued me before seem to have been resolved.

I’m behind my original schedule, but reasonably confident that I’ll have a working prototype in time for the showcase. It just won’t be as full-featured or as pretty as what I originally envisioned.

Plan for next week:

– Implement the activity and posture inference in software.

– Demonstrate a working prototype of the device.

– Make a poster.

 

11/30 TypeSafe

What I have been up to this week:

1) Got the IMU to work with my LilyPad, was able to get good Euler angle readings. Now we’re cookin’ with gas!

2) I started putting together a physical prototype of a wristband and IMU ring. You can see a physical mockup in the picture. Instead of using a flimsy piece of fabric for the wristband, tomorrow I plan to buy a sweatband and stitch my board and other components into it.

3) I’ve been experimenting with a neat little program called Gobetwino, which offers several useful features for communicating between an Arduino and a PC. I plan to use it for writing sensor data to a spreadsheet and analyzing the output, in order to derive activity and posture classification heuristics.

Wristband-mockup3

Current concerns:

1) I’m worried about wiring on the IMU. Specifically, I’m worried that connections might be too flimsy, which could cause the IMU to freeze periodically. I can’t think of a reliable solution, short of soldering the wires to the IMU, which I don’t want to do this early in development.

2) I’m also worried about wires being too thick and rigid, which could constrain the hand movement such that we get badly skewed sensor data for typing motion and posture. Anyone got any nicer, more flexible wires?

11/23 TypeSafe

FInally got to do a bit of work this week.

1) I managed to get the Flora board working on my lab computer, though it is still quite unreliable and needs to be reset all the damn time.

2) To establish I2C communication with my IMU, I had to first make a small modification to the FreeSixIMU library. Apparently in order to get sensor readings from the accelerometer, it is necessary to disable internal pullup resistors on the SDA and SCL pins. FreeSixIMU library did not do this properly for 32u4, which is what Flora uses.

3) Good news end here: data I’m getting from the IMU seems to be junk. While initial readings from the IMU are not always the same, they don’t change at all as I move the IMU around. Also, the device seems to freeze frequently. Assuming the IMU isn’t defective, I suspect wiring issues, some problem with my modifications to FreeSixIMU, or maybe even an issue with serial communication (possibly caused by Flora driver issues.)

My next steps:

1) I plan to switch back to using the LilyPad for my device. Upon further inspection, I have determined there is no reason whatsoever why either LilyPad and LilyPad Simple should lack support for I2C. Moreover, LilyPad doesn’t use the 32u4 chip and therefore doesn’t suffer from Flora’s annoying driver issues. I suspect the reason why I couldn’t get the IMU working on the LilyPad previously was because of the internal pullup resistor thing, and possibly wiring issues (the IMU is very sensitive to those apparently.)

2) Once I get the IMU working properly with the LilyPad, I will be able to start getting sensor data and implementing my prototype device.

11/9 TypeSafe

This week involved a lot of futile effort to get some data out of my IMU. I’m using a library called FreeSixIMU to get 6DOF tracking data.  The library in turn uses Wire library to communicate with the hardware using I2C. However, when trying to do this on my LilyPad simple board, Wire library fails to communicate with the IMU, possibly because the board does not support the SCL and SDA pins needed for I2C. I then tried to achieve the same on a second microcontroller, Adafruit Flora (which exposes the required pins). I barely managed to get Flora working on my machine (barely – Arduino likes to freeze a *lot* when using it), but still had no luck getting any kind of meaningful response from the IMU.

I plan to take the following steps to resolve or work around these issues:

1) Do a bit more testing with the Flora – I need to check whether I can get a simple example working on it, just to ensure that basic features like serial communication work properly.

2) Try the IMU on somebody else’s board/machine to make sure the thing isn’t defective.

3) I’m planning to reformat/reinstall the OS on my laptop, for unrelated reasons.  If the problem is on the software side (quite possible, given all the trouble I had installing Flora), maybe it’ll go away.

I don’t have any tangible results to show for this first week, so I’ll instead post a picture of a cat that looks like Hitler:

hitler_car

He’s actually a very sweet cat and not racist at all. I met him on the grounds of Dolmabahce palace in Besiktas, Istanbul. There are many stray cats like him around here and they are all very docile. However, he’s the first one I found that resembles a genocidal dictator. I call him “Fluffy the Führer.”

11/2 TypeSafe

TypeSafe: Hand-worn Device for Promoting Healthy Typing Habits

Tomislav Pejsa

Wristband-mockup2

The goal of my project is to build a hand-worn device that promotes healthier postures and habits during keyboard typing. The device will track the wearer’s typing activity and hand posture; it will provide auditory and haptic warnings about stressful postures; and it will alert them to take period breaks. The device will also record data about daily keyboard usage for later download and analysis.

Device will include the following parts: wristband housing the microcontroller, buzzer and/or vibe, and battery; ring with an IMU. May also include a MicroSD card reader and an electromagnetic sensor for better typing detection. The total price of all parts will be $80+.

Project plan:

1. Build a prototype that can capture hand tracking data

2. Perform analysis of the captured data [milestone: mid-Nov.]

3. Develop heuristic activity and posture classifiers

4. Iteratively build prototypes and improve/fix sensing and classification [milestone: end of Nov. for first iteration]

5. Implement recording and download of usage statistics

6. Build final prototype

Fallback: Simplify and shed features. Usage recording gets the axe first.

Wristband Monitoring Hand Activities

My goal is to design a wristband which tracks the wearer’s hand activities and alerts them of activities that are potentially harmful to their health. Specifically, I am considering two types of activities. Firstly, I want to detect unconscious self-manipulator gestures such as scratching and alert the user when they occur – this could be user to prevent people from scratching at injuries, rashes, etc. (Basically, I want to build a human version of the flea collar.) As a secondary (or alternative) feature, I want to track repetitive movements such as typing and mouse usage and notify the wearer to take periodic breaks.

I intend to work alone on the project. I feel confident that I can quickly prototype a wristband with a microcontroller and IMU that can track 6DOF hand movements and classify them using simple heuristics. However, I expect I will need to utilize a trained model to perform robust, reliable classification, which may prove challenging due to hardware limitations and the need to collect and annotate high-quality training data. Moreover, reliable detection of self-manipulator gestures is likely to be impossible using just 6DOF tracking data and may require integration of additional sensors into the design.

Wristband-mockup

Game with Blinking Lights and Cheesy, Badly Transcribed Uplifting Trance Melodies

My design implements a simple game with two sets of blinking LEDs. The LEDs on the left-hand side blink at a constant rate, while the ones on the right-hand side blink at a rate that is controlled using the illumination of the light sensor. Shining more light on the light sensor will make the LEDs on the right blink faster.

The objective of the game is to get both LEDs to blink in sync by shining the right amount of light on the light sensor.  Both sets of LEDs must also blink in phase; this can be achieved by pressing and holding the button, which will delay the blinking phase of the right-hand LEDs. Once the player has won the game, they are rewarded with a melody, played through the buzzer. The melody is accompanied by vibration and LEDs blinking in sync with the melody.

Inputs: button, light sensor

Outputs: 4 LEDs, buzzer, vibe

Animating Wearable Visuals

A popular and attractive application of wearable computing are garments with programmable visuals, realized using display technology such as LEDs. Programming such visuals is currently done at a low level, by writing microcontroller programs that control the activation of individual display elements. This approach lacks scalability – programming becomes progressively more unwieldy as the number of display elements increases. It is also ill-suited for implementing complex visual patterns that activate sychronously across multiple garments – when each garment has its own, separately programmed microcontroller, synchronizing visuals on multiple garments is a challenge. Finally, the approach is not particularly intuitive, as there is a disconnect between how display elements are accessed during programming (using pin numbers) and their logical groupings and physical locations on the garment.

I propose adapting some of the tools and workflows traditionally employed in computer animation to the problem of programming wearable visuals. I envision a system that allows authoring of animated visuals on multiple garments in a single place – a graphical animation tool on your PC. The lighting of many display elements across one or more garments could be programmed visually and using a much smaller set of logical parameters. For instance, imagine a shirt with an array of LEDs on the front and the designer wants to light them up in a pulsating heart shape. The designer could define a single animation parameter called “ShirtHeartShape” that maps to the pins of all those LEDs, keyframe its value at different times, and use interpolating curves to achieve a nice, pulsating effect. The designer could also author arbitrarily complex visuals across multiple garments by synchronously animating multiple such parameters. Most of the complexity, both in terms of algorithms and creative effort, would happen in the animation tool. There would be no need to program microcontrollers on individual garments – their only task would be to receive time-indexed animation frames (pin values) from the animation tool and realize the visuals by applying the frame values to appropriate pins.

Controlling display elements on a garment using hand-authored animation curves.
Controlling display elements on a garment using hand-authored animation curves.

As a further extension, I also propose an intuitive method for selecting which display elements should be controlled by a particular animation parameter. The idea is to scan the garment using a camera in order to obtain its image, and perform automatic registration to determine the pin numbers and physical locations of all display elements. The designer could then define the parameter mapping by using a sketch-based method (an equivalent of Photoshop lasso tool) to select a subset of LEDs directy on the image of the garment, and the system would automatically extract their pin numbers and compute the mapping.

Selecting display elements to be controlled by a single parameter using an intuitive lasso-like tool.
Selecting display elements to be controlled by a single parameter using an intuitive lasso-like tool.

High5

High5: Promoting Interpersonal Hand-to-Hand Touch
for Vibrant Workplace with Electrodermal Sensor Watches

http://nclab.kaist.ac.kr/wp-content/uploads/2014/08/High5_Camera_Ready_final.pdfa7a05

This work was presented at UBICOMP 2014. The authors present a system designed for detecting and rewarding interpersonal, hand-to-hand interactions such as  “high-fives,” in order to promote a more cheerful and vibrant workplace atmosphere. Users wear smartwatch-like devices, which use acceleration and skin potential to detect when users high-five each other. Each user is then awarded “high-five points,”  which encourages them to high-five each other even more. This presumably continues until everybody’s hands have open wounds on them from all that high-fiving.

What makes this work interesting is that it focuses on characterizing and supporting social behaviors, rather than purely physical ones, as is the case with most wearables. Furthermore, hand movements are  particularly challenging from technical  standpoint, since they have many degrees of freedom and lots of subtlety. We humans are naturally very adept at classifying that subtlety into concrete communicative intents, but teaching a wearable computer how to do it is a whole new ball game. Any researcher who makes tangible progress in that direction  deserves a major high-five: