Wii Want to Write

 DRS Sponsors Best Class Project Award

March 5, 2009

DRS Technologies, Inc. has once again sponsored a Best Project Award competition for Digital Communication and Signal Processing Systems Design, a Fall 2008 capstone design course taught by David Casasent, George Westinghouse Professor of ECE. The awards were presented at a joint HKN/IEEE meeting this semester.

DRS employee Michael Kessler from the company's Washington operations traveled to campus to help judge the oral and laboratory presentations. The winners were Jeffrey Lai, David Leong, and Jeffrey Panza, who won the prize with their project "Wii Want to Write."

"I was impressed by the difficulty of the projects selected and the various problems the students faced as they worked toward completion of the lab, "said Michael Kessler, a software engineer at DRS. "Each group seemed to complete their project in a methodical process utilizing programming techniques taught by Dr. Casasent. The course helped prepare the students for real life problems encountered in industry."

Each group in the course selects a project its own choice and implements it on real-time digital signal processing (DSP) hardware.

The emersion of entertainment and mobile communication devices that include accelerometers introduces a new area of research involving gesture recognition. The winning group chose to investigate these new technologies, utilizing a 3-axis accelerometer as the interface device for symbol and alphanumeric gesture recognition input to any computer device.

The project was, in short, Wiimote gesture recognition. A Wiimote is the controller for the Nintendo Wii, a current generation video game system that prides itself on its imaginative new interface. The controller itself contains a speaker, infrared laser, and, most importantly for the project's purposes, a 3-axis accelerometer. The project used only the accelerometer for gesture recognition. The user forms symbols, uppercase letters, or numbers (all considered "gestures" for this project) by drawing in the air with the Wiimote in hand. The accelerometer data is then detected by the DSP system hardware in the lab and the gesture input is automatically determined.

The student's system uses a custom classifier created from scratch and required only five training images for each gesture. The average length of the accelerometer output for a character is around 50 -- 100 samples per axis. This length is dependent on the speed with which a gesture is drawn. The processing algorithm used in the project requires use of only one set of medium speed gestures for training. The system handles slow or fast speed gestures.

To make calculations simpler on the DSP chip, the group resampled all input signals to 50 samples. The five sets of accelerometer output training curves were clustered, reducing the five curves to two (a maximum and a minimum curve). To classify a test input gesture, the test input accelerometer data was compared to each of the 26 upper case alphabetic characters, 10 roman numerals, and 16 general symbols. Using max/min bounding curves rather than all training samples greatly simplified storage and computations (by 60%) and allowed real-time performance.

To classify an input gesture, the test input accelerometer curves are matched to the bounding area of the curves for the different gestures. When a portion of the test input exceeds a reference bounding area, that portion of both the test input and the reference is dynamically time warped (DTW). This allows the matching to handle gestures made at slow, medium or fast speeds. The result for all three accelerometer axes gives a measure of how close the test input is to each reference gesture.

Casasent noted that "The algorithm and the use of min/max bounds was new to this area of research." The implementation on the DSP was in real time. Very high recognition rates of 94% were possible in some cases. Success was also shown in using one person's test set against another person's training set. This suggests that this system is fairly robust, requires minimal training (only 5 examples of each gesture), and allows use by multiple users and users with differing gesture speeds. This project is a good first step in investigating new ways to implement gesture recognition using simple accelerometers, which are quickly becoming a common component of many entertainment and mobile communications devices.

"This is quite significant since all prior gesture recognition using accelerometer data only involved several -five or six- simple gestures and achieved poorer performance," said Casasent.

DRS Technologies, Inc. is a leading supplier of integrated products, services, and support to military forces, intelligence agencies, and prime contractors worldwide. Focused on defense technology, the company develops, manufactures, and supports a broad range of systems for mission critical and military sustainment requirements, as well as homeland security.

Student project winners from left to right: Jeffrey Panza, Jeffrey Lai, and David Leong (Click on photo to enlarge)

Headshot of David Casasent

Related People:

David Casasent

Related Links:

Winning student poster