A real-time hand gesture interface for a medical image guided system Conference Paper uri icon


  • Abstract-In this paper, we consider a vision-based system that can inter-pret a user's gestures in real time to manipulate objects within a medical data visualization environment. Dynamic navigation gestures are trans-lated to commands based on their relative positions on the screen. Static gesture poses are identified to execute non-directional commands. This is accomplished by using Haar-like features to represent the shape of the hand. These features are then input to a Fuzzy C-Means Clustering algo-rithm for pose classification. A probabilistic neighborhood search algo-rithm is employed to automatically select a small number of Haar fea-tures, and to tune the fuzzy c-means classification algorithm. The gesture recognition system was implemented in a sterile medical data-browser en- vironment. Test results on four interface tasks showed that the use of a few Haar features …

publication date

  • January 1, 2006