A holistic framework for hand gestures design Academic Article uri icon

abstract

  • Hand gesture based interfaces are a proliferating area for immersive and augmented reality systems due to the rich interaction provided by this type of modality. Even though proper design of such interfaces requires accurate recognition, usability, ergonomic design and comfort. In most of the interfaces being developed the primary focus is on accurate gesture recognition. Formally, an optimal hand gesture vocabulary (GV), can be defined as a set of gesture-command associations, such that the time τ to perform a task is minimized over all possible hand gestures in our ontology. In this work, we consider three different cost functions as proxies to task completion time: intuitiveness Z 1 (GV), comfort Z 2 (GV) and recognition accuracy Z 3 (GV). Hence, we can establish that Max(Z i) (GV): i=1,2,3) over all GV's is our multiobjective problem (MOP). Because finding the solutions to the MOP requires a large amount of computation time, an analytical methodology is proposed in which the MOP is converted to a dual priority objective problem where recognition accuracy is considered of prime importance, and the human performance objectives are secondary. This work, as opposed to previous research done by the authors, is focused on two aspects: First,a modified cost function for an enhanced simulated annealing approach is explained and implementation issues are discussed. Second, a comparative study is performed between hand gesture vocabularies obtained using the methodology suggested, and vocabularies hand picked by individuals.. The superiority of our method is demonstrated in the context of a robotic vehicle control task using hand gestures.

publication date

  • January 1, 2008