Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review - Huang 1997

Hand gestures offer an alternative to bulky human-computer interface devices (HCI). Visually interpreting hand movements can help HCI achieve ease and naturalness. This has fueled research on computer vision-based hand gesture analysis and interpretation. We review visual perception of hand motions in HCI. This debate is based on modeling, analyzing, and identifying gestures. Whether a 3D or image appearance model of the human hand is employed affects gesture interpretation. 3D hand models allow for more elaborate modeling of hand gestures but pose computational difficulties given HCI's real-time requirements. Appearance-based models lead to computationally efficient "purposeful" techniques that operate well in confined contexts but lack HCI's universality. We describe implemented gestural systems and other vision-based gesture recognition applications. Although development is encouraging, more theoretical and computational breakthroughs are needed before gestures can be widely used for HCI. We address future directions in gesture recognition, especially its integration with natural human-computer interaction.

Pavlovic, V. I., Sharma, R., & Huang, T. S. (1997). Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on pattern analysis and machine intelligence, 19(7), 677-695.

Previous
Previous

Human Engaged Computing: the future of Human–Computer Interaction - Ren 2019

Next
Next

The Future of Depression Prevention and Treatment: an HCI Perspective - Petrovska 2016