Hand Gesture Control Apps Using Hand Movements
Keywords:
hand gesture recognition, hand-computer interaction, computer vision, mediapipe, cursor control, touchless interface, real-time tracking, gesture-based navigation, assistive technology, Python automationAbstract
This project presents an innovative and touch-free method for controlling a computer using simple hand gestures. With the help of a webcam, users can perform everyday actions—like moving the mouse cursor, clicking, scrolling, or adjusting system settings such as volume and brightness—just by waving or positioning their hands in specific ways. At the heart of this system is Mediapipe, a powerful computer vision library developed by Google. It accurately detects hand landmarks and tracks finger movements in real-time. By analyzing how the fingers are arranged, the program interprets different gestures and converts them into commands for the computer. This gesture-controlled interface not only makes computer interaction more intuitive but also enhances accessibility, especially for individuals with physical limitations or those looking for hygienic, contactless interaction. It's a meaningful step toward making human-computer interaction more seamless, inclusive, and futuristic.
Downloads
Metrics
References
Zhang, Z. (2012). Microsoft Kinect sensor and its effect. IEEE Multimedia, 19(2), 4–10.
Mittal, A., Zisserman, A., & Torr, P. H. S. (2011). Hand gesture recognition using motion trajectories. In Proceedings of the European Conference on Computer Vision (ECCV) (pp. 411–424).
Wang, R. Y., & Popović, J. (2009). Real-time handtracking with a color glove. ACM Transactions on Graphics, 28(3), 1–8.
Freeman, W. T., & Weissman, C. D. (1995). Television control by hand gestures. In IEEE International Workshop on Automatic Face and Gesture Recognition (pp. 179–183).
Pavlovic, V. I., Sharma, R., & Huang, T. S. (1997). Visual interpretation of hand gestures for human-computer interaction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 677–695.
Ong, S. C. W., & Ranganath, S. (2005). Automatic sign language analysis: A survey and the future beyond lexical meaning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(6), 873–891.
Carfagni, M., Furferi, R., Governi, L., & Volpe, Y. (2020). Real-time gesture recognition using convolutional neural networks and transfer learning. Procedia Manufacturing, 42, 526–532.
Google Developers. (n.d.). MediaPipe Framework. Retrieved from https://mediapipe.dev
Van den Bergh, M., & Van Gool, L. (2011). Combining RGB and ToF cameras for real-time 3D hand gesture interaction. In IEEE Workshop on Applications of Computer Vision (WACV) (pp. 66–72).
Arora, M., & Tiwari, P. (2020). Hand gesture recognition using computer vision. International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 6(2), 2456–3307.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
You are free to:
- Share — copy and redistribute the material in any medium or format
- Adapt — remix, transform, and build upon the material for any purpose, even commercially.
Terms:
- Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.