This paper is published in Volume-6, Issue-3, 2020
Area
Computer Science And Engineering
Author
Omansh Srivastava, Shivang Singh, Ritesh Prasad, Leander Leo Lagardo, Thaseen Taj
Org/Univ
Don Bosco Institute of Technology, Bangalore, Karnataka, India
Pub. Date
12 June, 2020
Paper ID
V6I3-1511
Publisher
Keywords
Gesture Recognition, Human-Computer Interface (HCI), Convolutional Neural Network (CNN), Contour Detection

Citationsacebook

IEEE
Omansh Srivastava, Shivang Singh, Ritesh Prasad, Leander Leo Lagardo, Thaseen Taj. Hand gesture control used for automating human activities, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARIIT.com.

APA
Omansh Srivastava, Shivang Singh, Ritesh Prasad, Leander Leo Lagardo, Thaseen Taj (2020). Hand gesture control used for automating human activities. International Journal of Advance Research, Ideas and Innovations in Technology, 6(3) www.IJARIIT.com.

MLA
Omansh Srivastava, Shivang Singh, Ritesh Prasad, Leander Leo Lagardo, Thaseen Taj. "Hand gesture control used for automating human activities." International Journal of Advance Research, Ideas and Innovations in Technology 6.3 (2020). www.IJARIIT.com.

Abstract

Generally, we designed a real-time human-computer interaction system based on hand gestures. The whole system consists of three components: hand detection, gesture recognition, and human-computer interaction (HCI) based on recognition and realizes the robust control of mouse and keyboard events with higher accuracy of gesture recognition. Specifically, we use the convolutional neural network (CNN) to recognize gestures and makes it attainable to identify relatively complex gestures using only one cheap monocular camera. We introduce the Kalman filter to estimate the hand position based on which the mouse cursor control is realized in a stable and smooth way. During the HCI stage, we develop a simple strategy to avoid the false recognition caused by noises mostly transient, false gestures, and thus to improve the reliability of interaction. The developed system is highly extendable and can be used in human-robotic or other human-machine interaction scenarios with more complex command formats rather than just mouse and keyboard events.