Skip to content

Developed python exe setup using tkinter to fully control mouse & keyboard movements by human hand and eye gestures. It works with the help of different libraries and packages like OpenCV, Mediapipe, Cvzone, Dlib, Haar Cascade, etc.

Notifications You must be signed in to change notification settings

ngandhi369/Gesture-Controller

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gesture-Controller

Team Member:-

Problems:-

  • Many of the Disabled People find it difficult to write using keyboard ,Also in Covid Situation many people don’t want to touch stuff that’s related to Covid.
  • Sometimes when Mouse or Keyboard doesnt Work Properly Incase of a laptop is a Difficulty.

Solution (Overview of project):-

  • We make a Gesture Based Keyboard Interface using Webcam so that one can type things in a new Interactive way. The Way it Works is using your Hand gesture and Eye tracking to use the Gui keyboard and mouse to Interact With it.

TKINTER GUI

Key Features:-

  • Developed python exe file setup using tkinter to fully control mouse & keyboard movements by human hand & eye gestures.
  • It works with the help of different libraries and packages like OpenCV, Mediapipe, Cvzone, Dlib, Haar Cascade, etc.
  • Also included Voice & basic storage features. Helpful in such a circumstances like COVID-19.

Working:-

Model Detects Face and Hand points using following modules:

dlib's 68 points facial detection:

  • The mouth can be accessed through points [48, 68].
  • The right eyebrow through points [17, 22].
  • The left eyebrow through points [22, 27].
  • The right eye using [36, 42].
  • The left eye with [42, 48].
  • The nose using [27, 35].
  • And the jaw via [0, 17].

FACE DETECTION

HandtrackingModule's 21 points in hand:

  • 4,8,12,16,20 - top of all five fingers, thumb top to pinky fingure top respectively.

HAND TRACKING

Different packages:

cvzone's HandTrackingModule to detect hands. Dlib's shape predictor(shape_predictor_68_face_landmarks.dat) and haarcascade classifier(frontal_face_detcetor.xml) to detects face imutils to fetch eyes landmarks from dlib's 68 points facial detection. Eye-aspect-ratio(EAR) for detect blinking. pynput for controlling keyboard events. pyautogui for mouse events

Dependencies:-

  • CvZone
  • Pynput
  • HaarCascade
  • PyAutoGui
  • Tkinter
  • Mediapipe
  • auto-py-to-exe
  • Dlib

MAIN IMAGE

References:-

Gesture.Keyboard.mp4

About

Developed python exe setup using tkinter to fully control mouse & keyboard movements by human hand and eye gestures. It works with the help of different libraries and packages like OpenCV, Mediapipe, Cvzone, Dlib, Haar Cascade, etc.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages