Skip to content

apurba-pp/Gesture-based-Volume-Controller

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Gesture-based-Volume-Controller

The project uses openCV to detect the distances between "the tip of thumb" and "the tip of index finger" (in further discussion refered as "finger tips"). Then pycaw is used to proportionate the previously mentioned distance with our system volume.
The Handtrackmodule in src helps to return back the live location of any of the 21 points present in our hand.

How is this project unique?

In this project the volume depends on the real world distance of finger tips and not the on-screen distance of finger tips.
Example: If the real world distance of finger tips is "x" then the volume remains almost the same with any on-screen distance of finger tips i.e. "y1, y2, y3, etc."

About

Now controlling volume is just matter of finger tips!!

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages