Hand Tracking & Gesture Control With Raspberry Pi + OpenCV + Python
Vložit
- čas přidán 26. 07. 2024
- Full Article - core-electronics.com.au/tutor...
Identify and track every joint in the fingers of your human hands, live. Then use your human hands to send commands to control media software and GPIO attached hardware. All via a Raspberry Pi Single Board Computer.
Make sure to use the Previous Raspberry Pi 'Buster' OS with this Guide.
Related Information
Flashing 'Buster' OS onto a Raspberry Pi - core-electronics.com.au/tutor...
Setting Up a Raspberry Pi as a Desktop - core-electronics.com.au/tutor...
GlowBit Matrix 4x4 Guide - core-electronics.com.au/tutor...
Face Tracking with Pan-Tilt Hat - core-electronics.com.au/tutor...
Facial Recognition Raspberry Pi - core-electronics.com.au/tutor...
Speed Camera with Raspberry Pi - core-electronics.com.au/tutor...
Object and Animal Recognition With Raspberry Pi - core-electronics.com.au/tutor...
How To Use Your Phone to Control Your Raspberry Pi - core-electronics.com.au/tutor...
Python Workshop for Beginners - core-electronics.com.au/tutor...
BuzzBox (What that VLC Video was all about) - core-electronics.com.au/proje...
Machine and deep learning has never been more accessible as this video will demonstrate. Cameras in combination with machine learning create the most powerful sensor you can ever put on a Raspberry Pi Single Board Computer. Today is all about real-time Hand Recognition and Finger Identification via computer vision with our Raspberry Pi single board computer doing all the hard work. The system built here will use Open-CV particularly CVzone. This is a huge package that helps solve real-time computer vision and image processing problems. This system will also be using MediaPipe for real-time Hand Identification, which will run a TensorFlow Lite delegate during script operation for hardware acceleration (this guide has it all!). Check the full guide on how to install these correctly and download the scripts. There are other types of gesture recognition technology that will work with a Raspberry Pi 4 Model B. For instance, you can also do hand identification or gesture identification with Pytorch, Haar cascades, or YOLO/YOLOv2 Packages but the MediaPipe dataset and system used in this guide is far superior. The first script when run will identify any hands seen in front of it through computer vision and then use machine learning to draw a hand framework over the top of any hands identified. The second script will output to the shell a statement on total finger count (both up and down) and specific details of each Finger on whether it is up or down. Third and fourth scripts are all about controlling hardware and software with your hands. The first uses a GlowBit Matrix 4x4. The amount of fingers you show will produce different colours on the matrix. The final script lets you control a VLC media player (play, pause, volume control) all through your fingertips. Gesture Volume Control success! All the scripts are fully open-source and can readily be expanded taking your projects to amazing places
If you have any questions about this content or want to share a project you're working on head over to our maker forum, we are full time makers and here to help - coreelec.io/forum
Core Electronics is located in the heart of Newcastle, Australia. We're powered by makers, for makers. Drop by if you are looking for:
Raspberry Pi 4 Model B 4GB (Used Here): core-electronics.com.au/catal...
Raspberry Pi High Quality Camera (Used Here): core-electronics.com.au/catal...
Raspberry Pi 6mm Wide Angle Camera Lens: core-electronics.com.au/catal...
Raspberry Pi Official Camera Module V2: core-electronics.com.au/catal...
Raspberry Pi 4 Power Supply: core-electronics.com.au/catal...
0:00 Intro
0:13 Video Overview
0:36 What You Need
1:40 Download the Scripts
2:03 Simple Hand Tracking Script
2:25 First Pay Off
2:40 Tracking More Hands
3:18 X-Y Data of a Single Point on Hand
3:48 Fingers Up or Down Script
4:29 Second Pay Off
5:16 Text to Speech Feature
5:43 GlowBit Matrix GPIO Control Script
6:10 Third Pay Off
6:20 GlowBit Script Explanation
8:53 Accessibility/Media Control Script
9:15 Final Pay Off
9:42 Macro and Script Explanation
12:15 Outro
The following trademarks are owned by Core Electronics Pty Ltd:
"Core Electronics" and the Core Electronics logo
"Makerverse" and the Makerverse logo
"PiicoDev" and the PiicoDev logo
"GlowBit" and the GlowBit logo
Finally, the tutorial I was most awaited for. Just love it
So glad!
Awsome dude, i just got my pi4 from you guys today.
Keep it up man you are awsome and simple
This is absolutely bonkers!
The project is incredible, and I'm blown away by the quality of the tutorial. Keep up the awesome work man!
I think so too 🙂 and cheers mate, very kind words! I will keep my head down and knock out some more.
I am amazed at all the effort that you must have put into these projects. Thank you so much.
My pleasure mate 🙂 hopefully you try some out for yourself!
So nice. Thank you so much. I love it very much, great tutorial!!! Much appreciated!
I’m creating an invention and I really need the hand jesters so thanks 🙏 😊
Faaaaantastic!!
Thank you for the wonderful project and tutorial, i just want to ask if how can access those specific joints in the hand so that I can compare my custom made specific hand gesture?
Keep it up, nice video clip, thank you for sharing it :)
Will do my best cheers mate 😊
Damn, feels just like Tony Stark, awesome thx!
very nice.
You should give this a try with a Coral TPU ?
Which I'm pretty sure works nicely with a Pi and OpenCV.
Using Coral TPU definitely provides a significant boost to performance for all computer vision tasks when utilising a Raspberry Pi. Definitely something work doing 😊
👌👌excellent project, thanks for sharing
Thank you! Cheers!
lol it doesn't work
thanks for the video but can you make pre installed opencv and tenserflow in .img fornat or os for rasberry pi
this in VR would be epic
It’s already there
Is this technology affected by high volume of UV light (ie light generated when arc welding) or would the camera require a lens to reduce light? Also could it track the light vs a reference point to determine how fast the light is moving linearly (2D and 3D)?
Would this model be a plug and play with a Coral USB Accelerator or is there other tasks when adding the Coral USB Accelerator?
That's clever.
I didn't realize a Pi had enough grunt to do live image recognition like that.
Yeah it's pretty impressive, obviously a much beefier computer will get faster frame rates and be able to track more objects simultaneously, but for getting started in computer vision systems its a great starting point.
@@Core-Electronics doesn't work.
Genial ❤️
Gracias 😊
is it possible to make this on Raspberry pi Zero 2 W?
Hi, thank you for wonderful project, i just want to ask what method did you use for this hand gesture recognition?
Cheers mate, MediaPipe and OpenCV working together are the packages that make the backbone of this Machine Learned system.
interesting, I wonder how much better this can get with dual cameras on the pi5 now
Hi , can this method be used in dark room where u will be on ur couch ?
hi Tim, how can I add thumb tracking for the Are Fingers up or Down.py code? I've added the thumb ID (4) to the list, but not sure what I need to adjust afterwards. Can you please assist? Thanks in advance!
Hey mate, you definitely can add Location tracking of certain joints to | Are Fingers Up or Down.py |. If you have downloaded the code from the article open up the | Simple-Hand-Tracker.py | script. In it you will see a section that has been commented out which is the code I used to find the location of the index finger in the video.
Copy that section across to your desired script and replace Index Finger ID number 8 with the Thumb ID number 4. Hope that helps!
So I have a plan to create a software that records the motion of your hand and essentially moves a prosthetic arm the same way. Do you think that software would possibly be able to accomplish that goal? And, I'm just wondering if you think that this concept would be able to match a hand's movement well enough. It's fine if you don't know :), I'm just curious. It's for a school project far off in the future but I'm planning in advance because engineering is my passion so I want to succeed while having fun
This set up is definitely pushing the Raspberry Pi very hard. However if you save the Raspberry Pi from printing each frame to the screen and overlaying each frame with the wire frame over the hand by customising the Main scripts there will be enough computing power inside a RPI4 to do your task without lag 😊.
Mind you I even did that! Just remembered - Come check the Where to Now Section in the main guide and you can see me controlling a hand - core-electronics.com.au/guides/raspberry-pi/hand-identification-raspberry-pi/
And a good mate Garry edited the provided code here to control a Lego hand he created - czcams.com/video/cadqkqh0zAY/video.html
@@Core-Electronics Thank you very much! So essentially it could very well be possible, just would take a bit of tweaking. Also, I'll definitely check that out! Seems very neat. Thank you for your time in replying!
If someone runs into the problem that some files in mediapipe can not be found: try downgrading your mediapipe version. Uninstall the version you tried before. Install an older version as root(can be found on pypi). That fixed the problem for me.
Hi! And what python did you use? 3.9? Now thonny just supports since 3.8 to up..
Heya! I've been trying to get this setup working with my Raspberry Pi 3B, running Buster (as mentioned in your article), along with an Arducam IMX519 (It's a 3rd party alternative to native Pi Cameras). The only issue is that my Arducam requires a Pi running Bullseye to work. Since it's been over a year since you put this video out, I was wondering if I could replicate your setup but switch buster for bullseye?
Definitely would be sweet to update al the AI guides l for the newer version of Bullseye. It has a very different software architecture to Buster so I'm hesitant to take the plunge. I'm very sure you can make that camera work with Buster OS - Here is a Forum post explaining how - forum.arducam.com/t/16mp-autofocus-raspbian-buster-no-camera-available/2464/6
@@Core-Electronics appreciate the response. Will give this a spin 👍🏾
Hi bro can u help us with our project? I have a few questions
How do you get it so smooth? I'm doing my own project, and when I try to track a body, I'm getting maybe one frame every few seconds. I don't mind some frame rate drop but this is unusable. I increased swap size but this didn't seem to help.
Definitely best to use a Raspberry Pi 4 Model B running absolutely nothing else except for the hand tracking system. You can lower the preview window size which will increase your frame rate if your using an earlier RASP PI
Will a Rasberry 3 work? It seems that the Rasberry 4 is in very short stock, so I was wondering if a raspberry 3 has enough power, I won't be using the high-quality camera but the V2 camera module. I also want to connect this to Arduino Uno, will this be possible?
I reckon hand and finger recognition may be asking too much of a Raspberry Pi 3, but I'd love to be proven wrong. And there is lots of ways to hook up a Raspberry Pi to send information/instructions to a Arduino so that won't be an issue, they also both conveniently run at 5Volts.
Hello, does this work without connecting to a pc?
Is it possible to remove the connection from the pc and hand detect it after coding?
Hey mate, this system is completely running on the Raspberry Pi 4 Model B. You can definitely run this system Headless (without a Monitor/Display) and still control hardware through hand signals.
Will this work on the 64bit versions of raspbian?
No doubt the teams at Open-CV and Raspberry Pi are working furiously to reach full compatibility with the new OS versions. That hasn't happened yet, so until then I would recommend using the previous Raspberry Pi 'Buster' OS with this guide.
Hey , i have been trying to use hand gesture recognition as the base for my medical application that is for the patients to communicate with the nurse or doctors and i am trying to use this setup , but i am stuck , can you please help accordingly?
I am using raspberry pi 3 model B
Love your project idea! And absolutely I can help. The best place to ask questions is at our forum here - forum.core-electronics.com.au/ - Pop through some images of the hardware and screen grabs of any software issues and we'll sort it !
hi,is there any updated info since rasperry pi has upgraded. maybe an intro in Rasperry pi 5?
This guide doesn't yet work with the new Bookworm OS, and unfortunately the Pi 5 right now only works on Bookworm OS. We have some updates for these vision videos in the works though!
I have paralysis in my fingers, but I can move my wrist, elbows and shoulders. My fingers and thumbs just kind of flop around as they are always in a relaxed state. Can I use this technology to recognize custom arm gestures while sitting in a wheelchair in front of the camera? I want to be able to use both arms to navigate and control things in the Metaverse As well as mechanical devices. I would just be interested in rotational and positioning information from the wrist, elbow and shoulder joints to simulate a virtual joystick, for example
These models are pre-baked to identify certain landmarks (finger poses) and return their status. It seems like you might be able to achieve the effect you're after by them into the gestures you require. It sounds like an awesome project, perhaps it's better to take it to the forum where we can share screenshots, code snippets and other helpful resources: forum.core-electronics.com.au/
@@Core-Electronics so you’re saying that it doesn’t really matter what position my hands are in? It will record that as a custom gesture? Because there is no way I could make a ✌🏼 or the 🖕 unfortunately LOL or even a 🤛
don't get your hopes up this doesnt work and is more than painful to install
Congratulations for your amazing work.
I have followed the article s instruction but unfortunately when i try to import mediapipe to the python script an error appears:
Traceback (most recent call last):
File "", line 1, in
File "/usr/local/lib/python3.7/dist-packages/mediapipe/__init__.py", line 16, in
from mediapipe.python import *
File "/usr/local/lib/python3.7/dist-packages/mediapipe/python/__init__.py", line 17, in
from mediapipe.python._framework_bindings import resource_util
ImportError: libImath-2_2.so.23: cannot open shared object file: No such file or directory
Any suggestions??
Heyya mate, thank you very kindly. My first two thoughts are whether you used the older 'Buster' Raspberry Pi OS or did you skip one of the MediaPipe installation Terminal commands. Type the following lines one by one into a new terminal command to try to get that MediaPipe to work.
sudo pip3 install mediapipe-rpi3
sudo pip3 install mediapipe-rpi4
sudo pip3 install gtts
sudo apt install mpg321
We have a heap of successful troubleshooting that you can find at the comment section of the full written up article - core-electronics.com.au/guides/hand-identification-raspberry-pi/
Have you guys have any problems installing mediapipe? I get an error when importing mediapipe in python
Come check the comments at the bottom of the full write up. There's a solution for you down there in the forum section😊
Sir how to take audio from raspberry pi 4... Please help me
Come create a forum if you need some expert help 😊 - forum.core-electronics.com.au/
Identity the joint in my hand?.... Lemon Kush
Our technology isn't quite there yet 😅
can you give us the source code of virtual mouse using the rasberry Pi 4 and the raspberry camera v2, thank you.
All the script done here you can download from the bottom of the Article page.
I have yet to create a virtual mouse via computer vision using the Raspberry Pi (if I do I will tell you), but a video to check out in regards to building this kind of system from first principles is this - czcams.com/video/iBwMi9iDZmQ/video.html
@@Core-Electronics thank you ☺
hi tim, how can i fix this error
from mediapipe.python._framework_bindings import resource_util
ModuleNotFoundError: No module named 'mediapipe.python._framework_bindings'
Seems to me one of the hand tracking packages hasn't installed correctly. Are you using the older 'Buster' Raspberry Pi OS version? As doing so is important to do until all the packages are updated for the newer 'Bullseye' Raspberry Pi OS version.
Come check the bottom of the full article, lots of troubleshooting to be found there. Also if you need more troubleshooting help pop a message over there as I can help you better over there 😊 we'll get your system working.
@@Core-Electronics mediapipe does not work on raspbian bullseye it works with raspbian buster
For some reason when I run any of the scrips they don't seem to work properly they just have a purple filter over the camera and nothing else. I also get this error code: [ WARN:0@4.204] global /home/pi/opencv/modules/videoio/src/cap_gstreamer.cpp (1405) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Hey mate, just to start with the simple stuff first, double check you have the camera enable in the Raspberry Pi Configuration and that the camera is connected correctly to the CSI port. If you keep running into issue, pop me a message in the comment section of the article as I will be able to help you easier there 🙂
Gstreamer warning will go away if you use line:
cap = cv2.VideoCapture(0, cv2.CAP_V4L2)
Can we use raspberry 3?
For this task I would recommend using a Raspberry Pi B 3+ or better. Otherwise the FPS is too slow for snappy responses.
hey mate, i tun the script yesterday and it worked well. Today nothing has changed, the window Pops up but wont track my hand... its also not showing up any Errors. i already connencted the camera again, Update my System and downloaded the script again and open it again but wont work. Anybody has some ideas?
Double check for me that you are running the older 'Buster' Raspberry Pi OS. There is also a great resource of successful trouble shooting that you can find on our Core Electronics Forum in relation to Open-CV/machine learned systems. Come write a post there if you keep running into issues and we'll work it out there 😊
Same with me. The code runs without any error but it does not track my hand. Running on Buster, Python 3.7.3, Opencv 4.5.5.
How hard would this be to implement into Gnome to control the desktop? I have lots of ideas, but unfortunately I'm a visionary, not a programmer.
Im doing It in Windows, the biggest problem is that i want the app to let people change gestures.
@@maanuu1687 maybe have an options GUI, make it simple, add a number of different gestures, and maybe for each hand as well.
If you could do this in Gnome, then you could get that "minority report" interface everyone's always referencing.
@@roracle Yeah, that's what I'm doing, using Flet for the GUI, and 16 gestures(all except for mouse and sound gestures are interchangeable using especifics parameters as hand orientation ). In the future, maybe I'll add more. My main focus right now is Windows, but maybe when it's finished, I can try it on GNOME. It should work on Linux because the Os library is compatible, but I might have to make some changes. If you'd like, I can let you know when it's finished
@@maanuu1687 do you have a GitHub project page?
can we write this data to .bvh file and import it to Autodesk Maya?
For sure, all the python scripts are open source and you really can do anything software wise with programming 🙂
@@Core-Electronics if possible, please make a video about it.
The download scripts are not available, can you send the link please?
Code should be available at the bottom of article or in the comment section. If you can't see it pop me a reply and we'll figure out whats happening.
How would I stop the camera from lagging?
If you want an instant speed boost using this hardware consider checking out the Coral USB Accelerator.
This video shows a nice comparison between using it and not for different machine-learned Computer Vision systems on a Raspberry Pi 4 Model B. At around 6.30 is where you’d want to check out. czcams.com/video/7gWCekMy1mw/video.html
@@Core-Electronics Thanks, mate
@@Core-Electronics it seems there's still a chip shortage and the coral usb accelerator is out of stock and some people are selling them and a high premium ☹️
@@Core-Electronics I have looked at the Intel® Neural Compute Stick 2, would this be a good option?
Hey mate, don't know how good the Raspberry Pi support is for this but it is definitely in the right ballpark.
Hmm, maybe sign language to text or speech might be a good application????????????
Absolutely, it would be an awesome project.
How can I learn Python like yours?
W3school is a good begginer option
When i set up Open CV, there is a error. Should I contine the progress with error?
Hey, if you are still having issues we have a forum topic specifically on this video that might have some helpful information. If you are having a unique issue, feel free to post in there, we have lots of maker eyes over there that can help!
forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705/58
@@Core-Electronics Thank you so much! Your video is very helpful to raspberry pi user!
I need help in website " If it fails at any point and you receive a message like | make: *** [Makefile:163: all] Error 2 | just re-type and enter the above line | make -j $(nproc) |". Same like for me also error come i repeatedly that cmd but same error again and again 😢
Plz reply ASAP
Sorry to hear you are having issues, we have a dedicated community forums post that might have some helpful information, if not feel free to chuck a post there with your setup and problem, we have lots of maker eyes over there that can help!
forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705
@@Core-Electronics i have already posted (58th post) there also
A Bit of trivia for you from an American Sign Language Interpreter… The gesture You and many others use as ‘Rock & Roll’ Really means ‘Bullshit!’ In American Sign Language. I laugh every time I see it being used.
Haha, thanks for sharing
Is anyone able to implement it in 2024?
+1 also I want to control with voice and an assistant like chatgpt
Im doing an app that Will allow people to control things like sound, mouse, drag, stop video, etc, with gestures that you can change to others that can suit you more. Is my tfg
How can i contact you . I need your help for my clg project
@@maanuu1687
didn't work even though i did everything correctly
Is it possible to run this on a RPi zero 2W??
Made the mistake of updating my WIP robot with Bullseye, so now I can move it around but not using its camera in python, doh
I made a quick guide on how to deal with the new camera terminal commands for 'Bullseye' OS. It might come in handy for you - core-electronics.com.au/tutorials/raspberry-pi-bullseye-camera-commands.html
@@Core-Electronics yeah saw that, had a play, still waiting on python libraries, might format a new card and go back to buster need to play with open CV too as I have the pi cam on the robot frame pointing forward and a USB cam on a servo pan tilt for looking around. Still such a coding n00b.
I love the sound of that project! Excited to see what you come up with 😊 come make a post on our forum to show it off/if you need any help forum.core-electronics.com.au/
How much ram and cpu does rpi use for this project? Does it overheat?
Computer Vision is always very intensive for the Raspberry Pi 4 Model B. My Memory on my 4GB Pi sits at around ~400MB and the CPU% is at around 111% for the particular script as it is running. These are the values I get from | htop | as the code is running.