Hand Tracking & Gesture Control With Raspberry Pi + OpenCV + Python

Sdílet
Vložit
  • čas přidán 26. 07. 2024
  • Full Article - core-electronics.com.au/tutor...
    Identify and track every joint in the fingers of your human hands, live. Then use your human hands to send commands to control media software and GPIO attached hardware. All via a Raspberry Pi Single Board Computer.
    Make sure to use the Previous Raspberry Pi 'Buster' OS with this Guide.
    Related Information
    Flashing 'Buster' OS onto a Raspberry Pi - core-electronics.com.au/tutor...
    Setting Up a Raspberry Pi as a Desktop - core-electronics.com.au/tutor...
    GlowBit Matrix 4x4 Guide - core-electronics.com.au/tutor...
    Face Tracking with Pan-Tilt Hat - core-electronics.com.au/tutor...
    Facial Recognition Raspberry Pi - core-electronics.com.au/tutor...
    Speed Camera with Raspberry Pi - core-electronics.com.au/tutor...
    Object and Animal Recognition With Raspberry Pi - core-electronics.com.au/tutor...
    How To Use Your Phone to Control Your Raspberry Pi - core-electronics.com.au/tutor...
    Python Workshop for Beginners - core-electronics.com.au/tutor...
    BuzzBox (What that VLC Video was all about) - core-electronics.com.au/proje...
    Machine and deep learning has never been more accessible as this video will demonstrate. Cameras in combination with machine learning create the most powerful sensor you can ever put on a Raspberry Pi Single Board Computer. Today is all about real-time Hand Recognition and Finger Identification via computer vision with our Raspberry Pi single board computer doing all the hard work. The system built here will use Open-CV particularly CVzone. This is a huge package that helps solve real-time computer vision and image processing problems. This system will also be using MediaPipe for real-time Hand Identification, which will run a TensorFlow Lite delegate during script operation for hardware acceleration (this guide has it all!). Check the full guide on how to install these correctly and download the scripts. There are other types of gesture recognition technology that will work with a Raspberry Pi 4 Model B. For instance, you can also do hand identification or gesture identification with Pytorch, Haar cascades, or YOLO/YOLOv2 Packages but the MediaPipe dataset and system used in this guide is far superior. The first script when run will identify any hands seen in front of it through computer vision and then use machine learning to draw a hand framework over the top of any hands identified. The second script will output to the shell a statement on total finger count (both up and down) and specific details of each Finger on whether it is up or down. Third and fourth scripts are all about controlling hardware and software with your hands. The first uses a GlowBit Matrix 4x4. The amount of fingers you show will produce different colours on the matrix. The final script lets you control a VLC media player (play, pause, volume control) all through your fingertips. Gesture Volume Control success! All the scripts are fully open-source and can readily be expanded taking your projects to amazing places
    If you have any questions about this content or want to share a project you're working on head over to our maker forum, we are full time makers and here to help - coreelec.io/forum
    Core Electronics is located in the heart of Newcastle, Australia. We're powered by makers, for makers. Drop by if you are looking for:
    Raspberry Pi 4 Model B 4GB (Used Here): core-electronics.com.au/catal...
    Raspberry Pi High Quality Camera (Used Here): core-electronics.com.au/catal...
    Raspberry Pi 6mm Wide Angle Camera Lens: core-electronics.com.au/catal...
    Raspberry Pi Official Camera Module V2: core-electronics.com.au/catal...
    Raspberry Pi 4 Power Supply: core-electronics.com.au/catal...
    0:00 Intro
    0:13 Video Overview
    0:36 What You Need
    1:40 Download the Scripts
    2:03 Simple Hand Tracking Script
    2:25 First Pay Off
    2:40 Tracking More Hands
    3:18 X-Y Data of a Single Point on Hand
    3:48 Fingers Up or Down Script
    4:29 Second Pay Off
    5:16 Text to Speech Feature
    5:43 GlowBit Matrix GPIO Control Script
    6:10 Third Pay Off
    6:20 GlowBit Script Explanation
    8:53 Accessibility/Media Control Script
    9:15 Final Pay Off
    9:42 Macro and Script Explanation
    12:15 Outro
    The following trademarks are owned by Core Electronics Pty Ltd:
    "Core Electronics" and the Core Electronics logo
    "Makerverse" and the Makerverse logo
    "PiicoDev" and the PiicoDev logo
    "GlowBit" and the GlowBit logo

Komentáře • 129

  • @sumedh1586
    @sumedh1586 Před 2 lety +4

    Finally, the tutorial I was most awaited for. Just love it

  • @TheLiquidMix
    @TheLiquidMix Před 2 lety

    Awsome dude, i just got my pi4 from you guys today.

  • @asirisudarshana536
    @asirisudarshana536 Před 2 lety +1

    Keep it up man you are awsome and simple

  • @adamboden766
    @adamboden766 Před rokem +3

    This is absolutely bonkers!
    The project is incredible, and I'm blown away by the quality of the tutorial. Keep up the awesome work man!

    • @Core-Electronics
      @Core-Electronics  Před rokem

      I think so too 🙂 and cheers mate, very kind words! I will keep my head down and knock out some more.

  • @jeffschroeder4805
    @jeffschroeder4805 Před 2 lety +10

    I am amazed at all the effort that you must have put into these projects. Thank you so much.

    • @Core-Electronics
      @Core-Electronics  Před 2 lety +1

      My pleasure mate 🙂 hopefully you try some out for yourself!

  • @yingwaisia2707
    @yingwaisia2707 Před rokem

    So nice. Thank you so much. I love it very much, great tutorial!!! Much appreciated!

  • @AnthonielPinnock-ci3rd
    @AnthonielPinnock-ci3rd Před 4 měsíci

    I’m creating an invention and I really need the hand jesters so thanks 🙏 😊

  • @rho35100
    @rho35100 Před 2 lety +1

    Faaaaantastic!!

  • @zgryx8428
    @zgryx8428 Před 2 lety +1

    Thank you for the wonderful project and tutorial, i just want to ask if how can access those specific joints in the hand so that I can compare my custom made specific hand gesture?

  • @Bianchi77
    @Bianchi77 Před 2 lety +1

    Keep it up, nice video clip, thank you for sharing it :)

  • @adrienguidat6805
    @adrienguidat6805 Před 2 lety +3

    Damn, feels just like Tony Stark, awesome thx!

  • @nerdy_dav
    @nerdy_dav Před 2 lety +2

    very nice.
    You should give this a try with a Coral TPU ?
    Which I'm pretty sure works nicely with a Pi and OpenCV.

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Using Coral TPU definitely provides a significant boost to performance for all computer vision tasks when utilising a Raspberry Pi. Definitely something work doing 😊

  • @ajithsb1853
    @ajithsb1853 Před 2 lety +2

    👌👌excellent project, thanks for sharing

  • @adityajadhav4768
    @adityajadhav4768 Před rokem

    thanks for the video but can you make pre installed opencv and tenserflow in .img fornat or os for rasberry pi

  • @LizzyTheLizard
    @LizzyTheLizard Před 2 lety +1

    this in VR would be epic

  • @chapincougars
    @chapincougars Před 11 měsíci

    Is this technology affected by high volume of UV light (ie light generated when arc welding) or would the camera require a lens to reduce light? Also could it track the light vs a reference point to determine how fast the light is moving linearly (2D and 3D)?

  • @ryandowney1391
    @ryandowney1391 Před rokem

    Would this model be a plug and play with a Coral USB Accelerator or is there other tasks when adding the Coral USB Accelerator?

  • @pileofstuff
    @pileofstuff Před 2 lety +4

    That's clever.
    I didn't realize a Pi had enough grunt to do live image recognition like that.

    • @Core-Electronics
      @Core-Electronics  Před 2 lety +1

      Yeah it's pretty impressive, obviously a much beefier computer will get faster frame rates and be able to track more objects simultaneously, but for getting started in computer vision systems its a great starting point.

    • @stinger220
      @stinger220 Před 19 dny

      @@Core-Electronics doesn't work.

  • @carlinelectronic7692
    @carlinelectronic7692 Před 2 lety +1

    Genial ❤️

  • @elvinmirzezade7997
    @elvinmirzezade7997 Před 7 měsíci +2

    is it possible to make this on Raspberry pi Zero 2 W?

  • @EronWahyu
    @EronWahyu Před 2 lety +1

    Hi, thank you for wonderful project, i just want to ask what method did you use for this hand gesture recognition?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Cheers mate, MediaPipe and OpenCV working together are the packages that make the backbone of this Machine Learned system.

  • @leonoliveira8652
    @leonoliveira8652 Před 8 měsíci

    interesting, I wonder how much better this can get with dual cameras on the pi5 now

  • @alenninan5524
    @alenninan5524 Před 4 měsíci

    Hi , can this method be used in dark room where u will be on ur couch ?

  • @Redbeef
    @Redbeef Před 2 lety +1

    hi Tim, how can I add thumb tracking for the Are Fingers up or Down.py code? I've added the thumb ID (4) to the list, but not sure what I need to adjust afterwards. Can you please assist? Thanks in advance!

    • @Core-Electronics
      @Core-Electronics  Před 2 lety +1

      Hey mate, you definitely can add Location tracking of certain joints to | Are Fingers Up or Down.py |. If you have downloaded the code from the article open up the | Simple-Hand-Tracker.py | script. In it you will see a section that has been commented out which is the code I used to find the location of the index finger in the video.
      Copy that section across to your desired script and replace Index Finger ID number 8 with the Thumb ID number 4. Hope that helps!

  • @JC-le4dy
    @JC-le4dy Před rokem +1

    So I have a plan to create a software that records the motion of your hand and essentially moves a prosthetic arm the same way. Do you think that software would possibly be able to accomplish that goal? And, I'm just wondering if you think that this concept would be able to match a hand's movement well enough. It's fine if you don't know :), I'm just curious. It's for a school project far off in the future but I'm planning in advance because engineering is my passion so I want to succeed while having fun

    • @Core-Electronics
      @Core-Electronics  Před rokem

      This set up is definitely pushing the Raspberry Pi very hard. However if you save the Raspberry Pi from printing each frame to the screen and overlaying each frame with the wire frame over the hand by customising the Main scripts there will be enough computing power inside a RPI4 to do your task without lag 😊.
      Mind you I even did that! Just remembered - Come check the Where to Now Section in the main guide and you can see me controlling a hand - core-electronics.com.au/guides/raspberry-pi/hand-identification-raspberry-pi/
      And a good mate Garry edited the provided code here to control a Lego hand he created - czcams.com/video/cadqkqh0zAY/video.html

    • @JC-le4dy
      @JC-le4dy Před rokem +1

      @@Core-Electronics Thank you very much! So essentially it could very well be possible, just would take a bit of tweaking. Also, I'll definitely check that out! Seems very neat. Thank you for your time in replying!

  • @absolutllost
    @absolutllost Před rokem

    If someone runs into the problem that some files in mediapipe can not be found: try downgrading your mediapipe version. Uninstall the version you tried before. Install an older version as root(can be found on pypi). That fixed the problem for me.

    • @rodrigoavilaengland
      @rodrigoavilaengland Před 8 měsíci

      Hi! And what python did you use? 3.9? Now thonny just supports since 3.8 to up..

  • @Manaick007
    @Manaick007 Před rokem +1

    Heya! I've been trying to get this setup working with my Raspberry Pi 3B, running Buster (as mentioned in your article), along with an Arducam IMX519 (It's a 3rd party alternative to native Pi Cameras). The only issue is that my Arducam requires a Pi running Bullseye to work. Since it's been over a year since you put this video out, I was wondering if I could replicate your setup but switch buster for bullseye?

    • @Core-Electronics
      @Core-Electronics  Před rokem +1

      Definitely would be sweet to update al the AI guides l for the newer version of Bullseye. It has a very different software architecture to Buster so I'm hesitant to take the plunge. I'm very sure you can make that camera work with Buster OS - Here is a Forum post explaining how - forum.arducam.com/t/16mp-autofocus-raspbian-buster-no-camera-available/2464/6

    • @Manaick007
      @Manaick007 Před rokem +1

      @@Core-Electronics appreciate the response. Will give this a spin 👍🏾

  • @reyroelortiz7448
    @reyroelortiz7448 Před rokem

    Hi bro can u help us with our project? I have a few questions

  • @RealRaven6229
    @RealRaven6229 Před rokem +1

    How do you get it so smooth? I'm doing my own project, and when I try to track a body, I'm getting maybe one frame every few seconds. I don't mind some frame rate drop but this is unusable. I increased swap size but this didn't seem to help.

    • @Core-Electronics
      @Core-Electronics  Před rokem

      Definitely best to use a Raspberry Pi 4 Model B running absolutely nothing else except for the hand tracking system. You can lower the preview window size which will increase your frame rate if your using an earlier RASP PI

  • @mathkidofmemes6129
    @mathkidofmemes6129 Před rokem +2

    Will a Rasberry 3 work? It seems that the Rasberry 4 is in very short stock, so I was wondering if a raspberry 3 has enough power, I won't be using the high-quality camera but the V2 camera module. I also want to connect this to Arduino Uno, will this be possible?

    • @Core-Electronics
      @Core-Electronics  Před rokem

      I reckon hand and finger recognition may be asking too much of a Raspberry Pi 3, but I'd love to be proven wrong. And there is lots of ways to hook up a Raspberry Pi to send information/instructions to a Arduino so that won't be an issue, they also both conveniently run at 5Volts.

  • @ranidusoysa8789
    @ranidusoysa8789 Před 2 lety +1

    Hello, does this work without connecting to a pc?
    Is it possible to remove the connection from the pc and hand detect it after coding?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Hey mate, this system is completely running on the Raspberry Pi 4 Model B. You can definitely run this system Headless (without a Monitor/Display) and still control hardware through hand signals.

  • @RamesTheGeneric
    @RamesTheGeneric Před 2 lety +1

    Will this work on the 64bit versions of raspbian?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      No doubt the teams at Open-CV and Raspberry Pi are working furiously to reach full compatibility with the new OS versions. That hasn't happened yet, so until then I would recommend using the previous Raspberry Pi 'Buster' OS with this guide.

  • @aakashkoneru2001
    @aakashkoneru2001 Před rokem +1

    Hey , i have been trying to use hand gesture recognition as the base for my medical application that is for the patients to communicate with the nurse or doctors and i am trying to use this setup , but i am stuck , can you please help accordingly?
    I am using raspberry pi 3 model B

    • @Core-Electronics
      @Core-Electronics  Před rokem

      Love your project idea! And absolutely I can help. The best place to ask questions is at our forum here - forum.core-electronics.com.au/ - Pop through some images of the hardware and screen grabs of any software issues and we'll sort it !

  • @stevewang1061
    @stevewang1061 Před měsícem

    hi,is there any updated info since rasperry pi has upgraded. maybe an intro in Rasperry pi 5?

    • @Core-Electronics
      @Core-Electronics  Před měsícem

      This guide doesn't yet work with the new Bookworm OS, and unfortunately the Pi 5 right now only works on Bookworm OS. We have some updates for these vision videos in the works though!

  • @spaceminers
    @spaceminers Před rokem

    I have paralysis in my fingers, but I can move my wrist, elbows and shoulders. My fingers and thumbs just kind of flop around as they are always in a relaxed state. Can I use this technology to recognize custom arm gestures while sitting in a wheelchair in front of the camera? I want to be able to use both arms to navigate and control things in the Metaverse As well as mechanical devices. I would just be interested in rotational and positioning information from the wrist, elbow and shoulder joints to simulate a virtual joystick, for example

    • @Core-Electronics
      @Core-Electronics  Před rokem

      These models are pre-baked to identify certain landmarks (finger poses) and return their status. It seems like you might be able to achieve the effect you're after by them into the gestures you require. It sounds like an awesome project, perhaps it's better to take it to the forum where we can share screenshots, code snippets and other helpful resources: forum.core-electronics.com.au/

    • @spaceminers
      @spaceminers Před 11 měsíci

      @@Core-Electronics so you’re saying that it doesn’t really matter what position my hands are in? It will record that as a custom gesture? Because there is no way I could make a ✌🏼 or the 🖕 unfortunately LOL or even a 🤛

    • @stinger220
      @stinger220 Před 19 dny

      don't get your hopes up this doesnt work and is more than painful to install

  • @liampourliampouras3274
    @liampourliampouras3274 Před 2 lety +1

    Congratulations for your amazing work.
    I have followed the article s instruction but unfortunately when i try to import mediapipe to the python script an error appears:
    Traceback (most recent call last):
    File "", line 1, in
    File "/usr/local/lib/python3.7/dist-packages/mediapipe/__init__.py", line 16, in
    from mediapipe.python import *
    File "/usr/local/lib/python3.7/dist-packages/mediapipe/python/__init__.py", line 17, in
    from mediapipe.python._framework_bindings import resource_util
    ImportError: libImath-2_2.so.23: cannot open shared object file: No such file or directory
    Any suggestions??

    • @Core-Electronics
      @Core-Electronics  Před rokem +1

      Heyya mate, thank you very kindly. My first two thoughts are whether you used the older 'Buster' Raspberry Pi OS or did you skip one of the MediaPipe installation Terminal commands. Type the following lines one by one into a new terminal command to try to get that MediaPipe to work.
      sudo pip3 install mediapipe-rpi3
      sudo pip3 install mediapipe-rpi4
      sudo pip3 install gtts
      sudo apt install mpg321
      We have a heap of successful troubleshooting that you can find at the comment section of the full written up article - core-electronics.com.au/guides/hand-identification-raspberry-pi/

  • @rasmusbryld1832
    @rasmusbryld1832 Před rokem +1

    Have you guys have any problems installing mediapipe? I get an error when importing mediapipe in python

    • @Core-Electronics
      @Core-Electronics  Před rokem

      Come check the comments at the bottom of the full write up. There's a solution for you down there in the forum section😊

  • @indianaiscience3670
    @indianaiscience3670 Před 2 lety +1

    Sir how to take audio from raspberry pi 4... Please help me

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Come create a forum if you need some expert help 😊 - forum.core-electronics.com.au/

  • @MattyEngland
    @MattyEngland Před 2 lety +1

    Identity the joint in my hand?.... Lemon Kush

  • @hibach9140
    @hibach9140 Před 2 lety +1

    can you give us the source code of virtual mouse using the rasberry Pi 4 and the raspberry camera v2, thank you.

    • @Core-Electronics
      @Core-Electronics  Před 2 lety +1

      All the script done here you can download from the bottom of the Article page.
      I have yet to create a virtual mouse via computer vision using the Raspberry Pi (if I do I will tell you), but a video to check out in regards to building this kind of system from first principles is this - czcams.com/video/iBwMi9iDZmQ/video.html

    • @hibach9140
      @hibach9140 Před 2 lety +1

      @@Core-Electronics thank you ☺

  • @sightellaidglass4651
    @sightellaidglass4651 Před 2 lety +1

    hi tim, how can i fix this error
    from mediapipe.python._framework_bindings import resource_util
    ModuleNotFoundError: No module named 'mediapipe.python._framework_bindings'

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Seems to me one of the hand tracking packages hasn't installed correctly. Are you using the older 'Buster' Raspberry Pi OS version? As doing so is important to do until all the packages are updated for the newer 'Bullseye' Raspberry Pi OS version.
      Come check the bottom of the full article, lots of troubleshooting to be found there. Also if you need more troubleshooting help pop a message over there as I can help you better over there 😊 we'll get your system working.

    • @sedatdoganay4938
      @sedatdoganay4938 Před 2 lety

      @@Core-Electronics mediapipe does not work on raspbian bullseye it works with raspbian buster

  • @samuelmarshall100
    @samuelmarshall100 Před 2 lety +1

    For some reason when I run any of the scrips they don't seem to work properly they just have a purple filter over the camera and nothing else. I also get this error code: [ WARN:0@4.204] global /home/pi/opencv/modules/videoio/src/cap_gstreamer.cpp (1405) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
    INFO: Created TensorFlow Lite XNNPACK delegate for CPU.

    • @Core-Electronics
      @Core-Electronics  Před 2 lety +1

      Hey mate, just to start with the simple stuff first, double check you have the camera enable in the Raspberry Pi Configuration and that the camera is connected correctly to the CSI port. If you keep running into issue, pop me a message in the comment section of the article as I will be able to help you easier there 🙂

    • @SwagatKumarIndia
      @SwagatKumarIndia Před rokem

      Gstreamer warning will go away if you use line:
      cap = cv2.VideoCapture(0, cv2.CAP_V4L2)

  • @reyroelortiz7448
    @reyroelortiz7448 Před rokem +1

    Can we use raspberry 3?

    • @Core-Electronics
      @Core-Electronics  Před rokem

      For this task I would recommend using a Raspberry Pi B 3+ or better. Otherwise the FPS is too slow for snappy responses.

  • @dilaraburan1776
    @dilaraburan1776 Před 2 lety +1

    hey mate, i tun the script yesterday and it worked well. Today nothing has changed, the window Pops up but wont track my hand... its also not showing up any Errors. i already connencted the camera again, Update my System and downloaded the script again and open it again but wont work. Anybody has some ideas?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Double check for me that you are running the older 'Buster' Raspberry Pi OS. There is also a great resource of successful trouble shooting that you can find on our Core Electronics Forum in relation to Open-CV/machine learned systems. Come write a post there if you keep running into issues and we'll work it out there 😊

    • @SwagatKumarIndia
      @SwagatKumarIndia Před rokem

      Same with me. The code runs without any error but it does not track my hand. Running on Buster, Python 3.7.3, Opencv 4.5.5.

  • @roracle
    @roracle Před 8 měsíci

    How hard would this be to implement into Gnome to control the desktop? I have lots of ideas, but unfortunately I'm a visionary, not a programmer.

    • @maanuu1687
      @maanuu1687 Před 4 měsíci

      Im doing It in Windows, the biggest problem is that i want the app to let people change gestures.

    • @roracle
      @roracle Před 4 měsíci

      @@maanuu1687 maybe have an options GUI, make it simple, add a number of different gestures, and maybe for each hand as well.
      If you could do this in Gnome, then you could get that "minority report" interface everyone's always referencing.

    • @maanuu1687
      @maanuu1687 Před 4 měsíci +1

      @@roracle Yeah, that's what I'm doing, using Flet for the GUI, and 16 gestures(all except for mouse and sound gestures are interchangeable using especifics parameters as hand orientation ). In the future, maybe I'll add more. My main focus right now is Windows, but maybe when it's finished, I can try it on GNOME. It should work on Linux because the Os library is compatible, but I might have to make some changes. If you'd like, I can let you know when it's finished

    • @roracle
      @roracle Před 4 měsíci

      @@maanuu1687 do you have a GitHub project page?

  • @visiontest-series-wisevide8257

    can we write this data to .bvh file and import it to Autodesk Maya?

    • @Core-Electronics
      @Core-Electronics  Před rokem +1

      For sure, all the python scripts are open source and you really can do anything software wise with programming 🙂

    • @visiontest-series-wisevide8257
      @visiontest-series-wisevide8257 Před rokem +1

      @@Core-Electronics if possible, please make a video about it.

  • @w4tchtheDAWG
    @w4tchtheDAWG Před 2 lety +1

    The download scripts are not available, can you send the link please?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Code should be available at the bottom of article or in the comment section. If you can't see it pop me a reply and we'll figure out whats happening.

  • @samuelmarshall100
    @samuelmarshall100 Před 2 lety +1

    How would I stop the camera from lagging?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      If you want an instant speed boost using this hardware consider checking out the Coral USB Accelerator.
      This video shows a nice comparison between using it and not for different machine-learned Computer Vision systems on a Raspberry Pi 4 Model B. At around 6.30 is where you’d want to check out. czcams.com/video/7gWCekMy1mw/video.html

    • @samuelmarshall100
      @samuelmarshall100 Před 2 lety +1

      @@Core-Electronics Thanks, mate

    • @samuelmarshall100
      @samuelmarshall100 Před 2 lety +1

      @@Core-Electronics it seems there's still a chip shortage and the coral usb accelerator is out of stock and some people are selling them and a high premium ☹️

    • @samuelmarshall100
      @samuelmarshall100 Před 2 lety +1

      ​@@Core-Electronics I have looked at the Intel® Neural Compute Stick 2, would this be a good option?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      Hey mate, don't know how good the Raspberry Pi support is for this but it is definitely in the right ballpark.

  • @cx3268
    @cx3268 Před 2 lety +1

    Hmm, maybe sign language to text or speech might be a good application????????????

  • @rajasekhar5982
    @rajasekhar5982 Před rokem

    How can I learn Python like yours?

    • @maanuu1687
      @maanuu1687 Před 4 měsíci

      W3school is a good begginer option

  • @user-sp4dx8lf9m
    @user-sp4dx8lf9m Před 2 měsíci

    When i set up Open CV, there is a error. Should I contine the progress with error?

    • @Core-Electronics
      @Core-Electronics  Před 2 měsíci +1

      Hey, if you are still having issues we have a forum topic specifically on this video that might have some helpful information. If you are having a unique issue, feel free to post in there, we have lots of maker eyes over there that can help!
      forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705/58

    • @user-sp4dx8lf9m
      @user-sp4dx8lf9m Před 2 měsíci

      @@Core-Electronics Thank you so much! Your video is very helpful to raspberry pi user!

  • @spacetechnology9718
    @spacetechnology9718 Před měsícem

    I need help in website " If it fails at any point and you receive a message like | make: *** [Makefile:163: all] Error 2 | just re-type and enter the above line | make -j $(nproc) |". Same like for me also error come i repeatedly that cmd but same error again and again 😢
    Plz reply ASAP

    • @Core-Electronics
      @Core-Electronics  Před měsícem +1

      Sorry to hear you are having issues, we have a dedicated community forums post that might have some helpful information, if not feel free to chuck a post there with your setup and problem, we have lots of maker eyes over there that can help!
      forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705

    • @spacetechnology9718
      @spacetechnology9718 Před měsícem

      @@Core-Electronics i have already posted (58th post) there also

  • @JenniferEliseAtchiso
    @JenniferEliseAtchiso Před rokem

    A Bit of trivia for you from an American Sign Language Interpreter… The gesture You and many others use as ‘Rock & Roll’ Really means ‘Bullshit!’ In American Sign Language. I laugh every time I see it being used.

  • @jaysoni9568
    @jaysoni9568 Před 5 měsíci +3

    Is anyone able to implement it in 2024?

    • @Maisonier
      @Maisonier Před 5 měsíci +1

      +1 also I want to control with voice and an assistant like chatgpt

    • @maanuu1687
      @maanuu1687 Před 4 měsíci +1

      Im doing an app that Will allow people to control things like sound, mouse, drag, stop video, etc, with gestures that you can change to others that can suit you more. Is my tfg

    • @ruban92
      @ruban92 Před 4 měsíci

      How can i contact you . I need your help for my clg project
      ​@@maanuu1687

    • @stinger220
      @stinger220 Před 19 dny

      didn't work even though i did everything correctly

  • @rishabhkumar7405
    @rishabhkumar7405 Před 3 měsíci

    Is it possible to run this on a RPi zero 2W??

  • @bridgetclinch3678
    @bridgetclinch3678 Před 2 lety +1

    Made the mistake of updating my WIP robot with Bullseye, so now I can move it around but not using its camera in python, doh

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      I made a quick guide on how to deal with the new camera terminal commands for 'Bullseye' OS. It might come in handy for you - core-electronics.com.au/tutorials/raspberry-pi-bullseye-camera-commands.html

    • @bridgetclinch3678
      @bridgetclinch3678 Před 2 lety +1

      @@Core-Electronics yeah saw that, had a play, still waiting on python libraries, might format a new card and go back to buster need to play with open CV too as I have the pi cam on the robot frame pointing forward and a USB cam on a servo pan tilt for looking around. Still such a coding n00b.

    • @Core-Electronics
      @Core-Electronics  Před 2 lety

      I love the sound of that project! Excited to see what you come up with 😊 come make a post on our forum to show it off/if you need any help forum.core-electronics.com.au/

  • @franciszable
    @franciszable Před 2 lety +1

    How much ram and cpu does rpi use for this project? Does it overheat?

    • @Core-Electronics
      @Core-Electronics  Před 2 lety +1

      Computer Vision is always very intensive for the Raspberry Pi 4 Model B. My Memory on my 4GB Pi sits at around ~400MB and the CPU% is at around 111% for the particular script as it is running. These are the values I get from | htop | as the code is running.