You can use external GPUs on the Raspberry Pi 5
Vložit
- čas přidán 31. 05. 2024
- GitHub issue: github.com/geerlingguy/raspbe...
Thanks to EVERYONE who's helped along this long journey (and it's not quite over yet!). Will AMD or Nvidia-or maybe even INTEL!?-support their cards on the Pi 5 officially? Time will tell ;)
Other things mentioned in this video:
- Pineberry Pi NVMe HAT: pineberrypi.com/products/hat-...
- Pineberry Pi uPCity: (Unavailable... I'll add a link if they make it!)
- AMD RX 460 - more info: pipci.jeffgeerling.com/cards_...
Support me on Patreon: / geerlingguy
Sponsor me on GitHub: github.com/sponsors/geerlingguy
Merch: redshirtjeff.com
2nd Channel: / geerlingengineering
Contents:
00:00 - It works!
02:10 - Wayfire works too!
03:52 - glmark2 and stability
07:05 - Where next? - Věda a technologie
Born too early to explore Space, born too late to explore Earth, born just in time to run an egpu on a raspberry pi.
“It’s great to be me!”
Oh glory to us. 😂……..(;´༎ຶД༎ຶ`)
Born just in time to see AI go exponential as well.
Born just in time to watch someone on CZcams manage to run an egpu on a raspberry pi.
There's still plenty of earth to explore, if you don't have Thalassophobia
I think you can fix the invisible cursor by adding WLR_NO_HARDWARE_CURSORS=1 to the environment variables
That sounds like a wayland problem I've encountered with nvidia cards. Why would this happen with radeon cards?
@@youtubeliskbuggy drivers on the pi possibly...
I wonder if it worked, seems like a driver on the graphics card
I'm really hoping on a cm5 variant without the rp1 so we can have a lot of lanes!
If we take off the RP1 chip, we'd have 5 lanes of PCIe, is it possible to use all 5 for a single device? Or are PCIe devices only able to use lane counts that are a power of 2?
@@YonatanAvhar PCIE only works in powers of 2. So the max you could get would be 4 lanes on one device, which is still pretty significant.
@@YonatanAvhar In PCIE 1x mode the lane is Bidirectional, after that the Lanes are paired to input with output lane. Thinking of it. Jeff should get himself some RX 6500 XT not the 6700XT. It may solve a lot of problems, yes the RX 6500 XT has only 4GB of Ram but it runs in PCIE 4x Mode ONLY which would be perfect for a CM5 variant without RP1 at least for testing it's great as the driver doesn't need to force the PCIE Bus of the Card down. And thanks to the crappy Bus lot RX 6500 XT run only slightly higher than a RX 580 on Ebay as they would be worse on an older PCIE 3.0 System.
you would need 16 cm5 boards connected together to get 1 x16 lane for a gpu to have full x16 connection or just 8 for x8 but keep in mind its only pcie 2.0 i saw jeff push it to 3.0 once but at 2.0 speeds its only 6gb/s x16 and half that at x8
but can it run crysis?
PI GPUS ARE A THING NOW!!!! (also i could never afford one but this content is amazing). Good job to everyone involved!
They are? Maybe for Jeff, but who else uses them?
@@CZcamsGlobalAdminstrator People who want low power video transcoding for a home multimedia server, people who want a low power Personal Video Recorder for their home security camera's, and people who are a bit more eccentric and want to do protein folding for science.
@@CZcamsGlobalAdminstrator These things start out this way, a bit janky, very weird and not really useful in a practical setup. Then ever so slowly they might get adapted and made more practical if there is a use for them, and there are possible valid use-cases for this. So yes, getting this close to running means they are very much a thing to watch out for.
@@CZcamsGlobalAdminstratorExactly. This would be a very niche use case scenario. If you need a GPU and a low power CPU, just build a low power x86 PC.
@@CinkodacsI could honestly see this becoming really useful, would take a few years, but I could see this blossoming into the future equivalent of entry level 300$< bricks for gaming or workstations for students, a few new standards would need to be made, maybe some innovative cooling solutions would be necessary, but I think a NUK or Thin Client equivalent with some actual muscles to flex could be really useful for a lot of people.
Ultra budget Gaming rigs kinda suffer from the cost of shipping, so lowering their footprint and volume could be a real advantage, not to mention having an entry level standard for students learning 3D modeling and living in small dorms and apartments. Schools could gobble them up and make certain extracurricular courses actually sort of viable. I really hope this is pursued and AMD or Intel or whoever see a potential industry to get behind.
Wow after so long, the GPU is finally working (partially) on Pi! Means those 2nd hand older GPU value will definitely increase in the future lol. I'm excited for the day the GPU to be fully functional in the future!
Box86+Box64+wine+steam now
Wouldn't buy anything older than 2018. RX6600 is already 200 dollars, that's real cheap, anything less than that price/performance ratio is a very possible waste of money, specially for a pi since you will want efficiency for your projects.
@@he8535 Windows on arm for me!
@@Splarkszterrx460 draws half the power of a 6600xt under load and idle power consumption is around 10w. the 6600 has greater efficiency but personally I'll just take the $30 card
@@butre. Might pick one up now. They might go up in price if they become a favorite for pi 5.
The video we've been waiting for! Well done on getting this working amongst your move and the seemingly endless task list that comes with it.
Most modern GPUs have a "0db mode" where when it's under a certain temperature usually around 40/50⁰c it will turn the fans off to reduce noise at idle.
They do. However if they don't spin at all (especially under stress test), there might be a problem with interfacing with VBIOS and fan curve. Or it might not get power to fans for some reason.
I appreciate the cut to explain the mistake about the HDMI connector, rather than just an on screen text. Often those on screen text corrections are too fast to read, or appear under regular subtitles.
Definitely! I try to get to that when I can in editing. Sometimes it's a bit difficult to actually film something later, but in this case I still have it set up on my desk right now, heh.
This is amazing! I love watching your videos where you are so ahead of things like this
I subscribed years ago to see you get a Pi working with a GPU. It was great content along the way but so satisfying that now you finally did it!
I love your content Jeff! It's great to be taken on an exploration journey of embedded pi-devices. Keep it up!
I really appreciate your work on this Jeff, as I will be using it later on but do not have the time to experiment. Thank you!
Awesome looking forward to seeing this improve. My pi 5 just shipped and I'm looking forward to tinkering with it
This is a milestone, your videos are amazing. This is really the cutting edge of PBC technology
0:45 that tracked yellow arrow was a thing of beauty
How is that possible? Frame by frame?
jeff this is AWESOME to see mate, well done working with the community!!!!!!!!!!
Very cool, Jeff! I think this has some possibilities for local LLMs in robotics projects. I know there are other options like Jetsons etc. But there is something lovely about the CPU being a Pi. I wish I had the skill set to contribute. But for now, I'll be following. BTW, on your recommendation a week or two ago, I ordered a couple pci adapters from the guys in Poland. Thanks for the tip.
your patience and perseverance is amazing. Congratulations on the business, hope you're doing lots of brainstorming on monetization. You have lots of possibilities in addition to making videos.
Thank you for your hard work on this awesome project Jeff! I am a computer engineering student and this is right up my alley!
Always nice to see these off the cuff videos from you. Good luck, and hope you have some good results with the RX6 series card. Love to see a used dell RX6300 plugged in! Only 2GB-3GB of ram, but 2Ghz clocks for under $100...
It's fascinating to discover the new possibilities that open up with devices like the Raspberry Pi. Thank you for sharing your enthusiasm!
Never give up and good things might happen. Happy to see there is progress 😊
When I was a masters student (Physics) I didn't have enough money to buy a laptop. I used an RPi3 and the cheapest display in the market for my academic stuff, like coding (Fortran, C/C++, python), graph plotting (gnuplot, python matplotlib) etc. When you said if RPi5 can be a potential replacement for traditional PCs, I remembered my old days.
After joining PhD, I started getting enough scholarship money to get myself a good quality laptop with RTX 3060. I use that now for ML purposes.
Its amazing everthing you do, bring compatibility from arm to graphics card is something so substantial to make the switch easier for consumers
awesome job Jeff. Would love to see you do this with the Rock 5b. Keep it going!!!!
I am really, really, really hoping that the RPF make the CM5's factory carrier board in an ITX compliant form factor like the VIA Neo-ITX. The 5 should be significantly faster than my daily driver laptop already, add a GPU and... I'm good. As well, add a flex riser and I can use it in a 1U case with a tape drive, a SAS array, a SCSI scanner, an old audio interface - all powered over the network.
Most of that is doable on the CM4 sure, but the form factor is *just* off, and somehow no-one ever managed to just... reproduce the IO board in a "corrected" form factor, only wild, expensive cluster boards. So, I'm begging that Eben does it. :D
i also wonder how it could work with the x4 pcie line which is connected to eth and usb at the rpi5 board
@@niyaziugur It's a good question, but I highly doubt we'll see a PCI-E only CM5. The point is to be a SoM. If a device user has to implement a southbridge on their platform then it's not a SoM, it's a Pentium III ARM edition.
And I think we can live without it. The number of use cases where PCIe x4 is more important than; ethernet, USB, video out, video in, GPIO - is going to be vanishingly small. Yes, I know there will be edge cases - but I wouldn't bet two cents that they're enough to justify a whole special version of the board.
And personally, I'd be much more eager to see an RP1 release that we could add to say - a Honeycomb LX2 for Pi support on other ARMv8 platforms.
:)
Achievement unlocked! Great job, Jeff!
Hi Jeff,
Love all your videos. Slightly off topic. I know PiKVM is a very popular topic right now. A similar topic you might be interested in, is using the pi as a remote terminal. My homelab currently runs on a Dell Micro computer with the optional Serial Module. I can remote into the pi using ssh and then serial into both my dell (proxmox) and/or my pfsense router using screen. Very easy and cheap to setup. Anyway I think Serial is under represented in the this content space but is incredibly useful. Figured I would through it on your idea pile for future homelab videos. Cheers!
I knew you would finally pull this off. Good job.
Super cool, definitely going to try it out! thanks for sharing!
Community driven 🎉 so happy to see progress & would love to help
I remember a couple of weeks ago someone trashing my comment about connecting a graphics card to a Pi 5.
Thanks for not only doing it but smashing it as well. 🙂
You look different Jeff, you look fantastic.
I love these videos, for some reason they remind me of the 80s when i was a kid and we had ZX Spectrums and the school had one BBC micro
Wow! It's finally here :D After all that hard work. Amazing!
This is very exciting! Super cool to see the Raspberry Pi with neat capabilities like these!
This guy's dedication, in order to get an external GPU working on a Raspberry Pi, is absolutely admirable ❤
Very cool stuff in the works!
Honestly I don't think the pi 5 would have had a PCIE interface if it wasn't for people like you stretching the boundaries of what could (or should) be done on a credit card sized computer. Stay curious :)
Playing games on Raspberry Pi will be finally possible I suppose within a year from now on (I hope). So basically you have to buy Raspberry Pi 5, PCIe adapter, eGPU and all the stuff and you are ready to go. Have to confess I would like to see you playing a Witcher (also from Poland) on a Raspberry Pi 5 one day. :) By the way the world is small - I knew Iwanicki who is a Red engine developer when he was a kid. Good luck with this Jeff. You are doing excellent work and keep going. I can't wait to see the studio fully working. Wish you all the best. This video is a good example that shows the power of the community. Excellent job folks!
Congrats to Jeff for pushing this and to all the others who helped you making this possible.
I know I am probably really boring if I say I'd love a combo like this for a casual desktop replacement but sure it's a possibility.
Ha, guess it's good I kept my RX580 all these years after all :)
Jeff, you really are one of the most wholesome dudes. Your enjoyment of the goofy tech is inspiring.
So I am so excited to see this progress!
I saw the title and was like:
THERES NO WAY!!!!
There is a mouse type or whotnot variable that can fix the no mouse image issue. Basically the acceleration for the mouse is likely not working on the GPU so you may want to just render it like on a framebuffer (and there is an option to force something along those lines which should make your mouse visible)
Wow! You have finally done it! Congrats!
This is awesome and something i hope to see more of!!!
That's exciting. I have graphics cards I'd love to use to make arcade boxes with, but having things that are cost effective to pair them with is awesome, and this is a step in that direction
In radeontop you can press 'c' for colour!
Ooooh did not realize that!
Yay! I love the jankyness. :)
Awesome work. :)
Neat milestone. Thanks for showing it working
Impressive. I look forward to seeing shenanigans with a GPU on the Pi.
is really reduced instruction set better in desktop if I can cool 230w tdp on dual tower aircooler?
Finally casting some light to my long running project - RPi with GPU in PS2 chassis game console. Good work Jeff!
Dreamcast chassis or bust!
@@user-xu2pi6vx7o Too bulky, no challenge at all!
Darn, I've clearly not checked in recently enough. I didn't realise GPU on Pi were this close to working.
If you can get a current gen AMD working I wonder if the Pi could ever manage to feed it data quick enough, even for 'pure' GPU tasks to saturate it. Either way great job to all involved, really does make the Pi seem even more like it would be able to replace any non-gaming desktop PC reasonably well - while compiling and a few other tasks might still feel too slow on even a Pi5 most of those tasks are not really 'DESKTOP USER' tasks - the sort of thing that gets pushed off on a workstation/server in many cases, and lets face it most folks will never touch a compiler (or any of the other really heavy CPU only tasks I can think off) at all.
Would be great, but isn't newest Raspberry Pi, PCI-E 3.0 x1 or x4? That would be enough for basic things, but some applications like to use GPU to compute some things and I don't know if that wouldn't be a bottleneck.
Since when was a 2016 ATi Radeon RX 460 current? In case you somehow misplaced your Calendar, I would like to be the first to welcome you to the year of our Lord 2024! So if an 8 year old card is in your option a 'current' gen GPU, then this actually explains a bit. Then again nVidias' insane pricing isn't helping. But a 4090 a RX 460 ain't.
@@Ichijoe2112 There was an 'if' in that statement you clearly missed. Any external GPU is impressive but ***IF!!! you can get the latest generation working could the pi even keep up?
Though you can argue the card in use is 'current' in the sense it actually uses the AMDGPU kernel driver and not the obsoleted Radeon. Not even from that awkward crossover when AMD decided to really do Opensource driver support.
@@Ichijoe2112 You also need to take into account that even if it is capable of using AMD's RX 460 - it isn't capable of fully utilizing it - therefore trying with stronger card is kind of pointless - for now.
@@jannegrey593 Thanks for validating my point. That this is a nearly pointles endever. again; Just because WE CAN! Isnt equal to we should.
You are amazing dude. Great work!
Nice work, Jeff! 😍
Wow, Kudos! Immense accomplishment! Gratuations! There are not enough words!
Brilliant, absolute madman. Keep at it.
Great video. Thanks for the content!
That was fast, Didn't expect a GPU to be running on the PI 5 this soon. Good Work everyone!!
Would be cool if they worked with the compute blades as well, lets you make a GPU compute cluster
You were so so preoccupied with whether or not you could, you didn't stop to think if you should.
Is it possible that the pi might run software mode for display input, transfer it through the ram to the gpu and then the gpu and loop it back in the video output?
Good works. Congralution. I still think that onboard GPU makes more sense. 🥶
You are a true legend Jeff!
Pre-ordered the pi 5, first one so not too knowledgeable about em but have been wanting to get a pi for about a year now. Will I be able to add a touchscreen to it? Will there be one for the 5 or could I get the same ones used for the 4? I'd appreciate some info on that. Thx.
I love the total impracticability of this :-)
4K thumped up ,well done.
Nice shirt! Im sending this video to the Rocky Mattermost too. 😄
Any chance we could use this setup for NVIDIA cards / cuda? would love to have my pi5 do faster ML and inference projects
Dang this is so cool! eGPUs are big brain time amazing work G
Huh. I’m genuinely surprised as to how good this works! I’ll be interested to see a cm5 or heck even a cm6 board that has a dedicated 4x or 8x slot.
You should use the 5V output from the ps or buy a 12V/5V at 4/5A DC/DC converter to power the Pi from the same supply you power the graphics card.
What is keyboard and mouse pad mouse you using
curious - would it be possible to run it with stuff like A310 from Intel?
around same power as RX 460 - while taking much less power
and Intel drivers for linux are on a same level as AMD - not a closed crap like nvidia
So, when do you think it will be stable enough to support folding at home?
I have that same RX460 card, and using that in Gentoo, you have to fork a lot of firmware into the kernel. It's not like some DKMS or kernel patch like nVidia uses. You literally have to kernel config edit the firmware hooks directly in the kernel, not using modules but baking it right into the kernel. Whoever is working on this needs to bear in mind that AMD's GPU has to be firmly hooked into the kernel for it to work correctly.
I have an RX550 (not much of a difference), but it just works. The newer drivers are something else. The opensource ones work really well. (Well, also I am no longer compiling my own kernels, no longer have the time for that and the difference is not much anyways).
I haven't had any firmware issues with amd cards so far, neither on my desktop, nor on the pi. You need to have the firmware, but that can be installed by installing the firmware-amd-graphics package (or downloading and copying the files separately if your distro doesn't offer it).
I tried a Pi4 as a second desktop at work, and it functioned but even 8GB was just too limiting. The only option was to not run a browser, just use the Pi as an xterm-tiling display and a head-end for Synergy.
Why can't the Pi family address more memory? Can this be design-improved in the future ?
Graphics-only?
Or can it be used to crunch data, ie AI / ML or like the old SETI app? 🤔
Can I ask if you can build a pikvm for pi pico? is it possible?
you fix the not working fan on the Graphics cards ?
It's it possible that those errors are memory corruption caused by PSU ground separation?
I had issues some years back with ground plane noise with a cheap external PCI case.
They're extremely consistent (in terms of when they happen), so I can't rule that out, but Coreforge also gets the same errors at the same times, and his power setup is different... but again, can't rule out anything at this stage :)
My goal is to get everything powered off one PSU at some point, but I haven't had time to work that out for the Pi yet.
This is a game changer for gpu intensive low cpu usage programs
I enjoy your passion. Don't fly too close to the Sun.
interesting whats if pi5 usb 3.0 use for ssd raid, usb 2.0 for 2x2mimo+2x2mimo wifi adapters, pci divide lines to connect more gpus, software proxmox ve , run AI lm python/cuda learning models. probably with k3s
did you also get opencl to work on the radeon?
Major milestone! Pretty soon we can do AI predictions with a RPI connected to an eGPU...
you can run m2 pcie ssd and gpu in the same time?
Do you think a 1080 TI would work? Thanks!
When can we expect for most GPU's to be supported by the RPi series?
Keep at it, I wait for the day when we can actually play stuff like Quake RTX on the Pi through an external graphics card.
this seems very promising, from what i can tell there are many tasks that are limited by the poor internal graphics so a weak external gpu might be able to add alot
This is big! Super rad!
I wonder if this will become interesting for AI inference in mobile systems or just at home. These models often don't need a good CPU, just a good GPU.
Wow! An RX-460!!! I still use one in my Ubuntu machine, and it never messes up!
A huge step for respberry Pi!💪💪💪
Wow! How does the noctua fan gets power or is regulated?
It's a 5v fan, so it has a USB adapter; and I have Noctua's fan speed controller so I can turn down the fan speed a bit!
So when is that uPCity getting released? I bought a HatDrive! Bottom but I'm not having any luck getting even the few PCIe devices that are detected through the m.2 to PCIe adapter to operate at all, let alone stably.
Likely power issues-the uPCity would be helpful for that! I asked but right now it sounds like they're trying to ship as much as they can before the new year. Probably in January, that's my guess.
This,is great.. I got lots of gpus from the crypto good time…. And pi 5s in the mail! Should arrive just in time for Xmas break woooo!
A really cool project, I dont know that I would wanna run that external gpu without its own internal fans working correctly.
Are the fans not running on the video card or just a camera sync illusion? With no fans, that card won't last long.