A Portable GPU That Fits In Your Palm Of Your Hand! Pocket AI RTX A500
Vložit
- čas přidán 17. 05. 2024
- The ADLINK Pocket AI is a portable GPU with an NVIDIA RTX A500. It's about the size of a deck of playing cards. It's powered by Thunderbolt 3 and has 4GB of GDDR6 RAM. the ADLINK Pocket AI A500 is here to transform your AI workflow and unleash the full potential of your creative ideas. Whether you're a data scientist, AI researcher, or a content creator, this compact powerhouse will revolutionize the way you work with AI.
Learn More About The ADLink Pocket AI A500 Portable GPU: www.adlinktech.com/en/pocket-...
Buy It Here:www.wdlsystems.com/adlink-egx...
Or Here:www.mouser.com/ProductDetail/...
Follow Me On Twitter: / theetaprime
Follow Me On Instagram: / etaprime
30% Code for software: ETA
Windows 10 Pro OEM Key($15): biitt.ly/KpEmf
Windows 11 Pro Key($21): biitt.ly/RUZiX
Windows10 Home Key($14): biitt.ly/2tPi1
Office 2019 pro key($46): biitt.ly/o0OQT
Office 2021 pro key($59): biitt.ly/iDMHc
Equipment I Use:
Monitor: Pixio 277 Pro On Amazon: amzn.to/3PGUBwe
Elgato HD60 X Screen Capture Device: amzn.to/3GkP2AL
Tool Kit: amzn.to/3Wo8bpX
Camera: amzn.to/3XJfFoI
DISCLAIMER: This video and description contains affiliate links, which means that if you click on one of the product links, I’ll receive a small commission at no extra cost to you!
Under section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, education, and research.
No Games Are Included Or Added
This video and Channel and Video are for viewers 14 years older and up. This video is not made for viewers under the age of 14.
Want to send me something?
ETAPRIME
12400 Wake Union Church Rd PMB 239
Wake Forest, NC 27587 US
THIS VIDEO IS FOR EDUCATIONAL PURPOSES ONLY!
00:00 Introduction
00:20 Unboxing The ADLINK Pocket AI GPU
01:13 Specs ADLINK Pocket AI
02:19 Setting It Up
03:23 Testing The Pocket AI Portable Graphics Card
04:18 Blender Test
06:03 Benchmarks
06:37 Gaming On The Pocket AI Potable GPU
08:42 First Impressions
#nvidia #etaprime - Hry
If more companies come out with stuff like this for a more affordable price it would be a game changer
I mean.... if they keep making them smaller wouldn't they just put them in the devices and skip having to carry a separate gpu?
@andrewhill4751 they arent gonna get smaller than this. They need to much cooling.
Well you can buy a good cpu laptop with average integrated gpu when you cant afford a laptop with good gpu and later go for this. There are many scenarios like this and you didnt even bother to think of one before writing your comment.@@andrewhill4751
@@TheTastefulThickness Oh they will get smaller especially when they use a cooling system like airjet pro
This has not happend, because designing a PCB that can handle 40Gbit/s perfomance is hard and needs extremely expensive test equipment and expensive pcbs to pull it off. Completely different to a USB 3.0 gadget that you can design in half an hour with free software and get it manufactured for a few bucks.
And for the MSI/Asus/Asrocks of this world it's a market that's just to niche, they can make way more money on more proven projects.
The portability is the most appealing aspect of this. It's expensive for the raw performance, but the fact you can fit it in your pocket is what makes it so impressive.
meh, gaming performance wise; you are better off with RDNA (3) alone. It's already come in many form factors.
Similar GPD G1 or questionable portability but stronger Asus XG mobile for Asus ROG Flow laptops are IMHO much better choices. There is a quite large performance penalty for using laptop own screen so this device will be really bad for FPS/price while the others have stronger GPUs and the option to connect the external display directly to eGPU for best performance if desired.
At that point there's really no reason to not just get a gaming laptop which is way more.portable and convenient than whatever this thing is.
there was nothing impressive. I don't have so many pockets for a "pocket PC", "pocket keybord", "pocket mouse", "pocket power brick". And on top of that a "pocket GPU"? there are such things as laptops on this planet.
Still, a laptop with a built-in 4080 is going to annihilate it
$500 bucks? No thanks
Do you have another GPU in mind that fits anywhere and consumes 30 watts?
@@gormauslander no. I also won't spend 500 on a gpu with 4gb ram. The performance just isn't there for the price.
It's a cool concept, and with at least 8gb ram, might be worth it due to its size and power draw, but def not as is.
If you want to buy something like this at that price, go for it. Like I said originally, $500 bucks? No thanks
Sounds like someone has issues warding off their homosexual thoughts lol. Make it a little less obvious bud.
If it came with at last 8GB vram I would consider this. 4GB is too small IMO. It is an excellent concept that has potential.
gpd g1 is 8gb
wut are u talking abt?
@@pinkipromise they're talking about the GPU's VRAM, and high VRAM is very important for training language models and the like (which, 4GB isn't)
@@cyberplonkabout how 4GB VRAM is not really enough for AI stuff.
@@jamesbrendan5170 ah ok
Good start with this type of tech. Excited to see the next few years with more efficient chips with more VRAM etc. This might not be that great especially for the price, but the idea is solid. We just have to wait for advancements. This will be huge.
Not only does it have just 4GB of memory, it also thermal throttles a lot due to it being cooled passively. It's no surprise that it doesn't perform as good as an A500 in a laptop, which would be equivalent to a GTX 1660 or an RX 590
Tbh for the price I would buy a ThinkPad P50 and it would have 4GB of VRAM. Although, scalpers keep selling them for too much smh. Especially, the 4GB M1000M and M2000M ones.
Dang it's faster than my current pc 💀
Not to defend the thermal throttling issue but it does have a fan on the backside per the pictures they have on the website of the product itself.
It's literally the same chip as a 3050 just sold as a "Workstation" card. But agreed, this is a joke of a product.
@@nikkelitous It's cut down, though. Not quite the same.
4Gb of Vram for AI learning or generation ? what a joke lol. Vram is just the most important thing rather than pure performance in AI right now.
True. And then it costs 450 on top.😅 Naa hard pass
The A500 is a cut-down mobile 3050... I guess it's for tablets or ultralights. Still, I can't fathom why anyone would look at a 3050 and say "too much".
hard pass on that
@@BillLambert I expect stuff like this from Jensen, he's an expert at producing bottom barrel hardware and marking them up.
I was unaware of that rtx 500, From your previous um790 xtx video I decided to get rid of my desktop which had last gen ddr4 memory/motherboard/cpu and got a um790 pro and made a um790 xt (7900xt) , oculink really works well. Good stuff keep up the good work , love the videos.
The 4GB of VRAM on this thing are going to be very limiting in terms of the kinds of things you can do with it. For instance, most image generation requires at least 6GB for any reasonable resolution.
I think it's geared towards edge computing (eg. LattePanda) for rapid prototyping, expand existing embedded systems in the field and sell it as a learning tool to schools.
@@Rotwold I can agree with that, but the problem is that this thing costs $500-$520. I'm sure there are narrow use-cases where that still might make sense -- particularly if the absolute smallest form-factor is a must -- but the mass-market/tinkerer appeal just isn't there. In 99% of cases, there are better options.
While I don't think running a language model or image generation model would be feasible on this device, I think some others application (for example image recognition, audio recognition, etc) could benefit a lot from this accelerator.
@@skyguardian18 Google Coral TPU does the same thing, sized like a Wireless E-key module, cost $30, and sucks 2W power at most.
You got the point... even the small text-generative AI model require at least 5GB~7GB vram...
Things like this would be great for Single Board PCs or anything with an ARM chip. Excited to see what comes in the future.
Excited to see what red shirt jeff will do to this nugget.
What this is stunning, I want this thing from future version plus the legion Go, thanks man a lot of time seeing your channel about egpus and video games, great channel
Next gen handhelds with thunderbolt be amazing having one of these.
It's a cool little device, but for $450 you can do quite a bit better adding an external GPU to a laptop/mini PC with DIY components.
The Pocket AI is currently priced at USD $429
@@mrhassell still too expensive
It would be very hard to make anything as portable, especially with its very low power consumption. It would not be a fair comparison.
How much do you think eGPU enclosures are? Often they are in the $300 USD range. And then for $150 probably getting a video card that has similar output to this. But obviously there is a huge difference in portability.
Egpu enclosures cost you an arm and a leg and isnt as portable
Its honestly a cool idea, it just needs more juice. 8GB minimum preferably with 12-16gb options. An output port would also be ideal.
This is totally awesome for local small LLM. I NEED THIS!!!!
This is intriguing. If units like this catch on, it could be the start of something very big.
I think that if this product costs where the halve it will worth for the current specs, however I would rather prefer another eGPU option
For AI applications 4gb is a bit low, it has one major strength though and that is its size. There is a niche market for mini PCs where size and power restrictions are at a premium. For example i could see this as part of a truck drivers setup or in a tiny home.
Literally the only version of one of these external GPU I've seen that's worth using
as usual ,great video ETA, i would love to see this on a sbc
Looks great! Can't wait to plug this on my (future) rpi 5!
...oh
Y'all need to stop saying that anything under 60 is not playable. Over 40fps is playable, over 30 can be a bit laggy depending on the game but is still playable.
I dunno bru I just rebought a 360 for the fun of it and 1080. 30 is painful says 60hz so let's even say it's pumping all 60fps is horrid 😅 still the best system ever but it's so so hard after soon many years at 144 then 165 😅 let's face it this just didn't hold true
Between 25 and 30 I would still call playable. Hell, lots of console games capped at 30 because they couldn't hit 60.
Above 60 is nice but most people don't have a hugh refresh rate monitor and this won't ever experience the benefits.
I can play 30fps on a switch all day and not notice, on the Steam deck a smooth 40fps doesn't feel any different to me than 60, but on desktop it feels like I always need a minimum of 60.
I've had a 144hz monitor before and didn't care about it, I still think 1080p 60fps is high tech. But I grew up gaming on a 2009 iMac with Windows XP installed and I use Chinese mini PC's to this day, I just don't care about the kind of games that need extremely powerful hardware.
Some people get motion sickness with the wrong amount of Hz, some people swear 60fps is garbage and 120fps should be minimum, it's really whatever works for you personally but it's not the righteous battle that people make it out to be.
Internet folks are a special breed...
This is awesome. A couple generations more and we can pretty much replace internal gpus imo, or upgrade our existing laptop with a better gpu. Super cool
It's crazy that I haven't heard about this. Thank you for bringing it to my attention. I've been watching your videos forever but just realized I wasn't subscribed. I fixed that
Unfortunately it is too small for Ai use, I would consider minimum 12gb to be really useful. Shame on AMD and Intel for lagging behind and allowing Nvidia to monopolize Ai market.
Dam
@@chrisdejonge611 it's marketed for researchers and data scientists, who are probably not training stable diffusion
AI video enhancement and some other workflows fit well within 4GB, but I agree 8-12+ GB would be better.
Pretty much this, exactly. You can get by with 8-10gb but more than 10gb these days is much easier to work with without issues.
Not everything AI is targeted toward stable diffusion goofy 💀
Maybe a video where you disassemble it? Would be nice to see the internals. Curious if it's an MXM module mounted on a thunderbolt adapter.
All things considered, it'd be cheaper to just produce a singular board that has the processor directly wired to the USB4 IC. There's no reason to add modularity to this type of product when there's really no parts on the market other than what they're already producing themselves as an AIB partner. A singular board would also cut down on manufacturing, points of failure, and QC testing, especially when this kind of product is at a much smaller business scale than you've probably considered. Also, MXM has been long dead, the modern alternative Dell introduced is a proprietary pinout, and Framework's re-pinning of these connectors probably won't see widespread adoption considering the history of modular mobile-spec components. It just doesn't make sense for this product to be more than just a singular board.
nice to see external gpu getting more portable
thanks for the Blender Render!👍
I wonder if at some point they will put one of these into a handheld device. Since is so tiny and low power maybe they can make it fit into something like the steam deck, that would be really nice. I hope we are getting closer to dedicated graphics on a handheld.
Yeah . . . You want it integrated with the CPU. Which is why the Switch has an ARM. Nobody is licensing Nvidia to make x86 cores.
I'd love to see maybe a software translation layer for x86 code in Proton for example, and maybe some kind of legal hardware acceleration for it in RISC or ARM, but who knows 🤷♂️
As far as I know, that's kinda the entire point of an APU. You're putting the CPU and GPU on the same wafer. The only limitation to how powerful the CPU and GPU combo can be is thermal concerns and production yields and it's typically more space and power efficient than a dedicated GPU.
Well, if there is ever a pocket RTX 4080 or better with at least 16GB of GDDRx RAM, I might consider it.
The best you could see with this concept would be 100W limited GPU for the PD charging. So the 4080 doesn’t make much sense. I’d like to see a desktop 4060 in a 100x100x50mm form factor with 100W PD power in, oculink AND TB4 input, and a miniDP out. Oculink is good for 60Gbps and has a pretty big performance delta over thunderbolt.
This is actually really cool!! Thanks for covering this 👍
Wow you actually got it, nice!
Very impressive size, tragic about the cost and the significantly lower performance than the 780M. If this was $150 USD it would be much more appealing than $450.
I would really really like to see USB C eGPU adapters reduced to this size
I'd love to see you test the pocket AI on an actual AI project, not just games. As a dev I would find that more appealing to see what it could do.
I think it is outside of his scope.
@@darkcoeficient Yeah, this is a gaming channel.
As cool as that would be me too he just does gaming
as others said, falls out of scope. But the thing is you don't need those tests since AI/ML and many other types of compute are not limited AT ALL by the 18-22Gbps of Thunderbolt. All you need to check the performance in AI use cases for this is to check any other A500 benchmark.
I mean its an AI product, not a gaming product, so then kinda pointless to demo as it wasn't all that impressive for gaming.@@RG6Snipers
love all the vids bro
@4:17 thanx for blender benchmark review! Please make it general along with gaming reviews for others also!
Honestly the best use case for this might be something like someone who does a lot of video editing on location. They can significantly speed up rendering and AI enhancement without needing a 15+ inch laptop. Back in August I was in Japan and needed to upscale some video I had shot, and it took practically forever in Topaz Video AI on the iGPU (I have an eGPU at home, but it’s impractical to take overseas). There’s also music visualization software that supports GPU acceleration, so between that and Ultimate Vocal Remover, DJs might get some use out of this.
Interesting video - curious how you 'switch' between the iris graphics, and the A500 accellerator?
windows 10 or 11 now has a way to do this integrated on the OS. Before, you could chose it depending on Nvidia/AMD driver support using their buggy user interfaces. Windows seems to do a better, more consistent job. I think even Linux handles this fine these days
Probably by unplugging it
This is really for AI on the edge. I would love this in a dev environment for IOT.
I've been interested in seeing this covered for a while!
All other coverage has been out of Asia and hard to get a read on
I think you need to add the MSRP to all your specs in the beginning of videos.
If more companies came out with stuff like this for free it would be a game changer.
It's a special hardware for ai training not for gaming pal
Why would it be free?
I love modules/ plugins.
I think the major asset to this is its portability in an open-plan office with hot decking with a clear desk policy. I want to be as light and compact as possible, and I'm happy to pay a modest premium for that convienice.
I've been looking for something like this to enable me to benefit from the larger design monitors available to me in my current workspace. If it had a two-monitor output it would be perfect.
This is an excellent idea and would be a game changer... With like 12-24GB of VRAM.
It also need better cooling as it is constantly overheating...
Yeah and it would be twice the game changer with 48GB of VRAM
Amazing!! Could you please compare it with the performance given by the Asus Rog Ally? Is it possible to connect it to the Ally?
no, rog ally doest have usb4, it have terrible egpu support
only their proprietary egpu will work
or m.2 egpu
i love where this is going
Wow this is amazing, more companies should make this, it's very useful gadget
It would be awesome if you could drop this in as an accelerator to enable RT/DLSS with any GPU and have the GPU dedicated to rasterization operations while the pocket AI does all of the RT/DLSS calculations.
that's not how rendering pipeline works mate.
@pretentious_a_ness which is why I said it WOULD be awesome IF you could...but also just because something is the status quo doesn't mean it can't change. People made using a drop in Nvidia GPU as a physx accelerator a real thing years ago....who's to say someone can't make something that injects into the pipeline, identifies the instructions requiring point-trace math, and "hides" it from your GPU and redirects it to the pocket AI....obviously that's still an oversimplification for a very complex (and maybe impossible) task but everything starts with an idea and a demand.
They should make these things for older cpu or low end cpu that doesn't have igpu.
not possible, there's no interface other than Thunderbolt which is rare in older machines. And even on the rare ones which do have Thunderbolt, it will be severely limited by an x2 PCIE connection. If you want, you can make your own egpu over m.2 for computers where you can forfeit that, or even go mini-PCIE, but those are extra-exotic, inconvenient setups
Use "integer Scaler" and be happy 😅 Low res gaming on HD display. 720p is minimum to game on a 4k screen. 360p also works but 360p is recommended for 2d games only scale 360p to 4k.
I would love to see this used in a virtual machine. Since it's a professional/workstation series GPU it should support it.
Why? Is this a virtual machine/docker channel?
@@dj4monie because you can play video games in a VM if you have a native platform that doesn't support it? And since it is a workstation card, even shit nvidia will allow it to work with passthrough?
I just got a UM773 and plan to get a eGPU I hate that the options are so bulky, it would be great if they came out with one for gaming with video out like you mentioned. Thanks for covering this!
seems like buying a Jetson Nano would be cheaper with similar specs . . . what do you think?
The sad thing is a lot of older pc/laptop does not have usb4 or thunderbolt
Up
even new ones
Yah think if I try it with Lenovo Yoga 900 via USB 3.1 Type C it would work 😅?
Like they say, nothing tried nothing done.😂
this looks really cool, imagine you can strap a cube to the back of your steam deck and make it run 25% faster
this is nice, a low power efficient option. i like hardware not being pushed to the limit all the time anymore
Would this be useful for the SteamDeck to give a boost when docked and playing on TV?
The fact that the pocket AI has no video out would make it useless for a scenario like that. If you want to dock the SteamDeck to a powerful GPU and then stream that output to a TV, consider a full fledged GPU with an EGPU enclosure.
What are they going to accelerate with only 4GB of VRAM? SD considers 8GB low and LlamaGPT requires 6GB - 42GB of VRAM depending on the model you are using.
yeah this is useless, and so expensive like $429 while for that price you can get at least a 3060 12gb
Finally somebody is talking about this thing
What a Time to be Alive😮💻
They need to make something like this for gaming.
GPD G1, Asus XG Mobile, etc
can you give more examples that arent 600+ dollars, and ASUS's solution is propietary@@MrHamncheez
One alternative is to use a Thunderbolt external enclosure, that is what I did with my 2015 MacBook pro running a GTX1660ti for gaming in Windows.
Not great but would be a definate upgrade to onboard graphics chips like 90% of the intel HD chips that steal your system memory to run. 👍
That used to be the case, but the recent low power APUs from AMD are starting to outpace something like this A500.
@@NutellaCrepe that's why I didn't mention AMD. 👍
on that size it's actually great. The closest eGPU to this in size was a much larger Lenovo Dock from like 5y ago, and that sported a 1050 Ti MAX-Q, while the smallest thunderbolt eGPU enclosures you can still get today are as large as a mini ITX case. Yeah, you can do your own m.2-based eGPU, or you can grab an outdated Gigabyte mini box but you will still be severely outdated and increasing the size by about 300%.
I am of course excluding some stuff like Dell's or Asus's proprietary GPU docks, because they aren't really thunderbolt, and they only work on an expensive or outdated subset of machines)
Assuming you can use it with those Intel CPUs.
I can't recall any very low-end Intel CPU laptop which could benefit from this with USB4.
This is so awesome! It reminds me of the Intel/Movidius Neural Compute Stick (NCS2). But that is a low-powered USB stick that is meant to connect to a Raspberry Pi, which can do object recognition at 5fps-30fps.
Kinda same thing, quite powerful for that, I think that one can do instance segmentation in real time.
One value on movidus chip is the pricepoint and power consumption
This could be really cool paired up with the MinisForum EM680, making the entire thing USB-C powered
I wish they'd make it as a pen drive dongle, with a HDMI output. Maybe in few more years
How does this compare to the GPD G1 using the AMD RX 7600XT?
GPD is way stronger 😅
Pocket AI:
$429 MSRP
4GB VRAM
25W TGP
6.54 TFLOPS
Thunderbolt 3.0 Connectivity
No Expansion Ports
Can be Powered by PD3.0+ (40W and up) USB Battery Banks
GPD G1:
$714 MSRP
8GB VRAM
75-120W TGP
21.54 TFLOPS
Three - USB 3.0A, One - SD Card Slot, Two - Displayport, One - HDMI Out, USB4 + Oculink Connectivity, Can Charge Devices with internal power supply.
Oculink Support (Faster than Thunderbolt 3 in Gaming performance 15-30% real world)
Needs AC Adapter plugged into wall
In completely different categories really.
Great product...
A PC with a Thunderbolt card and that GPU would be very versatile and powerful...thanks.
Very cool idea, and priced relatively reasonably for the size and convivence, easy to chuck in a bag. As a gamer, is it overclockable? Or allow undervolting so that it could achieve higher clocks?
4Gb is not enough for StableDiffusion, not mush applicable for transformer models
The jump in performance also come from the fact that the GPU and CPU are sharing the same TDP. But if you use that external GPU, the whole laptop TDP go to the CPU and the GPU use its own TDP.
This is really cool. With an other usb c port for video out and a more powerful gaming GPU something like this would be epic!
We need this to be compatibled with all of our gaming consoles and laptops.
Have to admit I'd love to see what would happen if you hooked this thing up to a Windows-installed Deck.
Can't believe he didn't do this..
@@KingmanKingman-nb4tn my guy, as he said, he had several unexpected results throughout the thing. it’s worth it just to see anyhow. he has hooked up eGPUs on less.
Nothing would happen, because this uses thunderbolt.
I hope handhelds would come up with this gpu. Having an nvidia GPU means you have dlss which is a game changer. Fsr doesn't come close.
Amd has fsr and asus rog ally has the z1 and z1 extreme chips which are amazing for a handheld
I don't care about that, we have a GPU that can connec through usb, future is bright.
Fsr
This GPU is $450 lmao. Its meant for AI.
Unfortunately it looks like Nvidia doesn't want to or care about competing with Nvidia on handheld SOC's.
You should definitely make an emulation video of this system!!! 😁😁😁
I mean testing emulation on this laptop with the external A500
This is a GREAT idea. I agree we need a little TB dock with one of these inside. Handhelds and laptops that just need a LITTLE more power would be HUGE here. That being said, Intel putting low power ARC in systems could be helpful also.
Can this be fit together with Steam Deck?
Unfortunately no. The steam deck is not compatible with external GPUs over USB-C.
In this case, the steam deck does have display OUT over USB-C, but not the "thunderbolt" protocol which is "sort of" like having PCIE passthrough over the cable, which would allow a dock to be connected.
There ARE handhelds with thunderbolt support, but the steam deck does NOT have it.
For 25 watts, that's not bad performance. That's essentially a 2060
No.
It's a cut down (fewer cores) and power limited (much lower clock speed) 3050 and the 3050 is already slower than the 2060. It's maybe a 1650 Super. Before taking the additional 25% performance hit of looping the video back through TB3/USB4.
@@lewzealand4717I think it performs like a 1660 basically a 3050
1650 .
@@lewzealand4717 Thanks for the correction!
For a graphics card to still have this performance @25 watts, the only other option is a 1030, so look at the relative performance. For that wattage of a GPU, the only thing close is a 750ti.
@@user-um9sl1kj6u it would be a sweet sff gpu if they made it 50 watts
With 4GB of VRAM, it is really not going to be much use for AI training, and larger models won't fit into memory for inference. It sounds like a good idea but it will only be useful for people with very specific requirements.
Yeah not for training, but for inference?
@@ghardware_3034 Most inference LLM's like Llama2 are about 6gb with 7b parameters and 4bit quantization (not very good results) and for 13b parameters (better results) you need at least 12. For Image generation, SD1.5 weighs between 2.5 and 4gb, but even with 2.5gb models, you don't have a lot of space for upscaling or controlNet.
@@qtsssim This thing isn't aimed at gamer.
Hope it's not aimed for people who want to do inference, it wouldn't even run LLM, TTS, SD .... this is useless need at the very least 8gb of vram @@lequack8861
Imagine using this on the steam deck, it would have been a huge thing and t would have been the most coolest thing ever
Ok i will buy this for sure. That looks like a great portable ai card goodbye to my old massive egpu setup for ai :D
Edit:
This is **not** a shame on the reviewer or the company. ETA Prime is a fantastic reviewer and has done a fantastic job here. The company (ADLink) is also doing a fantastic job as well. The shame here is **specifically** on Nvidia.
End Edit
If it were cheaper I'd say this would actually be fantastic as a plex type video out.
Combined with a framework laptop this thing would absolutely crush for all things media, especially if you're a streamer on a budget as AV1 out is now the rising standard.
But at 500 USD (including tax and shipping) the answer is a resounding no. Nvidia is too far too overpriced. At 450 bucks you may as well go with a Microcenter external GPU dock and a 6750xt if you're wanting external gaming capabilities.
Or if you're into AI or datacentric compute then that 450 USD should go to a 4070 for that price.
Portability is not an excuse for price gouging.
Overall capability, capacity, performance and product quality scores a 2.5 out of 11.
The diminishing category here is price to performance (with specific regards to AI/compute) with a score of zero.
not much of a reviewer, more like an advertiser... never compares stuff with other stuff just the machine itself and they are all "great and amazing"
Way to expensive
That's my complaint. He is given these items to " review" but it always sounds like an ad read. A lot of youtubers fall into that. They refuse to say anything negative what so ever to keep a friendly reputation in regards to these companies.
At least he's not like Linus and using the items completely wrong like a moron.
They charge the price they want because it is a niche use case, novelty product, can't really fight that right now. Portability is VERY MUCH an excuse. You can see that on smartphones or even laptops 24/7/365, where even a slim or 14' version of the exact same Lenovo Legion, Razer Blade or Asus ROG Zephyrus command a 500-1.5k premium. If the format gets popular we might see stuff like it cheaper and in more diverse feature sets, which I hope for.
Also, I'd argue for Plex these days you can use much more convenient hardware, like anything Intel 7th onwards NUC or 1L PC (youtube search for tiny mini micro), or even an Nvidia SBC if you want that transcoding powerhouse but that is asking for trouble. If you mean as a client plex device, the Nvidia shield has been the best target for a few years now but really any Qualcomm/Mediatek/Broadcom "smart dongle" or "cast" ort "tv" device works just fine, no need to spend over 200 bucks let alone buy a 450 buck GPU and still have to add the cost of a (Thunderbolt-compliant no less) computer on top.
@@cldpt
Disagree completely as phones, laptops, tablets, and every other portable machine can conjunctively use its total compute capacity to achieve average and beyond average results in an AI/Compute exclusive environment.
For example my Pixel 7 (non pro) is more capable for language and image generation locally- on device. While the overall performance is the same as this little box, it is at least a complete compute package while the A500 external is not. Therefore its overall capacity and capability score lower than even a Pixel 2.
There is no valid argument when it comes to a comparison to other complete devices. That's an apples to oranges comparison.
Even when comparing this box to a gtx1650 or rtx 3050 the overall compute performance is completely lacking. Even the 7600M XT can run games, ai, and compute tasks (such as cad and blender) at a reasonable baseline.
This little box is only good enough for a home server style video out and that's about it. When it comes to object recognition, stable diffusion, language processing, ai tasking, and more you're better off with a 35-40 USD tpu device- as its just as capable and costs one-tenth the price.
ugh, 4GB. not enough for using Stable Diffusion...
Yeah 6GB is the bare minimum for decent performance.
This is extremely cool. I wonder if my work would let me plug it into my work laptop
This thing is so impressive I might have to snag one of them.
It's only 4Gb, while it might be great in another year, this is basically false advertisement at this point in time.
Still better then most Intel HD chips that eat system memory to work. 👍
@@rmcdudmk212 facts
Engineered for AI, so not built to render graphics
It's not false advertisement though. He described what the product is for, and it sounds like the website for it does that, too.
@@RageyRage82yea but if you are delusional kid who feels entitled to everything this is false advertisement because it isnt what they want it to be
This is innovation ! This is the right direction for mobile GPUS ❤
Bro tech is getting wild...🔥
do you think this would help the rog ally z1 non extreme version? a bit worried that the pocket ai won't do pass through charging though
I installed Windows on my phone and plug this into it cuz my phone uses usb 4 .
surprisingly work
Now this is a good idea
Nice!!! Battery power media server get even better with extendable gpu over thunderbolt!! Thaz pritty handy!! 🎉
When you get your hands on the legion go this would be a nice addition to test with
Give that thing dedicated graphics out and I'm sold!!
things like this interest me a lot because I basically live off of a laptop these days but sadly I dont think i have a USB4 port
Makes me wish Raspberry Pi had TB4 so you could do really small scale AI stuff
The power efficiency is 😮
hey man i really enjoyed your contents. Im just wondering how doese this pocket egpu fair with the lenovo legion go. Cause i'll be traveling a lot and do have to unfortunately work while traveling but dont want to bring a huge pc. If you could man would really appreciate it that you put out a video with legion go paired with pocket AI! Cheers mate!
A pure tensor and RT add in card to support the gpu would be great as well
question, do you think this would work with the upcoming Lenova Leigon Go? if so its a buy for me. or how about using this on a macbook? thanks 😎
Hmmm... interesting. Hopefully things like this become more common which results in more choices, different power levels.