Nvidia RTX A4000, Worth Buying? Gaming Benchmark Review
Vložit
- čas přidán 24. 07. 2024
- Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3080 - geni.us/7xgj
GeForce RTX 3090 - geni.us/R8gg
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6900 XT - geni.us/5baeGU
Radeon RX 5700 XT - geni.us/L6Mb2JK
Radeon RX 5700 - geni.us/LZ2jn7
Read the written article at TechSpot: www.techspot.com/review/2342-...
Video Index:
00:00 - Welcome back to Hardware Unboxed
05:20 - RTX A4000, Stock Operation
05:34 - RTX A4000, Overclocked Operation
05:56 - RTX 3070 Ti, Stock Operation
06:30 - F1 2020
07:18 - Cyberpunk 2077
08:05 - Death Stranding
08:43 - Horizon Zero Dawn
09:23 - Rainbow Six Siege
09:57 - Watch Dogs Legion
10:19 - Power Consumption
11:27 - Average 1440p
12:27 - Average 4K
12:45 - Final Thoughts
Nvidia RTX A4000, Worth Buying? Gaming Benchmark Review
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Music By: / lakeyinspired - Věda a technologie
00:33 - Should have called the channel “Gamers Nexus”
Back to you, Steve 👐🤲🙌
This comment deserves to be pinned
Steve: Reeeeeee....
Thank you, Steve.
In this episode, our engineers Patrick and Patrick defuse a power supply and later put out a riser card fire by installing it into an airtight computer case.
@@Alpostpone yes, nzxt cases are good for somthing
"thanks Steve"
We're at a point, where people discuss wether or not $1200+ workstation GPUs are an affordable alternative to mid-range gaming GPUs. Let that sink in.
Lol XD
mfs would do anything to not buy amd
@@butifarras mmd xD
Its an option until the stinking miners start getting their hands on it
@@butifarras MFs can't even find AMD
For the power consumption and considering it’s not technically a gaming card that is actually really impressive. Even if you wanted it just for the physical size. Cards these days are getting ridiculously big.
Sadly efficiency is not "hip" these days
@@GewelReal it's because Nvidia needs the power to get the performance. So now you don't hear any fanboy about the power consumption anymore. Lol, they even are fine with buying new ridiculous PSU's to power the cards alone and make sure they'll run. There's that.
Logical It's pretty awful seeing GPU manufacturers not care that much about power consumption in the name of getting the performance crown.
@@KimBoKastekniv47 HD5970 was a dual GPU
@@mix3k818 why is it awful, just don't buy them and let people who don't care about the power consumption enjoy the performance...you can buy a low tier model or hell even undervolt
Love the Gamers Nexus joke. You should totally have used the opportunity to go "Back to you Steve"
You know times are desperate when people consider professional GPUs (which are usually 2x-3x the price of gaming equivalent) for their daily tasks.
Dude I bought a second hand Radeon Pro WX3100 to replace my RX 580 which died in less than 6 months.
The GPU was much cheaper and hey, it's even able to play games pretty well!
Not at max graphics, but the game quality and performance is still really good!
No one should do this.
Thanks to scalpers and miners paying whatever, they've successfully driven the price for consumer video cards to the cost of professional GPUs.....
@@GabrielTobing Well, yeah, obviously it is possible to find great deals on second-hand markets. But we're talking new cards primarily, and that's what the video was about.
I’ve looked into getting a lower end pro gpu for my station because i do graphic design work, but I also want to use it for gaming to kill two birds with one stone
Not having RGB makes this card 15% slower. Missed opportunity from Nvidia.
Such a performance loss could be easily reimbursed by 2 holographic stickers.
@@thenotsookayguy 😏 a man with taste
@@thenotsookayguy slap stickers, “this bad boi is gonna go even more brrrr-er now.”
RGB also lowers temperatures, giving more headroom for overclocking by converting heat to light. :P
Man of culture
production benchmarks would be interesting to see as well
especially for a production marketed card...
@@patricktho6546 Benchmarks out there already. They're not really marketed as production cards anymore, because they suck at it just like with games. They're great compute cards with far superior FP64 performance.
@@glowyboi7175 sorry, for me comuting etc is also production^^
So anything that makes money with the usage if a card. Not recreational like gamerd.
@@glowyboi7175 actually quadro gpus thanks to it's drivers, have a significant performance improvement on cad or viewport scenarios compared to the gaming counterpart
@@gigigigiotto1673 Only in Solidworks & Catia. That is complete crap for 3D viewports like Maya & Max.
Hmm gamers nexus has a good ring to it…. almost like I’ve heard it somewhere before…
Kinda melds all the gamers together.
"Back to you Steve, no not that Steve" lol
@@drunknmasta90 we are all glued to the content.
@@user-pn1fe6sg2w "so just buy it "
Its kind of a graphical challenge
Now we have to consider rendering card for gaming (usually 1/3 - 1/2 value for money compared with Geforce, "launching price")... Crazy world.
High powered / faster GeForce cards are the preferred choice in GPU rendering, but yeah I know what you meant. These are data science / FP64 compute cards.
@@glowyboi7175 I agree with you, unfortunately academic institutions and universities usually do not (previous Quadro and current RTX pro product lines are usually the only choice for us, in fact in most of cases OEM workstations are the only choice...).
Gamer-proof professional card. 63 MH/s.
@@SuperSuperka So only 2 MH/s higher than a 3060 Ti..? Sheesh.
@@kimnorberg9783 That's because reliability and accuracy are far more important in those environments than speeds.
I'd be curious to see how the card's performance is affected by the ECC memory. As someone who tinkers and games on Teslas and older Quadro cards, I tend to see a lot of people ignore the penalty incurred by running the VRAM with ECC enabled. In the case of some of the Teslas I've used with GPU passthrough, I've seen performance boosts of 10-15% in some titles just from disabling ECC. Using the nvidia-smi command line tools, you can disable ECC on most of these cards pretty easily, and I believe this setting can also be adjusted per application in the Nvidia control panel.
i figured the same, so would it be equal to the 3070 , when the a4000 is overclocked?
Doesn't gddr6 by default have ECC enabled anyway?
@@MrAvatin Er no, no ECC on consumer cards
Well ECC has two primary trade offs which I am gonna guess you know, cause you are using a quadro class card,...
There is an EC step for every cycle or memory transfer, and it adds about 2-5% speed cost to memory.
And second is about timing tightness, it can be as tight as top of the line OCed RAM sticks.
But there are two serious pros, 1 stability.
2. It allows you to log stuff rather than guesstimate on basis of system stability for say things like Overclocking or just day to day health checks.
For GPU vram, second con is irrelevant, but at the same time and you can oc your memory a bit after the fact to get back some performance. So overall it's totally worth it to turn on ECC if you have it, it won't hurt much.
I do have a Quadro card. I would like to disable ECC for a try. What is the command line you use on nvidia-smi to disable ECC?
This just shows how "gaming" grade graphics cards are well past the voltage efficiency point, and run the utmost lowest quality silicon. 120watts less for what is essentially only a 6% drop in performance. I'm sure with a better cooler and thus lower temperatures, it'd be even less. As every 10 degrees cooler, you gain 4% efficiency.
Also, the 3070 and below should have been made with GDDR6 instead of GDDR6X. (edit: I was mistaken on this.)
@@MikaelKKarlsson 3070 and below are GDDR6, only 3070TI and above are GDDR6X.
This is exactly why I have been overclocking and undervolting for gaming forever. Under very few circumstances am I unable to get within 10% of maximum performance (based on FPS, not other metrics) at 65-75% power draw when the card is tuned properly, not to mention it lasts longer as it runs cooler at its peak temps than out of the box settings.
@@dagarath Many underestimate undervolting on GPU's. My experience with undervolting is similar to yours. I get almost the same performance as stock, but with about 70% of the original power draw.
@@dagarath I UV my GTX 1080 but by necessity as it has a crap cooler on it. 0.9v @1923 MHz and 11.6GHz memory and it runs OC'd at 135W instead of 180W. Also 70°C instead of 80+°C.
As an IT manager, I sincerely hope people don't start buying these up for gaming/mining because I need them for our business (CAD/CAM work).
Yup. I use them for CT reconstructions.
lol you re doomed
As someone who works in enterprise market, I was always curious how this one would perform for gaming. Excellent as always at HU.
Its "enterprise", its a professional tool, for rendering CGI, for example, or for mining. Of course it's working fine when 3-7 of them in just one PC.
Crazy efficient bugger, isn't it? First time I have ever seen a card become more efficient after OC. Almost looks like Nvidia ships it with suboptimal clock speed out of the box. Could be bargain some time in future render farm fire sales?
It doesn't look that crazy when you just think of it as a the same thing as a laptop 3080... This has been the reality since maxwell or so, gaming cards get pushed harder and harder, so they are way off the efficient portion of the V/F curve.
@@wewillrockyou1986 The crappy small cooler looks like they just removed it from a laptop.
For someone who plans on rendering while on battery solar power this is my best option.
😂 Gamers Nexus ...
Next you gonna name yourself Steve too 😂
This might be a great used option in a few years when datacenters (or similar) upgrade to newer cards. Definitely need to keep an eye out for it
Very good point! 3060TI/3070 performance, ultra slim form factor, sips power...not bad if you can get it cheap in the future! oh and 16GB of ram to boot!
These don't typically end up in datacenters, these are workstation cards. The datacenter cards don't have fans.
4 sure!!😁
How often do datacenters upgrade to new cards? If its 3-4 years, by the time this hits the second hand market it will probably be overpriced for gaming, assuming normal gpu prices!
This isn't really a card you'd see in most data-centres, this is for high end workstations used in 3D rendering/video production
Looking at how tiny the card is once stripped down, its a really good option for an ultra compact Mini-ITX build, that gives you performance between a 3060 and 3070.
It's not clear to me why you would want to build an "ultra compact Mini-ITX" PC, unless you live in a refrigerator box. And I'm pretty certain you're not taking your PC to LAN parties every weekend (and even so, guys regularly lugged their full towers and CRTs). Is it fashion conformity, or what?
@@bricaaron3978 I just like stuff taking up less space, thats it.
@@bricaaron3978 Well maybe I want to use it as HTPC in small space, or maybe I want it on my desk and the desk is small, maybe I have small nook that can only fit a small PC in it, just because you don't have a clear use for this, doesn't mean there isn't one, I have an office, a theatre room and a living room, all have PC's in them, and I can tell you now, my wife would go mental if I plonked a clunker PC case in the living room, instead, I have a nice compact mini-itx build in a case that looks like a stereo component, and she has no idea iits there, so I have a clear cut use case for such a small compact card, as I play a large sum of games on my HTPC's and any performance boost is welcome.
@@NexGen-3D *"Well maybe I want to use it as HTPC "*
You didn't say that. You indicated 3060 - 3070 performance, which isn't necessary for an HTPC.
Of course an HTPC build in such a form factor makes sense. But deliberately trying to build a gaming PC in the smallest volume possible is counterproductive to the performance and upgradeability that gaming requires. I am only offering my opinion of course, but I can't help but liken this to the trend of making laptops, phones, and TVs as tiny and thin as possible for literally no good reason, even when it objectively reduces functionality and increases fragility.
It does seem from your answer that fashion is a factor...even if it's not your choice. :P
@@bricaaron3978 The fuck are you talking about. Counterproductive to what? Your personal (and shit to boot) opinion on what gaming is. Are you the sort of person that buy a 3090 if given the chance despite its ridiculous size and power draw?
I wonder how it handles multiple vms.
In my recent experience with Maxwell Quadros I've had no luck overclocking. Are Nvidia being less precious about this with the new models, or did I miss something?
I’m looking at this for an ITX low power build, but have yet to find 2 slot aftermarket air coolers for GPUs. Was there any thermal throttling?
"Quadro" sounds far better than "A series."
Odd to drop a name no one hated and everyone understood.
@@zodwraith5745 We are talking about Nvidia here. They want to be hated.
@@hellosmarty7103 Its not just Nvidia. The entire manufacturing industry has going goofy naming schemes. Remember when batteries were just AA, AAA, C and D?
You could walk into any store and ask for these batteries and every salesperson knew what you were talking about. Now we've got a virtual alphanumeric soup of different battery sizes and types. Walk into a 7-Elelven and ask for a 18650 and see what look you get from the salesperson.
@@krane15 Walk into any vape shop and ask for an 18650, 21400 or any other similar battery though, and they'll know exactly what you're talking about. Batteries were never just AA, AAA, C and D. What about PP3, PP9? LR44? I bet you'd get confused looks from a non-techie if you asked for a CR2032, as well.
hardware nexus, gamer unboxed
Nexus unboxed, Hardware Gamer?
Tech nexus,Tips unboxed
Watercooling would probably allow you to raise the overclock quite a bit. There are universal GPU die blocks from EK and Alphacool as well as Aliexpress brands.
The RTX A4000 might retain a better used resell value later on though, if the gaming market unfucks itself.
It will certainly be. Old Quadros lose their value really slowly, in the past they hardly been affected by new generations coming along
@@salmiakki5638 yep. I was looking for the GV100 to see if the price ever went down. Nope, still an eye-watering 10k USD.
You mean.. it'll remain valuable if it DOESNT unfuck itself. When the market goes back to normal.. that cards value will drop a good bit just like all other cards.
@@christophermullins7163 No, I mean that compared to the 3070, the A4000 hasn't jumped as dramatically in price, at least not here. I can find a A4000 for about $150 above it's initial MSRP of $1000, and the only 3070 I could find that was in stock, was about $900 above its intended MSRP. With the MSRP of the 3070 being $600, the 3070 is currently listed at about 250% of it's intended price, while the A4000 is priced at about 115%.
If the market returns to MSRP pricing, the A4000 has lost about $150 in value (plus however much it would lose due to being used), while the 3070 would have lost $900, plus however much the card would have dropped below MSRP due to being used rather than brand new.
@@mackan072 You said, If the market returns to MSRP pricing
Care to tell us when!!!!!
I would love to know how it performs the tasks that it was intended for.
And I would love to see RTX 3000 series cards against the A4000 in those tasks.
@@joe_ferreira I would like to see A5000 vs 3090 (similar price tag). I spend ~£1700 on 3090 but wondering if A5000 wouldn't have been a better idea for may primary use (CAD) and secodnary gaming.
@@lukenfoci Some Quadro vs Geforce tests have already been made in the past. Generally pro cards are better with the expensive CAD suite but the blower fan design makes them more suited for rendering server blades than workstations.
@@PainterVierax Can you share any links? I could find some tests but they were more render focused than CAD.
Well, if you're building a rendering beast you can put several of these in a single board. Hence why they have lower power usage and single slot coolers. (Rendering doesn't require SLI fingers)
That's what I don't get about people paying for high end Threadripper cause 4 of these would annihilate it in rendering. (Given you use a renderer that support GPUs. Most popular ones do.) Even 1 A4000 will completely destroy a Threadripper in Vray. For the money 4 of these would make that $4k TR laughably overpriced. TR is an expensive toy unless you have _very_ specific needs that can't be offloaded to GPU like 7zip. If you're doing compression that takes more than a couple minutes a 5950x might be worth it but not TR. Even compressing 50gb only takes like 10 minutes on an 8/16 CPU.
This is why I skip over all the "workstation" benchmarks on CPU reviews. They're 99% meaningless to me when everything is faster on the GPU. If you're rendering on your CPU alone you're already doing it wrong.
Hey, great review. I am a reseller of NVIDIA in Australia (If you are buying hardware, hit me up, happy to supply at cost or test units where possible). RTX A series is an enterprise card that is designed and build to run 24/7 under full load whilst gaming cards will never be under full load for prolonged periods of time. You can use Enterprise cards for gaming but you can't use gaming cards for enterprise work. e.g. If you are doing AI work or rendering that can take days, that will fry the gaming card in no time because gaming cards are simply not designed to run under full load for days at a time.
Ummm mining?
@@alanpeirce7292 Miners downclock the core. Mining stresses the VRAM much more. I run my 2080Ti at ~125W (1110Mhz). In gaming it can daw up to 370W (2050Mhz). I would not run it at full clocks 24/7 but with 125W it's prefectly fiine running 24/7. Jessie is correct in terms of the fact that you would not want to run a full clock gaming like workload 24/7. It would defenetly shorthen the cards lifespan.
@@Raivo_K primary reason for under volting and under clocking is for energy efficiency.
Not saying it won’t shorten the life span, but it won’t ‘fry the gaming card’ in no time.
@@Raivo_K A friend bought a 1080Ti that has been mining for 4 years and most of the time it was used the times it didnt is the dust removal which tooks a day in month or less, we changed the thermal pads in it and it runs at 1935mhz with the same performance at day 1. It just how you manage the card if it sits around 50-60 C you can use it like 10 years at least. (seller said that the Card was running at 1950mhz they didnt downclock it)
lol, that's straight up BS. Many people mine with their gaming cards and plenty of home users use consumer grade GPU's for tasks like folding at home, mining, rendering, etc.
The GTX 1080 I'm using right now was second hand from a uni student that used it in a render farm for some science research projects. I also use gaming cards for password cracking with hashcat for cyber sec studies.
Stop lying.
What’s the mounting holes for the A4000? If it’s similar or the same as the 20/30 series, a kraken g12 aio bracket mount might work for water cooling without a custom loop
How did you overclock this card? The sliders don't stay in MSI afterburner when you try to overclock workstation cards?
I would absolutely love to see how this does in a mitx system, the single slot would actually be really good for those types of systems
@@DarkMikaruX agreed there, I commented abit before the results. It's an interesting card all the same though. I do wish they made more single slot cards- there's sooo much you can do with them. But I know this is down to the cooling and power requirements of this gen especially. Edit: the 1070 katana was a really cool card
@Phoenix 𝙾𝚙𝚎𝚗 𝙼𝚢 PROFILE thats a good shout, would definitely be interesting to see
@@DarkMikaruX I am building a Velkase Velka 5 build rn and I am using the A4000 in it. I'm excited!
@@DarkMikaruX I run Solidworks and Ansys for school so getting a supported card at this price was a must
@@DarkMikaruX I'm probably gonna get the 12900K when it drops late Oct/ early nov
"Should have called the channel Gamers Nexus."
Watch out, you may summon Hardcore Steve to the channel.
how many steves would we need :D
Quite an interesting video! I was wondering if you were going to end up benchmarking this card when that question was asked in the Q&A. I'm glad you guys ended up exploring this avenue! I've always wondered how workstation cards matched up to G-Force cars in terms of gaming, and now I kind of know, at least for the a4000 compared to the 3070 anyway 🙂 love the content as well as the in-depth benchmarks, thanks for making this a great tech channel!
How did you manage to overclock the A4000? With Maxwell series Quadro cards I had no luck with the likes of Afterburner and had to resort to vBIOS mod to get an extra 25% out of my M4000.
This might be the closest successor to the legendary 1070 katana one can find out there! Such an interesting card with its 1 slot size, amount of vram and power efficiency. Damn! I wonder how deshrouding would work out... I'd also love seeing its performance in other applications if you have the time to check that out.
Every quadro in the x4000 line has had this form factor
I love the sleek design of this card. Frankly, it's not terrible for gaming after OC. Especially if it's around the same price as a 3070 given it has more RAM and it is also fully unlocked for scientific/CAD/CAM workloads if you happen to need to use it for these.
Do you guys use a specific windows when doing benchmark like Atlas or Revi?
is it limited by the TDP or actual hardware clock limit?
In Romania (EU) it is cheaper or the same price as regular 3070 cards so i would definitely consider buying it for the 16GB ECC memory. (around 1300$ vat included)
I got my Asus Tuf RTX 3070Ti in Finland for 999€ which i'm satisfied with, even though it is obviously a bit expensive.
@@Manakuski My god, you folks in the EU are getting robbed blind and here I thought $780 for my 3070 was too much
@@ij6708 24% of what i paid is tax. Which is included in the 999€ i paid. Also mine is a 3070Ti, not a regular 3070.
@@ij6708 No. The tax is included in our prices whereas in the US, you have to add sales tax to the price. Overall, prices are more or less comparable.
@@boden_staendig $780 was what I paid including tax although it is less than that other chap, 12% here. It's not only this comment, I've seen plenty of EU folk stating ridiculous prices, you guys are definitely getting gouged. Well we all are, just over there is more than most
The A2000 comes in a low profile version, FYI. Right now it's the only way to get an RTX 3050 desktop card.
So this would be 1660S class performance, but with DLSS and RT cores.
Yeap the A2000 looks very interesting.
@@raresmacovei8382 ray tracing on rtx 2060 and below doesn't make sense anyway, might as well get a 1660s
@@tyre1337 but DLSS makes a lot of sense which you don't get with the 1660s
@@samgoff5289 6gb vram>>>>>>>>>> 4gb
Battlefield, FC 6 in 1080p low consumes more than 4gb
And after 2 years 1080p low will consume >5gb vram i think
Hi.May I ask about what is the different between Laptop version and Desktop version for RTX A4000?I know the difference is TDP and Memory. Dose Cuda Cores number different between them?
Is it capable of for post production usage on movie film files?
Didnt test the FC6 HD texture pack and Doom Eternal ultra nightmare settings to see if the 16gb vram helps compared to 3070 8gb?
I have the same question, although I hear even the 3090 is having trouble with textures in FC6 despite its massive 24gb buffer
@@MrAtthedrivein925 Probably an optimisation issue that Ubisoft and Nvidia will address.
I loved this card since they first started coming out, all comes down to the bad prices at the moment. I would give up the 3070 performance for 16GB GDDR6 easy, shows how much better the ampere line-up could be
Hi can I know this RTX A4000 will support for Lumion 12 version
Hold up there Steve, you just did the Far Cry 6 benchmarks right? At 4k, how many GB VRAM is required for the HD Texture pack with Ray Tracing Enabled? I just saw that it needs 11GB VRAM to run both the HD texture pack and Ray tracing enabled at the same time. Meaning the 3080's not capable of ray tracing and HD textures simultaneously. Can you verify this? That said, does this mean the A4000 with 16GB VRAM just side-stepped this issue while the 3080, 3070 & 3070Ti just got invalidated for 4K gaming? I bought a 3080 expressly to play with everything on 4K ultra, including Ray Tracing & DLSS. My aim is 60FPS. My first encounter with insufficient VRAM was Black Ops Cold War with the VRAM warning then Half-Life: Alyx, now Far Cry 6. I did not expect to see insufficient VRAM less than 1 year from buying a 3080, much less encounter 3 times before the 1st year was fully up. Really disappointing.
Would be interesting to see how the Radeon Pro fares in comparison to it's gaming equivalent as well.
Not really....AMD workstation cards are kind of a joke 😂 their gaming gpu are good tho
I think a couple CZcamsrs have done this testing and they do not perform very well haha!
radeon pro are not good as a workstation card, so i can’t see a reason to even own one in the first place
@@samgoff5289 Do you guys know why those are a joke?
@@makisekurisu4674 Sam, beast and peter, nvidia shills.
That Gamers Nexus bit caught me off guard. Good one, Steve!
Thanks Steve, back to you Steve.
What brand and model of rx 5600xt are you using in the benchmarks?
can you do some smaller productivity benchmarks like affinity suite, adobe suite, davinci resolve, capture one?
Strange times indeed, reminds me of the geforce 560ti, an absolute power hog that ran stupidly hot with even the best cooler. A little while later the 750ti was released with a single fan and the whole card ran off the pcie slot needing no external power supply and at the same time destroyed the 560's performance.
This card must be running at about the most efficient clock speeds for RTX 30XX cards.
For AI work it's got to be a winner
So in other words, it's good at the things it's designed for and not so much at the things it isn't (gaming)
@@tissueoflies2780 more like the clock speed will be chosen to be the most efficient power/work done & the bigger ram is better for more professional work.
If the RTX 3070 had more ram it will have been faster thanks to it's higher clock speed, lots of professional apps that need VRAM will just crash or error out on cards with to little VRAM.
It's not like with games where some VRAM stuff can be sent to system RAM so you only get stutters, you just fail in pro apps.
Most AI workloads are optimised for 11GB VRAM or higher thanks to the 1080 TI having 11GB of VRAM, I suspect Nvidia never wanted that to happen a second time so they moved back to 8GB VRAM (I have no idea what happened with the 3060 as it's now the go to GPU for AI apps on a budget).
For AI, we look into RTX A6000 48GB x2 (NV-linked) configurations. Some deep networks are very big. I was hoping AMD to enable AI on iGPUs found in Ryzen 5700 so we could install large system RAM to work, but they didn't. And A100s are crazy expensive.
Can you recoup some of the cost by using it in your work station?
Great review, I came here because in a recent 'Moore's Law Is Dead' video Tom mentioned that there was a review of the A4000 in the works at 'Hardware Unboxed' since then I have been checking the video section and it was really worth it.
A4000 is kinda like 500 or less on ebay now.
I do wonder if you could scavenge a thicker air cooler off a different card? Kind of looks like it might fit an evga single fan cooler
The only problem is the power connector. It comes out the back of the cooler, and in some of the shots you can see it. The power plug is basically set up as an extension off the main board. Switching to a different cooler is unlikely to get you support for those cables. Should something break it would require some soldering to repair.
it still has worse VRM since its designed for lower power consumption, and considering VRM heat dissipation isnt linear and has an efficiency curve i dont think you could OC it much further on ambient, maaaaybe you could reach 3070 base spec performance
@@ThunderingRoar sure but for noise purpose at barely stock freq it can make sense to get rid of the blower.
Could you do some tests doing video encoding using handbrake and perhaps other programs and compare it to a 3070 ?
I remmebered when the rtx 3070ti was first introduced the difference between the 3070ti and the 3070 with not noticible but now I see major differences did they update the drivers to boost the ti performance?
I saw someone on Reddit swap the cooler on this for one from a 3060, seemed like a really nice card but just not worth it with the power limit.
This GPU is not designed for games. Test it on 3D softwares like Maya, Unreal and 3ds Max you will find the difference
I just wanted to say thanks for the in depth testing.
As in the past before todays GPU pricing madness I have seen second hand workstation cards on sale for less than there equivalent Geforce card, as workstation users often tend to upgrade GPUs multiple times before replacing the workstation (as a 20% uplift in performance can quickly pay for its self in a commercial environment). So I have more then once purchased them and found them to offer good performance for the money to upgrade an existing older system, but it does seem strange to be looking at the current generation of cards as a possible cost effective GPU solution for a new build :-)
How exactly did you overclock them ?
If you could unlock the voltage limiter and use a better cooler, that little card would be a beast.
Would be nice to see Radeon Pro W6800 tested also.
Steve mentioned this in the video, you have to factor in the fact that the power delivery is only designed for 140W. It's a "Quaddro", so I'd guess it's designed to a higher spec than the regular consumer cards, but it could still shorten it's lifespan considerably. IMHO if you wanted a beast you'd get a 3080/ti, you'd get the A4000 if you want a low power card
this cooler is absolutely perfect for it. The blower style cooling is the only "right" for GPU. You can put 3-7 of these cards in just a common computer case to have super nice rendering station. Or a mining station (63MH/s each card). Its not for stupid things like gaming.
@@petkogeorgiev2103 I've always found the Quadro line up to be very innovative for what they offered, especially with the ECC VRAM and the lower clocks and power draws. Now granted yes it is a 140w card, but that doesn't mean it's outside of it's theoretical potential if it could be done.
@@SuperSuperka I agree. The single slot design is almost a lost art for desktop PCs that now have huge big bulky chungus of coolers whereas this little fellow with a blower is keeping pace.
I remember one of the last single slot solutions for gaming I ever saw was a GeForce GTS 460 or something in that line up from Sapphire before they went fully AMD and Nvidia dropped the GTS and GT lineups.
@@ReaperX7 I have 2060 Super Turbo and 3080 Turbo in one case, just 2 slots each, rendering CGI. All are qiuet when gaming, and only with long renders (which is rare) became noisy. I would love to have one one GPU, this A4000 is a perfect match.
Would love to see Blender performance and some other productivity charts.
Looking to upgrade from my 1050ti at some point
There are watercooling options for an A4000 now out there. Would these help keep it cool to contend with higher cards?
Sadly there aren't any blocks yet. EK offer a 4000 block but that's a different product.
Bykski Full Coverage GPU Water Block and Backplate for Leadtek RTX A6000.
Where is the RTX A4000 :'(
Not having RGB makes this card 15% slower. Missed opportunity from Nvidia.
can you please test its capabilities for video editing or rendering? Thank you
Jesus Christ. I got my 3070 FE for 479 at launch and the prices now sounds absolutely mental
Bruh, I would literally die to get a 3070 now for even $579
@@slay3rgamingyt Ik I feel so grateful now. I got that 5600x and the series X for RRP too at launch 😂
The fact it didn't use much wattage with the OC was interesting
this is expected, since the known power/performence curves and the fact that this is by far limited by power consumtion (over voltage apparrntly)
Nvidia always gave dogshit tier silicon to gamers. We just don't (won't / can't) pay money out the ass.
he kept the same voltage thats why the wattage didnt change much
Just wondering if it's possible to compare entry level with entry level? I'm thinking T400 vs GT 1030 and other cards in the budget realm of the GT 1030, along with maybe asking why they don't make newer budget/entry level stuff, at least on the gaming side of things?
Great video. Wanting the new studio in action & a 2nd new studio build video.
Thanks Steve!
If you're going to call yourselves "Gamers Nexus" you better grow your hair out in glorious fashion to match the vibe of the channel!
HUB steve with long hair would be a sight to behold.
There's only room for one tech jesus. I wanna see a hair battle 😂
I have seen many one saying it is thermal throttling. is it true?
Thanks for This Video. I would Love to See How Does it Perform in Blender as compared to the 3070. Specially Scene with Multiple Particle Systems. Thanks
Thanks, Steve.
The video we never knew we needed
.. about the product we never knew we didn't need. Pity - but it was worth checking!
That's because you don't need it.
@@rather-reverend True. Very true. I just saw it's priced the same as a 3080 Ti here in Sweden. I guess if you own a business and need like a hundred GPUs, then this might make sense. Low power, quantity discount and most important: sales tax writeoff (VAT= 25% in Sweden) 😁
@@EyesOfByes Under normal circumstances these guys are considerably more expensive than the GPUs we want - what they're really good at is industrial applications. (Just don't ask me which ones!) The fact we that gamers are even thinking about buying the things is kind of crazy ...
What about compare 3d software, video editing and others Programs.?
seems to me as far as cooling is concerned, it's a great candidate for one of those silent passive cooling case setups... a big heat sink bolted to this thing would probably fix any cooling/noise concerns although that would ovbviously take out the single slot form factor advantage it does have...
Benefits are UNLOCKED NVENC Encoder, and the Ram. Great Plex Server Card also.
Ayyyyy Steve did it ❤️❤️❤️❤️
Seeing that the 3070 and A4000 share so much.. would the cooler from the 3070 fit the A4000's PCB?
Whats the temps on this card ?
could be good in a small form factor machine tho looks like a fun product
So long as the noise isn't a concern for you. My thoughts immediately went to the same place, but I worry about the noise in my media centre.
@@ArtifactSkyline would make a good LAN Party box, well - if the global situation was different...
@@amirpourghoureiyan1637 no doubt. You would be wearing headphones at events like, right?
@@ArtifactSkyline was thinking more along the lines of music blasting and people being loud lol.
@@amirpourghoureiyan1637 Fair. Never been to one, not sure on the environment.
This is a very interesting investigation. If Nvidia could beef up the VRMs and give it a waterblock, this card might have amazing performance at great efficiency.
this is just stupid. this card is supposed to work in the amount of 3-7 in just a coomon PC case.
Or you could just undervolt and underclock a regular gaming GPU and get the same performance and efficiency as this thing does. With a lot better cooler.
@@Kepe regular gaming GPU "shit in its pants" and you name this "a lot better cooler"
@@SuperSuperka What? Gaming GPUs have better coolers than the A4000 and if you undervolt and underclock a gaming GPU you'll get the efficiency of an A4000 but you'll have a better cooler than you get on an A4000.
@@Kepe You heard right. I have two blower-style GPU. Their fan can reacg 5000rpm. They are crazy good at cooling. Now go ahead and paste 2-3 GPU in you PC to make it a working machine and not that schoolboy stupid Gamer-PC.
Extremely interesting video, thanks for taking a look at this GPU! That form factor would be great for a very thin mini ITX build, hmm..
Can you look at the rtx a2000 I want to see how much performance a lower profile card can get
If you looking for power consumption. Buy the A4000.
Its extremely efficient
This goal you can also reach with normal cards by limiting TDP at 140 Watts. This card is also not efficient at high res when nearly constantly 140 Watts pulling because the VRMs are heating up an loses over 20 % effiency against cold conditions. You also lose your overclock/offset curve if the card is forced to throttle under 1300 Mhz. Nvidia the money maker saved money with all their FE PCBs by keeping them short. Therefore they put the (main) VRMs on all their ampere cards at the left side. With a blower concept this deadly. Hot air about 70 °C is preheating the VRMs which then gets over 90 °c and lose continously effiency. In worst case conditions the TDP is then only good for 900 Mhz. In this case your are faster if you limit the voltage with editing the curve to a flat line at 0,70 Volts with Afterburner. Then you are forcing the memory controller to only 5000 Mhz Vram clock which saves energy and it is still faster than 7000 Mhz Vram clock at TDP crippled gpu clock at 900 Mhz.
Only solution is a better aftermarket cooler or glue a bunch of little copper heatsinks at the back of the card where the VRMs are located.
@@kaiateisenbahn4112 Thanks, ill work on this with my 3090Ti then which i got it 3 weeks ago
Honestly steve we do care about video editing and 3D rendering. It may be the major reason why I may consider the A4000
Huh? These are not designed for video editing or 3D. (Its already benchmarked at 4% slower than a 3060Ti at rendering).
These are data science /, FP64 compute cards.
Especially given that it's double the memory. And yes, production rendering it's different from rendering games. Pure compute and RT cores.
Hey dude, can you review the Radeon Pros for gaming? :D
I would like to see the T1000 and RTX A2000 desktop GPUs to be reviewed, both of them have the appeal for those with SFF pre-built systems looking for a more powerful GPU than the currently-available GTX 1650 low-profile options.
Please explore advantages in professional use. I always wanted to know, how much I would actually gain in a professional application by paying that little bit extra.
Keyshot 10 GPU mode would be a great example to test.
Graphics cards don't know what a "professional" application is - there is no such thing. They are slower for everything except FP64 compute tasks where they excel, like data science. There are some very niche engineering apps like Solidworks & Catia that also benefit.
@@glowyboi7175 That's not true. There are differences in driver optimization between the cards which leads to a difference in performance that is beyond the actual capabilities of the cards. The card might not know the difference between the applications, but the drivers certainly do and that makes a difference.
Also, I use Solidworks on a daily basis and I would not call it a niche application, it is one of the most widely used 3d modelling Softwares and I know that just the drivers already lead to a feature difference and to a performance difference as well.
@@jakob.k_design Tiny compared to the gaming market. Niche is appropriate.
@@jakob.k_design You are wrong. Driver certification does not mean optimization. The certifications are just tests which can be done with any graphics card. I've been using Max & Maya for 20 years with GPU rendering in the last 5 years, both GeForce & Quadro. The latter are always slower and there are loads of reviews and benchmarks confirming this. No 3D animators or video editors use Quadros unless their company forces them to - this is not 2007 when Quadros were actually needed lol.
@@glowyboi7175 For fucks sake, I never said that quadro would be faster in every professional application, I did not even say it would be faster for rendering, I just want to know the difference between the cards.
AND are you kidding me, there is more then just 3d animation and rendering that these cards are used for. Saying we don´t need Quadro cards because Geforce cards are faster for rendering just shows how limited your point of view is.
Btw. I use Rtx cards for rendering as well and I am aware, that they are pretty much always faster then the Quadro counter part that does not mean there is no reason for them.
What is the A4000?
Who is the A4000 for?
I'll do you one better.
Why is the A4000?
But you'll never ask:
How is the A4000?
@@thenotsookayguy Where is the A4000?!
@@StefanEtienneTheVerrgeRep At your local Woolies.
Please make a comparison between RTX A4000 and the gaming cards in REVIT
That is simply insane that rtx 3060ti's are $1000+ at retail.
Quick Tip: If the used knickers you've been using to sniff have started to lose the smell, just lightly steam over a freshly boiled kettle to reactivate the scent
Life hack!
Thanks, Fred, I'll be sure to use this tip next time I crap myself
🧐
Would be nice for a super compact ITX build, especially if you consider how small the PCB is with a water block.
At 4k60, I don't think it really matters if I'm getting 62 or 72 so long as it's a consistent 60fps. The better question is does the RTX a4000 support DLSS 2.0?
Wish NVidia would make single-slot GPUs this efficient and powerful focused towards gaming again. Not everyone has the room for a big card, nor the ability to afford the huge power bills that come with them. I hate how Nvidia treats everyone like idiots with more money than brains, not caring about efficiency and whatnot.
I think if you could get just 40W more power in the A4000 it will match die RTX 3070. It's the powerlimit what hold the A4000 back.
And its the stupid single slot blower design that forces a low TDP.
@@glowyboi7175 that and the power delivery. Setup for 140 to 160 watts. The card's current power sources are limited to 225 watts, which is far less than the maximum of the 3070.
With a single slot design, a raw power between the 3060 Ti and the 3070, I honestly might consider it as an add-in GPU for Blender, given that Blender doesn't care about the PCIe bandwidth, I could even use the second x16 size slot "that's actually an electrical x4 slot) to put that bad boy in. I'll keep my eyes open for a potential second video of yours and for an interesting listing!
Can I combine this card with a RTX A6000 card for rendering?
It feels like people are junkies craving for something to buy.