NVIDIA Making CPUs, New RTX A5000 & A6000 GPUs, & Deep Learning
Vložit
- čas přidán 2. 06. 2024
- NVIDIA announced its new CPU today, a collaboration with ARM yielding the "Grace" CPU. The company is also releasing new RTX GPUs (the A4000, A5000, A6000).
Sponsor: Buy Corsair's 5000D Airflow Case on Amazon (geni.us/cnVP60)
We're already half-way through our initial new GN 'Volt' Large Modmat inventory. Grab one on the store if you want it this round: store.gamersnexus.net/product...
You can also pick up an Original Modmat (also in stock & shipping) on the store: store.gamersnexus.net/product...
NVIDIA's GTC 2021 conference saw the unveil of a series of new GPUs for professionals and data center applications, but all of these were overshadowed by NVIDIA's renewed interest in the CPU space. NVIDIA has been trying to purchase ARM since 2020 (currently pending regulatory approval), and in the meantime, the two organizations are working together to produce the new NVIDIA Grace CPU for data center and server applications. The NVIDIA Grace CPU aims to improve bandwidth of CPU-to-GPU communications and CPU-to-CPU communications, outperforming traditional PCIe signaling with NVIDIA's NVLink.
The GPUs announced are part of the former Quadro line, branding that NVIDIA is slowly moving away from. The A5000 and A4000 GPUs are in the high-end of Ampere RTX cards, with mobile versions also available in Max-Q laptops.
Separately from all of this, NVIDIA talked more about its Omniverse software and a few RTX games, including Black Myth: Wukong, Boundary, Narak: Bladepoint, and Bright Memory. Cyberpunk also got a comically short pseudo-reference.
Like our content? Please consider becoming our Patron to support us: / gamersnexus
TIMESTAMPS
00:00 - NVIDIA GTC 2021 Announcements
01:03 - NVIDIA's New "Grace" ARM CPU
05:11 - New GPUs from NVIDIA: A4000, A5000
10:51 - A10, A16, & Omniverse
** Please like, comment, and subscribe for more! **
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
Follow us in these locations for more gaming and hardware updates:
t: / gamersnexus
f: / gamersnexus
w: www.gamersnexus.net/
Editorial, Host: Steve Burke
Editorial: Patrick Lathan
Video: Andrew Coleman - Hry
Watch our recent IGP benchmark for a comparison of Intel's new integrated graphics performance: czcams.com/video/2H1B7ibjJZg/video.html
We're already half-way through our initial new GN 'Volt' Large Modmat inventory. Grab one on the store if you want it this round: store.gamersnexus.net/products/modmat-volt-large
You can also pick up an Original Modmat (also in stock & shipping) on the store: store.gamersnexus.net/products/modmat
tater-tot jokes aside, I and I'm sure many others really do appreciate those iGPU tests, as un-glorious as 11th gen and APU must sound to some at this point.
Great vid steve and crew. B)
10k years of work WOW theyve been workin on this one since before christ!
@@goofball1_134 ouch! lol, NICE!
burn hotline.... lol.
** as reported on my local tv channel: We my be unable to buy new Chips. the manufacturers are stopping to build computer silicon unless you're buying a new vehicle 😭
you and everyone, should never mention nvidia again or review anything they do after the declsred intentions spoke at the keuynote, that precisely describe a world where nvidia will take control of everyhting you do with ai... that inlcusdes you giving away yoru comuters and being an nvidia slaeve forever having to pay them what evetr tey want fo ru to do what ever is necessary online via their servers anfd 5G infrastructure...it is incredible you have not unerstood hwodleusional jensenhuang is, and that he basiclay is settign up teh 4th reich where ai wil conteol al of our lives and take away any thought or memry if what freedom was.
Soon choosing a cpu is going to be like choosing your starter pokemon
The problem here is that the starting pokemon uses cunning or shady business deals to push out competition and make it worse for consumers...
Oh wait, those spending less than $1000 on a single piece of hardware isn't considered a consumer...
"Pick this pokemon, it poops out rare candies!"
or "Special nvidia shades comes with the choice of this pokemon, wear it and your pokemon looks sexier than others"
Fire, water or plant type. Imo, blue is my favorite color but I still pick Bulbasaur every time.
@@Werloxali imagine picking bulbasaur lmfaooo
this comment was made by charmander gang
@@Werloxali I like Blue too, but i don't like supporting Intel, for ethical reasons.
@@dra6o0n all corporation bad and greedy, must say bad thing about corporation and demonize corporation.
Now we got RGB teams (Red - AMD) (Green - Nvidia) (Blue - Intel)
Yealol
Competition is always good for the consumer.
@@Kromiball so is rgb
Blue is my favorite color...
Ultimate fps
Nvidia: We're going to start making ARM CPU's!
Tegra: Wait... What???
tegra is just the gpu part
@@Zero11s No. Tegra is the entire SoC.
nVidia have already been making ARM cpus for years. Look at the Jetson Nano
@John-Paul Hunt Nintendo use Nvidia CPU already
Server CPU ARMs race.
I'll go now.
Ha. Ha. Ha. Ha. Ha. Ha. Ha.
Ha. Ha. Ha. Ha. Ha. Ha. Ha.
Ho. Hee. Hee.
Ah-hah. Ah-Hoo. Hee. Ha. Ah-hah.
I thought my jokes were bad.
@@xwaltranx 🃏🃏🃏
Key words server CPU. How many top 500 companies are going to buy into Nvidias CPU.
2:15 I choose to think that this is a single immortal engineer working from the dawn of time. Prometheus, bringing humans the flame, then inventing wheel, then single handedly dragging humanity all the way through stone age, bronze age, metal, renaissance, industrial revolution, digital age to finally bestow upon us their finest creation, a culmination of their life's work. A CPU that people will use to simulate realistic dong physics.
Exactly. I had to rub my eyes and look again when I saw their claim: 10,000 engineering _years._
Has the AI multiplied human lifetime to achieve that? 🤔😂
So AMD has been making both CPUs and GPUs for a while, Intel just started with a GPU division, and now NVIDIA is starting to make CPUs?
These are strange times.
They have been making ARM CPUs for years. See their Tegra line of ARM mobile processors.
@@matthewbilker3401 Yes but unless you are a nerd, you dont know about that and its not the same. You know what hes saying.
@@morpheas768 ARM designs CPUs. They license these designs to companies like Apple, Nvidia, Qualcomm, Samsung, ect... I am pretty sure they dont physically make anything you can buy. Designing a CPU is closely guarded science that will continue to be controlled by national defense agenda for many decades to come. I will be VERY SURPRISED if Nvidia is allowed to buy ARM. It raises way to many DOD flags.
@@morpheas768 Its not Arm making the CPU's... Its gonna be TSMC or Samsung actually manufacturing them. Arm only licences its design IP, it does not make anything. Also, regardless whether Nvidia acquires Arm or not, I think they will go ahead with this project anyway. Although... Its quite clear the deal is going through and Nvidia + Arm will fusion together. I mean "Grace" CPU... So obvious. Next gen Nvidia GPU is called "Hopper". So Nvidia + Arm marriage translates to Grace Hopper. Nvidia really is good with their naming schemes.
@@KeithZim Nah it will go through. There's no real objections from anyone, except maybe China. But they can go ahead and approve the deal even without Chinese approval. Arm China has been kinda taken over by CCP anyway.
5:36 funny typo: Server-Class Performance *512MB* RAM for $149k :D
:)
Hahaha, how on earth did nvidia miss that lmao
HAHAHA xD
Its crazy the amount of computing power needed to simulate things that just happen IRL. Like water
Well if you wanted to truly simulate reality you would have to simulate every single atom.
@@lycanthoss if you truly simulated reality you would be making reality 🙂
It just looks like 'they just happen' to us because we don't understand its complexity
Well it did take many billions of years iterating the laws of physics before it jusy happened and who knows what "before" that.
Lol
Nvidia making CPUs, Intel is making GPUs, cats and dogs live together...
The end is here guys! These are the signs!
Is this true?
Yes. This man has no dick.
@@jedidethfreak damn. two days to late.
you forgot the mass hysteria
Is no one talking about Megatron, their transformers ? that made me chuckle.
Is no one talking about Terminator 3?
Right? Did no one look at that and have an "are we the baddies" moment?
*M E G A T R O N*
13:56 Shattered Horizon is one of my all time favorite shooters. Seeing it get a sequel is one of the best announcements of the year.
wheeee :D
Same I loved that game
Can we choose RGB CPUs now?
Red : AMD
Green: NVIDIA
Blue: INTEL
The cpu inside one of the most popular video game consoles right now, the Nintendo switch uses an nvidia tegra arm cpu. It's always nice to see that they're doing stuff but it's not new for them
For the bandwidth between CPU and GPU, or Host-to-Device, it's almost always the bottleneck in scientific calculations not being natively written for GPUs. So having a lot of it allows developpers not to rewrite their entire code to "mask" these copy times.
That is also the reason why most LBM codes are awfully slow and why I eventually developed my own in OpenCL from the ground up. No host-device transfer and yikes, suddenly 100% hardware efficiency and ~150x faster than half-baked GPU codes of the competition.
Ok, but the vast majority of HPC codes or study cases doesn't fit into the GPU memory at once, and also they were developped (worse case) back in the 60-70's. So you can't just do that :)
Thank you sooo much. I do simulations and it’s sooo hard to find good reviews/news for these cards. I really do appreciate this
Man, your channel is so detailed on the specs
Was just about to go to sleep... Thanks for making it easier GN. I'll be passed out before the 5 min mark
Thank you, Steve! You truly are Tech Jesus!
So you could in the future have a Nvidia CPU, coupled with an Intel GPU.
No. It's a specialized ARM product targeted at feeding GPUs what they need when they need it. You can't run anything that is interesting to you on it.. because it's, you know, ARM and not x86. That and Nvidia isn't going to sell this as a standalone CPU. It will only be available coupled with Nvidia GPU tech.
With an amd chipset
6:41 "Nvidia Megatron Trains Transformers"
Did no one look at that and have an "are we the baddies" moment?
NVIDIA should make leather jackets.......now that's where the money is.💵
Hahaha
Probably won't be able to get it at MSRP
They would probably end up being nylon faux leather. lul
Jensen Wong cosplay show
@@tacticalcenter8658 yeah lol...
Solidworks + A4000 will be an interesting mix
I can finally crunch the static stress tests in under days!
@@concentricmachining4636Maybe it will finally stop crashing! /s
The A4000 is an overpriced RTX 3080 Mobile. Get a RTX 3090 instead for about the same money, that is like 2x faster and has 50% more the memory.
@@owencoyne6223 I don't even think an act of God would make it more stable.
@@ProjectPhysX I would agree for typical GPU workstation rendering but Solidworks is one of those very rare apps that needs double precision, which GeForce doesn't have, so they run terribly :/
The promos, it just works!
Was about to go to sleep, but this is definitely more important.
Lmao same
Same
Dang, I just got up.
Time Zones n stuff 😝
2a.m here and have to work at 8. Tech Jesus' gospel will grant me strength.
Rofl. Same. I dont even care about this stuff
Just waiting for intel to counter by changing the colour of their logo or something
Or adding one useless pin to their chips. Maybe a slightly thicker IHS. One thing is for sure, they'll do as little as possible and still cash in
"By removing 20% to the core counts"*
:D
Can’t wait for the possibilty to have an all nvidia pc
For rendering in Redshift or Octante, are 2x A5000s better than 2 RTX 3090s?
NVIDIA: we dont have a single current gpu in stock.
Also NVIDIA: ok cool lets move on to other adventures...
R&D can't just sit idle if production can't keep up
@@mqayyum9226 i know, but its sad to look at
Oh is that an EVGA Kingpin 3090 card with an attached LN2 container? Oh I wonder if that's going to be the subject of a future video!
Again? Wtf for?
I would donate to GN's Patreon for Steve's sarcasm let alone the great content.
12:20 haha good one :)
The more you buy,
the more you save.
Its just works
Lol , it just works
Careful, you might trigger that self righteous hypocrite who thinks you're trying to steal someone's identity. BTW, I'm the real pikachu, the rest of you are only cheap imitations.
@@pikachu7314 Sure, I'm real too. But we all know authentic Pikachus only say "pika" and variations of "pikachu"
Just got my tool kit and wireframe mouse pad from the GN store awesome in every way can't wait to get the new mod mat also way better than the merch from that other channel 🤔😉
Great video, as always. I watched this couple of months ago and decided to buy A4000 for my 2D/3D work. It is a hell of a card for the price and I hope it will serve me well for years to come. One thing I noticed is that HWinfo reports GPU temp as high as 85-87 degrees on full load. Is that OK considering it is a professional gpu and a blower style cooler. Pro gpus tend to have higher temp tolerances. What's your opinion on the subject. Thank you.
Hi, i was looking for the puget systems nv-link article and couldn't find it, could you provide it? I was hoping it was in the text form of this newspiece but there is none either 😥
5:36
512MB of DDR4 with 8 dimms? Nvidia going old school.
👀 that kingpin on the table
Holy shit, the jokes about NVidia starting to make CPUs from a while back now became a reality
Nvidia have been making CPU for ages, they have them in Jetson single boards, Nvidia shield and Nintendo switch
@@noobian3314 I'm talking about desktop/server CPUs
Any chance you'll cover the thermal pad issue/mod on 3080 & 3090? Wondering if 110C during gaming is acceptable at all for GDDR6X.
The timeline at 4:57. Is rumoured Lovelace now codenamed "Ampere Next"? Or are we only getting Lovelace in 2026? That part is extremely confusing to me.
Just ”name” change. It will get real name closer release...
Had a question unrelated to the video. Does the loop volume in a water cooled build effect the pump adversely? Say 2 gallons as apposed to 20. Would the pump, under ideal conditions, be effected to any measurable degree? My thoughts are that in a closed loop it won't matter.
Maybe I'm missing something here, but what is/are the use case(s) for a mobile, Max-Q workstation GPU?
Quadro went away with A series launch in October... keep up!
14:16 It's not only Cyberpunk, the first few frames are Death Stranding. Probably not the first to notice though...
wow! amazing info and here I am with a 1660 super.
Sweet! My dreams of owning a server grade android phone are coming true.
HOLD THE PHONE. Is that BRAKE CLEANER on the shelf in the upper left portion of the backboard?!
I really like the look of the new modmat. I cant wait till mine arrives. The only thing I dont like is that I wont be able to hang it on the wall. Im sure not many people will agree, but I think a poster version would be really cool. You could remove the gpu outline grid, and just add more information. Its the information, and the way you simplify it that interests me the most.
Something interesting to note about the supercomputer they annouced with their Grace CPUs is that they say 50 AI exaflops which should be FP16. So this computer should be, roughly, around 5 Exaflops. This would be around the power of the exascale machines coming up: Aurora from Intel (2022*), Frontier from AMD (2020-2021) and El Capitan also from AMD (2022-2023). It seems that nvidia getting none of the DOE exascale contracts took it personally and decided to its own super computer
So we should expect them like 2 years after launch.
Retooling takes 6-9 months. but its all done with Jenson mini been on the market for 2years.
That's a generous estimation....🤣😂
@Tano yeah its a joke.
@Tano oh dear 🤦🏽🤦♂️ you must be a right laugh on a night out.
Great, can't-wait for mining specific SKUs, and handicapped processors!!
Thanks for covering the A100 launch. I hadn't come across the news yet. I use V100s and P100s for hundreds of hours a month training machine learning models in the cloud. Looking forward to the boost in performance when cloud providers upgrade and I might think about trying to snag a P100 or a couple K80s as they offload them in the second hand market. Would love it if you would add a machine learning benchmark in your suite because I really have no idea how consumer cards compare to these data center cards. Was interested when the rtx 3060 came out with 12gb because that's about the minimum for most of the models I run. I was wondering how it would compare to a K80 or P100 because they're pretty much the only cards I could afford to invest in without saving up for quite awhile. Of course me affording a 3060 is contingent on their prices returning to list or at least somewhere near it.
I would not expect the A10 to run quietly, usually they make a brrrrrrrrrrrt noise. Coil whine maybe?
Hasn't nVidia been doing CPUs already such as Tegra? I suppose it's good to get into developing ARM designs since I think we'll be seeing a shift towards them as translation layers become more robust.
Apple pushing competitors to up their efforts in that space is something nVidia will probably benefit from in the long term if they wanted to break into desktop computing. Developers are beginning to translate application interfaces for ARM rather than rely on translation performance. Can only hope we get a new generation of fast, affordable, and most importantly POWER EFFICIENT mainstream consumer processors.
Honestly I watch you guys because a friend recommended it. :-)
Edit:
How do you count the bridge spacing? Does it include both slots, one slot or neither?
from the middle of 1 Slot to the middle of the tareted one
@@patricktho6546 So don't count a slot? A 4 slot spacing means 3 slots are unused?
@@gudenau no. I was wrong with my specifics/wording.
A slot has fixed width, that it "occupies". This ranges from a bit above the socket to a bit below. Now we have kind of a table, where we have all 8 or so slots over eachother (seen on the back of cases). If everyone of these spots have a socket, then we have a spacing of 1 (not 0, as would be the other possibility). So for 1 slot GPU's we need a bridge that is 1 slot width, since it is basically just shifted.
if you have a 2 slot cards the bridge needs to go from the 1st to the 3rd slot. So it covers 2 slots. If you leave for air a bit of space or have a 2.5 or 3 slot card on the first slot, you need a 3 slot bridge and so on.
I hope this was a better explanation for you.
@@gudenau to the specific quention: If every slot has socket, then with a 4 slot spacing, 3 are unuseable between the card you put in since you need the extra slot to go from the border of the slot to the pins on both sides.
Nvidia: "IT JUST WORKS"
"OUT OF STOCK"
Because the more you buy the more you save
I'm on the edge of buying either A5000 or rtx-3090 for deep learning (specially transformers) only.
Do we have any performance comparision between them on ai? Which one should go with?
Their Enterprise Data Center is located in Connecticut?
Quadro RTXes are not only for certified applications. Those are the only ones which can be used in 8x to 10x card servers (e.g. from Supermicro) for machine learning purposes „on a budget“, because of their fan design. So, no luck with Titan RTXes for such applications, you have to go with the more expensive Quadros.
If you can‘t afford A100 NVLink servers, you have to go with Quadros.
Would actually like to see a full nvidia laptop just put it in its “shield “ class of products 🤔
Nice - and when will I get my 3080?
Next year... maybe if you are lucky ;)
Is it possible to put together a computer with nvidia cpu, intel gpu, and amd ssd? Seems to be a winning combination. :)
In europe, wake up at 7 , prepare expresso, sit down, watch new fresh GN video.
Ready for a new day.
As a european you really should know to write "espresso", greetings from switzerland :D
@@Misterzen87 Expressivo
@@Misterzen87 you are damn right ;) thank God I am not italian...
@@yusepbcn that would have made nonna very sad haha
Maybe open more factories to make more GPUs?
I got to see this massive server tower that is more advance then anything I have ever saw. 3,000 TB. I can only say about that.
It's not for civilian use. This thing can act before you can think it.
so intel is making graphic cards, and nvidia is making cpu.......its the end of times indeed..gentlemen, its been an honor.
13:57 Did he say boundry? for that spacial shooting game? Not native english dude, but that game looks awesome
I look forward to the day that gamer-grade PCs *need* refrigeration units built in! 😅
@Tano Interesting. Now the question would be, are they affordable enough and worth it? 🤔
@Tano Lol. So you're saying they just need to be cheaper and more efficient? I concur! 🤔😋
In future down the line Gpu's will be completely forgotten and we will see integrated cpu graphics so good that they can play any game at max settings, easy with more than 50gb vrams
cool but will it be in stock
Is DDR5 going to use a new socket or will it be the same 288 pin, DDR4 socket we currently have?
PCIe is up to gen 4 on the same socket, so the DDR4 socket could get reused.
New. DDR has never been backwards compatible and DDR5 will be the biggest change.
Now all they need is an x86 license! (Not likely, but dare to dream)
Looking at my shield and Jetson Nano slightly puzzled.
That is jargon for it being a Mining CPU
A4000 pretty exciting too 👀
Single slot Ampere lookin mad cute
Interesting to note... Bright Memory, Boundary, Naraka Bladepoint, and Black Myth Wukong are all Chinese developed games, AFAIK they're all "indie" too
I’ve been waiting all year for the a5000 to complete my workstation
What software do you use?
14:00 was shattered horizon supposed to be anything more than a multiplayer tech demo? I played it when it came out and it was fucking AWESOME but my PC almost couldn't run it. The required specs were pretty high.
5:38 The amount of RAM for that worstation seems a bit...low.
I'm not surprised they switched back to GDDR6 memory.
They didn't switch back, 6X is not ECC capable.
@@bronzehd6212 ... but its still not ECC is what's required for compute GPUs.
Having a shit day
Gamers Nexus releases a video
Having a wonderful day
There's an internal struggle within ARM China that affecting the nVidia buy-out of ARM.
Can please add cc for all videos (en)
Ampere next next. The nextest of the nexts.
5:38 - 512 MB? Megabytes? That's a typo right? Surely they meant GB. Or maybe it's not in reference to system memory, but I struggle to think of another place where 512MB of DDR4 memory would make sense in a modern system.
5000D with attention paid to small detail - giggity :)
6:18 Wait a minute… You have RTX 5000 GPUs? Where did those come from?! Do you have a time machine?
How do embargos work for products that not even the manufacturer know exists yet??
He said "not the A series" The RTX 5000 is previous generation Quadro card.
Pretty sure the manufacturer knows about the A series if they are on nVidia's website.
The dgx station... cooled with a custom refrigerant loop and a custom machined cpu block, even the gpus are in the loop... I nearly creamed my pants
Come on man I was just going to bed
Haha and i just woke up.. I am from Bangladesh 😁
@@alokff2561 me too
@@alokff2561 you play free fire?
Perfect timing
Should I buy a laptop with 10th gen i7 and QUADRO RTX 5000 or wait for 11th gen i7 and A4000??
Amogus in the top left shelf?!
OK I'm sorry, I'll shut up now.
Please review Nvidia Quadro RTX A4000
Mini cpu inside nextgen RTX graphics card????👀
CPU Industry needs Nvidia's..smart marketing tactics!
5:37 wow 512MB of memory! That ought to be enough for anyone.
7:22 I think he means 16gb
with our current gpus market i welcome our new overlord
Haven't Nvidia been producing ARM CPUs (SoCs, but strictly speaking, Ryzen is also SoC, but I digress) for ages? E.g. Tegra X1, which is used in shield and switch and many other for AI
ARM creates the specifications for CPUs. I don’t see how Nvidia’s ARM based SoC (that it has been creating for a decade) wouldn’t be considered “Nvidia” CPU’s (if you also consider Nvidia x86 it’s own CPU’s), any x86 CPU would be based on Intel and AMD specifications and instruction sets