NVIDIA Making CPUs, New RTX A5000 & A6000 GPUs, & Deep Learning

Sdílet
Vložit
  • čas přidán 2. 06. 2024
  • NVIDIA announced its new CPU today, a collaboration with ARM yielding the "Grace" CPU. The company is also releasing new RTX GPUs (the A4000, A5000, A6000).
    Sponsor: Buy Corsair's 5000D Airflow Case on Amazon (geni.us/cnVP60)
    We're already half-way through our initial new GN 'Volt' Large Modmat inventory. Grab one on the store if you want it this round: store.gamersnexus.net/product...
    You can also pick up an Original Modmat (also in stock & shipping) on the store: store.gamersnexus.net/product...
    NVIDIA's GTC 2021 conference saw the unveil of a series of new GPUs for professionals and data center applications, but all of these were overshadowed by NVIDIA's renewed interest in the CPU space. NVIDIA has been trying to purchase ARM since 2020 (currently pending regulatory approval), and in the meantime, the two organizations are working together to produce the new NVIDIA Grace CPU for data center and server applications. The NVIDIA Grace CPU aims to improve bandwidth of CPU-to-GPU communications and CPU-to-CPU communications, outperforming traditional PCIe signaling with NVIDIA's NVLink.
    The GPUs announced are part of the former Quadro line, branding that NVIDIA is slowly moving away from. The A5000 and A4000 GPUs are in the high-end of Ampere RTX cards, with mobile versions also available in Max-Q laptops.
    Separately from all of this, NVIDIA talked more about its Omniverse software and a few RTX games, including Black Myth: Wukong, Boundary, Narak: Bladepoint, and Bright Memory. Cyberpunk also got a comically short pseudo-reference.
    Like our content? Please consider becoming our Patron to support us: / gamersnexus
    TIMESTAMPS
    00:00 - NVIDIA GTC 2021 Announcements
    01:03 - NVIDIA's New "Grace" ARM CPU
    05:11 - New GPUs from NVIDIA: A4000, A5000
    10:51 - A10, A16, & Omniverse
    ** Please like, comment, and subscribe for more! **
    Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
    Follow us in these locations for more gaming and hardware updates:
    t: / gamersnexus
    f: / gamersnexus
    w: www.gamersnexus.net/
    Editorial, Host: Steve Burke
    Editorial: Patrick Lathan
    Video: Andrew Coleman
  • Hry

Komentáře • 642

  • @GamersNexus
    @GamersNexus  Před 3 lety +64

    Watch our recent IGP benchmark for a comparison of Intel's new integrated graphics performance: czcams.com/video/2H1B7ibjJZg/video.html
    We're already half-way through our initial new GN 'Volt' Large Modmat inventory. Grab one on the store if you want it this round: store.gamersnexus.net/products/modmat-volt-large
    You can also pick up an Original Modmat (also in stock & shipping) on the store: store.gamersnexus.net/products/modmat

    • @Zarcondeegrissom
      @Zarcondeegrissom Před 3 lety

      tater-tot jokes aside, I and I'm sure many others really do appreciate those iGPU tests, as un-glorious as 11th gen and APU must sound to some at this point.
      Great vid steve and crew. B)

    • @goofball1_134
      @goofball1_134 Před 3 lety +1

      10k years of work WOW theyve been workin on this one since before christ!

    • @Zarcondeegrissom
      @Zarcondeegrissom Před 3 lety

      @@goofball1_134 ouch! lol, NICE!
      burn hotline.... lol.

    • @philhacker1137
      @philhacker1137 Před 3 lety

      ** as reported on my local tv channel: We my be unable to buy new Chips. the manufacturers are stopping to build computer silicon unless you're buying a new vehicle 😭

    • @AbyNeon
      @AbyNeon Před 3 lety

      you and everyone, should never mention nvidia again or review anything they do after the declsred intentions spoke at the keuynote, that precisely describe a world where nvidia will take control of everyhting you do with ai... that inlcusdes you giving away yoru comuters and being an nvidia slaeve forever having to pay them what evetr tey want fo ru to do what ever is necessary online via their servers anfd 5G infrastructure...it is incredible you have not unerstood hwodleusional jensenhuang is, and that he basiclay is settign up teh 4th reich where ai wil conteol al of our lives and take away any thought or memry if what freedom was.

  • @chumbo1948
    @chumbo1948 Před 3 lety +1007

    Soon choosing a cpu is going to be like choosing your starter pokemon

    • @dra6o0n
      @dra6o0n Před 3 lety +97

      The problem here is that the starting pokemon uses cunning or shady business deals to push out competition and make it worse for consumers...
      Oh wait, those spending less than $1000 on a single piece of hardware isn't considered a consumer...
      "Pick this pokemon, it poops out rare candies!"
      or "Special nvidia shades comes with the choice of this pokemon, wear it and your pokemon looks sexier than others"

    • @Werloxali
      @Werloxali Před 3 lety +36

      Fire, water or plant type. Imo, blue is my favorite color but I still pick Bulbasaur every time.

    • @Perseagatuna
      @Perseagatuna Před 3 lety +60

      @@Werloxali imagine picking bulbasaur lmfaooo
      this comment was made by charmander gang

    • @tengkualiff
      @tengkualiff Před 3 lety +15

      @@Werloxali I like Blue too, but i don't like supporting Intel, for ethical reasons.

    • @zen608
      @zen608 Před 3 lety +28

      @@dra6o0n all corporation bad and greedy, must say bad thing about corporation and demonize corporation.

  • @TanTan-gl5ic
    @TanTan-gl5ic Před 3 lety +85

    Now we got RGB teams (Red - AMD) (Green - Nvidia) (Blue - Intel)

  • @antontaylor4530
    @antontaylor4530 Před 3 lety +96

    Nvidia: We're going to start making ARM CPU's!
    Tegra: Wait... What???

    • @Zero11s
      @Zero11s Před 3 lety

      tegra is just the gpu part

    • @johnbuscher
      @johnbuscher Před 3 lety +27

      @@Zero11s No. Tegra is the entire SoC.

    • @robvdl
      @robvdl Před 3 lety

      nVidia have already been making ARM cpus for years. Look at the Jetson Nano

    • @noobian3314
      @noobian3314 Před 3 lety

      @John-Paul Hunt Nintendo use Nvidia CPU already

  • @JB-ym4up
    @JB-ym4up Před 3 lety +68

    Server CPU ARMs race.
    I'll go now.

    • @xwaltranx
      @xwaltranx Před 3 lety +6

      Ha. Ha. Ha. Ha. Ha. Ha. Ha.
      Ha. Ha. Ha. Ha. Ha. Ha. Ha.
      Ho. Hee. Hee.
      Ah-hah. Ah-Hoo. Hee. Ha. Ah-hah.
      I thought my jokes were bad.

    • @swaggitypigfig8413
      @swaggitypigfig8413 Před 3 lety +4

      @@xwaltranx 🃏🃏🃏

    • @evacody1249
      @evacody1249 Před 3 lety

      Key words server CPU. How many top 500 companies are going to buy into Nvidias CPU.

  • @Alexander-dn5up
    @Alexander-dn5up Před 3 lety +26

    2:15 I choose to think that this is a single immortal engineer working from the dawn of time. Prometheus, bringing humans the flame, then inventing wheel, then single handedly dragging humanity all the way through stone age, bronze age, metal, renaissance, industrial revolution, digital age to finally bestow upon us their finest creation, a culmination of their life's work. A CPU that people will use to simulate realistic dong physics.

    • @Carcinogenic2
      @Carcinogenic2 Před 3 lety +2

      Exactly. I had to rub my eyes and look again when I saw their claim: 10,000 engineering _years._
      Has the AI multiplied human lifetime to achieve that? 🤔😂

  • @rikwisselink-bijker
    @rikwisselink-bijker Před 3 lety +357

    So AMD has been making both CPUs and GPUs for a while, Intel just started with a GPU division, and now NVIDIA is starting to make CPUs?
    These are strange times.

    • @matthewbilker3401
      @matthewbilker3401 Před 3 lety +61

      They have been making ARM CPUs for years. See their Tegra line of ARM mobile processors.

    • @jonathansoko1085
      @jonathansoko1085 Před 3 lety +20

      @@matthewbilker3401 Yes but unless you are a nerd, you dont know about that and its not the same. You know what hes saying.

    • @KeithZim
      @KeithZim Před 3 lety +6

      @@morpheas768 ARM designs CPUs. They license these designs to companies like Apple, Nvidia, Qualcomm, Samsung, ect... I am pretty sure they dont physically make anything you can buy. Designing a CPU is closely guarded science that will continue to be controlled by national defense agenda for many decades to come. I will be VERY SURPRISED if Nvidia is allowed to buy ARM. It raises way to many DOD flags.

    • @teemuvesala9575
      @teemuvesala9575 Před 3 lety +1

      @@morpheas768 Its not Arm making the CPU's... Its gonna be TSMC or Samsung actually manufacturing them. Arm only licences its design IP, it does not make anything. Also, regardless whether Nvidia acquires Arm or not, I think they will go ahead with this project anyway. Although... Its quite clear the deal is going through and Nvidia + Arm will fusion together. I mean "Grace" CPU... So obvious. Next gen Nvidia GPU is called "Hopper". So Nvidia + Arm marriage translates to Grace Hopper. Nvidia really is good with their naming schemes.

    • @teemuvesala9575
      @teemuvesala9575 Před 3 lety +1

      @@KeithZim Nah it will go through. There's no real objections from anyone, except maybe China. But they can go ahead and approve the deal even without Chinese approval. Arm China has been kinda taken over by CCP anyway.

  • @ProjectPhysX
    @ProjectPhysX Před 3 lety +118

    5:36 funny typo: Server-Class Performance *512MB* RAM for $149k :D

  • @nickmudd
    @nickmudd Před 3 lety +96

    Its crazy the amount of computing power needed to simulate things that just happen IRL. Like water

    • @lycanthoss
      @lycanthoss Před 3 lety +34

      Well if you wanted to truly simulate reality you would have to simulate every single atom.

    • @gosugirl1
      @gosugirl1 Před 3 lety +33

      @@lycanthoss if you truly simulated reality you would be making reality 🙂

    • @xentiment6581
      @xentiment6581 Před 3 lety +11

      It just looks like 'they just happen' to us because we don't understand its complexity

    • @roqsteady5290
      @roqsteady5290 Před 3 lety +5

      Well it did take many billions of years iterating the laws of physics before it jusy happened and who knows what "before" that.

    • @Moon_Presence
      @Moon_Presence Před 3 lety

      Lol

  • @leinadreign3510
    @leinadreign3510 Před 3 lety +58

    Nvidia making CPUs, Intel is making GPUs, cats and dogs live together...
    The end is here guys! These are the signs!

    • @jedidethfreak
      @jedidethfreak Před 3 lety +1

      Is this true?
      Yes. This man has no dick.

    • @AC3handle
      @AC3handle Před 3 lety

      @@jedidethfreak damn. two days to late.

    • @psycronizer
      @psycronizer Před 3 lety

      you forgot the mass hysteria

  • @21preend42
    @21preend42 Před 3 lety +92

    Is no one talking about Megatron, their transformers ? that made me chuckle.

  • @budthecyborg4575
    @budthecyborg4575 Před 3 lety +14

    13:56 Shattered Horizon is one of my all time favorite shooters. Seeing it get a sequel is one of the best announcements of the year.

  • @timo1294
    @timo1294 Před 3 lety +10

    Can we choose RGB CPUs now?
    Red : AMD
    Green: NVIDIA
    Blue: INTEL

  • @joshhardin666
    @joshhardin666 Před 3 lety +1

    The cpu inside one of the most popular video game consoles right now, the Nintendo switch uses an nvidia tegra arm cpu. It's always nice to see that they're doing stuff but it's not new for them

  • @fabienleydier
    @fabienleydier Před 3 lety +53

    For the bandwidth between CPU and GPU, or Host-to-Device, it's almost always the bottleneck in scientific calculations not being natively written for GPUs. So having a lot of it allows developpers not to rewrite their entire code to "mask" these copy times.

    • @ProjectPhysX
      @ProjectPhysX Před 3 lety +1

      That is also the reason why most LBM codes are awfully slow and why I eventually developed my own in OpenCL from the ground up. No host-device transfer and yikes, suddenly 100% hardware efficiency and ~150x faster than half-baked GPU codes of the competition.

    • @fabienleydier
      @fabienleydier Před 3 lety

      Ok, but the vast majority of HPC codes or study cases doesn't fit into the GPU memory at once, and also they were developped (worse case) back in the 60-70's. So you can't just do that :)

  • @amsrremix2239
    @amsrremix2239 Před 2 lety

    Thank you sooo much. I do simulations and it’s sooo hard to find good reviews/news for these cards. I really do appreciate this

  • @brokejaw
    @brokejaw Před 3 lety

    Man, your channel is so detailed on the specs

  • @calebwaniger2642
    @calebwaniger2642 Před 3 lety +24

    Was just about to go to sleep... Thanks for making it easier GN. I'll be passed out before the 5 min mark

  • @AlexanderMichelson
    @AlexanderMichelson Před 3 lety +2

    Thank you, Steve! You truly are Tech Jesus!

  • @LetsTakeWalk
    @LetsTakeWalk Před 3 lety +22

    So you could in the future have a Nvidia CPU, coupled with an Intel GPU.

    • @andersjjensen
      @andersjjensen Před 3 lety +3

      No. It's a specialized ARM product targeted at feeding GPUs what they need when they need it. You can't run anything that is interesting to you on it.. because it's, you know, ARM and not x86. That and Nvidia isn't going to sell this as a standalone CPU. It will only be available coupled with Nvidia GPU tech.

    • @Dexter101x
      @Dexter101x Před 3 lety +1

      With an amd chipset

  • @gamerboss13
    @gamerboss13 Před 3 lety +5

    6:41 "Nvidia Megatron Trains Transformers"
    Did no one look at that and have an "are we the baddies" moment?

  • @nevo6400
    @nevo6400 Před 3 lety +19

    NVIDIA should make leather jackets.......now that's where the money is.💵

    • @intech7863
      @intech7863 Před 3 lety +1

      Hahaha

    • @rschana100
      @rschana100 Před 3 lety +2

      Probably won't be able to get it at MSRP

    • @tacticalcenter8658
      @tacticalcenter8658 Před 3 lety +3

      They would probably end up being nylon faux leather. lul

    • @io5362
      @io5362 Před 3 lety +2

      Jensen Wong cosplay show

    • @nevo6400
      @nevo6400 Před 3 lety

      @@tacticalcenter8658 yeah lol...

  • @owencoyne6223
    @owencoyne6223 Před 3 lety +25

    Solidworks + A4000 will be an interesting mix

    • @concentricmachining4636
      @concentricmachining4636 Před 3 lety +6

      I can finally crunch the static stress tests in under days!

    • @owencoyne6223
      @owencoyne6223 Před 3 lety +5

      @@concentricmachining4636Maybe it will finally stop crashing! /s

    • @ProjectPhysX
      @ProjectPhysX Před 3 lety

      The A4000 is an overpriced RTX 3080 Mobile. Get a RTX 3090 instead for about the same money, that is like 2x faster and has 50% more the memory.

    • @concentricmachining4636
      @concentricmachining4636 Před 3 lety +2

      @@owencoyne6223 I don't even think an act of God would make it more stable.

    • @glowyboi7175
      @glowyboi7175 Před 3 lety +8

      @@ProjectPhysX I would agree for typical GPU workstation rendering but Solidworks is one of those very rare apps that needs double precision, which GeForce doesn't have, so they run terribly :/

  • @jonathan8087
    @jonathan8087 Před 3 lety +3

    The promos, it just works!

  • @purpulse
    @purpulse Před 3 lety +77

    Was about to go to sleep, but this is definitely more important.

  • @ToadyEN
    @ToadyEN Před 3 lety +24

    Just waiting for intel to counter by changing the colour of their logo or something

    • @riffsthatkill2180
      @riffsthatkill2180 Před 3 lety +1

      Or adding one useless pin to their chips. Maybe a slightly thicker IHS. One thing is for sure, they'll do as little as possible and still cash in

    • @attractivegd9531
      @attractivegd9531 Před 3 lety

      "By removing 20% to the core counts"*
      :D

  • @pythonl0rd584
    @pythonl0rd584 Před 3 lety +2

    Can’t wait for the possibilty to have an all nvidia pc

  • @motionsquared
    @motionsquared Před 3 lety +2

    For rendering in Redshift or Octante, are 2x A5000s better than 2 RTX 3090s?

  • @evilpotatoman9436
    @evilpotatoman9436 Před 3 lety +12

    NVIDIA: we dont have a single current gpu in stock.
    Also NVIDIA: ok cool lets move on to other adventures...

    • @mqayyum9226
      @mqayyum9226 Před 3 lety +1

      R&D can't just sit idle if production can't keep up

    • @evilpotatoman9436
      @evilpotatoman9436 Před 3 lety

      @@mqayyum9226 i know, but its sad to look at

  • @Reinhardovich
    @Reinhardovich Před 3 lety +4

    Oh is that an EVGA Kingpin 3090 card with an attached LN2 container? Oh I wonder if that's going to be the subject of a future video!

  • @alexv6324
    @alexv6324 Před 3 lety

    I would donate to GN's Patreon for Steve's sarcasm let alone the great content.

  • @MeteWayne
    @MeteWayne Před 3 lety +1

    12:20 haha good one :)

  • @jensenhuangnvidiaceo9478
    @jensenhuangnvidiaceo9478 Před 3 lety +8

    The more you buy,
    the more you save.
    Its just works

    • @texasslingleadsomtingwong8751
      @texasslingleadsomtingwong8751 Před 3 lety +1

      Lol , it just works

    • @pikachu7314
      @pikachu7314 Před 3 lety +3

      Careful, you might trigger that self righteous hypocrite who thinks you're trying to steal someone's identity. BTW, I'm the real pikachu, the rest of you are only cheap imitations.

    • @StefanEtienneTheVerrgeRep
      @StefanEtienneTheVerrgeRep Před 3 lety

      @@pikachu7314 Sure, I'm real too. But we all know authentic Pikachus only say "pika" and variations of "pikachu"

  • @annulushoctar3900
    @annulushoctar3900 Před 3 lety

    Just got my tool kit and wireframe mouse pad from the GN store awesome in every way can't wait to get the new mod mat also way better than the merch from that other channel 🤔😉

  • @doctorkj5640
    @doctorkj5640 Před 2 lety +1

    Great video, as always. I watched this couple of months ago and decided to buy A4000 for my 2D/3D work. It is a hell of a card for the price and I hope it will serve me well for years to come. One thing I noticed is that HWinfo reports GPU temp as high as 85-87 degrees on full load. Is that OK considering it is a professional gpu and a blower style cooler. Pro gpus tend to have higher temp tolerances. What's your opinion on the subject. Thank you.

  • @HariGtt
    @HariGtt Před 3 lety

    Hi, i was looking for the puget systems nv-link article and couldn't find it, could you provide it? I was hoping it was in the text form of this newspiece but there is none either 😥

  • @WayStedYou
    @WayStedYou Před 3 lety +5

    5:36
    512MB of DDR4 with 8 dimms? Nvidia going old school.

  • @type_z
    @type_z Před 3 lety

    👀 that kingpin on the table

  • @deadchannel5933
    @deadchannel5933 Před 3 lety +5

    Holy shit, the jokes about NVidia starting to make CPUs from a while back now became a reality

    • @noobian3314
      @noobian3314 Před 3 lety

      Nvidia have been making CPU for ages, they have them in Jetson single boards, Nvidia shield and Nintendo switch

    • @deadchannel5933
      @deadchannel5933 Před 3 lety

      @@noobian3314 I'm talking about desktop/server CPUs

  • @timaaaahhhh
    @timaaaahhhh Před 3 lety

    Any chance you'll cover the thermal pad issue/mod on 3080 & 3090? Wondering if 110C during gaming is acceptable at all for GDDR6X.

  • @RolandsSh
    @RolandsSh Před 3 lety +2

    The timeline at 4:57. Is rumoured Lovelace now codenamed "Ampere Next"? Or are we only getting Lovelace in 2026? That part is extremely confusing to me.

    • @haukionkannel
      @haukionkannel Před 3 lety

      Just ”name” change. It will get real name closer release...

  • @marsvoltron
    @marsvoltron Před 3 lety

    Had a question unrelated to the video. Does the loop volume in a water cooled build effect the pump adversely? Say 2 gallons as apposed to 20. Would the pump, under ideal conditions, be effected to any measurable degree? My thoughts are that in a closed loop it won't matter.

  • @mahatmagandhiful
    @mahatmagandhiful Před 3 lety

    Maybe I'm missing something here, but what is/are the use case(s) for a mobile, Max-Q workstation GPU?

  • @morg444
    @morg444 Před 3 lety

    Quadro went away with A series launch in October... keep up!

  • @Taorakis
    @Taorakis Před 3 lety

    14:16 It's not only Cyberpunk, the first few frames are Death Stranding. Probably not the first to notice though...

  • @jonpolor
    @jonpolor Před 3 lety

    wow! amazing info and here I am with a 1660 super.

  • @dhgmrz17
    @dhgmrz17 Před 3 lety

    Sweet! My dreams of owning a server grade android phone are coming true.

  • @TheJared030858
    @TheJared030858 Před 3 lety

    HOLD THE PHONE. Is that BRAKE CLEANER on the shelf in the upper left portion of the backboard?!

  • @thewackerofthetally3568

    I really like the look of the new modmat. I cant wait till mine arrives. The only thing I dont like is that I wont be able to hang it on the wall. Im sure not many people will agree, but I think a poster version would be really cool. You could remove the gpu outline grid, and just add more information. Its the information, and the way you simplify it that interests me the most.

  • @jorgeluisgalvezvallejo4680

    Something interesting to note about the supercomputer they annouced with their Grace CPUs is that they say 50 AI exaflops which should be FP16. So this computer should be, roughly, around 5 Exaflops. This would be around the power of the exascale machines coming up: Aurora from Intel (2022*), Frontier from AMD (2020-2021) and El Capitan also from AMD (2022-2023). It seems that nvidia getting none of the DOE exascale contracts took it personally and decided to its own super computer

  • @MARTINRIGGSS
    @MARTINRIGGSS Před 3 lety +11

    So we should expect them like 2 years after launch.

    • @philhacker1137
      @philhacker1137 Před 3 lety

      Retooling takes 6-9 months. but its all done with Jenson mini been on the market for 2years.

    • @oli8296
      @oli8296 Před 3 lety

      That's a generous estimation....🤣😂

    • @oli8296
      @oli8296 Před 3 lety

      @Tano yeah its a joke.

    • @oli8296
      @oli8296 Před 3 lety

      @Tano oh dear 🤦🏽🤦‍♂️ you must be a right laugh on a night out.

  • @kgonepostl
    @kgonepostl Před 3 lety

    Great, can't-wait for mining specific SKUs, and handicapped processors!!

  • @violetgraham4223
    @violetgraham4223 Před 3 lety

    Thanks for covering the A100 launch. I hadn't come across the news yet. I use V100s and P100s for hundreds of hours a month training machine learning models in the cloud. Looking forward to the boost in performance when cloud providers upgrade and I might think about trying to snag a P100 or a couple K80s as they offload them in the second hand market. Would love it if you would add a machine learning benchmark in your suite because I really have no idea how consumer cards compare to these data center cards. Was interested when the rtx 3060 came out with 12gb because that's about the minimum for most of the models I run. I was wondering how it would compare to a K80 or P100 because they're pretty much the only cards I could afford to invest in without saving up for quite awhile. Of course me affording a 3060 is contingent on their prices returning to list or at least somewhere near it.

  • @delilas2398
    @delilas2398 Před 3 lety +1

    I would not expect the A10 to run quietly, usually they make a brrrrrrrrrrrt noise. Coil whine maybe?

  • @merthyr1831
    @merthyr1831 Před 3 lety +2

    Hasn't nVidia been doing CPUs already such as Tegra? I suppose it's good to get into developing ARM designs since I think we'll be seeing a shift towards them as translation layers become more robust.
    Apple pushing competitors to up their efforts in that space is something nVidia will probably benefit from in the long term if they wanted to break into desktop computing. Developers are beginning to translate application interfaces for ARM rather than rely on translation performance. Can only hope we get a new generation of fast, affordable, and most importantly POWER EFFICIENT mainstream consumer processors.

  • @gudenau
    @gudenau Před 3 lety

    Honestly I watch you guys because a friend recommended it. :-)
    Edit:
    How do you count the bridge spacing? Does it include both slots, one slot or neither?

    • @patricktho6546
      @patricktho6546 Před 3 lety

      from the middle of 1 Slot to the middle of the tareted one

    • @gudenau
      @gudenau Před 3 lety

      @@patricktho6546 So don't count a slot? A 4 slot spacing means 3 slots are unused?

    • @patricktho6546
      @patricktho6546 Před 3 lety

      @@gudenau no. I was wrong with my specifics/wording.
      A slot has fixed width, that it "occupies". This ranges from a bit above the socket to a bit below. Now we have kind of a table, where we have all 8 or so slots over eachother (seen on the back of cases). If everyone of these spots have a socket, then we have a spacing of 1 (not 0, as would be the other possibility). So for 1 slot GPU's we need a bridge that is 1 slot width, since it is basically just shifted.
      if you have a 2 slot cards the bridge needs to go from the 1st to the 3rd slot. So it covers 2 slots. If you leave for air a bit of space or have a 2.5 or 3 slot card on the first slot, you need a 3 slot bridge and so on.
      I hope this was a better explanation for you.

    • @patricktho6546
      @patricktho6546 Před 3 lety

      @@gudenau to the specific quention: If every slot has socket, then with a 4 slot spacing, 3 are unuseable between the card you put in since you need the extra slot to go from the border of the slot to the pins on both sides.

  • @astralmaze
    @astralmaze Před 3 lety +15

    Nvidia: "IT JUST WORKS"
    "OUT OF STOCK"

  • @incxbxs
    @incxbxs Před 2 lety

    I'm on the edge of buying either A5000 or rtx-3090 for deep learning (specially transformers) only.
    Do we have any performance comparision between them on ai? Which one should go with?

  • @CharlieMermelstein
    @CharlieMermelstein Před 3 lety

    Their Enterprise Data Center is located in Connecticut?

  • @_TbT_
    @_TbT_ Před 3 lety +1

    Quadro RTXes are not only for certified applications. Those are the only ones which can be used in 8x to 10x card servers (e.g. from Supermicro) for machine learning purposes „on a budget“, because of their fan design. So, no luck with Titan RTXes for such applications, you have to go with the more expensive Quadros.
    If you can‘t afford A100 NVLink servers, you have to go with Quadros.

  • @issahumps
    @issahumps Před 3 lety

    Would actually like to see a full nvidia laptop just put it in its “shield “ class of products 🤔

  • @Nemcija
    @Nemcija Před 3 lety +2

    Nice - and when will I get my 3080?

  • @zorrozalai
    @zorrozalai Před 3 lety

    Is it possible to put together a computer with nvidia cpu, intel gpu, and amd ssd? Seems to be a winning combination. :)

  • @yusepbcn
    @yusepbcn Před 3 lety +6

    In europe, wake up at 7 , prepare expresso, sit down, watch new fresh GN video.
    Ready for a new day.

    • @Misterzen87
      @Misterzen87 Před 3 lety +8

      As a european you really should know to write "espresso", greetings from switzerland :D

    • @cactuslietuva
      @cactuslietuva Před 3 lety

      @@Misterzen87 Expressivo

    • @yusepbcn
      @yusepbcn Před 3 lety +2

      @@Misterzen87 you are damn right ;) thank God I am not italian...

    • @Misterzen87
      @Misterzen87 Před 3 lety

      @@yusepbcn that would have made nonna very sad haha

  • @sophiaperennis2360
    @sophiaperennis2360 Před 3 lety

    Maybe open more factories to make more GPUs?

  • @lindah6954
    @lindah6954 Před 3 lety

    I got to see this massive server tower that is more advance then anything I have ever saw. 3,000 TB. I can only say about that.
    It's not for civilian use. This thing can act before you can think it.

  • @bladerj
    @bladerj Před 3 lety +3

    so intel is making graphic cards, and nvidia is making cpu.......its the end of times indeed..gentlemen, its been an honor.

  • @Helios.vfx.
    @Helios.vfx. Před 3 lety

    13:57 Did he say boundry? for that spacial shooting game? Not native english dude, but that game looks awesome

  • @megamanx466
    @megamanx466 Před 3 lety +5

    I look forward to the day that gamer-grade PCs *need* refrigeration units built in! 😅

    • @megamanx466
      @megamanx466 Před 3 lety

      @Tano Interesting. Now the question would be, are they affordable enough and worth it? 🤔

    • @megamanx466
      @megamanx466 Před 3 lety

      @Tano Lol. So you're saying they just need to be cheaper and more efficient? I concur! 🤔😋

  • @GamerFan2050
    @GamerFan2050 Před 3 lety +1

    In future down the line Gpu's will be completely forgotten and we will see integrated cpu graphics so good that they can play any game at max settings, easy with more than 50gb vrams

  • @sensah100
    @sensah100 Před 3 lety

    cool but will it be in stock

  • @RANDOMNATION907
    @RANDOMNATION907 Před 3 lety

    Is DDR5 going to use a new socket or will it be the same 288 pin, DDR4 socket we currently have?
    PCIe is up to gen 4 on the same socket, so the DDR4 socket could get reused.

    • @glowyboi7175
      @glowyboi7175 Před 3 lety

      New. DDR has never been backwards compatible and DDR5 will be the biggest change.

  • @kronusaerospace8872
    @kronusaerospace8872 Před 3 lety +1

    Now all they need is an x86 license! (Not likely, but dare to dream)

  • @rugan0723
    @rugan0723 Před 3 lety

    Looking at my shield and Jetson Nano slightly puzzled.

  • @Sky1
    @Sky1 Před 3 lety

    That is jargon for it being a Mining CPU

  • @Deveron4
    @Deveron4 Před 3 lety +2

    A4000 pretty exciting too 👀
    Single slot Ampere lookin mad cute

  • @denniskrq
    @denniskrq Před 3 lety +1

    Interesting to note... Bright Memory, Boundary, Naraka Bladepoint, and Black Myth Wukong are all Chinese developed games, AFAIK they're all "indie" too

  • @peelthebananna9827
    @peelthebananna9827 Před 3 lety

    I’ve been waiting all year for the a5000 to complete my workstation

  • @HimothyOHooligan
    @HimothyOHooligan Před 3 lety

    14:00 was shattered horizon supposed to be anything more than a multiplayer tech demo? I played it when it came out and it was fucking AWESOME but my PC almost couldn't run it. The required specs were pretty high.

  • @lamikal2515
    @lamikal2515 Před 3 lety +1

    5:38 The amount of RAM for that worstation seems a bit...low.

  • @reaktorleak89
    @reaktorleak89 Před 3 lety

    I'm not surprised they switched back to GDDR6 memory.

    • @glowyboi7175
      @glowyboi7175 Před 3 lety

      They didn't switch back, 6X is not ECC capable.

    • @glowyboi7175
      @glowyboi7175 Před 3 lety

      @@bronzehd6212 ... but its still not ECC is what's required for compute GPUs.

  • @ashercampbell2571
    @ashercampbell2571 Před 3 lety +11

    Having a shit day
    Gamers Nexus releases a video
    Having a wonderful day

  • @SiriaK
    @SiriaK Před 3 lety

    There's an internal struggle within ARM China that affecting the nVidia buy-out of ARM.

  • @munirfurkangoren4267
    @munirfurkangoren4267 Před 3 lety

    Can please add cc for all videos (en)

  • @Nevy21
    @Nevy21 Před 3 lety

    Ampere next next. The nextest of the nexts.

  • @jafizzle95
    @jafizzle95 Před 3 lety

    5:38 - 512 MB? Megabytes? That's a typo right? Surely they meant GB. Or maybe it's not in reference to system memory, but I struggle to think of another place where 512MB of DDR4 memory would make sense in a modern system.

  • @johnpettit6886
    @johnpettit6886 Před 3 lety

    5000D with attention paid to small detail - giggity :)

  • @World_Theory
    @World_Theory Před 3 lety +1

    6:18 Wait a minute… You have RTX 5000 GPUs? Where did those come from?! Do you have a time machine?
    How do embargos work for products that not even the manufacturer know exists yet??

    • @glowyboi7175
      @glowyboi7175 Před 3 lety

      He said "not the A series" The RTX 5000 is previous generation Quadro card.
      Pretty sure the manufacturer knows about the A series if they are on nVidia's website.

  • @wahidtrynaheghugh260
    @wahidtrynaheghugh260 Před 3 lety

    The dgx station... cooled with a custom refrigerant loop and a custom machined cpu block, even the gpus are in the loop... I nearly creamed my pants

  • @monetary687
    @monetary687 Před 3 lety +22

    Come on man I was just going to bed

  • @ricki530
    @ricki530 Před 3 lety

    Should I buy a laptop with 10th gen i7 and QUADRO RTX 5000 or wait for 11th gen i7 and A4000??

  • @Deses
    @Deses Před 3 lety

    Amogus in the top left shelf?!
    OK I'm sorry, I'll shut up now.

  • @syedtohfaraiyan2419
    @syedtohfaraiyan2419 Před 2 lety

    Please review Nvidia Quadro RTX A4000

  • @thedesk954
    @thedesk954 Před 3 lety +1

    Mini cpu inside nextgen RTX graphics card????👀

  • @D3aTHMaGNeTiC
    @D3aTHMaGNeTiC Před 3 lety

    CPU Industry needs Nvidia's..smart marketing tactics!

  • @axiom1650
    @axiom1650 Před 3 lety

    5:37 wow 512MB of memory! That ought to be enough for anyone.

  • @SpiritHealer1
    @SpiritHealer1 Před 3 lety

    7:22 I think he means 16gb

  • @hyouzanren1846
    @hyouzanren1846 Před 3 lety

    with our current gpus market i welcome our new overlord

  • @mikhailkovalev7762
    @mikhailkovalev7762 Před 3 lety

    Haven't Nvidia been producing ARM CPUs (SoCs, but strictly speaking, Ryzen is also SoC, but I digress) for ages? E.g. Tegra X1, which is used in shield and switch and many other for AI

  • @MartinChir
    @MartinChir Před 3 lety

    ARM creates the specifications for CPUs. I don’t see how Nvidia’s ARM based SoC (that it has been creating for a decade) wouldn’t be considered “Nvidia” CPU’s (if you also consider Nvidia x86 it’s own CPU’s), any x86 CPU would be based on Intel and AMD specifications and instruction sets