NVIDIA Made a CPU.. I’m Holding It. - Grace CPU/Hopper SuperChip @ Computex 2023

Sdílet
Vložit
  • čas přidán 15. 05. 2024
  • Try Pulseway FREE today, and make IT monitoring simple at: lmg.gg/LTT23
    I'm at the Gigabyte booth at Computex 2023 where they're showing off bonkers new hardware from Nvidia!
    Discuss on the forum: linustechtips.com/topic/15099...
    Immersion tank 1 A1P0-EB0 (rev. 100) :www.gigabyte.com/Enterprise/A...
    Immersion tank 2 A1O3-CC0 (rev. 100): www.gigabyte.com/Enterprise/A...
    Big AI server (h100) - G593-SD0 (rev. AAX1): www.gigabyte.com/Enterprise/G...
    ► GET MERCH: lttstore.com
    ► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
    ► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
    ► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
    ► EQUIPMENT WE USE TO FILM LTT: lmg.gg/LTTEquipment
    ► OUR WAN PODCAST GEAR: lmg.gg/wanset
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    0:22 Meet the Grace Super Chip!
    1:22 We got permission for this...
    3:13 ..but not for this.
    4:40 Now for the GPU!
    6:13 That's where the Interconnect comes in
    7:32 There's "old-fashioned GPUs" too
    8:35 Crazy network card
    11:00 outro
  • Věda a technologie

Komentáře • 5K

  • @RevJR
    @RevJR Před 11 měsíci +4869

    They don't want you to know this, but the processors at the convention are free. You can just walk up and take one.

    • @maxmustermann2370
      @maxmustermann2370 Před 11 měsíci +102

      Just the reason that nvidia squez the moneyz from gamer. They now can gift thier new CPUs, the gamer allrdy payd for the development and research.
      Btw. HP Moonshot was a fail. I see here no diff, just a bunch of Desktop GPUs crunchd in a 14" Laptop Blade. But it is AI man!!!11!1!!, the next hype hardware for failing startups.

    • @russellzauner
      @russellzauner Před 11 měsíci +41

      *doorbell rings*
      Linus 2 weeks from publishing this clip: Why did [ship company] just pull up in a semi?

    • @HanSolo__
      @HanSolo__ Před 11 měsíci +20

      Also, these are our available GPUs.

    • @user-go7mc4ez1d
      @user-go7mc4ez1d Před 11 měsíci +94

      Can confirm, I went to one of these conventions and offered $1000 for one of their processors.
      Their answer? "It's not for sale".
      Snooze you loose Nvidia, thanks for the freebie

    • @oscarsh2909
      @oscarsh2909 Před 11 měsíci +7

      I think we all know that you made a joke like this because you thought about stealing that poor CPU.

  • @Krolitian1
    @Krolitian1 Před 11 měsíci +20347

    I love the idea of Linus just going into conventions and just unscrewing random tech he finds all over the walls without permission.

    • @faceboy1392
      @faceboy1392 Před 11 měsíci +1115

      sinus lebastion is just too dangerous at conventions

    • @MrJosephrandall
      @MrJosephrandall Před 11 měsíci +375

      I was about to comment this myself, goes to show how much the companies trust him now

    • @CrustySofa
      @CrustySofa Před 11 měsíci +109

      seems like something he does everywhere he goes

    • @Vysair
      @Vysair Před 11 měsíci +79

      That's probably what he do in the old day. On-the-spot permission with no prior planning

    • @jaxtinpetersen62
      @jaxtinpetersen62 Před 11 měsíci +69

      the way he chuckles as well, when he actually get's permission. lol

  • @arshkhanlm3086
    @arshkhanlm3086 Před 11 měsíci +546

    Intel made a GPU, now NVIDIA made a CPU, what a time we live in

    • @i_Kruti
      @i_Kruti Před 11 měsíci +2

      😂🤣😂🤣😂🤣😂🤣😂🤣

    • @crashnreset6987
      @crashnreset6987 Před 11 měsíci

      Yea,, what's next?..... Men making babies and women getting drunk and having tattoos ? ;p

    • @KanadeYagami
      @KanadeYagami Před 11 měsíci +35

      NVIDIA should start making motherboards again to go with that new CPU. That would be a real trip. 😆

    • @i_Kruti
      @i_Kruti Před 11 měsíci +10

      @@KanadeYagami and we willn't be able to buy that motherboard due to its price.....😂🤣😂🤣😂🤣😂🤣😂🤣

    • @adamhassam
      @adamhassam Před 11 měsíci

      meanwhile we "gamers" are still fine..

  • @billygilbert7911
    @billygilbert7911 Před 11 měsíci +429

    I'm not surprised they let you take it apart. 1.5 million views in less than 24 hours is more coverage then this would get anywhere. I love these types of videos.

    • @pr0f3ta_yt
      @pr0f3ta_yt Před 11 měsíci +6

      Yeah duh. This whole video is full of shill. Do people actually think this isn’t paid for by Gigabyte or Nvidia. It might aswell be a marketing video for them

    • @billygilbert7911
      @billygilbert7911 Před 11 měsíci +36

      @@pr0f3ta_yt Who cares if it's marketing. Its still cool.

    • @draketurtle4169
      @draketurtle4169 Před 11 měsíci +2

      @@pr0f3ta_yt at least through Linus we can’t done transparency from these big tech companies.
      We actually get to see up and coming tech and Linus explains it’s use cases etc to us normies.

    • @nguyenhanh9479
      @nguyenhanh9479 Před 11 měsíci

      @@pr0f3ta_yt how do you expect them making money ? youtube pay is sh*t, everyone know that.

    • @hjf3022
      @hjf3022 Před 11 měsíci

      @@pr0f3ta_yt they would have to disclose that fact if it was.

  • @HStorm26
    @HStorm26 Před 11 měsíci +6228

    A green cpu with a blue gpu may soon be possible.
    Scary times.

    • @justinbiggs1005
      @justinbiggs1005 Před 11 měsíci +597

      Scary times with pricing and greed. But interesting times hardware/software technology wise

    • @stephenkennedy266
      @stephenkennedy266 Před 11 měsíci +117

      What the hell kind of bizzaro world are we in?

    • @lueduck
      @lueduck Před 11 měsíci +492

      Highly doubt the green goblin is interested in making a cpu for peasants like us.

    • @BudgetGamingEnthusiast
      @BudgetGamingEnthusiast Před 11 měsíci +16

      It already is
      The world is ending

    • @SWOTHDRA
      @SWOTHDRA Před 11 měsíci

      ​​@@justinbiggs1005 scary times indeed for the PC cucked race, hold up ya'll are now gonna get cucked the other way around Nvidia proc and intel gpu??? Damn the PC peasant race keep taking L's.

  • @Filoz
    @Filoz Před 11 měsíci +2674

    I've never been so nervous watching Linus holding new tech.

    • @mike-tq5es
      @mike-tq5es Před 11 měsíci +63

      the intel fab tour was more nerve wrecking lol~ even though he wasn't holding anything like here, his hand gestures and body movement so near all those precision machines after saying we shouldn't touch anything was true anxiety. (oh yea, and he actually did pat machines anyway) XD

    • @phoenux3986
      @phoenux3986 Před 11 měsíci +43

      Something tells me the display units are probably nonfunctional if they're willing to let Linus take one off the wall and open it up with little to no supervision.

    • @Xorthis
      @Xorthis Před 11 měsíci +32

      @@phoenux3986 Nope. I'm sure they are fully functional hardware items. I'm kinda sad he didn't drop one!
      Next week: Repairing the $150,000 server we had to buy after breaking it!

    • @gloamglozer
      @gloamglozer Před 11 měsíci +1

      @@Xorthis haha :D I need to see it! But I gues it costs much more.

    • @IngwiePhoenix
      @IngwiePhoenix Před 11 měsíci +5

      If you've watched him for years, you get used to it.
      Gold controller, 10k Intel CPU (which he dropped) are just among the first things that come to my mind. xD

  • @landonvincent9586
    @landonvincent9586 Před 11 měsíci +70

    Linus is literally the legend of the tech industry. imagine not only being invited to a pre-show, but also being allowed to play with the displays.

  • @Flots1111
    @Flots1111 Před 11 měsíci +7

    Fantastic overview of the new NVIDIA products and a stellar breakdown on ARM procs and where they work best. I'm working through some NVIDIA certification courses and the info is all there but they provide no context other than a dizzying array of multiplier comparisons against previous gen hardware and this video brought it all into focus. Thanks so much, really helpful!

  • @macias22
    @macias22 Před 11 měsíci +2479

    I love how Linus just HAS to disassemble everything he gets his hands on

    • @superintendent1152
      @superintendent1152 Před 11 měsíci +16

      thats how he rolls

    • @grumpyratt2163
      @grumpyratt2163 Před 11 měsíci +48

      Someone somewhere was holding their breath saying don't f'ing drop it Linus don't you dare drop it 😂

    • @dilbertron2
      @dilbertron2 Před 11 měsíci +4

      thats how he rick rolls

    • @jordi95
      @jordi95 Před 11 měsíci +17

      He could not use the LTT screwdriver though! What a missed oportunity!

    • @Ander01SE
      @Ander01SE Před 11 měsíci

      Imagine if it was GN Steve...

  • @ANeMzero
    @ANeMzero Před 11 měsíci +1269

    For reference on the name: Grace Hopper was the US Navy computer scientist who wrote some of the earliest theory on machine-independent programming languages and is credited for writing the first compiler, two incredibly important steps towards modern computing.

    • @idova
      @idova Před 11 měsíci +63

      yes, hearing 'grandma COBOL' mentioned did bring a smile to my face

    • @crimson-foxtwitch2581
      @crimson-foxtwitch2581 Před 11 měsíci +48

      yeah, NVIDIA names a lot of their architectures after important people in science history.

    • @markus1351
      @markus1351 Před 11 měsíci +30

      also ranked Rear Admiral on top of that

    • @legerdemain
      @legerdemain Před 11 měsíci +7

      Grace Hopper has a Posse.

    • @cruzer789
      @cruzer789 Před 11 měsíci +43

      She was also the first person to coin the term 'bug' in computer sciences because she found an actual bug in one of their systems and then taped it into the maintenance log book.

  • @williambrunelle9050
    @williambrunelle9050 Před 11 měsíci +32

    The little giggle of holding a server... a very expensive server and not dropping it made everyone's day! Like a kid in the toy store... Would love to see how hard it was for Jake to pull him away kicking and screaming.

  • @thefreebooter8816
    @thefreebooter8816 Před 11 měsíci +62

    Linus holding a ~$150,000 compute module like it's a boombox will never get old

  • @shorty685
    @shorty685 Před 11 měsíci +1986

    The confidence that some manufacturers have in Linus despite his track record is impressive.

    • @Fishmanistan
      @Fishmanistan Před 11 měsíci +242

      That's because if Linus drop's their product it's free advertising though clips for years to come lol

    • @TAMAMO-VIRUS
      @TAMAMO-VIRUS Před 11 měsíci +141

      Also, these are display units. Meaning they either don't work at full capacity, or might not even work at all.

    • @SonicBoone56
      @SonicBoone56 Před 11 měsíci +84

      @@TAMAMO-VIRUS big companies rarely put something valuable out there in public view, sometimes it's just a dummy unit.

    • @huskycruxes7232
      @huskycruxes7232 Před 11 měsíci +15

      ​@@FishmanistanI did not think of that. I have found myself watching Linus drop compilations

    • @michaelshafer2996
      @michaelshafer2996 Před 11 měsíci +6

      Walk in there with a fat wallet and or million dollar business insurance policy theyd let you do it to 🤷🏻

  • @jamr3y
    @jamr3y Před 11 měsíci +925

    Linus: we haven’t been on good terms with nvidia for a long time
    Also Linus: proceeds to dismantle latest nvidia tech

    • @damptoget9000
      @damptoget9000 Před 11 měsíci +70

      It's gigabyte's booth

    • @brandonmoss7976
      @brandonmoss7976 Před 11 měsíci +7

      @@damptoget9000 they are third-party seller this is NVIDIA tech tho

    • @Gabu_
      @Gabu_ Před 11 měsíci +34

      @@brandonmoss7976 Realistically, Nvidia can't do shit if Gigabyte wants to show off their new stuff that's already available.

    • @prestonvarner611
      @prestonvarner611 Před 11 měsíci +10

      @@brandonmoss7976 Gigabyte can let linus do what he wants... Nvidia would not stop him lets be real here.

    • @Petrixxxxxxx
      @Petrixxxxxxx Před 11 měsíci +7

      @@brandonmoss7976 Which does not matter?
      If you bought a car from Toyota and started dismantling it, do you think Toyota could tell you to stop?

  • @topsofwow
    @topsofwow Před 11 měsíci +1

    Those network cards are also hypervisors allowing you to divide one system up and scale the compute needed per customer.

  • @TiagoRamosVideos
    @TiagoRamosVideos Před 11 měsíci +4

    Incredible product! 👏 And it was great to see the confidence brands put on you Linus 👌

  • @mylestheman
    @mylestheman Před 11 měsíci +1761

    I can’t believe they trusted Linus not to drop one of these 😂

    • @pitchradio9707
      @pitchradio9707 Před 11 měsíci +38

      I think they more trust he can compensate fairly when he does, plus it would be good advertising.

    • @TravisFabel
      @TravisFabel Před 11 měsíci +153

      I think these are non-operational demo examples. That's why they don't care.
      You don't hang $100,000 machine on the wall of a convention. You put up the dead CPUs and mockup PSUs that are basically worthless.

    • @mattsnyder4754
      @mattsnyder4754 Před 11 měsíci +20

      I can’t believe you think that they hung functional hardware on the wall of a convention center 😂

    • @oddball_the_blue
      @oddball_the_blue Před 11 měsíci +3

      I came here just to say the same...

    • @hw2508
      @hw2508 Před 11 měsíci +3

      Mounted to the wall? Probably just models with damaged CPUs anyways.

  • @xtr0city
    @xtr0city Před 11 měsíci +924

    Gigabyte allowing Linus to disassemble a product mounted vertically is a level of trust I didn't know was possible, glad it worked out for them cause Jensen made it very clear how much it costs lol.

    • @hariranormal5584
      @hariranormal5584 Před 11 měsíci +21

      They got a visit to ASML... that cuts everything. Visit to one of the arguably most complicated machines on earth is not a easy task.

    • @karmatraining
      @karmatraining Před 11 měsíci +65

      That module was probably a dud or scrap part that they just used to show how it looks. Ain't nobody leaving a $100K chip hanging on a wall

    • @miroslavmilan
      @miroslavmilan Před 11 měsíci +4

      At first I thought he was going to DELID it.

    • @yourboi1842
      @yourboi1842 Před 11 měsíci +2

      I’d imagine a sponsor on a Linus tech tips video is a few grand. But Linus making a an entire video directly on your product is somehow not worth him dropping it once a blue moon?

    • @miroslavmilan
      @miroslavmilan Před 11 měsíci +1

      The thing is, that probability is a lot higher than once in a blue moon 😄
      Anyhow, it’s mostly just a banter from his loyal fans.

  • @AlwaresHUN
    @AlwaresHUN Před 11 měsíci

    In work we already on the ARM architecture. Switching to it was change the amd64 to arm values in our infrastructure config. Its like half an hour with hundreds of microservices (+ testing, validating).

  • @gaborkeresztes1739
    @gaborkeresztes1739 Před 11 měsíci +17

    Mad respect to Gigabyte for letting this chip get into Linus' hands.

  • @fatrobin72
    @fatrobin72 Před 11 měsíci +632

    Fun fact... one of the first boards Acorn (the company who created ARM) made had a broken power connection to the CPU... but as ARM chips were so low powered, it was still fine

    • @someoneelse5005
      @someoneelse5005 Před 11 měsíci +96

      I watched that... the insanity was that residual power from capacitance all around the chassis managed to power the circuits!

    • @Dragoon710
      @Dragoon710 Před 11 měsíci +7

      @@someoneelse5005 that seems very interesting how can I find this video?

    • @brandonw1604
      @brandonw1604 Před 11 měsíci +11

      Then RISC, what ARM is built on, the CPU was running after power was disconnected.

    • @createusername6421
      @createusername6421 Před 11 měsíci

      😮

    • @tanmaypanadi1414
      @tanmaypanadi1414 Před 11 měsíci

      ​@@Dragoon710
      You are in for a treat
      Lowspec gaming YT channel has a couple of videos covering ARM .
      czcams.com/video/gKYOjDz_RT8/video.html
      czcams.com/video/nIwdhPOVOUk/video.html

  • @idonotcomplyrevolution
    @idonotcomplyrevolution Před 11 měsíci +493

    What I'm beginning to notice is, "compute modules" are essentially the PC and the motherboard isn't really a motherboard anymore, its just an I/O interface for the compute module. Which if you remember is how we used to make computers 40 years ago, just with wildly more advanced tech.

    • @krozareq
      @krozareq Před 11 měsíci +37

      Yep. Everything got shoved into an ISA slot. Keyboard controller, mouse controller, VGA card, memory, etc.

    • @bassplayer3974
      @bassplayer3974 Před 11 měsíci +7

      Reaching limit shrinking and motherboard a bottle neck. Buy in fab package.

    • @brodriguez11000
      @brodriguez11000 Před 11 měsíci +7

      Still do with industrial computers. e.g. VME, etc.

    • @threalismaradona9899
      @threalismaradona9899 Před 11 měsíci +13

      Cloud is just the mainframe and time sharing albeit as you said with very much more advanced tech

    • @SonicBoone56
      @SonicBoone56 Před 11 měsíci +10

      Yep. It's going back to everything being on a card and having it all connect to a backplane.

  • @1over137
    @1over137 Před 11 měsíci +3

    I have used a system with 1440 cores and 64Tb RAM, but it was a few hundred physical commodity boxes. The latest compute space stuff that is replacing the likes that I used, is insane.

    • @1oglop1
      @1oglop1 Před 11 měsíci +1

      And can it run Crysis?

  • @v3nomdahunta
    @v3nomdahunta Před 11 měsíci

    I love that you don't appear to double check that it's off or unplugged.
    I worry when seeing you with any screwdriver to hardware.

  • @dariusz.9119
    @dariusz.9119 Před 11 měsíci +623

    Imagine if Nvidia's reps didn't know Linus has a screwdriver and he just looked around, saw the reps moved away and started dismantling the showcase board before anyone could take a notice 😅😅

    • @davidhanousek9874
      @davidhanousek9874 Před 11 měsíci

      he has mounted ltt screwdriver in his butt... like always 😁

    • @thewiirocks
      @thewiirocks Před 11 měsíci +21

      If they're not used to Linus by now, that's on them!

    • @U1TR4F0RCE
      @U1TR4F0RCE Před 11 měsíci +9

      To be fair NVIDIA didn't let him do that, it's Gigabyte who does work with NVIDIA who did the presentations.

    • @Landen79Foff-wc5ej
      @Landen79Foff-wc5ej Před 11 měsíci +3

      and then when they noticed, they'd be like "HEY!" and then Linus drops it. 😏🤣

    • @8088I
      @8088I Před 11 měsíci +1

      Get ready for Not Super Ai🤖,
      Bot for Super Duper AI👾👧!
      Westworld is only a generation
      away🤠👧(👾). ... :-))

  • @DarkSwordsman
    @DarkSwordsman Před 11 měsíci +506

    It's ironic that in an era where we went from needing dozens of dedicated cards to having most things handled in software, we are now going in reverse: Hardware processing things with dedicated chips or cards.

    • @Ferretsnarf
      @Ferretsnarf Před 11 měsíci +106

      About 10 years ago when I was in college for Electrical and Computer Engineering this is actually one of the things we were talking about. We're more or less hitting a brick wall in miniaturization and increasing the raw speed of individual components. How do we improve performance when we can't miniaturize our chips any more than we already have (At this point we're talking about transistors that are so small that you can count their width in atoms)? Well you offload tasks into different chips (TCP/IP on the network adapter and like Linus showed putting the encryption workload on the adapter). If you find there's a specific workload that you're constantly asking your general-purpose CPU to do, it might start to make sense to put that task on a specialist chip rather than putting it on your CPU.
      ASICs are on the rise and expansion cards are coming back.

    • @autohmae
      @autohmae Před 11 měsíci +17

      Do you remember some people were saying: the end of Moore's law ? That's what is going on here...

    • @Ferretsnarf
      @Ferretsnarf Před 11 měsíci +28

      @@autohmae Yeah, we were talking about that at the time as well. I avoided saying it because I kind of hate talking about Moore's law online - you almost always get some kind of blowback when you talk about moore's law being dead. On the consumer side of things I could almost see why you might think moore's law isn't dead. We're not really seeing smaller+faster all that much anymore. We occasionally barely scrape by into a smaller node, but you're not really getting faster and more efficient transistors out of it anymore, instead you're mostly cramming more stuff onto the die and subsequently aiming a firehose at it to hope you cool it enough to not explode.

    • @autohmae
      @autohmae Před 11 měsíci +5

      @@Ferretsnarf do you know why consumers with some technical knowledge don't know it's dead ? Because of the marketing with CPU node process size.

    • @JustSomeDinosaurPerson
      @JustSomeDinosaurPerson Před 11 měsíci +8

      @@Ferretsnarf This has happened before, and ASICs have always had a need over general purpose processors. Our reasons for stagnation in tech is more of a complex problem as opposed to exclusively being down to physics. As it is, quite a few clever people in fields of research have proposed numerous workarounds that are plausible in theory, but simply not testable at the moment and not feasible on a wide scale, especially without aggressive grant funding like in the past.
      If anything, I would say that we're actually quite lucky that AI has brought about a bit of a resurgence in potential general optimization and advancement.
      Finally, Moore's law was always more of a "loose observation" and never intended to be indefinite, with Moore himself saying that he was certain the trend would not hold for long and become irrelevant to the next abstract steps in advanced design.

  • @00SEVEN28
    @00SEVEN28 Před 9 měsíci +1

    Those are some jaw-dropping performance stats, NGL AI is going to love its new playground.

  • @Lampe2020
    @Lampe2020 Před 11 měsíci +2

    5:33 WHAT?!? 4TB/s?!? That's all computer data I ever have produced - in a single second?!?

  • @billyeveryteen7328
    @billyeveryteen7328 Před 11 měsíci +615

    Hopefully Linus is still making content fifteen or twenty years later, when you can pick these up for relatively cheap to see how they perform in games.

    • @Jaroartx
      @Jaroartx Před 11 měsíci +22

      i could imagine lets install batorcera for running ps6 games 😂😂😂 at ease, and for you dirty otakus create your own living A.I waifu cat girl

    • @JavierMora1112
      @JavierMora1112 Před 11 měsíci +15

      On Ali express 10 years from now

    • @mika2666
      @mika2666 Před 11 měsíci +16

      Sadly the "100" series, previously known as Tesla, does not support any video outputs and does not support any graphics APIs, it's only for compute

    • @Xorthis
      @Xorthis Před 11 měsíci +6

      @@mika2666 That hasn't stopped people running games on them. There's a few benchmarks out there.

    • @honam1021
      @honam1021 Před 11 měsíci +6

      @@Xorthis A100 and newer would perform very poorly in games as only a small subset of the chip supports graphics workload.
      From the H100 architecture whitesheet: "Only two TPCs in both the SXM5 and PCIe H100 GPUs are graphics-capable (that is, they can run vertex, geometry, and pixel shaders)." (a full H100 has 72 TPCs)

  • @FH1X_PROJECT
    @FH1X_PROJECT Před 11 měsíci +157

    That totally natural scan around the room before he takes the thing apart is just brilliant.

  • @KX36
    @KX36 Před 11 měsíci +1

    stock trading software runs on mellanox network cards as the latency from cpu to network card could cost a huge amount in missed trades

  • @Zensiji
    @Zensiji Před 11 měsíci +8

    @Linus, I'm so glad you decided to step down as CEO so you could focus on the magic! Every day I tune into this channel to learn something new and you guys always manage to keep it fresh and engaging! Long Live LTT!

  • @theindependentradio
    @theindependentradio Před 11 měsíci +15

    4:14 who else was waiting for him to drop it

  • @EposVox
    @EposVox Před 11 měsíci +497

    I feel like we're looking at the future of consumer platforms in 5-10 years, just in BIG form

    • @z0phi3l
      @z0phi3l Před 11 měsíci +38

      Like mentioned Apple is there, Microsoft is close, question is, who will do mass ARM based consumer chips first, Intel or AMD?

    • @NostraDavid2
      @NostraDavid2 Před 11 měsíci +7

      Some powerusers, maybe. I don't see windows hardcore switching to ARM. Who knows... Maybe we'll be surprised.

    • @jonforhan9196
      @jonforhan9196 Před 11 měsíci +6

      @@z0phi3lMicrosoft is far from there with their Qualcomm chip surface laptops, maybe for a student taking notes and using a web browser but it’s basically the compute power of a phone lol

    • @wright96d
      @wright96d Před 11 měsíci +14

      @@NostraDavid2 I think you have that switched. I’ll be surprised if 90% of consumer PCs aren’t running ARM SoCs in 10 years. And I’m talking mostly pre-builts here.

    • @z0phi3l
      @z0phi3l Před 11 měsíci +2

      @@NostraDavid2 if this goes like I think it will, we won't have a choice. Wild guess is Intel x86 will make it to 16th gen before they kill it, same with AMD, 2-3 more x86 before they also switch to all ARM

  • @MrLickwidfoxxx
    @MrLickwidfoxxx Před 11 měsíci +5

    I love how Linus held the hooper gpu on his shoulder like Clouds Buster sword 😂

  • @Hisham_HMA
    @Hisham_HMA Před 11 měsíci +8

    whats more amazing is them letting Linus handle those parts with his steady hands

  • @stratonarrow
    @stratonarrow Před 11 měsíci +325

    For Linus to not drop whatever he’s holding immediately after saying “I don’t even wanna know what this thing costs” is pretty astounding to me.

    • @tombrauey
      @tombrauey Před 11 měsíci +13

      I personally doubt that those are working chips. It‘s more likely that they are defective and are used for exhibition purposes.

    • @jackturner269
      @jackturner269 Před 11 měsíci

      Linus has enough money to replace what ever gets broken guaranteed

    • @Demidar
      @Demidar Před 11 měsíci

      you really think Nvidia is going to let him dissasemble working systems ? one of those racks is probarbly 500k

    • @stratonarrow
      @stratonarrow Před 11 měsíci +1

      @@tombrauey oh it is known. Still funny though.

    • @stratonarrow
      @stratonarrow Před 11 měsíci

      @@Demidar I didn't say that

  • @Yoshi-ux9ch
    @Yoshi-ux9ch Před 11 měsíci +174

    It's a very brave move to allow Linus to hold anything important

    • @trapical
      @trapical Před 11 měsíci +7

      I mean let's be honest. This video is going to get more views than anything in the entire rest of the weekend of this convention... It's worth the risk of having to drop something when he is the headliner.

    • @Gabu_
      @Gabu_ Před 11 měsíci +11

      They're almost certainly dummy chips that already don't work.

    • @Bimboms
      @Bimboms Před 11 měsíci

      I kept waiting for him to drop it.

    • @aleks138
      @aleks138 Před 11 měsíci

      considering how much there is as stake if someone steals one, they're not real chips

    • @brodriguez11000
      @brodriguez11000 Před 11 měsíci

      @@trapical The people who can afford this most likely aren't in Linuses demographic.

  • @jpdj2715
    @jpdj2715 Před 11 měsíci +1

    ARM - "Acorn RISC Machine - first used in 1983. The ARM company never made processors/chips themselves, but designed them in specialised CAD systems. The CAD logical design file then was converted into a physical design that could be "printed" (my term) by the "foundry" (industry jargon). Such a logical design actually facilitates simulation in software of how the processor will work. The first physical batch of ARM came back to the ARM company and they had their physical test motherboards. Set the mobo up, plug the CPU in, run tests. Overnight, one of the engineers wakes up and becomes aware there was a connection or configuration issue in the power-lines and the test should have failed. Turned out the processor needs so little power that it had run off the power leaked into the processor from I/O presented to the processor. That's why almost all CPUs in smartphones are derived from that first ARM and why Apple derived their current generation of "proprietary" Apple chips from ARM too.

  • @dannydonohue2577
    @dannydonohue2577 Před 11 měsíci

    Very Cool!! Went there and saw it too... -Does it come with the Gigabyte special backdoor chip to acess all your data?

  • @ZyzzEnjoyer
    @ZyzzEnjoyer Před 11 měsíci +340

    I love seeing Linus having fun while disassembling all those things

    • @Fogolol
      @Fogolol Před 11 měsíci +3

      yeah and he looks like a kid in a candy shop lmao

    • @tyson31415
      @tyson31415 Před 11 měsíci +3

      Me too, but it also makes me nervous. He and gravity don't always get along so well.

    • @ddbrosnahan
      @ddbrosnahan Před 11 měsíci

      anything on Cerebras Andromeda AI wafer-exa-scale technology?

  • @jamesswaby5676
    @jamesswaby5676 Před 11 měsíci +403

    Linus: 'I didn't ask permission for this part but nobody seems to be stopping me.'
    Security: 'That's Linus... just let him do his thing. He'll put it back together... probably.' 😂

    • @RK-252
      @RK-252 Před 11 měsíci +23

      "trust me bro". 😉

    • @kameljoe21
      @kameljoe21 Před 11 měsíci +12

      he might even drop it!

    • @whitekong316
      @whitekong316 Před 11 měsíci +5

      Put it back with half the screws…

    • @KalebG
      @KalebG Před 11 měsíci +1

      let him cook

    • @TheZanzou
      @TheZanzou Před 11 měsíci +8

      I mean knowing how security goes for various events they probably weren't fully informed of what he could and couldn't do, just that he was allowed to mess with the display over there.

  • @Indrid__Cold
    @Indrid__Cold Před 10 měsíci +2

    I love your contagious passion and enthusiasm for technology. I joined the PC industry as a hardware trainer/presenter in 1991. It took me months to accept the fact that i was actually getting paid to do something was so passionate about. Best wotking years of my life!

  • @M.C.Turnt69
    @M.C.Turnt69 Před 9 měsíci

    Just like anything else really, you implement the new alongside the old. With a little time, tech will improve enough that backwards compatibility won't be an issue. We already have emulators to run android on x86, not a big jump to run windows emulator on ARM. I wouldn't think anyways.

  • @VertexTuner
    @VertexTuner Před 11 měsíci +99

    What I find amazing about ARM architecture CPU's is that the very first one was was simulated and developed by Acorn Computers on an Acorn BBC Microcomputer (which used a MOS 6502 CPU) . The original name for Advanced RISC Machine was Acorn RISC Machine. I'm happy to say as a Brit I saw the beginning of this CPU legacy and still own both my BBC Microcomputer Model B and my Acorn Archimedes A3010 (which featured an ARM250, the 2nd generation ARM CPU). There was an actual ARM upgrade system for the BBC Micro, but it was far out of anyone's league/access and was mainly used by Acorn to develop the Archimedes.

    • @robinpage2730
      @robinpage2730 Před 11 měsíci +10

      Fun fact: when the fist ARM CPU passed it's bench test, the testers went to unplug it, and realized it was already unplugged. It had passed it's bench test entirely on residual stored energy. It was THAT power efficient.

    • @100daysofmeh
      @100daysofmeh Před 11 měsíci

      The first computer i ever used was a bbc micro. When i got to infant school and even highschool we had Acorns

    • @autohmae
      @autohmae Před 11 měsíci

      This is why the Raspberry Pi exists, to re-ignite the BBC Micro experience, as a teaching tool. And I can report: the Raspberry Pi is the most sold computer from the UK ever.

    • @Xorthis
      @Xorthis Před 11 měsíci +1

      @@autohmae I was just reading about the shortages! Insane that it's so popular. I also just bought an RP2040 to mess about with. Incredible little devices!

    • @Xorthis
      @Xorthis Před 11 měsíci +1

      Granny's Garden and Suburban Fox were two of the biggest games in my primary school :D Damn I miss the old BBC machines. Just before my computer lab went to x86 to keep up with the newest trends, we had one RISC machine with module cards. It could run as an Acorn, as a BBC, or even as a 486 (Each module card had the CPU to run that standard). I have no idea why this kind of system never made it.

  • @Morecow
    @Morecow Před 11 měsíci +501

    No no no this can’t be right

    • @thischannelisforcommenting5680
      @thischannelisforcommenting5680 Před 11 měsíci +26

      technically nintendo switch is Nvidia GPU

    • @chillaxen2113
      @chillaxen2113 Před 11 měsíci +5

      That’s what I said bro it’s not real I swear

    • @skylovescars69420
      @skylovescars69420 Před 11 měsíci +24

      No! And now Intel is making GPUs! And they are good value!!

    • @Chowy2
      @Chowy2 Před 11 měsíci +2

      GOD NO

    • @FalloutProto
      @FalloutProto Před 11 měsíci +2

      @@thischannelisforcommenting5680 isn’t the switch more of an SOC?

  • @5urg3x
    @5urg3x Před 11 měsíci

    ARM SoCs are not automatically more power efficient due to their "lightweight instructure set" lol. Jeff G has some good videos about this. It really depends on the workload and the specifics of what you're comparing it to / against.

  • @Itsdeadfrosty
    @Itsdeadfrosty Před 9 měsíci

    knowing his history for dropping parts im impressed he had the balls to not set the parts down but instead shoulder mounted them

  • @TheSickness
    @TheSickness Před 11 měsíci +1131

    Nvidia: we can connect multiple GPUs in multiple racks into one room filling huge Gpu
    Also Nvidia: SLI...yeah that don't work

    • @nhiko999
      @nhiko999 Před 11 měsíci +108

      Just in case: SLI works but it's mainly dépendant on the type of work asked to the GPU, and games are not benefiting much of the multiple nodes. For scientific computation however...

    • @EmilKlingberg
      @EmilKlingberg Před 11 měsíci +102

      Well, SLI is actually a great technology, but its requires high competency from game developers, and lets just say that's not too common. Look at simulation programs or modeling and raytracing software and you realize how awesome sli setups are when running proper software.

    • @coccoborg
      @coccoborg Před 11 měsíci +24

      ​@@EmilKlingberg on point! If you want to see a well optimized game for sli/cf, have a look back at crysis 2! May not have been the best in the series, but multi-GPU support in that title was wildly effective!

    • @TheSickness
      @TheSickness Před 11 měsíci +12

      @@EmilKlingberg yeah, feels like game devs these days need 'guardrails' enforced by Sony and a 'one click - enable' to implement feature button.
      (Thinking about the interviews on Morres law is dead channel)
      For the mentioned use cases you can forget running that on consumer cards as none have the connectors anymore

    • @krozareq
      @krozareq Před 11 měsíci +18

      The difficulty with SLI is that is has to raster frames real time for 144+hz display on a screen. GPU offloaded work, such as NN machine learning, is a much easier task to parallelize.

  • @artamereenshort6610
    @artamereenshort6610 Před 11 měsíci +277

    It's a real moment of pure pleasure, to see Linus with eyes that shine, like a kid in a toy store

  • @BrunoTorrente
    @BrunoTorrente Před 11 měsíci +1

    Grace Brewster Hopper (née Murray; December 9, 1906 - January 1, 1992) was an American computer scientist, mathematician, and United States Navy rear admiral. One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first linkers. Hopper was the first to devise the theory of machine-independent programming languages, and the FLOW-MATIC programming language she created using this theory was later extended to create COBOL, an early high-level programming language still in use today.

  • @UrMomExpressed
    @UrMomExpressed Před 9 měsíci

    "aww shit, here comes linus" -7:14 the arm following Linus around outside the camera cleaning up after him

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 Před 11 měsíci +398

    Nvidia has been making CPUs for over a decade now. Tegra initially for high end tablets and now for high end (~$700-$2,500) embedded systems. And they've been making Grace for AI prototyping workstations for about 5 years (if you have a spare $25,000).
    If you only have $5,000, there are a few options with the Ampere Altra if you really must have ARM.
    The power savings are very suspect, Jeff Geerling tested and found it to not be much different than Threadripper.

    • @GauravSharma-dy8xv
      @GauravSharma-dy8xv Před 11 měsíci +27

      I seriously forgot about tegra, it was sooo long ago

    • @FilippRoos
      @FilippRoos Před 11 měsíci +26

      and the Switch

    • @rafradeki
      @rafradeki Před 11 měsíci +50

      Arm is no magic bullet to energy efficiency, if using arm alone would be enough to make cpus more efficient even at high power, we would only have arm cpus

    • @andywolan
      @andywolan Před 11 měsíci +10

      Is it a true that ARM instructions are more energy efficient, but require more instructions to get the same task done than x86 instructions?

    • @genderender
      @genderender Před 11 měsíci +16

      Nvidia is betting that people will use Hopper and most definitely betting that people will buy their expensive ass interconnect modules. The actual performance of these chips is probably meaningless outside of the context of "shove a shit ton of DDR5 at it", much like Apple Silicon. And plus, AMD already beat Nvidia to the punch here. MI300 is CDNA3 + Zen 4 on a single package, using their Infinity Fabric (which is literally the same technology but packaged differently)
      Epyc still exists, and is impossible to actually beat because its much more versatile than these bespoke solutions. Until Arm can complete outside of the niche, we will keep hearing these arguments for years to come. Zen 4 is extremely efficient, as good as many Arm chips so x86 isn't out of the game yet

  • @Glitch5970
    @Glitch5970 Před 11 měsíci +8

    4:22 dude you sounded like Ramon Salazar from og RE4 laughing like that LMAOOO

  • @EstebanAbarca94
    @EstebanAbarca94 Před 11 měsíci +1

    This is your thing man, glad to see you hyped up in your videos again!!! welcome back to the old Linus :)

  • @WaffleStaffel
    @WaffleStaffel Před 11 měsíci

    Thanks for the infomercial, with advertising!

  • @CMoney1401
    @CMoney1401 Před 11 měsíci +179

    I don’t know what’s more impressive, the technology or Linus not dropping it!

    • @bricefleckenstein9666
      @bricefleckenstein9666 Před 11 měsíci +6

      I vote for Linus not dropping it.
      9-)

    • @davidgoodnow269
      @davidgoodnow269 Před 11 měsíci +3

      "You drop it, you pay for it.
      Don't worry, we have payment plan options available."

    • @emerje0
      @emerje0 Před 11 měsíci +1

      Linus goes into these tech booths with all the abandon of a kid with boundary issues walking into a toy store unattended.

  • @chriskaprys
    @chriskaprys Před 11 měsíci +75

    Wild to see just how far-wide-deep the subscription model has reached. If the contemporary fiscal landscape were a chess board, the pawn could only move to a square that it's rented from the opposing king.

    • @autohmae
      @autohmae Před 11 měsíci +2

      IBM has been doing this for decades, so no surprise

  • @Pssst.ByTheWay
    @Pssst.ByTheWay Před 11 měsíci

    I have been out the tech loops for 3-4 years.
    I kind of feel i might have to start from way back!

  • @therucha
    @therucha Před 11 měsíci

    You holding the chip like that gave me flashback to when you accedently dropped a card a few years ago x'D

  • @viken3368
    @viken3368 Před 11 měsíci +5

    8:33 throwing prototype/showcase tech to Linus 'Droptips' Sebastian is a very bold move

  • @raymondm.3748
    @raymondm.3748 Před 11 měsíci +297

    This is nothing short of insane, the fact that there is so much processing power with less power means that we will have much higher speeds throughout our internet!

    • @OmniKoneko
      @OmniKoneko Před 11 měsíci +19

      Not only that but it will decrease the heat generated from it so it provides more cushions for the coolers

    • @KalebG
      @KalebG Před 11 měsíci +6

      also cheaper hosting!

    • @JelvinCS
      @JelvinCS Před 11 měsíci +36

      This can't happen. What will I blame my awful Counter Strike performance on?

    • @SlyNine
      @SlyNine Před 11 měsíci +9

      Just lower the clock speed on your cpu and you'll have much better performance per watt.
      Our home chips run way beyond the efficiency curve

    • @phantomedits3716
      @phantomedits3716 Před 11 měsíci

      This won't really make your internet faster. But, there's a case to be made that it might, in a roundabout way, make your websites load faster because the website is running on this hardware.

  • @ahmadpochinki1244
    @ahmadpochinki1244 Před 11 měsíci +2

    1:05 i love linus just unscrew everything and employees just ignore him like "oh that's guy"

  • @greggv8
    @greggv8 Před 11 měsíci

    Apple did an early version of offloading network processing to the network card. They made one model of NuBus Ethernet card which had a Motorola 68000 CPU on it and it used the A/ROSE system extension. Apple Realtime Operating System Extension. To find out what performance difference it made you'd have to dig through old issues of Macintosh magazines.

  • @MaxXxoUh
    @MaxXxoUh Před 11 měsíci +155

    That Grace Hopper is a freaking piece of art. It makes you want to code an entire OS and Game just to test it. Just imagine what a crazy project that would be.

    • @TheCHEATER900
      @TheCHEATER900 Před 11 měsíci +10

      I love the homage to grace Hopper btw. Excellent naming

    • @Allan_Stone
      @Allan_Stone Před 11 měsíci +4

      How well can Rollercoaster Tycoon 2 run on this thing if natively translated, that's what I'm wondering

    • @chrishousby2685
      @chrishousby2685 Před 11 měsíci +4

      ​@@TheCHEATER900 imagine going back and telling her how big a transistor will become. The ones she developed software languages with were vacuum tubes several inches long.

    • @brodriguez11000
      @brodriguez11000 Před 11 měsíci

      We know what future Crysis developers will be using.

    • @puppergump4117
      @puppergump4117 Před 11 měsíci +2

      @@chrishousby2685 I imagine that anyone working with computers understands how quickly they will improve. The stuff shown in this video may be made for personal computers in 20 years. Just as 20 years ago, personal computers had millions of times less space and compute power.

  • @ardentdfender4116
    @ardentdfender4116 Před 11 měsíci +19

    My heart definitely skipped a beat at 8:34 when someone threw the Network Card. That would have been a hell of a drop.

    • @tormodhag6824
      @tormodhag6824 Před 11 měsíci +3

      Like that thing can cost as much as a car

    • @Vortex001_MLG
      @Vortex001_MLG Před 4 měsíci

      @@tormodhag6824it probably does 😮

  • @isaacrosin10
    @isaacrosin10 Před 11 měsíci

    Crossing my fingers that the clusters I'm using for my masters research will order some of these! 🤞

  • @robertpearson8546
    @robertpearson8546 Před 11 měsíci

    You have stumbled across the hardware/software design cycle.
    At any point in time, some functionality is implemented in hardware and some is implemented in software.
    When they want more performance, they move functionality to faster hardware. As the hardware becomes faster, they move functionality back to software. Then repeat.
    Designing hardware and software and the same activity. Both can be modeled using Petri Nets.
    Most computers implement Forth using a software interpreter. The Novix processor implemented the interpreter in hardware.
    Virtual machines are implemented in software. Attempts to implement Windows or the Jave Virtual machine in hardware have failed.
    At one time, floating point arithmetic was all done in software. Today, floating point arithmetic is (mostly) standardized and is implemented in hardware.
    The original GUI was implemented in software. Today, there are GPUs.
    In 1980, I tried to get TI to use more than one CPU in a computer. 68000s cost about $5, and we could offload functionality to peripheral CPUs. Management ignored this concept.

  • @miyaayazaki4273
    @miyaayazaki4273 Před 11 měsíci +44

    Can we all take a second and appreciate how casually Linus holds that GPU on his shoulder? ( 5:12 )

  • @jackoboy1754
    @jackoboy1754 Před 11 měsíci +70

    linus: *walks in*
    also linus: *randomly starts unscrewing things from the wall*

  • @raptor__.
    @raptor__. Před 9 měsíci

    might not have wanted to take a coffee right before this

  • @ianmchale770
    @ianmchale770 Před 11 měsíci +12

    Hearing Linus saying “I can’t believe they let me take this off the wall” and proceeding to laugh like a small child made my day. Linus is the geeky adult version of a kid in a candy store 😅

  • @trapical
    @trapical Před 11 měsíci +242

    This hardware is utterly and completely insane. If you can comprehend the slightest bit of the numbers behind this, it's just madness.

    • @MarkOakleyComics
      @MarkOakleyComics Před 11 měsíci +37

      Those AI art programs which can produce 30 photo-real variations of, "A mountain of cookies" in under a second strongly suggests that we're living the last generation before everybody is born in pods and never uses their eyes. I'm legit alarmed by these compulsive engineers who know deep down that they should put the brakes on, but just can't stop themselves.

    • @rudisimo
      @rudisimo Před 11 měsíci +39

      ​@@MarkOakleyComics​you should make your tinfoil hat tighter, perhaps that will help.

    • @Leanzazzy
      @Leanzazzy Před 11 měsíci

      ​@@MarkOakleyComicsOnly idiots think AI will overthrow the world.
      If you actually understood what AI is and how it works you wouldn't think that. It's not some magical sentient being. It's literally just mathematical models and equations used to predict future outcomes based on inputs datasets.
      Datasets, which need I remind you, need to come from living, active, intelligent humans. If there aren't humans producing new, creative, informative data, AI would be useless.
      AI is a good thing. It is simply a tool to help us simplify our work and reach our goals. It can, and hopefully will, be used to ease and remove the burden of existence from mankind, so we can truly be free to do what we want and not struggle just to survive.

    • @Leanzazzy
      @Leanzazzy Před 11 měsíci

      It's insane only if you compare it to consumer-level hardware and software.
      Remember, governments all over the world have and maintain far higher tech than the public can even dream of. They secretly use this tech for military, scientific, and usually espionage purposes.
      We get only the bottom of the barrel. Most of the tech we use today were once government secrets.
      The Internet itself started as a US military defence and research project.

    • @MarkOakleyComics
      @MarkOakleyComics Před 11 měsíci +15

      @@rudisimo Right. Because there aren't any examples of technology getting ahead of our ability to adapt without catastrophic results. I can think of a couple items of note just from the last few years.
      Meanwhile.., Neuralink is entering human trials.

  • @Thohean
    @Thohean Před 11 měsíci +309

    I think the most surprising thing to me is that Gigabyte has enterprise class hardware.

    • @konnorj6442
      @konnorj6442 Před 11 měsíci +29

      Ah but notice they did NOT show you the GB power supply?
      Lmfao

    • @ThyXsphyre
      @ThyXsphyre Před 11 měsíci +1

      Pleaase dont say that about Gigabyte

    • @georgevel
      @georgevel Před 11 měsíci +1

      My whole system is aorus bro

    • @Tehkezah
      @Tehkezah Před 11 měsíci +14

      @@georgevel and nothing of it is enterprise class hardware

    • @georgevel
      @georgevel Před 11 měsíci +1

      @@Tehkezah I said that bc ppl are saying about their psus and I wanna note that they got no problem

  • @first-thoughtgiver-of-will2456

    You need to be careful about apples to apples ISAs. A lot of ARM processors aren't even superscalar and while there is some interesting things with RISC and optimizations that can be performed there is much variability in ARM specifications including superscalar, extended instructions etc. X86 uses more power because it switches more silicon and does a lot of superscalar out of order and predictive magic de facto that ARM may or may not attempt (or need to at the sake of productivity per cycle). Generally modern arm (cortex A7 etc.) Implement a lot of this stuff but I wouldn't be surprised if these very high core count (Ampere, Gravitron etc.) Silently sacrifice silicon in the name of RISC and core count.

  • @Dick_Valparaiso
    @Dick_Valparaiso Před 11 měsíci +4

    Nice to see Nvidia making such cpu strides. Hopefully, these improvements bode well for the next Nintendo Switch (should Nintendo actually opt for a semi-modern chipset).
    Btw, I know *this* cpu won't be in the next Switch. I'm talking about general improvements that could trickle down to gaming.

  • @Posh_Quack
    @Posh_Quack Před 11 měsíci +9

    1:27 DON'T DROP IT

  • @dannymitchell6131
    @dannymitchell6131 Před 11 měsíci +54

    I just love how excited Linus always is for new tech. Never change bro.

  • @kyletheunissen8819
    @kyletheunissen8819 Před 10 měsíci

    Future pcs etc will be like choosing a movie or series on your favorite platform.
    Imagine choosing a platform subscription and able to build and use an online pc like you just able to press play on a movie or series.

  • @FunkyPants3D
    @FunkyPants3D Před 11 měsíci

    Just waiting to see one of the clips of the next installment of Linus Drop Tipps in this video

  • @egillis214
    @egillis214 Před 11 měsíci +63

    ARM is a RISC instruction set. The Hewlett-Packard Packard PA-RISC was way ahead of its time. I worked on the first HP 3000 on MPE & HP 9000 HP-UX systems. Some of the desktop workstations like the tiny 715 systems were incredible in 1980’s.

    • @konnorj6442
      @konnorj6442 Před 11 měsíci +1

      Ah the toys of my youth! I worked with some of that wayyyy back when along with many other goodies that all ofnthe winbloze babies wouldnt have any clue what it is now nevermind how to use it and due to the millenials andnbeyond idiotic overly entitled arrogant bs they dont even appreciate that which was gained to make the current toys evennpossuble via our hard work long before they were a set spot on cheap hotels sheets

    • @danielwoods7325
      @danielwoods7325 Před 11 měsíci +1

      Beat me to this lol good comment

    • @kellecetraro4807
      @kellecetraro4807 Před 11 měsíci +2

      You beat me to this comment, but I made it anyway 😂
      I'm sort of scratching my head as to why he's (Linus??) acting like it's a new thing... Just for novelty I still have a SUN E450 still running and productive 😂

    • @archgirl
      @archgirl Před 11 měsíci +1

      @@konnorj6442Pardon?

    • @1Sonicsky1
      @1Sonicsky1 Před 11 měsíci +6

      ​​@@konnorj6442 First of all, it is this exact toxicity that completely stagnates any real intelligence... I would rather be stuck fixing Windows 3.1 and Vista installations for the rest of eternity than ever hold a mindset akin to yours. Every architecture, operating system, and programming language has its strengths and weaknesses, and it is our responsibility as technicians to learn and understand each one so that we can always provide the best for whatever our client is trying to achieve. I have met both old and young people who are kinder, more intelligent, and exhibit far more competence than you have shown here.

  • @BenjaminWagener
    @BenjaminWagener Před 11 měsíci +35

    The way NVIDIA focuses on cloud and AI and so on and makes local gaming more and more expensive, I fear local gaming will get rarer and rarer. They want us to nudge to use GeForce Now instead, because it's more efficient to them to share the performance from its servers than to sell us individually a GPU.

    • @SWOTHDRA
      @SWOTHDRA Před 11 měsíci +9

      True, thats the future. Thats also thw reason gaming companies want to go always online , games as a service. Cause those games you can easily do the transition from local to cloud without the consumer knowing and once ur locked in you gonna pay for renting the software and hardware

    • @brandonmoss7976
      @brandonmoss7976 Před 11 měsíci

      Pretty soon you're gpus are going to come with their own custom CPUs 😄 along with a few pelethites of storage for the AI data, so every time you play a game, the EI will be smarter every single time it customized for every single game for every single place style for every single player 😳

    • @ryanw7196
      @ryanw7196 Před 11 měsíci

      Honestly with the percentage of their income that is now coming from AI I dont really see them giving a shit about GeForce anything, they could be completely out of the commercial hardware space in 10 years. I mean I imagine they may already have rtx 5000 series in the wings and possibly even the 6000 series... But after that? If everything else goes according to plan then Nvidia wont care much about consumer cash anymore.

    • @Wobble2007
      @Wobble2007 Před 11 měsíci +1

      It will never be viable until the majority of the internet infra structure is pure fibre, copper is just way too high-latency (laggy) for gaming remotely, even 1GB fibre is borderline, in reality, 10gb full fat fibre is the minimum for a good gaming experience over remote connections, even current HDMI standards struggles to carry enough bandwidth to keep up with modern video games, so even with 10GB fibre a heavy compression technique will need to be employed, I wouldn't ever want to use it for gaming personally.

    • @Gabu_
      @Gabu_ Před 11 měsíci

      @@Wobble2007 What are you smoking? You can ALREADY play remotely at fairly decent latency with a regular 100 MB bandwidth. Barely anybody except competitive e-sports professionals care whether you have a latency of 50ms or 10ms

  • @archgaden
    @archgaden Před 11 měsíci

    Seeing such insane hardware posturing to take on AI really makes me appreciate the wetware in we all own. ChatGPT's core is a roughly 800gb blob of static digital neurons, that can't change or reorganize after the training. That's just for an LLM that is still missing a lot of functionality we have when it comes to language processing... and on top of that, we're doing so many other things, all while consuming a small fraction of the power, re-trainable in real time, and being entirely portable. To fully map the input/output map of a human neuron takes either very expensive calculus or a blob of transformers 5 to 7 layers deep. We've still got a couple of orders of magnitude to go on hardware to match what a brain can do, it's not such an impossible goal now.
    As an aside, I find it funny that LLMs tend to be so much bigger than visual AIs like StableDiffusion. That old 'a picture is worth a thousand words' seems to be panning out more like 'a word is worth a thousand pictures'. Of course, StableDiffusion still have a lot of obvious gaps in training, while ChatGPT's flaws are more subtle, so it's not really a good comparison yet.

  • @carrino15
    @carrino15 Před 11 měsíci +1

    When he threw the little processor net work card my heart stopped a second.

  • @bricoschmoo1897
    @bricoschmoo1897 Před 11 měsíci +15

    0:16 I love how the worker jumps away as soon as he realizes he's going to inadvertedly go between Linus and the camera !

    • @notjux
      @notjux Před 11 měsíci +1

      He tried so hard to bail but only drew more attention to himself. May he rest in peace. o7

  • @yoloswaggins2161
    @yoloswaggins2161 Před 11 měsíci +3

    0:16 That poor guy in orange jumping out of the shot as fast as he could

  • @viktoraggerholm5102
    @viktoraggerholm5102 Před 11 měsíci

    00:15 shoutout to that employee who saw you were filming and didn't want to be in the way

  • @HauntedTheTruth
    @HauntedTheTruth Před 9 měsíci

    Is this what is in the DGX Quantum? Also what will it be for? What does this mean for gen consumers?

  • @henrycollins2478
    @henrycollins2478 Před 11 měsíci +66

    Their data centers have been pretty helpful for their stock

    • @genderender
      @genderender Před 11 měsíci +8

      I find H100's price tag nothing more than "well Google will pay no matter what" kind of price. A million dollar investment into a server for a company this large is both eaten up immediately by running costs, but also still a blip for total operating cost of the company. Nvidia shareholders must be happy as shit

    • @ZeLoShady
      @ZeLoShady Před 11 měsíci

      Ya I think this is a good time to invest given their Q1 reports and all these new tech developments. Nvidia is currently operating like an innovator again and not some comfortable company (like intel was before AMD came back).

    • @ZebraGoat
      @ZebraGoat Před 11 měsíci +4

      @@ZeLoShady their q1 financial report showed a decline in revenue and net income lol

    • @ZeLoShady
      @ZeLoShady Před 11 měsíci

      @@ZebraGoat exactly.

    • @tjoloi
      @tjoloi Před 11 měsíci

      @@ZeLoShady Considering they had a 26% rally yesterday over the span of a single hour, I'm gonna go and assume that you're not the only one to think this

  • @dragon2knight
    @dragon2knight Před 11 měsíci +378

    This is NVidia's future, and they know it. Good, now we can let folks like Intel and AMD shine a bit more, especially when they get their drivers ironed out.

    • @maxjames00077
      @maxjames00077 Před 11 měsíci +24

      You think Intel and AMD can get some GPU market share in HPC?

    • @FlaMan407
      @FlaMan407 Před 11 měsíci +43

      AMD drivers are really nice. Meanwhile, Intel drivers struggle to play old games.

    • @Mr.Morden
      @Mr.Morden Před 11 měsíci +8

      As long as Nvidia continues to fail at purchasing a CPU vendor.

    • @mcslender2965
      @mcslender2965 Před 11 měsíci +11

      Im sure NVidia will be sad about that as they control whatever runs the AI customer support you work with, the AI that power your online services, the servers that renders your movie...

    • @maxjames00077
      @maxjames00077 Před 11 měsíci +13

      @@FlaMan407 My Arc A770's drivers are amazing after the updates.. AMD has gone nowhere in the last 10 years 😂

  • @Lancista
    @Lancista Před 9 měsíci +1

    *_Intel is making GPU's..._*
    *_nVidia is making CPU's..._*
    *_AMD is kicking both of them..._*
    *_We live in a strange times..._*

  • @dickslocum
    @dickslocum Před 11 měsíci

    So how many years have the Military Data Processing and Aircraft systems had this tech. I remember using 386 based systems in the F15 back in the early 70s.

    • @xxxxxxxxxx6903
      @xxxxxxxxxx6903 Před 11 měsíci +1

      Well maybe just a wee bit exaggerated timewise?
      The Intel 8086 architecture was developed in 1978 I believe. It was the original design that eventually became the X86, 286, 386, 486, and the Pentium line of processors. Although the Mil 1553 bus/microcontrollers may have existed at a very classified level back then?

  • @KentHambrock
    @KentHambrock Před 11 měsíci +15

    AWS has had a similar card to the last thing you showed off since around 2017. They just call them Annapurna cards inside the data centers (likely because they're made by Annapurna, a company they acquired back in 2015), but it's literally that. 1 or 2 SFP ports + an ethernet port and it gets used as the NIC inside pretty much all their servers these days. I assume the industry at large has had cards like that since 2015~ or even earlier, since I don't remember AWS being on the leading edge of anything in the data center space when I was working for them. xD

    • @minervaulthir5767
      @minervaulthir5767 Před 11 měsíci +1

      The only 'revolutionary' part of AWS was elasticompute almost entirely in the software side. They do self manufacture certain things now tho (that the company would probably actually look to terminate me for discussing. Me and my PIP is already riding very thin water heh)

  • @Mainyehc
    @Mainyehc Před 11 měsíci +4

    4:32 “*This* is Grace Hopper. On the one side, we’ve got the same, 72-core Grace ARM cpu we’ve just saw, but on the other side, the “ooooooo shiny” latest and greatest nVIDIA H100 Hopper GPU. Today I am going to review Grace Hopper, and show you all of its quirks and features.”

  • @stephenclark7026
    @stephenclark7026 Před 11 měsíci

    I’ve been trying to find a water block for the Zotac 3090 ti but I can’t seem to find one, do you know where I could go to find one?

  • @alek2341
    @alek2341 Před 11 měsíci

    I've just been crushed by the CC wall of text at the start

  • @gmacdonalddc
    @gmacdonalddc Před 11 měsíci +4

    Cool stuff! I've always wondered what cutting edge server tech stuff was capable of, enjoyed the vid!

  • @ChristianStout
    @ChristianStout Před 11 měsíci +5

    3:19 I'm more interested in those Intel Arc MXM cards in the background. Can we get some coverage on those?

  • @takomayowasabi6491
    @takomayowasabi6491 Před 2 měsíci

    I didnt not know how they actually look like but looking at the size, now I kind of understand the price.

  • @ApfelJohannisbeere
    @ApfelJohannisbeere Před 11 měsíci

    That was an awesome info release!