How Nvidia Won AI

Sdílet
Vložit
  • čas přidán 14. 05. 2024
  • When we last left Nvidia, the company had emerged victorious in the brutal graphics card Battle Royale throughout the 1990s.
    Very impressive. But as the company entered the 2000s, they embarked on a journey to do more. Moving towards an entirely new kind of microprocessor - and the multi-billion dollar market it would unlock.
    In this video, we are going to look at how Nvidia turned the humble graphics card into a platform that dominates one of tech’s most important fields: Artificial Intelligence.
    Links:
    - The Asianometry Newsletter: asianometry.com
    - Patreon: / asianometry
    - The Podcast: anchor.fm/asianometry
    - Twitter: / asianometry

Komentáře • 600

  • @Asianometry
    @Asianometry  Před 2 lety +110

    What would you like to see on the channel?

    • @bjliuyunli
      @bjliuyunli Před 2 lety +12

      Thanks a lot for the video! Would be great to see a video about power semis like IGBTs and Silicon carbide.

    • @2drealms196
      @2drealms196 Před 2 lety +8

      You've covered Nvidia, could you cover Nuvia and Nivea?

    • @masternobody1896
      @masternobody1896 Před 2 lety +1

      yes more gaming video is what I like

    • @fulcrumR6
      @fulcrumR6 Před 2 lety +9

      Can't wait to see. You should do a video on the companies behind modern day tanks, no matter the country. The history on many companies (General Motors, Etc) and them taking the time to design the tanks and make them functional is very interesting. I'd love to see a video on that.

    • @screwcollege8474
      @screwcollege8474 Před 2 lety +6

      Marvell technology pls

  • @okemeko
    @okemeko Před 2 lety +510

    From what a professor in my university told me, they didn't only "work closely" with researchers. They straight up gifted some cards so research centers in some cases. This way, not only did nvidia provide a good platform, but all the software made was naturally made for CUDA

    • @eumim8020
      @eumim8020 Před 2 lety +43

      My master's thesis supervisor has 5 professors submitting a request for a GPU each, NVIDIA covers all their monopolistic anticompetitive core with a whole system for helping public university systems, if i'm lucky my final DL model will be trained in his little office server with those GPUs

    • @slopedarmor
      @slopedarmor Před 2 lety +16

      i think i member that nvidia gifted a gtx980ti to the developers of kingdom come deliverance (a kickstarter computer game), to supposedly help them with development? haha

    • @monad_tcp
      @monad_tcp Před 2 lety +24

      Ah that old trick from Microsoft of gifting goodies. Like giving away office licenses or the entire Internet Explorer for free if you bought Windows.

    • @steveunderwood3683
      @steveunderwood3683 Před 2 lety +26

      If you don't provide some help to early adopters, how are you ever going to build a thriving environment? Providing cards, software, training and support to academics was a good thing. The sleazy stuff they did was to cook studies to make the benefits of a GPU look much greater than it really was, in applications where the benefits of GPU were marginal at best. GPGPU is great for some things, and weak for others. The early nVidia sponsored papers were so heavily rigged, it took some serious analysis to figure out where GPGPU was a real boon, and how big that boon might be.

    • @monad_tcp
      @monad_tcp Před 2 lety +11

      @@steveunderwood3683 Yeah, its the environment, the benefits stated on those papers were rigged to nVidia's side, but they would be feasible in a computational level with a open environment for study.
      But the industry is too locked on Cuda/Intel x86 .
      At least now, things are going to change a bit, as if we could say ARM is different...

  • @e2rqey
    @e2rqey Před 2 lety +179

    Nvidia does a really good job of identifying new burgeoning industries where their products could be leveraged, then integrating themselves into the industry from so early on that as the industry matures Nvida's product become essential to the functioning of that industry. I remember visiting a certain self driving car company in California about 4 years ago and seeing a literal wall of Nvidia 1080Ti GPUs. They had at least a couple hundred of them. Apparently they had all been gifted to them by Nvidia.
    I've heard Nvidia will also send their engineers out to work with companies and help them optimize their software or whatever they are doing, to get the maximum performance out of the GPU for whatever purpose they are using them for.

    • @zerbah
      @zerbah Před 2 lety +21

      Nvidia has great support for AI and game development. When I was talking with a small indie game studio about their game, they confirmed that Nvidia sent them two top of the line founder's cards for development free of charge and offered to optimize drivers for their game when the final build is ready. Meanwhile, the AMD cards were crashing and having black screen monitors because of buggy drivers making it complete pain to test the development version of the game on them...

    • @aamirsiddiqui9957
      @aamirsiddiqui9957 Před 2 lety

      @@zerbah How long will AMD take to be as good as Nvidia

    • @cyranova9627
      @cyranova9627 Před 2 lety +5

      I remember one. that some game developer actually get invited to dining with Nvidia person to talk about their game development with nvidia GPU. not AMD one.
      all they just do sweet talk to game developer

    • @tweedy4sg
      @tweedy4sg Před rokem +3

      True they do... but is not exactly successful everytime. Remember how they joined the mobile AP (application processor) market with the Tegra series, which now seem to have fizzled out into oblivion.

    • @graphicsRat
      @graphicsRat Před rokem +4

      @@tweedy4sg Yes not every bet will win. In fact most bets will fail. But the 1 out of the 5 that succeed will more than pay for the failures and much more. That's how investments work. Venture capitalists for example know this too well. Not all their investments will pay off. But every now and then they invest in tomorrow's Google scale company and that's where they make their money.

  • @0MoTheG
    @0MoTheG Před 2 lety +30

    CUDA was originally not targeted at machine learning or deep neural networks, but
    molecular dynamics, fluid dynamics,
    financial monte carlo, financial pattern search, MRI reconstruction, deconvolution
    and very large systems of linear equations in general.
    A.I. is a recent addition.

    • @TheDarkToes
      @TheDarkToes Před rokem +3

      Back in the day, we would have 64 cuda cores and we thought we were hot shit hitting 800mhz. Look how far it's come.

    • @christopherpearson8637
      @christopherpearson8637 Před rokem +3

      You stumble into the right choices sometimes.

  • @deusexaethera
    @deusexaethera Před 2 lety +294

    Ahh, the time-honored winning formula:
    1) Make a good product.
    2) Get it to market quickly.
    3) Don't crush people who tinker with it and find new uses for it.

    • @heyhoe168
      @heyhoe168 Před 2 lety +44

      Nvidia dont really follows (3), but it have a very strong (2).

    • @cubertmiso4140
      @cubertmiso4140 Před rokem

      @@heyhoe168 agree on that comment. 3) corner the market 4) raise prices

    • @peterweller8583
      @peterweller8583 Před rokem

      @@heyhoe168 3 Thay's too bad because that is where the most honey comes from.

    • @shmehfleh3115
      @shmehfleh3115 Před rokem +4

      @@heyhoe168 Neither does Apple, unfortunately.

    • @locinolacolino1302
      @locinolacolino1302 Před rokem +12

      3* Create an accessable proprietary toolkit (CUDA) that's become mainstream in legacy content, and crush anyone who tries to leave the Nvidia ecosystem.

  • @scottfranco1962
    @scottfranco1962 Před 2 lety +483

    Nvidia is a real success story. The only blemish is (as illustrated by Linus Torvald's famous giving the middle finger to them) is their completely proprietary stance on development. Imagine if Microsoft had arranged so that only their C/C# compilers could be used to develop programs for Windows. CUDA is a closed shop, as are the graphics drivers for Nvidia's cards.

    • @janlanik2660
      @janlanik2660 Před 2 lety +6

      But msvc can be used only on Windows.

    • @theairaccumulator7144
      @theairaccumulator7144 Před 2 lety +23

      @@janlanik2660 imagine using windows, much less mvsc

    • @scottfranco1962
      @scottfranco1962 Před 2 lety +34

      @@janlanik2660 I think you misread what I said. Microsoft (or any OS maker, Apple included) could have easily made it so that only their compilers could be used on their systems, no GCC, no independent developers. That is what Nvidia has done.

    • @janlanik2660
      @janlanik2660 Před 2 lety +6

      @@scottfranco1962 ok sorry for the misinterpretation. But even so, I have only used CUDA, which is indeed Nvidia only, but I believe that there are some cross platform solutions, e.g. OpenCL, so you don’t have to use proprietary tools to run something on Nvidia, or am I wrong?

    • @Ethan_Simon
      @Ethan_Simon Před 2 lety +81

      @New Moon You don't need something to be proprietary to pay your engineers to work on it.

  • @mimimimeow
    @mimimimeow Před 2 lety +69

    I think it's worth mentioning that the a lot of recent advances in GPU computing (Turing, Ampere, RDNA, mesh shaders, DX12U) can be traced to the PlayStation 2's programmable VU0+VU1 architecture and PlayStation 3's Cell SPUs. Researchers did crazy stuff with these, like real time ray tracing, distributed supercomputing for disease mechanism research and USAF's space monitoring. PS3 F@H program reached 8 Petaflops at one point!
    Sony and Toshiba would've been like Nvidia today if they provided proper dev support to make use of these chips' capability and continued developing, than just throwing the chip to game devs and said "deal with it". I feel like Sony concentrated too much on selling gaming systems and didn't realize what monsters they actually created. Nvidia won by actually providing a good dev ecosystem with CUDA.

    • @dhargarten
      @dhargarten Před rokem +2

      Didn't Sony at one point encourage and support using PlayStations for science computing, only to later block it completely? With the PS4 if I recall correctly?

    • @FloStyle_
      @FloStyle_ Před rokem +15

      @@dhargarten It was the PS3 and running linux native on the console. Later that caused exploits and hacks of the hardware and sony closed the ecosystem really fast. That caused lawsuit that took years into the PS4 lifespan to conclude.

    • @Special1122
      @Special1122 Před 10 měsíci

      ​@@FloStyle_geohot?

  • @zombielinkinpark
    @zombielinkinpark Před 2 lety +53

    Despite both Google and Ali cloud developed their own NPU for AI acceration. They are still buying large quantities of Nvidia Delta HGX GPUs as their own AI development platform. Programming for CUDA are far easier then their own proprietary hardware and SDK. Nvidia really put a lot of effort in the CUDA sdk and make it to be industry's standard.

  • @yadavdhakal2044
    @yadavdhakal2044 Před rokem +30

    Nvidia didn't invent the graphics pipeline. It was invented by Sillicon Graphics or SGI. SGI developed the language OpenGL as far as 1992. They mainly used to target cinema and scientific visualizations market. They used to manufacture entire work station with their own OS (IRIX) and other specialized servers.
    What Nvidia did was to target the personal entertainment market. This made Nvidia competent because of decreased overall unit cost. Later OS such as linux were able to run these GPUs in cluster and thus here too SGI loosed. SGI could easily be like Nvidia if they were on right track.
    SGI is now reduced to a conference known as SigGraph. And mainly is research based peer program. And still contributes to computer graphics especially through OpenGL and Vulkan API specification!

    • @lookoutforchris
      @lookoutforchris Před 9 měsíci +2

      The original GeForce card was so groundbreaking they were sued by Silicon Graphics for copying their technology. SGI won and nVidia paid royalties to them. Everything nVidia had came from SGI 😂

  • @PhilJohn1980
    @PhilJohn1980 Před rokem +19

    Ah, geometry stages with matrices - I remember my Comp Sci computer graphics class in the 90's where our final assignment was to, by hand, do all the maths and plot out a simple 3D model on paper. Each student had the same 3D model defined, but different viewport definitions. Fun times.

  • @BaldyMacbeard
    @BaldyMacbeard Před rokem +18

    The secret of their success for many years was: working closely with developers/customers to gain advantage over their competitors. For instance, Nvidia would give free cards to game developers and send out evangelists to help optimize the game engines. Obviouly resulting in a strong developer bias towards Nvidia cards. Which is how and why they were outperforming AMD for many years. In the machine learning space, they are being extremely generous in their public relations to academia, once again giving away tons of free GPUs and helping developers out. It's a fairly common tactic to try and bring students on board so once they graduate and go on to work in tech companies, they bring a strong bias towards software & hardware they're familiar with. In the server market, Nvidia has been collaborating closely with most manufacturers while offering their DGX systems in parallel. They also have a collaboration with IBM that solders Nvidia GPUs onto their Power8 machines, giving a ginormous boost to bandwidth between GPU and CPU and also PS5-like storage access. And don't forget about the Jetson boards. Those things are pretty amazing for edge computing use cases like object recognition in video and such. They dominate like they do by not trying to sell a single product, but offering tons of solutions for every single market out there.

    • @409raul
      @409raul Před rokem +3

      Genius move by Nvidia. Jensen Huang is the reason why Nvidia is where they are today. One of the best CEOs in the world (despite the greed LOL).

    • @TherconJair
      @TherconJair Před rokem +3

      It's quite easy when your main competitor was nearly extinguished by anti-competitive measures of their much larger rival, Intel, and has to stay afloat somehow while bleeding money. Nvidia made so much money with gaming cards when AMD couldn't compete due to lack of funds for RnD with, that they had an extremely calm "blue ocean" to work with and could comparatively cheapely build up their de-facto monopoly in the space. AMD will need to invest a lot of money to somehow break into the now very "red ocean" of the Nvidia monopoly of CUDA.
      I don't see them able to survive in the long term against two much larger rivals, and we'll be all losers for it.

    • @Magnulus76
      @Magnulus76 Před 11 měsíci

      Yeah, Nvidia offered alot of support.
      I know there's alot of fanboys that think NVidia must have some kind of secret sauce, but the truth is that CUDA's performance isn't necessarily any better than OpenCL. And I say that as somebody that owns an NVidia card. NVidia just spent alot on support and generated alot of influence/hype.

  • @Quxxy
    @Quxxy Před 2 lety +61

    I don't think you're right about what "clipping" means at 2:56. Occlusion (hiding things behind other things) is done with a Z-buffer*. As far as I recall, clipping refers to clipping triangles to the edge of the screen to avoid rasterising triangles that fall outside of the visible area, either partially or fully. As far as I'm aware, no one ever did occlusion geometrically on a per-triangle basis. The closest would be in some engines that will rasterise a simplified version of a scene to generate an occlusion buffer**, but that's not handled by the geometry engine, it's just regular rasterisation.
    *Except on tile-based rasterisers like the PowerVR lineage used in the Dreamcast and some smartphones, notably the iPhone.
    (Not a graphics programmer or expert, just an interested gamer.)
    *Edit*: Also, for 7:46 about the fixed function pipeline being totally gone: from what I remember this is not entirely true. GPUs still contain dedicated units for some of the fixed functionality; from memory, that includes texture lookups and blending. Reminds me of an old story from someone who worked on the Larrabee project who mentioned that one of the reasons it failed to produce a usable GPU was that they tried to do all the texturing work in software, and it just couldn't compete with dedicated hardware.

    • @Asianometry
      @Asianometry  Před 2 lety +16

      Thx. I'll look into this and see if a clarification is needed

    • @Quxxy
      @Quxxy Před 2 lety +26

      @@Asianometry I doubt it. It's an inconsequential detail that doesn't change anything about the substance of the video. I mean, I doubt anyone is watching a video about nVidia's AI dominance looking for an in-depth technical description of the now long-obsolete fixed function pipeline. :)

    • @musaran2
      @musaran2 Před 2 lety +4

      Clipping is the general removal of what does not need rendering: view volume, backface, occlusion…

    • @tma2001
      @tma2001 Před 2 lety +3

      yeah I was about to post the same nitpick - also the setup and window clipping part of the fixed function pipeline is still there in hardware its just not programmable (nor should it be). The raster ops backend is not programmable either - just configurable.
      The Painters algorithm is an object based visibilty test that clips overlapping triangles against each other whereas the z-buffer is an image based per pixel visibilty test.

    • @vintyprod
      @vintyprod Před rokem

      @@Quxxy I am

  • @CarthagoMike
    @CarthagoMike Před 2 lety +10

    Oh nice, a new Asianometry video!
    Time to get a cup of tea, sit back, and watch.

  • @TickerSymbolYOU
    @TickerSymbolYOU Před rokem +15

    This is literally the best breakdown on CZcams when it comes to Nvidia's dominance of the AI space. Love your work!

    • @409raul
      @409raul Před rokem

      Nice to see you here Alex! Nvidia for the win!

    • @prashantmishra9985
      @prashantmishra9985 Před rokem +4

      ​@@409raul Being a fanboy of a corporate won't benefit us.

    • @havkacik
      @havkacik Před rokem

      Totally agree 👍 :)

  • @hgbugalou
    @hgbugalou Před 2 lety +8

    I would buy a shirt that says "but first, let me talk about the Asianometry newsletter".

  • @RandomlyDrumming
    @RandomlyDrumming Před 2 lety +20

    A small mistake, right at the beginning - Geforce 256 had hit the market in 1999, not 1996. In the mid-90's, Nvidia was, more or less, just another contender, chipping away at the market dominance of the legendary 3dfx. :)

    • @shoam2103
      @shoam2103 Před 2 lety

      So theirs wasn't the first GPU? I think the PlayStation had an albeit very basic one..

    • @shoam2103
      @shoam2103 Před 2 lety

      Okay 5:55 clears it up a bit..

    • @RandomlyDrumming
      @RandomlyDrumming Před 2 lety

      @@shoam2103 Well, technically, it was, as it handled the entire pipeline. Interestingly, the first *programmable* graphics chip for PC was Rendition Verite v1000 (RISC-based), released back in 1995, if I'm not mistaken. :)

    • @DM0407
      @DM0407 Před rokem +1

      Yep, I had bought an RIVA TNT2 to play Asheron's Call in 1999. I guess the 256 was out at this time but I couldn't afford it bad the TNT2 was still a massive jump in performance. Going from a choppy software renderer to "hardware accelerated" graphics was amazing at the time.. The paths had textures! Who knew?
      I don't remember the original Geforce being that big of a deal, but I remember lusting after the Geforce 2 AGP.

  • @Sagittarius-A-Star
    @Sagittarius-A-Star Před 2 lety +94

    I don't want to know how much effort it was to put all this information together.
    Thanks and thumbs up.
    P.S.: At Nvidia they are insane. Just try to find out which GPU you have and how it compares to others or if they are CUDA capable .....
    You will end up digging through lists of hundreds or thousands of cards.

    • @killerhurtalot
      @killerhurtalot Před 2 lety +13

      That's the thing though.
      Nvidia usually actually has 6-7 actual chips that they manufacturer. They don't manufacturer tens or hundreds of GPUs each generation...
      The main difference is that due to manufacturing defects, the GPUs are just binned and has different sectors enabled.
      The 3090 and 3080 are actually the same chip. The 3080 just has around 15% less pipelines/CUs and less tensor cores enabled...

    • @Baulder13
      @Baulder13 Před 2 lety +10

      This man has no quit! The amount of research he puts in and how much content that has been coming out is ridiculous.

    • @Hobbes4ever
      @Hobbes4ever Před 2 lety +2

      @@killerhurtalot kind of like what Intel does with their Celeron

    • @marksminis
      @marksminis Před 2 lety +6

      @@killerhurtalot yes that is correct. A large silicon wafer is a huge investment. By testing each core, defective cores can be coded out, so you still have a working chip to sell. Throwing out a large expensive chip just for having a few bad cores would be insane. Only a small percentage of chips coming off the huge wafer are totally perfect, and those are mostly near the center of the wafer.

  • @conradwiebe7919
    @conradwiebe7919 Před 2 lety +10

    Long time viewer and newsletter reader, love your videos. I just wanted to mention that the truncated graph @ 15:28 is a mistake, especially when you then put it next to a non-truncated graph a little later. The difference between datacenter and gaming revenue is greatly exaggerated due to this choice of graph. I feel it actually diminished your point that datacenter is rapidly catching up to gaming.

  • @BenLJackson
    @BenLJackson Před 2 lety

    I felt some nestalgia, good vid 👍 deciphering all this back in the day was so much fun. Also I love your explanation of AI and what it really is.

  • @Matlockization
    @Matlockization Před rokem

    Thank you for explaining some of the details in the beginning.

  • @Doomlaser
    @Doomlaser Před 2 lety +8

    As a game developer, I've been waiting for a video like this. Good work

  • @ted_1638
    @ted_1638 Před 2 lety +1

    fantastic video! thank you for the hard work.

  • @dipankarchatterjee8809

    A very well researched presentation. Thank you Bro.

  • @DavidSoto90
    @DavidSoto90 Před 2 lety +1

    such a valuable video, great work as usual!

  • @Meta5917
    @Meta5917 Před 2 lety

    Great video. Keep it up, proud of you

  • @NeilStainton
    @NeilStainton Před 2 lety +1

    Thank you for your excellent work in condensing and analysing NVIDIA’s progress.

  • @ministryofyahushua3065
    @ministryofyahushua3065 Před 2 lety +1

    Love your channel, very well presented.

  • @rzmonk76
    @rzmonk76 Před 2 lety

    Subscribed, really nice presentation!

  • @ttcc5273
    @ttcc5273 Před rokem

    Thank you for this video, it was informative, digestible, and I learned more than I expected to. 👍

  • @skipsteel
    @skipsteel Před 2 lety

    Thanks really well done, you made the complex simple thanks.

  • @drewwollin3462
    @drewwollin3462 Před 2 lety +12

    Very good as always. A good explanation of how graphics cards work and how they have evolved.

  • @jmk1727
    @jmk1727 Před 2 lety

    man your videos are all always amazing......PERIOD.

  • @Socrates21stCentury
    @Socrates21stCentury Před rokem

    Nice job, very informative !!!

  • @MohammadSadiqurRahman
    @MohammadSadiqurRahman Před 2 lety

    insightful. loved the content

  • @christakimoto8425
    @christakimoto8425 Před 5 měsíci

    This is an outstanding and informative video. Thank you so much!

  • @Zloi_oi
    @Zloi_oi Před rokem

    This is really interesting!Thanks for your work, sir!

  • @punditgi
    @punditgi Před 2 lety +3

    First rate information well presented! 👍

  • @helmutzollner5496
    @helmutzollner5496 Před rokem

    Excellent overview. Thank you.

  • @Magnulus76
    @Magnulus76 Před 11 měsíci +1

    They had neural networks being used in computer games even back in the early 90's, to a limited extent (mostly a few strategy games). The reason there's hype about neural nets now, is that the raw computing power of a GPU allows companies to develop neural networks that can mimic human visual perception and pattern recognition.

  • @nailsonlandim
    @nailsonlandim Před 2 lety +2

    Excellent video. funny fact is I passed the day dealing with CUDA and a CV application I'm working on

  • @LimabeanStudios
    @LimabeanStudios Před rokem +1

    Just found this channel the other day and it's amazing.
    One thing I don't see mentioned in the comments is that Nvidia is often rated one of the best companies in the world to work at. It's a lot easier to do big things with happy employees lol

  • @ChristianKurzke
    @ChristianKurzke Před rokem

    I love this, very well researched, and the correct level of technology for the average executive,.. who isn't a math genius. ;)

  • @supabass4003
    @supabass4003 Před 2 lety +43

    I have spent more money on nvidia GPUs in the last 20 years than I have on cars lol.

    • @mspy2989
      @mspy2989 Před 2 lety +3

      Goals

    • @heyhoe168
      @heyhoe168 Před 2 lety +2

      Same. Btw, I dont have a car.

    • @wazaagbreak-head6039
      @wazaagbreak-head6039 Před rokem

      I have no reason to update my ancient corolla it's a piece of crap but it gets me to work each day

    • @LimabeanStudios
      @LimabeanStudios Před rokem

      I have only purchased one of each and same lmao

    • @prateekpanwar646
      @prateekpanwar646 Před rokem

      @@wazaagbreak-head6039Is it 750 TI / 760?

  • @richardm9934
    @richardm9934 Před 4 měsíci

    Fantastic video!

  • @jdevoz
    @jdevoz Před 2 měsíci

    Amazing video!

  • @AaronSchwarz42
    @AaronSchwarz42 Před 2 lety

    Excellent analytics on market diffusion of COTS //

  • @harrykekgmail
    @harrykekgmail Před 2 lety +6

    a classic in your stream of videos!

    • @screwcollege8474
      @screwcollege8474 Před 2 lety

      how you posted 2 months ago?

    • @2drealms196
      @2drealms196 Před 2 lety +3

      @@screwcollege8474 patreon members get access to his vidoes first. Later on he makes the makes the videos public. Another way is through his college partnership program.

  • @Aermydach
    @Aermydach Před 2 lety +13

    Another great presentation.
    Watching this got me thinking that I wasted my time studying agricultural and wine science at Uni. Instead, I should've studied computer science/engineering. . .

    • @dankuruseo9611
      @dankuruseo9611 Před 2 lety +7

      Someone has to feed us 👍

    • @pinkipromise
      @pinkipromise Před 2 lety

      didnt know farmers have degrees

    • @Aermydach
      @Aermydach Před 2 lety +1

      @@pinkipromise They typically don't. The degrees are for researchers, technical support/agronomists (for fertilisers, pesticides, crop and livestock nutrition etc) and other specialist support roles.

  • @Bianchi77
    @Bianchi77 Před 2 lety

    Nice video,thank you for sharrng it :)

  • @emulegs5
    @emulegs5 Před 2 lety

    Please remember to leave a video link to the last video and the first in a series and the first aswell, I would have clicked links to them based off your intro alone

  • @GBlunted
    @GBlunted Před 2 lety +5

    This is cool content, I liked this video! I like the explanation of low-level processes as well as the history lesson of how it all evolved to where it is today...

  • @zodiacfml
    @zodiacfml Před 2 lety +5

    beaten me to this critique which is the most important part of Nvidia's luck/success. I recall it took years before Nvidia finally got to CUDA support/programming. Researchers using GPUs is also the reason why AMD bought Ati. There was a whitepaper from AMD that computing will move/focus to graphics from then on, they were just more than a decade way too early with that prediction.
    Another thing to note, it is the gamers/consumers that made all this possible paying for the R&D of graphics cards that will be used to sell products for the datacenter. Ray tracing hardware for example is a poor feature for use in gaming currently but it is excellent for industrial use.

    • @markhahn0
      @markhahn0 Před 2 lety +1

      in some ways, it's remarkable how poorly AMD has done. they've never delivered on anything like a sleek cpu-gpu-unified infrastructure, even though they have all the pieces in hand (and talked about things like HSA). it'll be ironic if Intel manages with oneAPI, since for so long, they were defending the CPU like a castle...

    • @zodiacfml
      @zodiacfml Před 2 lety

      agreed. though the hardware on the latest gaming consoles were impressive when they were announced, just ok when the consoles became available.
      AMD also doesn't have a foot in Arm where Nvidia has on the Nintendo Switch and Apple on M1.
      My last two PCs are Intel i3-8100 and recently a i3-12100 since I have some use of the iGPUs.

  • @michaelhulcy6680
    @michaelhulcy6680 Před 2 lety

    "Triangles. Triangles all the way down baby." Dat was a good one.
    Making Duke Nukem in 96 jealous man.

  • @Campaigner82
    @Campaigner82 Před 2 lety

    You make so good videos! The pictures I’m intrigued by. You’re doing a good job!

  • @igorwilliams7469
    @igorwilliams7469 Před 2 lety

    Thinking about that elevator analogy a bit too much... Are there ANY elevators with level tiers midway (like bottom and top) for riders to decamp? While obviously adding complexity to a system that it almost plug and play, it could certainly be interesting!

  • @shmehfleh3115
    @shmehfleh3115 Před rokem

    This video filled in a lot of gaps for me. I work with the things and I wasn't sure how GPUs evolved into general computing devices.

  • @BartKus
    @BartKus Před 2 lety +22

    You do really good work, sir. Much appreciate.

  • @hc3d
    @hc3d Před 2 lety

    wow, amazing analysis.

  • @FrancisdeBriey
    @FrancisdeBriey Před 2 lety

    Subscribed !

  • @swlak516
    @swlak516 Před 2 lety +1

    These videos make me feel smarter than I really am. And I feel like you’re one of the few CZcams contact creators in the space who can do that. Thank you.

    • @Speed001
      @Speed001 Před 2 lety

      This is definitely a bit above me with tech terms I don't care to learn.

  • @jem4444
    @jem4444 Před rokem

    Extremely well done!

  • @kokop1107
    @kokop1107 Před 8 měsíci

    This is a very good and acurate explaination

  • @IntangirVoluntaryist
    @IntangirVoluntaryist Před 2 lety

    I still have several old gen cards
    TNT cards, banshee, voodoo, first gen geforce, some early gen ati cards too
    i also have some old soundblaster cards :)

  • @markhahn0
    @markhahn0 Před 2 lety +4

    important to point out that no one really uses Cuda for AI - they use pytorch or tensorflow. that means that Nv doesn't have any real lock on the market - alternatives are highly competitive.

    • @Stef3m
      @Stef3m Před rokem

      That is an important point that is too rarely bring out

    • @kotokotfgcscrub
      @kotokotfgcscrub Před rokem

      ML frameworks came to existing later and were built upon cuda and cudnn, and are way more optimized for nvidia even after starting to support other hardware.

  • @nabeelhasan6593
    @nabeelhasan6593 Před 2 lety +4

    I always wish there was a unified framework like Cuda for all platforms , NVIDIA absolute monopoly on Deep learning reallly makes things hard

    • @Corei14
      @Corei14 Před 2 lety +4

      Open cl. Now making it work as well as others is a different question

    • @joshuagoldshteyn8651
      @joshuagoldshteyn8651 Před rokem

      How does it make things really hard? Simply use an Nvidia GPU with high batch sizes or any CPU with low batches sizes?

  • @19smkl91
    @19smkl91 Před rokem

    6:24 I've seen people rubber banding when stepping on and even bug off at half way up, usually receiving hurtings.

  • @BillHawkins0318
    @BillHawkins0318 Před 2 lety +31

    What I never understood is why NVIDIA Attemped to cool the HEAT SINK with a 3 cent fan. Especially from 2001 to 2010. When said 3 cent fan goes out the GPU can burn up. 🔥. As passive cooling Has never been enough. We don't have to worry about AI. 3 cent fans will make short work of that.

    • @Tartar
      @Tartar Před 2 lety +3

      GPU fans are still very cheap these days.
      Hopefully we the Asus Noctua collaboration is a sign of things to come with GPU's with premium and quiet cooling solutions.

    • @rayoflight62
      @rayoflight62 Před 2 lety +1

      A good percentage of computers from that period were because of the failed fan. It was that transparent plastic melted all over the GPU heatsink. The GPU subsequently failed, usually shorting the +5 V bus...

    • @musaran2
      @musaran2 Před 2 lety +2

      By the time it happens, they consider it is obsolete and you are supposed to upgrade.
      I hate it.

    • @Bialy_1
      @Bialy_1 Před 2 lety

      "As passive cooling Has never been enough." Nope some cards got good passive cooling, and you always can replace the fan on your own when its starting to make noise...

  • @PlanetFrosty
    @PlanetFrosty Před 2 lety

    Good presentation

  • @valenganev5774
    @valenganev5774 Před rokem

    what do think about the fujitsu Celcius PC? where do you place this among other PC's? What is the future of fujitsu?

  • @birseyleryap
    @birseyleryap Před rokem

    that popping sound @8:53 from the lips

  • @estebancastellino3284

    I remember when NVidia software graphic accelerator card was the cheap option for those of us who couldn't afford a Vodoo card, the one that did came with hardware accelerator. Vodoo was about it's fith version by the time NVidia put the GForce chip on.

  • @GhostZodick
    @GhostZodick Před 2 lety

    Your video always have a low frequency pounding sound in the background. Would you mind look into that and try to fix it in the future videos? At first I thought it was something pounding in my house, but later realized it was in your video because I only hear it at certain parts of your video.

  • @green5270
    @green5270 Před 2 lety

    excellent video

  • @etherjoe505
    @etherjoe505 Před 2 lety +5

    Single Instruction Multiple Data 👍👍👍

  • @MikkoRantalainen
    @MikkoRantalainen Před rokem

    Great document as usual but the bar graphs not starting from zero around 15:30 wasn't very cool. The illustration made it appear like Gaming is more than double the Data Canter revenue but when you compare the actual numbers 3.22 vs 2.94, you'll quick see that the difference is actually about 9%!

  • @cfehunter
    @cfehunter Před rokem +1

    "Early graphics processing broke scenes up into triangles".... they still do.

  • @Jensth
    @Jensth Před 11 měsíci

    You were spot on with this one. After this came out everyone bought NVIDIA stock up like crazy.

  • @Palmit_
    @Palmit_ Před 2 lety +1

    thank you John. :)

  • @final0915
    @final0915 Před rokem

    12:35 haha i wonder what images they collected for non-hotdogs

  • @JohnKobylarz
    @JohnKobylarz Před 2 lety +6

    Excellent video. As someone who remembers when the GeForce 256 was launched, it’s amazing to reflect on how far they’ve come and how influential their tech has been on the world.
    Before GPU’s, PC gaming was a much different affair. Even looking at JPEG’s was an somewhat intense system task before GPU’s became the norm.
    I learned a lot from this video, and enjoyed it. It helps me connect the dots regarding how AI learning works.

  • @davidbooth8422
    @davidbooth8422 Před 2 lety

    Hi John. I love your videos! Do you have any possible connections that might want to manufacture a much better and cheaper smoke detector than is available today? I would love to explain how easy that would be to any technical person who would listen. I am not trying to make money either, just save lives.

  • @johndvoracek1000
    @johndvoracek1000 Před rokem

    I was wondering if you would mention Apple; then you did at the end but not in the way I anticipated. Isn't Apple's M chip a move in the same chip capabilities and architecture as Nvidia, etc.?

  • @screwcollege8474
    @screwcollege8474 Před 2 lety +10

    Great video, now I understand why Nvidia is worth 600 billion in market value

    • @johnl.7754
      @johnl.7754 Před 2 lety +2

      Would have never thought that it would be worth over 3x Intel or any other cpu manufacturers.

  • @timswartz4520
    @timswartz4520 Před rokem

    That G-Force 256 made me very happy for a long time.

  • @bhavintoliabg4946
    @bhavintoliabg4946 Před 2 lety

    This one video made me respect NVIDIAs work more than any advt ever would.

  • @ADHD55
    @ADHD55 Před 2 lety +15

    Nvidia is what happens when the CEO is a engineer not a short term thinking mba

    • @user-lx7kx1dd3q
      @user-lx7kx1dd3q Před 2 lety

      It's in Chinese blood. An engineer

    • @xraymind
      @xraymind Před 2 lety +5

      @@user-lx7kx1dd3q Correction, Taiwanese blood.

    • @user-lx7kx1dd3q
      @user-lx7kx1dd3q Před 2 lety +2

      @@xraymind there's no such thing as Taiwanese blood. Taiwan is a land not a race.

    • @ADHD55
      @ADHD55 Před 2 lety +1

      @@user-lx7kx1dd3q huh? NVIDIA is a American company not Chinese

    • @user-lx7kx1dd3q
      @user-lx7kx1dd3q Před 2 lety +1

      @@ADHD55 since when I said Nvidia isn't American company. You talked about it CEO. And who do you think it's CEO? It's Jason Huang. A Chinese Taiwan that has became US citizen.
      Are you still a kid that I need to spell everything for you???????

  • @ravindertalwar553
    @ravindertalwar553 Před rokem

    Congratulations 👏 and lots of love and blessings ❤️

  • @minecraftdonebig
    @minecraftdonebig Před 2 lety +2

    If i was in charge all chip processes engineers and associated people would be required to wear wizard hats because this shit is insane magic

  • @Tigerbalm338
    @Tigerbalm338 Před rokem

    To paraphrase a popular SNL skit:
    "Triangles baby! MORE TRIANGLES!"

  • @allcouto
    @allcouto Před rokem

    You guys completly fogot DOJO!

  • @gregsutton2400
    @gregsutton2400 Před rokem

    great info

  • @amdenis
    @amdenis Před 2 lety +4

    We use several multi-GPU (4-7 GPU’s) AI workstations based on NVidia’s highest-end Tesla and RTX cards. Interestingly, the Apple Silicon is already getting close to the NVidia from a price performance standpoint, and is more capable per-watt. It will be interesting to compare them when the pro-desktop Apple silicon based solutions are released later this year.

    • @666Tomato666
      @666Tomato666 Před rokem

      it will be more interesting when nVidia releases cards that use the same node as the M1 and M2 chips

    • @graphicsRat
      @graphicsRat Před rokem

      I think Apple GPUs have tremendous potential. An integrated processor where CPU and GPU share the same memory. No more need to copy data to the GPU for processing and sending the results back to the CPU.

    • @666Tomato666
      @666Tomato666 Před rokem

      @@graphicsRat You do know that this is basically the architecture for every game console on the market? It's not novel, and, no, it's not better for general purpose computing.

    • @graphicsRat
      @graphicsRat Před rokem

      @@666Tomato666 That may very well be the case but Apple is bringing it to PCs. I am also aware that mobile phones use the same technology. But as someone who has programmed GPUs I know how inefficient it is to shuffle data between the CPU and GPU.
      PS: I am not an Apple fan boy.

  • @ylstorage7085
    @ylstorage7085 Před rokem +1

    no offense... but the Y-axis @15:24 ... dude...
    that's what I called, how to make a 3 looks 3 times bigger than another 3

  • @Alphahydro
    @Alphahydro Před 2 lety

    That's pretty interesting, and only the tip of what we'll be able to accomplish with GPU horsepower.

  • @tonyduncan9852
    @tonyduncan9852 Před 2 lety

    Thanks for that. :)

  • @villageidiot8194
    @villageidiot8194 Před 2 lety +1

    Go do an article on Innosilicon, a mainland Chinese GPU maker. How far behind are they? Is there any hope for a third or fourth GPU player in the market space. Will Intel Arc be the 3rd player?

    • @justice929
      @justice929 Před 2 lety

      Lisa Su and Nvidia CEO are Taiwanese. same with TSMC

    • @villageidiot8194
      @villageidiot8194 Před 2 lety

      @@justice929 Don't know how Nvidia & TSMC entered the chat. I was asking about Innosilicon, they have Fantasy One GPU graphics card. From what I can gather, their offices are in Zhuhai, Wuhan, Suzhou, Xi'an, Chengdu, Dalian, Beijing, Shanghai, Shenzhen, London, Silicon Valley, Toronto. Note that only 3 offices outside of China (UK, US, Canada) and 9 offices in China. Their headquarters are in Wuhan, Hubei province, China.

    • @perforongo9078
      @perforongo9078 Před rokem

      I think a company like Innosilicon would do well in China itself because China is so protective of homegrown companies. But if I were to bet on a third player in the GPU market I'd bet on Intel.

  • @johnaugsburger6192
    @johnaugsburger6192 Před 2 lety

    Thanks so much

  • @Hansengineering
    @Hansengineering Před 2 lety +2

    Bees make honey by holding flower nectar in their mouths, then breathing with their mouth open until it dehydrates.

  • @Zarcondeegrissom
    @Zarcondeegrissom Před 2 lety

    cool stuff, yet not the only thing to have parallel compute. Seems a bit wasteful to require a killo-watt chip just to compute audio echo choirs equalization or even distance calculations for Doppler radars. Are the humble DSP chips all ancient things from the past, hmm. It would be so nice to have a synthesizer with modern tech that isn't just a leftover from a bygone era like the Kurzweil K2000RS. Dont get me wrong, AI is cool just like Sharon Apple from Macross, I just think some things get left behind and forgotten by new tech.

    • @rayoflight62
      @rayoflight62 Před 2 lety +1

      DSP do calculations: like a Fast Fourier Transform for noise cancellation, equalisation or other fixed audio calculations.
      Nvidia cards do computation of matrix transpositions, they manipulate vector spaces, i.e. how to recognise a moving truck which is taking over your car, during a thunderstorm at night...

    • @Zarcondeegrissom
      @Zarcondeegrissom Před 2 lety

      @@rayoflight62 can it be powered by a single 9v battery and fit in a foot peddle for my guitar, hmmm. I think something is left behind with everything going to nvidia DGX rack units.

    • @muzlee7479
      @muzlee7479 Před 2 lety

      @@Zarcondeegrissom that’s two totally different things. Pointless comparison

  • @taimalik1110
    @taimalik1110 Před 2 lety +5

    This is essentially my stock market news channel! Thank you sir, I shall invest even more in NVDA :P

    • @DK-ox7ze
      @DK-ox7ze Před rokem +1

      Nvidia stock has taken a beating lately, just like many others tech stocks. I bought Nvidia few months back because of exactly the reasons mention in this video, but it's 42% down since then. Hope that it recovers soon.