The FUTURE of NVidia

Sdílet
Vložit
  • čas přidán 31. 05. 2022
  • 20% coupon code: C30
    win11 professional 1pc : www.ucdkeys.com/product/windo...
    Office2021 Professional Plus CD Key:www.ucdkeys.com/product/offic...
    Windows11Pro+Office2021Pro Global Bundle: www.ucdkeys.com/product/windo...
    Windows 11 Home CD KEY GLOBAL :www.ucdkeys.com/product/windo...
    office 19 pp:www.ucdkeys.com/office-2019-P...
    win10 pro:www.ucdkeys.com/Windows-10-Pr...
    365 account: www.ucdkeys.com/Office-365-Pr...
    Support me on Patreon: / coreteks
    Buy a mug: teespring.com/stores/coreteks
    Bookmark: coreteks.tech
    My channel on Odysee: odysee.com/@coreteks
    I now stream at:​​
    / coreteks_youtube
    Follow me on Twitter: / coreteks
    And Instagram: / hellocoreteks
    Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
    #nvidia #4090 #apple
  • Věda a technologie

Komentáře • 590

  • @ctsd623
    @ctsd623 Před 2 lety +113

    Re: laptop sales shooting up in 2021, it largely has to do with the GPU shortage; it was a weird market where buying a gaming laptop was a much better bargain than buying a discreet GPU due to the greatly inflated prices on the latter. It wasn't a sustainable growth rate.

    • @phoenixrising4995
      @phoenixrising4995 Před 2 lety +13

      Or when a prebuilt in its entirety was cheaper then a single GPU. Thats how I got my 3080ti and sold the parts to cover some of the GPU inflation price.

    • @itsryanski2355
      @itsryanski2355 Před 2 lety +3

      Though there is a GPU shortage, far more GPU's were made to previous years(ever).

    • @kintustis
      @kintustis Před 2 lety +2

      @@itsryanski2355 well they all went to miners anyways. And sadly a lot of them will end up as e-waste instead of being gamed on.

    • @paranoid9678
      @paranoid9678 Před 2 lety

      @@itsryanski2355 well nvidia and online stores sold palets of gpu to minier some evern posted videos of it on twitter
      on amds store 2 bots in europe have managed to buy 90% of all gpu in one drop and i cant remember how many months it was that i saw a 3080 founder on nvidias site to buy

    • @francopanigaia2425
      @francopanigaia2425 Před 2 lety

      @@itsryanski2355 thats why you are about to see 3090s going very cheap, with a deflation in the gpu market. and they know.

  • @ps3guy22
    @ps3guy22 Před 2 lety +47

    0:00 Imagine walking in to your million dollar data center and seeing your DXG server units start tweakin like that

    • @WhitmanEntertainment
      @WhitmanEntertainment Před 2 lety +10

      Actual nightmare fuel. I thought I WAS tweaking when i started the video. What were the nVidia presentation designers smoking?

    • @billr3053
      @billr3053 Před 2 lety +4

      Certainly looked bizarre. What is that supposed to represent?? Must only have meaning to marketing types and those artsees that came up with it.

    • @BBWahoo
      @BBWahoo Před 2 lety +4

      Why imagine when you can take a high dose of mescaline and make it happen? :V

    • @swaggitypigfig8413
      @swaggitypigfig8413 Před 2 lety +2

      They look like the control rods in Chernobyl right before it blew up.

    • @revvilo
      @revvilo Před 2 lety

      @@billr3053 It's musical. It's the latest video on their YT channel.

  • @jaredmcnamara6874
    @jaredmcnamara6874 Před 2 lety +68

    The reason Jensen loves data center is because of higher margins. Much higher than anything consumer.

    • @tacticalcenter8658
      @tacticalcenter8658 Před 2 lety +5

      Yup

    • @davedimitrov
      @davedimitrov Před 2 lety +11

      Not only margins, but guaranteed demand, fewer RMA issues, and that sweet paid software support money

    • @FirstnameLastname-py3bc
      @FirstnameLastname-py3bc Před 2 lety

      Besides top deals otherwise enterprise is usually sold at low margins

    • @johnsimon8457
      @johnsimon8457 Před 2 lety

      Amazon can charge by the hour for GPU equipped ec2 instances, while consumer hardware is a just money spent on entertainment. Of course nvidia is going to charge whatever they like for data center hardware

    • @ZorexZockt
      @ZorexZockt Před 2 lety

      Can’t blame them.

  • @nowherebrain
    @nowherebrain Před 2 lety +6

    I really dislike how they "nvidia", equate mining sales as contributing to "gaming"....horrible.

  • @dslay04
    @dslay04 Před 2 lety +10

    Just a note. Latest infinity fabric like used in MI250X is 800 GBps

    • @jemborg
      @jemborg Před 2 lety

      I thought that might be the case.

  • @slickjim861
    @slickjim861 Před 2 lety +58

    Every time someone was saying it’s not miners it’s higher demand bc of the lockdowns, I rolled my eyes knowing that it was in fact the miners . And that everyone who could and had brains was selling to miners bc they were willing to pay double and triple the asking price for pallets of gpus. Doesn’t take a rocket scientist to figure it out.

    • @sylviam6535
      @sylviam6535 Před 2 lety

      There’s a ton of BS regarding microchips. They are still blaming working from home outfitting and lockdowns for the shortages, when people outfitted for that two years ago and most of the world is no longer doing lockdowns. I am getting tired of being lied to about everything.

    • @wertigon
      @wertigon Před 2 lety +3

      You are correct that one of the main contributing factors to the crazy Q4 20 -> Q2 22 market was the miner surge. However, that was NOT the only reason. Other reasons include:
      * Chip shortage (pandemic -> industry companies cancelling orders of old chips -> old manufacturing chains and nodes become retired preemptively -> more strain on newer nodes)
      * Gamer market exploding due to lots of people locked inside their homes
      * Nvidia and AMD not having foreseen the demand and just couldn't make enough of the cards for a loooong while
      * Collapse of logistics, so getting both silicon to the factories and shipping the finished products was a nightmare
      So, it was a perfect storm of Low supply + Broken logistics + High mining demand + Even higher gaming demand + Industry forcing closure of old nodes that created that situation. Did the miners contribute in large part, yes, but the main reason is still the lockdowns, even though gamer demand is a large part of it.

    • @bjma654651651
      @bjma654651651 Před 2 lety

      Why did Jenson/NVidia lie though?

    • @Darth_Tater69
      @Darth_Tater69 Před 2 lety

      @@bjma654651651 To make themselves look better

    • @bjma654651651
      @bjma654651651 Před 2 lety

      @@Darth_Tater69 How so? They are embarrassed that their product is not being used for graphics?

  • @N0N0111
    @N0N0111 Před 2 lety +30

    After many tens of billions of transistors ramp up, they are finally addressing the huge cache bottleneck!
    L2 and L3 cache will have some exponential sizes now, and that will bring so much more performance that people didn't know bigger cache could bring to the table!
    The whole PC eco-system is about to see the once in a decade performance ramp up from each component end this year!
    Very excited times for the PC Master Race 😁

    • @prashanthb6521
      @prashanthb6521 Před 2 lety +1

      I am more interested in CXL.

    • @elmariachi5133
      @elmariachi5133 Před 2 lety +4

      The whole PC eco-system is shrinking, and will soon be where it was in the 90s and then even 80s, where you had to pay a fortune for hardware, because it was a very special thing for a very limited amount of freaks.

    • @Darth_Tater69
      @Darth_Tater69 Před 2 lety +8

      @@elmariachi5133 PC gaming is bigger than ever and exponentially more accessible, try again

  • @Stychinsky
    @Stychinsky Před 2 lety +10

    27:44 >Western companies are not selling in Russia anymore.
    Idk, last I checked there's no shortages of GPUs, CPUs and other components. Speaking as a person who's built a new system last month.

    • @Frank451Martin
      @Frank451Martin Před 2 lety

      Probably companies found legal loopholes and being subtle without being hit by heavy fines. One way is all the tech companies will sell a lot of their stuff in china or India and said nation will give it to Russia to circumvent the sanctions.

    • @tacticalcenter8658
      @tacticalcenter8658 Před 2 lety +1

      Sucker. You probably paid too much for it.

    • @ascendantMethEnjoyer
      @ascendantMethEnjoyer Před 2 lety +1

      Dumb question on my part, but is Nvidia a "western" company if everything they manufacture is made in the east? I figure things manufactured in china can just be shipped over to Russia easily. Has china jumped on the globohomo sanction bandwagon against Russia?

    • @tacticalcenter8658
      @tacticalcenter8658 Před 2 lety

      @@ascendantMethEnjoyer the usa used gain of function research to try and start war with Russia and China. Russia is shutting down the bio labs in Ukraine. China will do the next in Taiwan. So yes... China and Russia will get technology from each other. With that said... The next 6 months are going to be very interesting. Especially for those who deal in corruption.

  • @MrChriss000
    @MrChriss000 Před 2 lety +89

    The impact of ray tracing on gamers spending was minimal, as mentioned below it is a technology the hardware is not ready for. Might not be affordable for many years yet.
    The 'DLSS' feature is a hit with gamers obviously. Esp now it has improved but as usual radeon have come up with a similar feature.
    Basically old gamers know all they have to do when nvidia makes something proprietary is sit back and watch. List of failed gimmicks is now extensive.
    Did the gtx 10 series have onboard assink compute hardware for DX12 / Vulkan?
    No. A deliberate move to shorten the lifespan of their cards.
    Jenson now Likely got the right bum ache a gtx 1070 can run FSR 2 and get 20% faster frames similar quality looking to D.L.S.S. 2 without upgrading the card.
    Some toasts to radeon about that from 2 pals who paid 340 quid 4 years ago for 1070's ..
    ( mining boom probs for buying vega)
    I would not trust a word of what comes out of Jensens mouth.
    I have never bought a new nvidia card, (highest used spend was 17 GBP)
    I also dont invest in new games that suddenly need a $300 card to run 1080p 'high', this is part of nvidia's modus operandi.
    I may play them years later on newer hardware but mostly not cos they are 'a bit shit'..
    GREAT games are great games and will run on lower hardware, you dont need to go to the Imax theatre to watch 12 Angry Men.
    Jensen AND radeon can take $300-$400 '60' cards and shove them up their AARHHHHHSE?

    • @samgoff5289
      @samgoff5289 Před 2 lety +5

      Wow dude most people would be embarrassed about wearing a tinfoil hat but you flaunt yours proudly 🤦‍♂️

    • @MrChriss000
      @MrChriss000 Před 2 lety

      @@samgoff5289 Built my first gaming rig out of used parts in 2003 and have been following developments in hardware since then.
      I have 39 upvotes and you have zero so either there are an awful lot of tinfoil hat wearers here or you dont have longterm experience of this market.
      czcams.com/video/H0L3OTZ13Os/video.html

    • @blkspade23
      @blkspade23 Před 2 lety +20

      @@samgoff5289 He's not wrong, and it's not tin foil hattery. There is evidence in Nvidia's actions going back a very long time if you've been paying attention. They employ strategies to influence vendor lock-in like no other. Like when they acquired Physx and integrated it into their GPUs. They enabled Physx offloading to a 2nd Nvidia GPU, but it was found to work with an AMD/ATI GPUs as the main render card. Subsequent nvidia drivers would then detect you having an AMD GPU and disable HW Physx. They punished their consumers for giving their competition money.
      The Gsync module mostly only existed because investing in an expensive monitor with one ties you to buying only their cards for their closed version of Adaptive sync. That module has always been absent from every laptop that offers GSync, because they just used the eDP based Adaptive sync standard in laptops. Those obviously get replaced whole sale. It took Nvidia way too many years to enable Adaptive sync support for non-Gsync monitors for desktop, even though they were capable the whole time.
      The Geforce partner program might has well have been considered a page from the Intel playbook.

    • @CommandoTM
      @CommandoTM Před 2 lety +1

      ​@@blkspade23 this is all true
      IIRC another thing about the PhysX block thats similar to the G-Sync situation is that they eventually gave up, sometime around the 10-series. i recall seeing a few Polaris/Vega cards paired with GTX 1050/TIs for a PhysX boost in Batman Arkham Knight

    • @arenzricodexd4409
      @arenzricodexd4409 Před 2 lety

      @@blkspade23 "but it was found to work with an AMD/ATI GPUs as the main render card."
      people did not found this by accident. initially nvidia did allow this but thing is PhysX is nvidia tech. but in case of PhysX it is better for nvidia to block the hybrid setup before it escalates into something more complicated like lawsuit. if nvidia officialy block hybrid setup people can't sue nvidia when the hybrid setup did not work as it should. the hacked driver to make hybrid setup working again do exist. plus nvidia only officially block hybrid setup when AMD pretty much have no intention to work with nvidia to improve such setup because back then AMD kind of banking on Bullet Physics to be PhysX killer as Roy Taylor have mention it.

  • @kintustis
    @kintustis Před 2 lety +9

    Apple's foray into shared memory between the GPU and CPU rings almost exactly like what AMD attempting to do with HSA and APUs..... 10 years ago.

    • @stevenhe3462
      @stevenhe3462 Před 2 lety +2

      "foray" is not right. The fruit company has been making phones and tablets chips for 10 years.

  • @marksapollo
    @marksapollo Před 2 lety +95

    I think the time for ray tracing is yet to come, it was promised with the new consoles yet games with it don’t look a whole lot different, with a hit to FPS, Nvidia should also update the Shield Tv as it’s fallen far behind the competition now.

    • @scarletspidernz
      @scarletspidernz Před 2 lety +8

      Ray tracing won't hit its stride for another 2-3 gens, until mid range gpu's can hit 1440p 60fps with raytracing on in even the most demanding games, we're not there yet. Which is what consoles need to hit in thier next gen as baseline

    • @Durkhead
      @Durkhead Před 2 lety +6

      Consoles have small amd gpus reg desktop amd gpus can barely handle raytracing what did you expect xbox and Sony basically lied grand turismo is proof theirs only raytracing in the replay not actual gameplay

    • @scarletspidernz
      @scarletspidernz Před 2 lety +6

      Everyone knew Ray tracing isn't capable at mid range, remember Nvidia was pushing the narrative of Ray tracing for everyone.
      If anyone believed consoles can do full on Ray Tracing then they only have themselvesto blame for not doing thier research. As the desktop gpu space showed 3060/3060ti can barely do it.

    • @SilverJoystix
      @SilverJoystix Před 2 lety +1

      I have yet to see a single argument for why ray tracing is needed in games *at all.* I don't even think you know why you think you need ray tracing.

    • @photonboy999
      @photonboy999 Před 2 lety +27

      Ray-Tracing is here to stay...
      The MAIN two advantages are:
      1) Game development is far easier (or will be at least once a game dev ONLY does ray-tracing and leaves the baked lighting behind. ugh.)
      2) It sells GPU's
      ANYBODY that understood how games are made realized that ray-tracing would be SLOW to integrate into the ecosystem. Most optimally you choose the lighting/shadow issues that ray-tracing is best suited for. Certain instances of SSR for example... as your rendering budget increases over time you add more ray-traced features. Eventually you require ray-traced graphics cards because you won't spend the time to pre-bake the same elements with traditional raster techniques. It comes down to the percentage of GPU's that support ray-tracing at suitable processing capability.
      So we're going to see TWO huge trends for PC gaming (and console):
      1) the PERCENTAGE of ray-traced elements will increase
      2) MACHINE LEARNING will be increasingly used for UPSCALING and to figure out ways to make the main code more efficient
      *Diminishing Returns*
      A few years ago I would have said this was a HUGE problem in just a few years. Just look at the BEST looking game on PC today. Something that merges, say, Photogrammetry and a little ray-traced elements. It's getting close to photo-realism isn't it? Sure, you can push a higher FPS but that only gets you so far and as the game engine's and AI get better then 3x the frames don't cost you 3x the performance.. it's already not scaling like that due to reusing temporal data. We EVENTUALLY might see 180FPS get down to 1.5x the GPU processing cost as 60FPS. It will vary a lot though.
      BUT.. ray-tracing has it's own diminishing returns. Do we need FULL ray-tracing if the traditional raster + ray-tracing techniques get really good and the advantage to DEVELOPERS with newer game engine tools means rebuilding the lighting isn't inefficient any more? We can use ray-tracing to pre-bake the lighting in real-time for example so that developers see the final product so it doesn't even affect them.
      FULLY ray-traced is probably where we're headed but at some point regardless of whether it's fully ray-traced or not we're going to hit rapidly diminishing returns. There's OTHER things than just the obvious visuals such as COLLISION detection at a finer scale, better fluid simulations etc but I'm not sure they will add much load to the GPU for most games.

  • @billykotsos4642
    @billykotsos4642 Před 2 lety +26

    How many GPUs are they planning to sell to miners this time?

    • @hexedits369
      @hexedits369 Před 2 lety +4

      proof of stake is future, so hopefully not as much

    • @amorgan20111
      @amorgan20111 Před 2 lety +2

      (Dr. Evil voice) one million GPUs!

    • @exception05
      @exception05 Před 2 lety +5

      Most of them. Now they're main buyers. Who pays - set rules.

    • @amorgan20111
      @amorgan20111 Před 2 lety +3

      @@exception05 in reality, the mining demand for GPUs is going to dry up due to a downturn in profitability and ETH (the most profitable crypto currency to mine) moving to PoS. So honestly, I don't see miners making up a huge part of demand within the next 18 months.

    • @Safetytrousers
      @Safetytrousers Před 2 lety +5

      Mining is currently dead. After a year of mostly making a living on it, I have totally stopped mining. Mining revenue has tanked, and this is before Ethereum proof of stake. You have no idea what is going on if you think there will be a continued high demand for GPUs for mining.

  • @korhanuran
    @korhanuran Před 2 lety +4

    People underestimate AI's role in future game graphics. DLSS and denoising were just the beginning.

    • @miguelpereira9859
      @miguelpereira9859 Před rokem +2

      Wait until Nvidia starts integrating AI generated faces into 3d game models, with Ray tracing on top of that. I agree with you.

  • @NatrajChaturvedi
    @NatrajChaturvedi Před 2 lety +5

    The functional design of 4000 cards may very well be exactly the same as 3000 series refference cards but I very much doubt they will even use the same color scheme and not even change any of the plastic and other design bits to make it seem fresh and new to the casuals.
    Tho they may be testing the 4000 series chips with previous coolers for now. Hard to say if its a genuine 4090ti pic or not therefore.

    • @zeNUKEify
      @zeNUKEify Před 2 lety

      True, it could be like the small change in 980 vs 1080 generations, or something totally different

  • @denvera1g1
    @denvera1g1 Před 2 lety +39

    26:12 Nvidia might be starting off with the 4080TI because its not the top card(it is a 102 die after all they, have an entirely untapped die for consumers in the 100 die)
    They have plenty of room to grow not only with the Geforce line, but also the Titan line, with a TitanADHD, Titan Ti, and Titan Tie

    • @andreirachko
      @andreirachko Před 2 lety +10

      TitanADHD? Waiting for TitanOCD, TitanASD, and TitanBPD xD

    • @denvera1g1
      @denvera1g1 Před 2 lety +2

      @@andreirachko I was referncing the Nvidia keynote where the guy said 3090Tie and 3090Ti in the same sentence

    • @francopanigaia2425
      @francopanigaia2425 Před 2 lety

      4080ti would have been called 4070 just a few years ago. you get more number and more price for a lower tier card at every release. how can you blame them. they are selling you the 500USD tier card for 2000 USD now, and the real TI comes out as an ultra TITAN at 3500 usd.

    • @CMDRSweeper
      @CMDRSweeper Před 2 lety +2

      Nah! Can't be a Titan ADHD... Microsoft stole the ADHD a long time ago...
      The cause of it is when you run Active Directory on a High Definition monitor! :D

    • @denvera1g1
      @denvera1g1 Před 2 lety +1

      @@francopanigaia2425 What used to be the 80 class sold for $499, but now, that full 100 die cannot be found in a GPU with video out, being locked into an enterprise only role, oh and it costs ~$12k USD

  • @llllllll4206
    @llllllll4206 Před 2 lety +2

    Small correction - Colette Kress is the Executive VP / CFO and not the CTO.
    The CTO of Nvidia is Michael Kagan :)
    Awesome vid as always

  • @ziyadt362
    @ziyadt362 Před 2 lety +34

    Can't imagine the amount of effort that was spent for making this video...
    It feels illigal to watch it for free.
    👌.

    • @samlebon9884
      @samlebon9884 Před 2 lety +4

      They say there is no free lunch, but with Coretek, you get breakfast , lunch and dinner for free.

    • @Instant_Nerf
      @Instant_Nerf Před 2 lety +1

      Sounds like you’re easily robed

    • @isthisme-ct9fe
      @isthisme-ct9fe Před 2 lety +1

      this is not free they got pay from google

    • @fleckx
      @fleckx Před 2 lety

      @@isthisme-ct9fe and affiliate marketing.

  • @ashishpatel350
    @ashishpatel350 Před 2 lety +22

    Really hope GPU prices collapsijg will bankrupt the flippers 🤣

    • @stillnotchill2560
      @stillnotchill2560 Před 2 lety

      The crash won't last. And most flippers already made their profit. At this point they are just cashing out.

  • @anasevi9456
    @anasevi9456 Před 2 lety +9

    The not selling in russia is completely meaningless. China will sell russia whatever they want with, nvidia gpus, amd cpus, whatever.. with next to no markup for favourable fuel rates, they chose their side and it's not us, and no western company can afford to cut off China.

    • @tacticalcenter8658
      @tacticalcenter8658 Před 2 lety

      Its because the biolabs that the USA funded and made c nine teen with.

  • @El.Duder-ino
    @El.Duder-ino Před 2 lety +3

    You are very right, faster interconnects, getting all processors ideally as SoCs and memory close together with speeds getting close to in chip cache are where we will see major improvements. From chip manufacturing perspective its very logical as chiplet approach which means improvements in packaging and interposer interconnects are in tune with future improvements.
    Apple's M1 shows what we will see with the PC platform as well and Nvidia will push that their own way with their Arm based processors with their GPU/AI accelerators and Mellanox based interconnects.

  • @Alperic27
    @Alperic27 Před 2 lety +5

    Unreal 5 is showing that hardware rt is not currently a big help for real time global illuminarion… sw rt is currently better at it. apple’s gpu cores are currently a pain in the neck: not fast enough, and harder to optimize for (tbdr means having to change the render strategy… explaining why u5 nanite doesnt work on it)

  • @anishrko1
    @anishrko1 Před 2 lety +4

    Insightful video. Not meaning to nitpick but Colette Kress is CFO not CTO

  • @justynpryce
    @justynpryce Před 2 lety +8

    You're forgetting about these newer techniques that Nvidia is bringing into the space, like ReSTIR. It is an absolute game changer for ray tracing performance and could very well prepare people for the future of high-quality ray tracing on less expensive/older cards. Also things like DLSS and Nanite making rendering more complex scenes easier. I think you may find yourself very surprised with gaming very soon at least once this generation finally starts. Which IMO is when UE5 games come out while implementing ReSTIR and DLSS. I mean, this has potential to be current 3090 performance on a 2070 or lower when all is said and done. . . So, with the amount of expertise they put into developing DLSS and ReSTIR, I don't think RT is going anywhere in their minds or in ours.

  • @vfn4i83
    @vfn4i83 Před 2 lety +3

    with the RTX still consuming 300+W, no it will never catch up.

  • @realonezx4841
    @realonezx4841 Před 2 lety +1

    Let's see if Jensen will continue to killing it.With the current Inflation and energy prices if a 4090 pulls 600w and 4080 500w.Gamers will not even be able to afford any of the new cards.Not only from the higher Msrp that nvidia will offer but also from higher energy prices.

  • @michahojwa8132
    @michahojwa8132 Před 2 lety +6

    Remember fear game and soft shadows? Tanked half of perf. Next gen it was near free. I believe 2x perf on ray tracing is enough to make current rt games play allright and achieve at least 1080p 60fps in new rt titles by 4090.

    • @MsMarco6
      @MsMarco6 Před rokem

      you can already acheive 1080p 60 in most rt titles.
      with dlss I can play almost all rt titles at 60fps 1600p on my 3070 laptop, the exception being spiderman but that has far more to do with the sky high cpu requirements that no laptop can realistically reach rn.
      the idea that RT makes games unplayable or barely gives a graphical difference may have been true during the 2000 series but isn't anymore (atleast not for 3060 and up).
      I honestly don't know why coreteks seems so against RT, improving the lighting model has always been a major pillar of 3d graphics advancements and rt is the logical next step. you can dislike Nvidia for their pricing but investing heavily in raytracing seems to me to be the logical move.
      But then again coreteks even seems to talk down on dlss (seriously who doesn't like dlss), coreteks might say no ones buying an rtx card for it but it's why I chose nvidia over amd. so idk maybe Jenson kicked his puppy or something.

    • @michahojwa8132
      @michahojwa8132 Před rokem

      @@MsMarco6 I also like ray tracing, i'm giving half of perf of my 6700xt to use it. What I meant this isn't full ray tracing because those are limited effects and a lot of cheats - rt may finally be Ray tracing as John Carmack has described - you dont do lighting, dont prebake shadows, dont fallback rt reflections - all is done for you as single effect - path tracing - and it works well no matter the load. If this becomes common then 4090 would give you 1080p 60fps in every game but I wouldnt bet it would play higher res native. Maybe this is the tech in cyberpunk overdrive rt as a number of elements/lights/shadows may be 5000x larger - judging from nvidia press info. I don't like nvidia but I respect their work and achevements and have bought their cards multiple times. As for coreteks - he doesn't like Ray tracing that much - Like MLID didn't like dlss2 - when the game with proper support came, MLID started admiring dlss and wants dlss3 in his new card. It is a matter of acknowledging added value in the games the guy likes - I'd say it's comming inevitably :).

  • @Deeps__
    @Deeps__ Před 2 lety +2

    Tbh I’ve just resided to the fact that the hardware sector is a triad of scumbaggary. I’ll purely by on price to performance now regardless of manufacturer.

  • @lokeshg9929
    @lokeshg9929 Před 2 lety +1

    Nvidia's CTO is Michael Kagan, dude. Colette Kress is the CFO.
    CTOs don't go around reflecting on or estimating sales and growth, especially in an investor call.

  • @username65585
    @username65585 Před 2 lety +1

    A quick check using Steam hardware survey showed me that 37.64% of Nvidia GPUs are RTX models. There might be some RTX models in the "other" category but there is no way for me to check that. Also I might have messed something up because I had to wrangle it into a spreadsheet since the Hardware survey data isn't exposed as raw data. Also the stat for all GPUs in 27% are Nvidia RTX.

  • @mylescalladine8693
    @mylescalladine8693 Před 2 lety +2

    you also forgot were now in a recession and the whole stock market is crashing.

  • @gettcouped
    @gettcouped Před 2 lety

    Great video!!! Imagine HBM memory gets cheap and we can have an HBM unified memory structure

  • @ridiculous_gaming
    @ridiculous_gaming Před 2 lety +1

    My first Nvidia GPU was the 16mb TNT, which was an integrated desktop card with a gpu.It was less powerful than a 3FDX card for 3d gaming, but was convenient.

    • @a266012
      @a266012 Před rokem +1

      I remember this one, my dad's Pentium II-400 had one, too. It was just as good as the Voodoo II or Banshee in OpenGL games like Half Life (but looked a bit better, as textures weren't messed up that often), but in Glide-optimized games like Unreal it could still not compete with the Voodoo cards of that generation. But it was some sort of turning point - Glide-optimized Games declined, while OpenGL and Direct3D-based games took over the market, and also nVidia began its rise. My first nVidia card was the TNT2 (Model64 with reduced Memory bandwidth of only 64 bits - it was like the XX60/XX70 series nowadays - fast enough, but affordable), which I paid around 120 bucks for, while the fully equipped version went for around 200 bucks. Great times 😅

  • @arcadealchemist
    @arcadealchemist Před 2 lety +1

    yeh nvidia is really forgetting who got them where they are today.
    but thats happening with most corperations now they forget the userbase who built them and suck at the tit of whoever is paying the most.
    this is normally when companies make the most mistakes and fail

  • @workaccount1234
    @workaccount1234 Před 2 lety

    Could mining specific GPU's be grouped into data center at this point?

  • @Schroinx
    @Schroinx Před 2 lety +2

    The weak link is the CPU, as Windows is dominant and for gaming is relying on x86 chips. Both AMD and Intel will do anything they can to keep him out from the CPU market. Jensen will have to fork out some cash to MS and the game designers to make it RISC-V or Arm compatible.

  • @ag687
    @ag687 Před 2 lety +10

    Nvidia has been more of a software than hardware company for like a decade now. Even before the AI stuff. It's why they've enjoyed higher margins than hardware alone for ages now.

    • @vmafarah9473
      @vmafarah9473 Před 2 lety +3

      Jensen know how to set bait. And catch all us pockets. He use AI and Raytracing for that.

    • @ag687
      @ag687 Před 2 lety +1

      ​@@vmafarah9473 Raytracing and related stuff has a long runway of upgrades before all the possible settings it can possibly include can be turned on. The AI stuff also leaves room for performance improvements on older gen cards.

    • @estring123
      @estring123 Před rokem

      no vast majority of sales is still in gpus, their a hardware company.

    • @ag687
      @ag687 Před rokem

      ​@@estring123 That was not a comment on software sales.
      They are making their hardware more useful with software which makes the hardware command a premium price. Think DLSS, NVLink, Drivers and everything else software that gets people to want to use their GPU over a competitor. Software adds value to the hardware so they make more on the sale of the hardware.

    • @estring123
      @estring123 Před rokem

      @@ag687 amd and intel have been asleep
      nvidia is still the biggest bubble in the market, it's far worse than tesla.
      DC growth makes investors think nvidia should be priced to dominate the galaxy. but DC growth missed this quarter, and more importantly this parabolic DC/cloud explosion will fall back to earth, just like every other business cycle this explosion in DCs/cloud is cyclical. covid and infinite money printing has results in extreme pull forward demand of DC revenue.

  • @billschannel1116
    @billschannel1116 Před 2 lety

    Videocard sales lessen approaching refreshes. That's one of the reasons they say Q2 will sell less.

  • @lencas112
    @lencas112 Před 2 lety +3

    You seem too optimistic with 300$ cards. I doubt we get that good of a rtx performance for 300. 300is now low end. Mid tier is more like 400-500. And i doubt the inflation will allow it to be cheeper than before.

    • @samlebon9884
      @samlebon9884 Před 2 lety

      Maybe the author is taking inflation out if the equation. The $300 of 10 years ago becomes $400, so it's time to move on

  • @indigoskywalker
    @indigoskywalker Před 2 lety +1

    "the network is the computer" Sun Micro Systems

  • @cryptochrome_original
    @cryptochrome_original Před 2 lety +1

    When Jenson refers to "the channel", what he is referring to is distributors (who resell to resellers, who resell to consumers). So no, Jenson has in fact absolutely no idea how many GPUs went to miners. Just pointing this out.

  • @Defender2516
    @Defender2516 Před 2 lety +2

    When you realize Ukraine got a 40 billion aid package and Nvidia only makes 10 Billion a year.

  • @issacehowardjr679
    @issacehowardjr679 Před 2 lety +1

    Average gamers cannot afford the prices for the new GPU's. Nor do they wish to pay more than $300.00. Period. I'm sure they'll go a different direction... that's obvious. We'll simply go else where.

  • @marcin_karwinski
    @marcin_karwinski Před 2 lety +1

    While nVidia certainly has the engineering to build a Grace-Hopper-inspired SoC for desktops/laptops, I'm afraid I need to doubt them actually doing so. Compared to Apple, which has had their own OS with their own APIs for everything already ingrained in the ecosystem developers' toolboxes and only a Rosetta feature to translate/emulate X86 binaries on ARM ISA was required for MacOS ecosystem to work well and offer nearly the same app support as previous X86 incarnations, nVidia would have to devise both hardware and software ecosystem. In DC/server market, where Grace-Hopper was born into, linux and unix OSes are mainstays and a lot of apps are already packaged for ARM or available as source-code for ARM-targeted compilations, so the burden of ARM software stack is not exactly in nVidia hands. Major SW vendors would always prep an ARM-native variant of their offerings for big DC customers if it means gaining or losing big money... And the GPGPU compute components usually use nVidia sourced API, so only nVidia needs to provide a new version of the APIs that works with the hardware of their own devising. Desktops and laptops are ruled by different tenants currently, so in order to be successful, and nVidia won't enter a market unless it's sure to bring profits with as little work as possible on their side, nVidia would have to either compromise or provide fuller system stack on their own. What do i mean by that? If nVidia wants to offload OS development/support somewhere, they have 3 basic options - either they go with Microsoft, or they go for some linux distro, or they utilise Android. In case of the first one, bearing in mind how MS treats Win on ARM and how much support this product offers, I'd say going with Redmond as SW-backing provider would heavily limit the laptop/desktop core clientele due to woeful gaming and other software support/availability under Win on ARM ecosystem. Going with the second option, while there are ARM builds for many major players, for nVidia to appease their possible desktop clients they'd have to devise X86 to ARM translation/emulation solution with negligible impact on performance or stability of the running apps, something like a Proton on steroids, as it would have to be capable of both translating Windows API calls to linux but also X86 binaries to ARM... And at that stage, it would probably be easier for the software engineers to simply build a linux stack from the ground up as their own distro specifically tailored for their running gear. Still, given the Proton compatibility issues Valve and linux as a whole has with just the Windows APIs to linux APIs translation and software support amongst game studios/devs, going for an even more extensive translation layer between CPU ISAs wuld be a very arduous task. That leaves us with Android/ChromeOS, with a lot/majority of software already devised with ARM ISA in mind, but also mostly ARM GPUs in mind, wherein traditional desktop/laptop users would not feel too much at home with the limited features the ecosystem offers for desktop-like workloads. Still, it's probably best bet for nVidia, with a power-sipping efficient design with a lot of horsepower under the hood for gaming, they might be able to spin it off into Android-based desktop-like experience for mobile gaming on ultrabook-styled or minipc-sized gear. Though at this point they'd be fighting competition from the tablet/smartphone heavy hitters, not counting all the Android, and also Windows, "handhelds". On the other hand, nVidia has the potential to disrupt the market with its own OS that eg. translates Win API calls and X86 ISA etc. and supports eg. some/majority of application/game stores - while Epic or GOG could help with software stack creation, MS could potentially be negative with their XboX/Microsoft store integrations and Valve would certainly prefer customers go for their SteamDeck instead so Steam store might also be not on board... And that severely limits potential software availability and thus hits/cuts off potential customer base to less than half... As such, I really believe that they might try to spin the Android for "desktop" experience, whether in a mobile notebook variant or immobile tethered to the desk mini cube brick. However, at this point it's more likely the SoC actually gets built for a new "Grace(ful) Shield Ti(e)" to combine said Android features with their ShieldTV customer base and an encroachment into the Shield gaming console for the living room usage pattern that could then be further extended through eg. nVidia's own cloud gaming services to further boost compatibility/value factor (eg. by offering lower pricing plans for said revised Shield-like product). Once nVidia takes the foothold in your rooms, and with heavy DC/server market possible profits (either through direct sales or eg. said cloud-gaming service), they'd have both the top of the food chain and the bottom served, after which they might successfully sell the idea of a quite compromised, from regular laptop/desktop standpoint, laptop solution that peoply might want to invest in. Of course, they might also go big and do a HEDT platform for pros to utilise DC designs in a desktop with pro-oriented SW stack prepared, thus possibly also not requiring a lot of work to pour game-compatibility aspects into the platform, just focusing on eg. media creation and AI/ML aspects, same as Apple attempts to tempt the creators into their ecosystem...

    • @ag687
      @ag687 Před 2 lety

      That wall of text... I tried to read it...
      My thoughts on the Grace/hopper thing for now is so nvidia can build systems without using AMD/Intel. That gives Nvidia more flexibility when building their custom servers and a bigger profit margin. They are already focused on building AI/software/hardware ecosystem. I think it'll be years before we see desktop/laptop processor from nvidia.

    • @marcin_karwinski
      @marcin_karwinski Před 2 lety +1

      @@ag687 TL;DR version for reading impaired ;) :
      1) nvidia can do regular/gamer-oriented laptop/desktop SoCs but they have to provide software stack (either OS or translation layers) to tempt ordinary joes into their budding latop/desktop ecosystem so the cost/benefit calculus is not in favor,
      2) apple had it easier cause they have had their own platforms/APIs for years enforced onto the ecosystem devs,
      3) nvidia can easily do a specialised pro-oriented HEDT variant or mobile workstation by focusing on their AI/Graphics "studio/production" stacks akin to apple target group marketing - think DC-/sserver-grade equipment in a lighter variant but not as basic as their devkit of Jetson family,
      4) nvidia may create such a SoC that sits in the middle between GH and Jetson though at a far lower spec end than 3), as a beefed up/refreshed variant of a Shield line, a kind of living room pseudo media-consumption light-gaming/emulation-device Android nettop to rival small consoles, miniPCs and *TV nettops since the software stack is largely there and the market is not yet overcrowded.

  • @Acetyl53
    @Acetyl53 Před 2 lety +4

    It just hit me, NVlink phonetically is Envy Link. Is Nvidia... Envy-dia? Huh.

    • @8est8its89
      @8est8its89 Před 2 lety +1

      Yep
      Hence why they use the eye of the green eyed monster as their logo.
      It's not a coincidence and yes.. they are a shitty evil company.

    • @samlebon9884
      @samlebon9884 Před 2 lety

      Nvlink is useless in an X86 system.

    • @Acetyl53
      @Acetyl53 Před 2 lety +2

      ​@@8est8its89 I was aware of the phrase "green with envy" and read someone's article "the green eyed narcissist" the other day. Looked up green eyed monster as well, seems to be a personification of jealousy. Interesting. I have green eyes, and never thought of this. I'll have to dig deeper into the origins.
      CZcams hid your comment by the way. I agree, they are a terrible company. All of them are. AMD has been seen as the underdog and "good guy", but as they build up again they'll just flip too.

  • @platin2148
    @platin2148 Před 2 lety +1

    So it sounds like a SGI ending?

  • @jklappenbach
    @jklappenbach Před rokem

    Everyone assumes, incorrectly, that Moore's law is based entirely on predictable, iterative updates on existing processes.
    Nope.
    In most cases, that assumption is correct. But the key moments where exponential growth is achieved are due to innovation and dramatic process changes. There are quite a number of dimensions that play a role: materials, optics, architecture, physical structures, etc. Over the historic record, these jumps are smoothed into an exponential curve. All of these jumps due to non-linear innovation.
    There is a finite limit for processing: it's the speed light through a computing medium. We also have a lower limit on our ability to interact with matter (store a bit): the Plank length.
    Short of that, we have a long way to go.
    "Moore's Law is dead" -- How quaint.

  • @kennyj4366
    @kennyj4366 Před 2 lety +1

    Very nice video. Lots and lots of interesting information. Thank you. 👍

  • @ut2k4wikichici
    @ut2k4wikichici Před 2 lety +33

    nvidia has done bugger all for gamers for 2 years, turing sucked, ampere sucked but i never got to find out first hand because their still being scalped so i went out and got a 6900xt

    • @magnomliman8114
      @magnomliman8114 Před 2 lety +6

      ''sucked'' right... said the AMD BUYER

    • @moonw0man
      @moonw0man Před 2 lety +4

      bruh AMD can't even fix their driver issues what the hell have they done for gamers

    • @positivevibes532
      @positivevibes532 Před 2 lety +1

      If Ampere was so bad why did you decide to buy an even worse architecture with awful driver support? lol

    • @Simon-gc6uf
      @Simon-gc6uf Před 2 lety +1

      @@moonw0man Yeah keep parroting that line over and over without giving any details.. What driver issues specifically are you referring to?

    • @Simon-gc6uf
      @Simon-gc6uf Před 2 lety

      @@positivevibes532 You got the wrong name mate, you're not giving off any positive vibes, just another wannabe know it all who tries to validate their existence by worshiping at the shrine of nvidia... Woohoo great work..... lol

  • @Noir1234
    @Noir1234 Před 2 lety +1

    21:50 - Because miner needed GPUs, so they bought gaming laptops and normal people also bought them because they were cheaper than a desktop system.

  • @StreetComp
    @StreetComp Před 2 lety

    As far as the installed RTX base being 1/3 of their cards that’s not a surprise if you’ve been watching that Steam user hardware page for years. As an enthusiast who keeps a desktop relatively up to date, I remember being stunned years ago when saw what components most people were using

  • @Chris3s
    @Chris3s Před 2 lety

    Where is ucdkeys based at and where do they get their keys?

  • @neilbrideau8520
    @neilbrideau8520 Před 2 lety +33

    NVIDIA + ARM was a frightening prospect. NVIDIA seems a little lost right now and AMD is breathing down their necks in both gaming AND datacenter applications. This should be interesting as NVIDIA has proven to be a street fighter with no morals.

    • @ut2k4wikichici
      @ut2k4wikichici Před 2 lety +5

      AMD CPU + INTEL GPU master race lmao

    • @emeraldcelestial1058
      @emeraldcelestial1058 Před 2 lety +3

      @@ut2k4wikichici lmfao

    • @Acetyl53
      @Acetyl53 Před 2 lety

      All corporations are fundamentally amoral, full spectrum dominance is the name of the game. Anything that doesn't organize itself this way is erased. The modern world and in large part every modern person is a product of this natural order.

    • @ut2k4wikichici
      @ut2k4wikichici Před 2 lety

      @@Acetyl53 mmmhmm

    • @tacticalcenter8658
      @tacticalcenter8658 Před 2 lety +2

      Arm should stay independent.

  • @Iswimandrun
    @Iswimandrun Před 2 lety +4

    Crypto on gpu is dying, let it die it's a waste of hardware. Ethereum in my opinion will fail hard at moving to proof of stake.

  • @Luminasonique
    @Luminasonique Před 2 lety

    i think they were just waiting for the unreal engine 5 to be ready for the next gen , i just hope they are going to contain the power consuption at moderate levels (around 200 - 250 w would be great).

  • @Steamrick
    @Steamrick Před 2 lety +1

    The margin on nvidia's earnings call is just insane...

  • @MozartificeR
    @MozartificeR Před 2 lety

    I think they should go with 8 K 120 hertz, through DLSS. And sacrifice all other space on the die for RT cores. To sacrifice die space to open up the RTX pool of stuff they want to do. I think this would be solid product.

  • @Austin1990
    @Austin1990 Před 2 lety +6

    Such a good video.
    But, every time you mentioned growth, I wondered how much would be inflation.

    • @BoltRM
      @BoltRM Před 2 lety

      Growth without including stats about profit, can mean anything. Growth without profit = debt.

  • @mzamroni
    @mzamroni Před 2 lety

    Gaming gpu is the least profitable chips.
    As it relies on more fpus to perform better, its performance spec will be stagnant until EUV becomes cheap and has high production rate that can suffice datacenter cpu and gpu demand

  • @WhyHighC
    @WhyHighC Před 2 lety

    Background music?

  • @supking403
    @supking403 Před 2 lety

    Gpu prices will be to overpriced which will result in a cloud gaming market explosion

  • @igavinwood
    @igavinwood Před 2 lety +3

    Great vid. I appreciate the look into the good and bad that nVidea are up to. I say 'good' loosely, as their reputation is one of market manipulation. What scares me more is the manipulation of MS rendering the ownership of your pc meaningless as they slowly wrap up all the avenues that a typical owner of a pc has into only working if they have their permission to do so and give over the rights to personal privacy to MS. Now that's real anti-consumer behaviour. Any chance you could cover this, as SW is the part that makes HW useful?

    • @jordanhildebrandt3705
      @jordanhildebrandt3705 Před 2 lety +1

      I so much agree. MS is such a garbage company. And on top of all their "we own your soul" behavior, windows itself is a hot mess.
      I am frankly so glad that desktop Linux is getting so much industry attention. GPU drivers are finally good. 1-to-1 trackpad gestures. overall good hardware support. And printing is in a different universe compared to windows. It's now a real alternative for many situations (though CERTAINLY NOT all). I blew windows away on my Surface Pro and installed Fedora. Windows found the most creative ways to get in my way.

    • @kintustis
      @kintustis Před 2 lety

      Ms is fucking awful. Gets worse and worse every year. The company just can't seem to get out of it's own way.
      If only linux wasn't complete trash. Maybe if they all got together and formed a couple competent distros with actual QA testing and hardware support. At this point, using WSL on W11 is better than actually running Linux.

    • @igavinwood
      @igavinwood Před 2 lety

      @@jordanhildebrandt3705 The scary thing about MS is that even Linux needs permission from them to run on MS enabled pcs

    • @jordanhildebrandt3705
      @jordanhildebrandt3705 Před 2 lety

      @@igavinwood Yeah... all the secure boot crap. My next laptop will probably be a Framework machine. About as far as you can get from MS while remaining in our galaxy.

  • @gamamew
    @gamamew Před 2 lety +1

    If Nvidia really wants their raytracing being adopted by the masses, they must bring RTX to the low end ($100~$250 1050-1060Ti prices).
    $300 isn't gonna work, that's too expensive, especially today that prices of everything are skyrocketing, not everybody lives in the U.S.A. and it's making hundreds of dollars of salary every week.

  • @MajaroReal
    @MajaroReal Před 2 lety +1

    People have not upgraded to RTX cards becuase they are new scarce and expensive lol its not because people don't want RTX

  • @Tigerhearty
    @Tigerhearty Před 2 lety +1

    Ai can lead to brand new untoughts ways of graphic rendering and computing with much more efficiency.

  • @billykotsos4642
    @billykotsos4642 Před 2 lety

    12M is like getting a speeding ticket of 6 bucks. What a load of bs

  • @gaborkeresztes1739
    @gaborkeresztes1739 Před 2 lety

    When the 3000s came out i actually wanted to upgrade from my 970 to 3080 but the stock and the price(EU)... still waiting to upgrade but now intel is here 'any time now' and amd... i hope they don't f up the 4000s

  • @MozartificeR
    @MozartificeR Před 2 lety +2

    4090 Ti first might mean they are trying to spark the mineing dollar again.

    • @N0N0111
      @N0N0111 Před 2 lety +4

      You better know Jensen is all about the mining selling profit first.
      And the gaming small profits as last, Jensen is a very sneaky boy.

  • @zekiozturk2084
    @zekiozturk2084 Před 2 lety +1

    Well gaming consoles are way cheaper and they are better than most pc or gaming PCs which are on 1050 to 1060 level. Beside professionals maybe prefers apple silicon over Rtx

  • @lil----lil
    @lil----lil Před 2 lety +1

    Two years into RTX 3080 and they're STILL selling above MSRP. Let 'em eat cake.

    • @sylvainh2o
      @sylvainh2o Před 2 lety

      Company is doing well when everything you can produce is 100% sold and pre-ordered :P

  • @sonicareful
    @sonicareful Před 2 lety

    Colette Kress is CFO intead of CTO...?

  • @martinshelton9533
    @martinshelton9533 Před 2 lety

    Not in the circles I'm in, part time gamer and VR user, the RTX 3000 range is very popular even at the high cost for PCVR gamers and standard gamers!

  • @jedcheng5551
    @jedcheng5551 Před 2 lety

    CUDA is a gigantic moat for Nvidia. Most Nvidia competitors were trying to manually write a bunch of kernels and then put it on the marketing slides saying their hardware supports X Y Z software. The only real change in approach was from intel but it's too early to say whether they can challenge Nvidia

  • @jordanhildebrandt3705
    @jordanhildebrandt3705 Před 2 lety +6

    Excellent video. Way to be simultaneously objective and passionate about a subject. Honest critics are hard to find.
    Sounds like Nvidia is gonna do about as well as any hardware company possibly can, within the constraints of whatever economic conditions we end up with over the next year.

    • @reinhardtwilhelm5415
      @reinhardtwilhelm5415 Před 2 lety

      I do wish they wouldn’t, because they’ve been extremely ruthless for pretty much their entire history.

  • @VictorMistral
    @VictorMistral Před 2 lety

    Remember the old AMD HSA?
    Now CXL.mem and things like that might help unify memory.
    There is "not often" a need for memory to be used often by multiple processor, and no matter how it's shared, if used by multiple processor/core, effort needs to be done to keep it coherent. But yes the PCI-e bus will be a bottleneck if there a high bandwidth demands.
    Localizing it to the same memory can help, but might just move the bottleneck to the memory bus.

    • @SuperFlamethrower
      @SuperFlamethrower Před 2 lety

      AMD won't design an APU similar to a game console. PS4 and PS5 use GDDR and unified memory to get enough bandwidth. In a regular system, the RAM controller bandwidth will be a bottleneck and limit the size of the integrated GPU you can have in an APU. Apple uses regular LPDDR5 and lots of channels. There are different solutions. AMD chooses "add a discrete GPU with GDDR" if you need extra graphics performance and won't do a high performance APU for gaming.

    • @miguelpereira9859
      @miguelpereira9859 Před rokem

      @@SuperFlamethrower What about 3D stacking

  • @Instant_Nerf
    @Instant_Nerf Před 2 lety +1

    No. The gamers didn’t reject rtx. They rejected The price

  • @digitalhouse6969
    @digitalhouse6969 Před 2 lety +1

    NVidia will play a big role in the full dive tech in the future! Remember my words!!!!!

  • @joegaffney8006
    @joegaffney8006 Před 2 lety +2

    Is data center revenue including large mining operations?

    • @xdman20005
      @xdman20005 Před 2 lety

      nah they just buy a bulk of consumer cards, would be suprised if they use a100s to mine

    • @Mr71peti
      @Mr71peti Před 2 lety

      This is my question too.

    • @ag687
      @ag687 Před 2 lety +1

      Most mining sales end up mixed into gaming segment sales. As most miners like the value and resale value of a gaming card and just buy the gaming cards. The rest are dedicated "CMP" mining cards which i think are mixed into the "OEM and Other" category.

  • @mrsoda6986
    @mrsoda6986 Před 2 lety +2

    Why would I pay 21$ for windows 11 when I can just buy windows 10 for 14$ and upgrade for free?

    • @toastdtech3597
      @toastdtech3597 Před 2 lety

      Why buy windows? Its so easy to bypass the activation with a few commands

  • @clownbaby7985
    @clownbaby7985 Před rokem

    I understand why, but it’s still very sad that every year I can only find a handful of by far the best tech channels videos. You have the, please try to squeeze out an extra video for me this year.

  • @prashanthb6521
    @prashanthb6521 Před 2 lety

    12:20 Sounded like a matrix glitch :)

  • @Mr71peti
    @Mr71peti Před 2 lety +11

    This is amazing content..thank you!
    I'm an investor and I think a huge percentage of their revenue has been crypto mining related. Described by them as discreet gpu sales. With Ethereum merge its just going to disappear and we will see for the first timr tgr real size of discreetel gpu business.
    But datacenter is growing very fast and they can transition into a company that has a new robust source of revenue...as long as the timing is right. But here is my question: could cryptomining be going on on a larger scale in datacenter too? I read about 3 largest telecom datacenters in China being investigated for cryptomining.. and if this is occurring i wonder how its affecting datacenter revenue theyre currently posting and also in the future.
    Their earnings calls, as far as i'm concerned, are extremely misleading. Its kinda shocking.

    • @Mr71peti
      @Mr71peti Před 2 lety

      Another person mentioned this already, but Colette Kress is the company CFO.

    • @ryanperrault8174
      @ryanperrault8174 Před 2 lety

      Just in time for me to get a new PC....definitely need one

    • @stillnotchill2560
      @stillnotchill2560 Před 2 lety

      Hate to be the bearer of bad news but crypto won't necessarily be down forever, and could just as easily pump to all time highs like it did less than a year ago. Also, even if ETH is no longer mineable, that doesn't mean they won't just find a new coin to mine instead of just stopping.

    • @Mr71peti
      @Mr71peti Před 2 lety

      @@stillnotchill2560 you would be the bearer of good news..atleast for investors. I hope this os the case.

  • @gusmlie
    @gusmlie Před rokem +1

    That will be the end of pc gaming.

  • @NightRogue77
    @NightRogue77 Před 2 lety

    That could also be looked at as Jensen does not understand the AI sector as well as the gaming sector. The better you know something, the easier you can explain it.

  • @dierkrieger
    @dierkrieger Před 2 lety

    I am an NVIDIA investor, and one big problem NVIDIA has is high-priced GPUs that are really not worth it for gamers. NVIDIA stocks are really low right now, and we are most likely going into a worldwide recession so all these big companies are in trouble.

  • @ageofdoge
    @ageofdoge Před 2 lety +3

    Great video! I do think Tesla should be given consideration when talking about competitors in the AI data center space.

  • @____trazluz____9804
    @____trazluz____9804 Před 2 lety +1

    Raytracing is too demanding for be used , it take too much fps for not much difference in graphics

  • @jp34604
    @jp34604 Před 2 lety +6

    you have by far one of the best channels on CZcams I can't believe you only have 130k Subs

  • @P4INKiller
    @P4INKiller Před 2 lety

    3:24 Is it just me, or is that Elon Musk without a hair transplant?

  • @nomad9098
    @nomad9098 Před 2 lety

    I wouldn't be surprised if they pull another Turing moment for ada-lovelace.

  • @adriankoch964
    @adriankoch964 Před 2 lety

    Steam Survey showed ~19% RTX install base 11 months ago. So NVIDIA claiming 30% now, doesn't seem so far off gaming reality.

  • @PlanetFrosty
    @PlanetFrosty Před 2 lety +2

    This is the same strategy as Fujitsu in their HBM2 memory. Apple’s platform is limited. However, we created shared memory with SPARC Java Processor in cooperation with Sun in developing a telecom fiber to the home/business the shared memory and execution. AMD gets a boost from Oakridge Deployment with HPE.

  • @evilmarc
    @evilmarc Před 2 lety

    Maybe if they made their shit affordable they'd sell more

  • @JethroRose
    @JethroRose Před 2 lety

    NV used basically the same cooler on their reference cards for several generations between what... 700-1000 series?

  • @BeyondImaginationzz
    @BeyondImaginationzz Před 2 lety

    Nvidia is good position to release arm based pc , their jetson line is pretty good.

  • @chrisevans4634
    @chrisevans4634 Před 2 lety

    Nvidia sheild desktop/console?

  • @hyperkid78
    @hyperkid78 Před 2 lety

    3090ti's can be found for prolly about like 800€ in like a year

  • @stanislavkimov2779
    @stanislavkimov2779 Před rokem +1

    "Democratization of RTX". A bad prediction. New GPUs don't even offer a better value than the old gen.

  • @monstercameron
    @monstercameron Před 2 lety

    all intel and amd SOCs with IGPUs have unified memory

  • @matthewvelazquez2013
    @matthewvelazquez2013 Před 2 lety

    Two Minute Papers Says: Hold on to Your Papers!

  • @NighthunterNyx
    @NighthunterNyx Před 2 lety +13

    AMD APU as used on 450 euro consoles have ray tracing, FSR 2.0 (that seems better than aging DLSS) and pretty much all you need to game. In fact I think NVidia is in trouble AMD APUs will sooner or later make it to the open market not just consoles. So think of a next gen system where AMD offers performance close to NVidia as raw power but they have dedicated audio and advanced IO built in while CPU and GPU are integrated and work with the same memory much like consoles - you will get twice the fun for half the price. NVidia will simply loose all game market. That is very distinct possibility given how good XBox SX, PlayStation 5 and SteamDeck do.
    PS your AMD claims are very mislead. AMD had 100% of the high performance console market in the last 10 years……

    • @f-22raptor25
      @f-22raptor25 Před 2 lety +2

      Aging dlss? Fsr 2 has come closer to dlss but still not as good. Fsr 1.0 was crap. Dlss was always better, still is, vs fsr 2.0 its slightly better.
      And not sure why you bring up ray tracing on consoles rdna2 isn't even good at rt. Rt is a gimmick at the moment

    • @NighthunterNyx
      @NighthunterNyx Před 2 lety +1

      @@f-22raptor25 I am unsure what gimmick you are referring to. On PS5 I get solid 60 fps with ray tracing. Plays pretty good. Bad part is far too few games have it

    • @f-22raptor25
      @f-22raptor25 Před 2 lety

      @@NighthunterNyx rt that runs well on rdna 2 is very limited. RT just isnt worth it, so it has become a marketing term. Rasterization has gotten so good. True full on ray tracing takes a huge performance hit(cyberpunk on pc) that even a 3090 struggles with. Rdna 2 is around 30-50% weaker at rt.

    • @Darth_Tater69
      @Darth_Tater69 Před 2 lety

      @@f-22raptor25 The rasterization techniques you speak of, ptgi for example, also absolutely ate performance when they were originally implemented. Like any technology, it takes time to mature and make sense for most people. 30 series was exponentially better with RT than the 20 series, 40 series will probably be a bigger jump and may very well be the tipping point that makes turning on RT a no brainer. At some point in the future, RT cores and AMD's equivalent will render lighting faster than raster. You can only get so far with tricks.