Nvidia Should Sell GeForce.

Sdílet
Vložit
  • čas přidán 4. 06. 2023
  • Nvidia reaches a $1 Trilllion market cap due to increased demand for AI chips.
    Watch the full WAN Show: • It's Time To Name And ...
    ► GET MERCH: lttstore.com
    ► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
    ► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
    ► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
    ► OUR WAN PODCAST GEAR: lmg.gg/wanset
    FOLLOW US ON SOCIAL
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    TikTok (LMG Clips): www.tiktok.com/@_lmgclips_
    Twitch: / linustech
  • Věda a technologie

Komentáře • 1K

  • @satchell78
    @satchell78 Před rokem +2493

    I like Luke's alpha move, hiding all the headphones, forcing Linus to sit close.

    • @kaktusgoreng
      @kaktusgoreng Před rokem +239

      One step closer to a fanfic taran read a few years ago

    • @rosin_apprefix3673
      @rosin_apprefix3673 Před rokem +26

      i can like this beacuase its already at 69 likes so i want you to know i like this comment

    • @D4RKV3NOM
      @D4RKV3NOM Před rokem +5

      @@rosin_apprefix3673 nice!

    • @rettbull9100
      @rettbull9100 Před rokem +42

      multi-million dollar tech company and their sharing wired headphones.....

    • @SomeUserNameBlahBlah
      @SomeUserNameBlahBlah Před rokem +15

      @@rettbull9100 "tech" company?

  • @kokohanahana
    @kokohanahana Před rokem +1367

    nvidia should do what automotive industry did, have a distinct brand for each market while still under one umbrella. and have them run mostly independent of each other but still can use each other's tech

    • @johngreen6169
      @johngreen6169 Před rokem +269

      This is what I came to write. GeForce needs its own leadership so it doesn't take a backseat to AI, but splitting it off entirely seems like a mistake

    • @zach99999
      @zach99999 Před rokem +9

      I was thinking something like this and the automotive industry is a great example

    • @e21big
      @e21big Před rokem +82

      Or like Samsung Electronics assembling a phone with parts from Samsung Display and Samsung Semiconductor

    • @Ari--d
      @Ari--d Před rokem +32

      this would not work because they are limited by fab capacity. The fabs can only make so many chips, so selling AI/data centre chips nets more profit because buisnesses have more money

    • @XyloWang
      @XyloWang Před rokem +77

      @@Ari--d This. Why sell a RTX 4090 for $1600 when you can sell the same chip for $40k to a business.

  • @joruffin
    @joruffin Před rokem +257

    Even after a split, Geforce would need to keep having AI features and CUDA cores because that hobbyist and student pipeline is insanely effective.

    • @wakannnai1
      @wakannnai1 Před rokem +23

      I don't think a split would necessarily mean they don't have AI hw. What it means is GeForce won't have to optimize margins.

    • @HiltTilt
      @HiltTilt Před rokem +4

      Right, but then it doesn't have to be there focus unlike Nvidia

    • @doltBmB
      @doltBmB Před 11 měsíci +2

      CUDA cores is fine because they double as unified shader processors, AI and RT are cancerous growths that gobble up graphics performance.

  • @MrRez808
    @MrRez808 Před rokem +1141

    Imagine if EVGA could buy GeForce.

    • @Felven
      @Felven Před rokem +230

      the world would be a better place

    • @wilsondavenport6939
      @wilsondavenport6939 Před rokem +40

      Actually tho

    • @m4yd1e86
      @m4yd1e86 Před rokem +39

      Bro, if only...

    • @__-fi6xg
      @__-fi6xg Před rokem +21

      @@AndreiBYhappy i think we should go for 2 if even nvidia can reach 1.

    • @Boofener
      @Boofener Před rokem +3

      id love this

  • @alexanderkenway
    @alexanderkenway Před rokem +617

    The stock price will come down for sure but the problem has always been that they make way more selling enterprise grade hardware than consumer. And with the rise of AI they'll want to capitalize

    • @BRBMrSoul
      @BRBMrSoul Před rokem +3

      Exactly

    • @ultimategotea
      @ultimategotea Před rokem +35

      They would only sell it for a king’s ransom, it’s more likely they spin it off into another company that they still wholly own, but has some amount of autonomy (eg how Google treats CZcams)

    • @jon9103
      @jon9103 Před rokem +6

      @@ultimategotea they could also spin it off and IPO while still maintaining some level of ownership.

    • @StriveForLuck
      @StriveForLuck Před rokem +1

      So what exactly are you saying then? That they will inevitably lower prices, just not anytime soon due to focus on AI chips?

    • @onepoh4680
      @onepoh4680 Před rokem +21

      @@StriveForLuck no, prices will never go down if the current strategy continue. It will be the same that happens with rolex with prices skyrocketing due to artificial shortage, and if they sell less because of it nvidia couldn’t care less because is not very profitable compared with their enterprise business

  • @adam346
    @adam346 Před rokem +84

    Fun Fact: Aramco is one of F1's biggest advertisers and they actually insert advertisements over-top of existing boards (not other advertisements mind you) with real-time CGI in some instances.

    • @craigst6755
      @craigst6755 Před rokem +6

      Yeh always referred to barriers as the Aramco barriers by the commentators. I'm surprised it's unfamiliar.

    • @Drenwickification
      @Drenwickification Před rokem +1

      Well it won’t be aramco doing it. They put other live graphics on the track as well. For example during the Monaco gp I saw a graphic for the driver radio on the inside wall of the tunnel

    • @BrendanP
      @BrendanP Před rokem +1

      Also Aston Martin's title sponsor for their F1 team.

    • @Requiemes
      @Requiemes Před rokem +3

      That's common practice on any top level football match, because otherwise visibility of the led panels would be not that great

    • @MondyTS
      @MondyTS Před rokem +1

      ​@@Requiemes also, you can put on different ads for different broadcasters

  • @willwunsche6940
    @willwunsche6940 Před rokem +275

    Ultra low motion blur is actually pretty sick tbf. Optimum Tech did a good video on it

    • @itsprod.472
      @itsprod.472 Před rokem +7

      What Linus was saying here that future display will have that built in so to speak

    • @helplmchoking
      @helplmchoking Před rokem +24

      @@itsprod.472 Yeah if your display can refresh fast enough there won't be any motion blur on the screen (or in the software, yuck) and the only motion blur will be what your *eyes* actually can perceive. You can "perceive" technically infinite fps, your eyes aren't capturing discrete images and real objects don't move in discrete hops (sorry, Planck) so the faster you can refresh the less you need visual tricks to make things look smooth. It's why VR gets more and more nauseating and disorienting as frame rates drop
      Not sure when we'll be able to run those framerates, even if monitors can refresh that fast, but it'll be cool.

    • @brandoniannn
      @brandoniannn Před rokem

      Was gonna say the same but ya beat me to it

    • @dan_loeb
      @dan_loeb Před rokem +4

      @@helplmchoking it's not insanely high frame rates, ulmb actually runs at half (or less) the framerate of your monitor. every other frame is a black out frame, which clears the screen of any possible ghosting, smearing, other frame persistence artifacts. you get equivalent motion clarity to insanely high frame rates, but you have to lock to 60/120 or etc. it doesn't work with vrr/freesync/gsync currently, and it drastically cuts peak brightness down because half your frames are black. it's pretty neat but it's a series of trade offs. when they say 1000hz, it's the equivalent motion clarity to 1000hz. it works on a couple existing 360hz gsync monitors that got firmware updates to add it, and it runs at half or less of that frame rate.

    • @TheMightyMuch
      @TheMightyMuch Před rokem +2

      It is super cool, and that video did a really good job of showing it! I still reckon I would take the very good motion clarity of an OLED and the amazing colors over an LCD. ULMB2 from what i've seen does deaden the brightness to OLED levels. I'm also still concerned for eye health and strain because of the rapid backstrobing.
      I also feel like you need high refresh rate all the time to maintain the ULMB2 where OLED clarity works at all refresh rates. Ok I'm just shilling OLED at this point, but honestly, I've tried it once, and now I can't go back: b ULMB2 is super cool though, and it's even cooler it can be implemented via a firmware update.

  • @jmtradbr
    @jmtradbr Před rokem +833

    geforce is nothing without nvidia patents

    • @jzilla1234
      @jzilla1234 Před rokem +136

      The patents would go with it. And ai stuff would be licenced

    • @DOGMA1138
      @DOGMA1138 Před rokem +111

      @@jzilla1234 You can't spin off that much IP, NVIDIA is also dependent on cross licensing agreements with other chip makers such as Intel.

    • @Pleezath
      @Pleezath Před rokem +5

      Maby they could use it in a few years and call it geforce-X or something stupid like that idk. They still hold the rights then. If raytracing (rtx) still somehow fails to gain popularity.

    • @tranquil14738
      @tranquil14738 Před rokem +3

      More like nvidias marketing

    • @flamingscar5263
      @flamingscar5263 Před rokem +25

      nothing would stop Geforce and Nvidia from entering a partnership allowing Geforce to use Nvidias patents, Nvidia and Geforce would still be very close

  • @LukeBroadhurstfilm
    @LukeBroadhurstfilm Před rokem +27

    Intel: 4 cores
    Nvidia: 8gb vram

  • @Thor847200
    @Thor847200 Před rokem +18

    @LMG Clips.
    I don't think Linus meant that Nvidia should "Sell" GeForce. He said "Spin Off" which means to make it it's own company. Sell off and Spin Off are two completely different hings.

  • @fohhee
    @fohhee Před rokem +31

    A company build by gamers, not for gamers.

    • @Argylleagen
      @Argylleagen Před rokem +2

      "i dont want to play with you anymore"

  • @tofu_golem
    @tofu_golem Před rokem +196

    I think Nvidia does not want me to buy their graphics cards, and I am happy to oblige them.

    • @seafafrage6934
      @seafafrage6934 Před rokem +20

      bruh they don't need your money LITERALLY 💀

    • @muhammadzinc5228
      @muhammadzinc5228 Před rokem +29

      ​@@seafafrage6934 AMD cards have come a long way and ill happily buy those instead

    • @seafafrage6934
      @seafafrage6934 Před rokem +3

      @muhammad zinc same, but i will still hold for the new lineup because the current once are inflated af

    • @Thezuule1
      @Thezuule1 Před rokem +5

      ​@@muhammadzinc5228 unfortunately not when it comes to VR. Headsets that rely on transcoding don't work well on AMD GPUs.

    • @ufiCz
      @ufiCz Před rokem +3

      @@seafafrage6934 I don´t think you know how business works. They just can´t produce GPUs and not sell them. They need his money, they won´t feed one division of their company that is not profitable with other one that has a profit.

  • @gigitrix
    @gigitrix Před rokem +252

    Low margins is one thing, but as compute gets further and further centralised in the hands of a handful of hyperscalar clouds, if you're an Nvidia the last thing you want is to hand over the reigns to a kingmaker that can threaten to withdraw their support at any time.
    If anything, strategically, they need this stuff to maintain more control over their own destiny even if it's at a cost.

    • @obumbrataetvelatamichiquoq9085
      @obumbrataetvelatamichiquoq9085 Před rokem +28

      The thing is they don't need the consumer GPU market anymore. The upside on the AI and corporate market is much larger. Not bringing out the best consumer gpus anymore likely won't hurt their share price more than a couple points, and thus it's not worth it for them to continue putting out consumer gpu's.

    • @rightwingsafetysquad9872
      @rightwingsafetysquad9872 Před rokem +12

      Paradoxically, the hyperscalers are perhaps more dependent on Nvidia to facilitate their own business. Google, Amazon, and Microsoft all produce their own processors, but AMD, Intel, and especially Nvidia are advancing so much faster that they can't cut off the hardware suppliers.

    • @anivicuno9473
      @anivicuno9473 Před rokem +12

      ​@@obumbrataetvelatamichiquoq9085
      Well, they don't need gamers until the autocomplete bubble pops.
      The LLMs and SD networks are really only good for derivative works, and only for works that don't require accuracy.
      So eventually, the market will realize that most of the promises are hype and grifting, and Nvidia will be exactly where they were last year when crypto bit the dust.

    • @dav356
      @dav356 Před rokem +10

      @@anivicuno9473 What you're saying only seems remotely plausible if you don't actually know anything about 'LLMs and SD networks'.

    • @cooleyYT
      @cooleyYT Před rokem +4

      ​@@dav356 which you don't either

  • @Cherokie89
    @Cherokie89 Před rokem +62

    If they split, Nvidia will eat up any potential benefit to consumers with higher priced chip costs to geforce. Even if they don't, I wouldn't be surprised if Geforce claims that's what happened so they can keep a high price.

  • @maymayman0
    @maymayman0 Před rokem +157

    8800 GT was the first gpu i ever got as a handmedown from my dads old rig. Then I got a a gtx 650 (regret) and later a 980 Ti that I have been using ever since. It's no wonder those generations get so much love.

    • @Some-person-dot-dot-dot
      @Some-person-dot-dot-dot Před rokem +7

      Those were the days. I remember my parents had a Gateway prebuilt office computer I would play Missle Command on. After that I went straight to a 760 / i7-4770s then I after that I built a computer for myself with a 980 ti. I didn't have the 980 ti for too long though. When I built this computer I was not really paying attention to the tech news outlets. I just googled the parts that I wanted to build with and made sure everything was compatible with each other. 6 months after I built that rig I was hit with a wave a hyped news centered around Pascal. Ended up upgrading to the 1080. I still have this card right now. Not sure when I'll be able to upgrade since pricing is completely bonkers right now. Hoping the 6950xt will drop atleast another 80 USD in the used market.

    • @SkylerB17
      @SkylerB17 Před rokem +3

      @@Some-person-dot-dot-dot i also still have a 1080. But i started there. had it for almost 6 years now. been fantastic. definitely showing its age however. but yea im with ya, im not sure when ill be able to upgrade.

    • @Tripnotik25
      @Tripnotik25 Před rokem +4

      Used my 8800 GT 512MB till 2014 when Farcry 4 came out and i still tested the card on it (after it ran FC3 effortlessly at 60fps720p). Results were akin to "how on earth can it run this too". Never sold and kept as museum piece along with the thermalright HR-03 i bought for it (good AIBs were not a thing).

    • @kaydnburns5935
      @kaydnburns5935 Před rokem +2

      I ran a GTX 980TI at its launch until a few months ago when I finally got a RTX 3090 and let me tell you, it’s worth it! It’s so amazing being able to play every single game on the market at max settings 1440p at 100-200fps it’s satisfying as fuck lol I also got dummy lucky and got a 3090 for $600

    • @stingrae789
      @stingrae789 Před rokem +1

      ​@@SkylerB17I jumped on the 1070 just before shit went crazy. I managed to grab a 1080 at a good price but haven't had much time for games anyway. Current pricing is prohibitive especially for those of us in third world countries.
      Who would have thought a midrange card would cost near double the price of previous midrange cards.

  • @addol95
    @addol95 Před rokem +203

    I'm quite literally forced to use Nvidia GPUs because they control CUDA cores, and the two most popular 3d render engines in my business require CUDA on windows and linux.
    Edit: ULMB is actually quite nice depending on your hardware setup. It reduces smearing and is impressive in competitive games.

    • @yasunakaikumi
      @yasunakaikumi Před rokem +28

      this is also my biggest problem right now, even tho I wanna shift to Intel or AMD GPU, I cannot easily escape from their nightmare club. tho since im Blender mainly, AMD actually implemented their RT core to render in cycles now so bit by bit im moving to another GPU Brand

    • @fluidthought42
      @fluidthought42 Před rokem +12

      Hopefully we start creating more and more systems that can be CUDA-independent. ML training is already going there.

    • @flamingscar5263
      @flamingscar5263 Před rokem +39

      AMD has a CUDA emulation layer called RoCM and its pretty damn good, THE PROBLEM IS, not all AMD GPUs support RoCM, and finding the ones that do is a nightmare because AMDs documentation on it is awful. AMD is THIS CLOSE to dethroning Nvidia but they just stumble over the dumbest things

    • @flintfrommother3gaming
      @flintfrommother3gaming Před rokem +18

      @@flamingscar5263 They aren't close to dethroning anyone, Nvidia is still the household name for everything. Even if AMD makes a product in the same level or even a bit better (which won't happen, at best it's gonna be the same) since Nvidia is on top of the market, everybody will choose them for guaranteeing accuracy and reliability, which makes sense.

    • @flamingscar5263
      @flamingscar5263 Před rokem +24

      @@flintfrommother3gaming Nvidia is not a household name, outside the tech sphere Nvidia is largely unknown, ask any non tech savvy person who makes their GPU and the response will probably be "whats a GPU?"
      when were talking broadly, people don't buy Nvidia GPUs, they buy prebuild computers that have Nvidia GPUs, and prebuild manufactures will go wherever the best performance is, look at how fast prebuilt OEMs switched to AMD from Intel, so if AMD made a truly competing product they would gain significant market share

  • @Grim_and_Proper
    @Grim_and_Proper Před rokem +17

    I think the one thing that makes the viewer retention on tier list videos so strange is the fact that you can skip to the parts you're most interested in, then skip to the end and see where everything placed. Then, the viewer can go back and view the rationale behind any placements they thought interesting or unusual.
    They almost incentivise skipping to the end and then rewatching in parts...

  • @Truth12345
    @Truth12345 Před rokem +45

    "You don't want to want a monopoly. " You also don't want to have a duopoly. Amdollar gracefully went along with Ngreedia with maximizing the price of their products.

    • @vicroc4
      @vicroc4 Před rokem +8

      Which is why Intel trying to break into the market is so important. Another competitor would reduce the Coke/Pepsi style duopoly.
      Though that's still nowhere near enough, especially as Intel is a huge company in its own right.

    • @cj09beira
      @cj09beira Před rokem

      @@vicroc4 Sorry but adding intel to any market is a net loss not gain, they are by far one of the worst anti-consumer brands out there.
      so much so that they are competing agaisnt big pharma for who has had to pay the most in damages for anti-consumer behavior

  • @Insulinspike
    @Insulinspike Před rokem +45

    I agree. Maybe structure it something like Toyota and Lexus (except not a luxury brand) just make it a subsidiary.

    • @martin0499
      @martin0499 Před rokem

      Or like Ŝkoda and Bentley, both owned by VW

    • @FeelMyBirdie
      @FeelMyBirdie Před rokem

      Anything that doesnt involve the chinese like geely and volvo is good to me. I dont want any dirty commies to design our gpus

    • @rexthewolf3149
      @rexthewolf3149 Před rokem +2

      @@FeelMyBirdie they’re not communists they’re market socialists.

  • @zodwraith5745
    @zodwraith5745 Před rokem +81

    The problem with the idea of splitting of GeForce and AI/Datacenter is it's most likely those engineers that make great AI designs are also the ones that make great graphics cards.

    • @Jiggs224u
      @Jiggs224u Před rokem +4

      Yeah I'm confused. GeForce is software that is built for gamers. Has nothing to do with the enterprise/ai/data center argument in the video.

    • @curie1420
      @curie1420 Před rokem +1

      are you saying gpu driver updates = data center software and ai? arent transistors and die sizes different from software in the first place? dont they have 2 different teams for that already?

    • @AlasdairGR
      @AlasdairGR Před rokem

      Those are two completely different kinds of silicon architecture. They likely have separate, distinct teams dedicated to those respective processors.

    • @mitchjames9350
      @mitchjames9350 Před rokem

      @Menphy AI research actually requires Computer Scientists, Electrical and Computer Engineers, Psychologists who specialise in Ai and mathematicians.

    • @mitchjames9350
      @mitchjames9350 Před rokem +1

      ⁠​⁠@@Jiggs224u GeForce is the gaming graphics brand name, the professional/ai/data section is all based on GPU’s which are overall based on the same architecture as the gaming division.

  • @brego129
    @brego129 Před rokem +15

    Props Linus for remembering history. I've been spouting the lessons from the Kepler era for years now.

  • @TheSAMathematician
    @TheSAMathematician Před rokem +17

    The amount of industry software written in CUDA is annoying as well - sometimes stuck with nvidia even if amd/intel have competitive offerigs

    • @Gelonyrum
      @Gelonyrum Před rokem

      True! As 3D artist I want to use Intel GPU but I CAN'T, CUDA and other AI features can't be replaced in any way. (Hello adobe, you little bastards)

    • @vicroc4
      @vicroc4 Před rokem

      And AMD refuses to move their CUDA-equivalent platform over to Windows, which would capture a lot of that market.
      Hopefully DirectML can close the gap a bit.

    • @Altrop
      @Altrop Před rokem +1

      I could be mistaken and I don't have time to google it but I think I read something about Radeon Compute working very well with CUDA software due to a translation layer, losing almost no performance? HIP?

  • @SkylerB17
    @SkylerB17 Před rokem +1

    10 series is when i finally got into PC gaming. Almost 6 years ago. Still got the same rig, besides an upgraded CPU

  • @danielcobia7818
    @danielcobia7818 Před rokem +3

    I bought my own dedicated GPU for the first time the generation before FX. That Ti 4400 actually lasted a good while. The next thing I bought was a 7800 GT.

  • @ekayn1606
    @ekayn1606 Před rokem +7

    I watched the entire video ranking every series. I enjoyed comparing how much they improved on the previous generation and seeing what improvements are still around.

  • @Exilum
    @Exilum Před rokem +8

    From my POV, the research Nvidia does on graphics would be lost for a spun-off GeForce. Looking at how many papers they keep churning out and how they end up being released either as a feature or as software is what makes the link between the research side of Nvidia and the GeForce division. The fact they thought about tensor cores so long before anyone else was because they saw the potential through their research. I really think GeForce would slowly die once it gets cut from Nvidia. As much as it is slowed down by the new horizons that opened for Nvidia, it also gains a lot from the innovations Nvidia comes up with.

  • @Likwidfox
    @Likwidfox Před rokem +2

    The way Ram trucks separated from Dodge.

  • @pandacron
    @pandacron Před rokem +31

    Baby being raised up in the pool: AI
    Drowning kid in the back: gamers
    Skeleton at the bottom: people that create with their computers

  • @mrmertah5412
    @mrmertah5412 Před rokem +8

    Exactly, this is what happens to companies. When they find their way paved by sycophants and thugs, they will move to the next level and leave their old fans behind, bring with them only their thugs and sycophants. We have seen them in their exhibitions "Computex 2023" talking about something completely different from graphics cards, isn't guys?

  • @fuseteam
    @fuseteam Před rokem +29

    Microsoft
    Apple
    Amazon
    Alphabet
    Nvidea
    MAAAN :v

    • @JacobFrodsham
      @JacobFrodsham Před rokem +3

      Bro look into black rock company. They own them all.

    • @andrewehyang
      @andrewehyang Před rokem

      Manaa

    • @tradingelk6914
      @tradingelk6914 Před rokem +15

      @@JacobFrodsham BlackRock has stocks in all these companies, however they are not majority shareholders. So just saying "they own them all" is false.

    • @JackPorter
      @JackPorter Před rokem

      ​@@tradingelk6914 investment firm holds stocks? colour me surprised

  • @sivu1243
    @sivu1243 Před rokem +2

    This makes for a good case study in a business management class on the merits of either approach.

  • @XLR8bg
    @XLR8bg Před 10 měsíci

    5:40-6:10 I didn't even know geforce has its own hostname, but for me, the nvidia URL redirects to the GeForce page, so I'd say their website is still GeForce primarily.

  • @KeinNiemand
    @KeinNiemand Před rokem +3

    But if geforce is seprate it will loose support for cuda and all of nvidias AI stuff, which I need so I can mess around with stuff like stable diffusion and other AI stuff for fun without paying a ton of money in going with the cloud.

  • @tallrocko7010
    @tallrocko7010 Před rokem +3

    I think they need to make a few product lines GeForce needs to be more basic for gamers and streamers, Quadro needs to be more industry built with more vram, and the A series needs to be purely for professional/heavy workstation/AI which can be stupid expensive while a Quadro is the in-between. Then server and data center can be it's own thing. If Nvidia does this they can reduce the price for gamers while businesses pay interprise pricing and Quadros will be somewhat reasonable.

  • @adalaza
    @adalaza Před rokem +1

    10:51 that was the Radeon Rebellion in the run up to the Vega launch which was super cringe

  • @HolyVampires
    @HolyVampires Před rokem

    heard the same agrument for samsung. magins on phones are slim they wont to be in screens and semiconductor.

  • @LG-jn5fx
    @LG-jn5fx Před rokem +28

    Second post and I almost never do this. Nvidia is now classed as a 1 Trillion Dollar company on the back of $29 billion in sales. It does not take a seer to see that it is ludicrous. Sell now, it is the new Cisco.

    • @ent1311
      @ent1311 Před rokem +1

      That's not including assets or future innovation.

    • @johnathanera5863
      @johnathanera5863 Před rokem

      And this is what happens when an ignorant amateur thinks they are a pro because they did 5 minutes of research on Google lol. What a joke.

  • @joeldoxtator9804
    @joeldoxtator9804 Před rokem +17

    Don't break off GeForce.
    Rebirth 3Dfx.
    Nvidia can regurgitate 3Dfx like android 18 in the cell saga.
    Return of working SLi and impressive new driver set that works closely with developers like 3Dfx Glide back in the day.

    • @GoldSrc_
      @GoldSrc_ Před rokem +8

      SLI is useless when DX12 has explicit multi GPU support, it even can be made to work with AMD and nvidia cards _at the same time_ , it's just up to the game devs so implement it or not.

    • @joeldoxtator9804
      @joeldoxtator9804 Před rokem +1

      @@GoldSrc_ Ignoring the new driver set are we? Glide 2.0, full Sli all the time.

    • @ARealPain
      @ARealPain Před rokem +1

      4090 SLI would be sick!

    • @WyattOShea
      @WyattOShea Před rokem +1

      Lmao the DBZ reference is hilarious.

  • @terranceoliver8935
    @terranceoliver8935 Před rokem +1

    I actually brought a used desktop with a 460 in it. And actually spent my money on a 660 for my first GPU. Been hooked ever since

  • @visitrohitbag
    @visitrohitbag Před rokem +1

    I was playing Zelda BOTW on cemu emulator with 4K mod. It was not very taxing on my cpu and old gpu. I was really thinking what a good gameplay it has, and is it really always necessary to make photorealistic games which need $500 GPUs to run?

  • @TrllyToys
    @TrllyToys Před rokem +4

    Linus said something that kind of made me think of a fun video idea. Have everyone play the first game they ever played on PC with whatever their first graphics card was and discuss what made them choose that graphics card and the things about it that made them fall in love with PC gaming (or what made them disappointed in PC gaming) and have it be another one of those walk down memory lane type videos.

  • @oopdigdigfig9786
    @oopdigdigfig9786 Před rokem +5

    I would love to see an AMD GPU ranking tbh

  • @mmcc2852
    @mmcc2852 Před 10 měsíci

    Also AI is come back to consumer in the form of DLSS and experimental algorithm that lower manufacturing process, it is no longer simple transfer of sillica architecture.

  • @sissyspaceship
    @sissyspaceship Před rokem +4

    It would be a similar company to Samsung Electronics and Samsung Display Technologies.

  • @porkypine602
    @porkypine602 Před rokem +5

    Nvidia targets margins in the range of 60 to 70 percent. That is an absolute money printer. Back during the more reasonable times (before the boom in professional market) margin was high 40s to 50s. The mining boom just embodied nvidia that they don’t give a fuck

    • @rkan2
      @rkan2 Před rokem

      No company that is focused on making profit (most of them are) give a shit.

  • @phipsi3922
    @phipsi3922 Před rokem

    I also thought about Nvidia splitting of geforce.
    Regarding ULMB 2: it is extremly cool tech and ulmb2 monitors perform better than oled (according to the early tests). You can probably use it on oled though to get even better results

  • @johnsamuel1999
    @johnsamuel1999 Před rokem

    There will be a lot of IP and tech used commonly in both. This would make splitting it take several years

  • @Dtr146
    @Dtr146 Před rokem +29

    People are talking like people are speaking with their wallets on this one. No literally every stock brokerage held a dividend share. Even stash held a stock party. That's why they are at a trillion

    • @MrGamelover23
      @MrGamelover23 Před rokem +26

      And it doesn't matter if consumers vote with their wallets, Nvidia can completely abandon gamers and still be bigger than AMD ever will be.

    • @neociber24
      @neociber24 Před rokem +3

      ​@@MrGamelover23 so that's what it means "diversify"

    • @flamingscar5263
      @flamingscar5263 Před rokem +6

      @@MrGamelover23 Im not a religious person, but there has to be a higher being that favors Nvidia, every time Nvidia fucks over games something saves their ass, 2000 series was AMD not able to compete, 3000 series was crypto mining, now 4000 series is AI

    • @traplover6357
      @traplover6357 Před rokem +1

      Companies diversify their products so boycotting from a niche audience won't work. Nvidia getting an audience from the AI industry will make gamers worse off in demanding a better product.

    • @issahumps
      @issahumps Před rokem +1

      I’ve made a shit ton from nvidia stock splits lol everybody holding or jumping in for a new one

  • @BenTheMiller
    @BenTheMiller Před rokem +8

    The mind share alone of professionals using their cuda tech is reason enough to keep geforce

    • @TheAkashicTraveller
      @TheAkashicTraveller Před rokem +1

      Nvidia being Nvidia is already causing at least some of these people to try and go to AMD and then just end up with GPUs sitting in cupboards because ROCm support is a mess.

  • @yumri4
    @yumri4 Před rokem

    With the hopper example you "can" game on FP32 and FP64 even though most code is assuming FP16 right now. It will run slower as them majority of it is FP64 and FP64 takes up more physical space than FP16. You also have the Tensor cores games don't use or don't use yet anyways. So yes you can but the work around you will have to do to get it a video out and the lower fps for games isn't worth it. It rocks for Ai iterations for pytorch and other based on CUDA compute while sucking for tensor compute when compared to using a series of TPUs from nVidia you put into your pi cluster board though that is comparing 1 processor to 8 in parallel.

  • @pileshmuzma9406
    @pileshmuzma9406 Před rokem +1

    My feeling on this is that continuing with exponential gains in pure processing ability for GPU's is an inefficient way to increase gaming performance for consumers, given that the code which the processors are processing has gotten increasingly inefficient over time. In a nutshell, this is why graphics performance on equivalent graphics cards are better on consoles than PC's: Optimization. As AI transformers are implemented I think the problem will be solved universally.

  • @brianjordan9446
    @brianjordan9446 Před rokem +38

    How about spin off both Quadro and GeForce and have nvidia be _only_ a chip maker? They can focus 110% on innovation and making the best possible chips and Quadro and GeForce can take and use what they need for their product lines. Nvidia would be able to move so much quicker without having to deal with corporate customers and consumers. Same structure as LG Display vs the rest of LG, and Samsung too.

    • @jeremydale4548
      @jeremydale4548 Před rokem

      And with any luck that will get any Shitty bahavior AWAY from the GPU makers

    • @santyclaws
      @santyclaws Před rokem +13

      nvidia doesn't manufacture chips

    • @Zwank36
      @Zwank36 Před rokem +4

      TSMC make nvidias chips...

    • @nathangamble125
      @nathangamble125 Před rokem +10

      Nvidia _isn't_ a chipmaker, they're chip designers. They don't have their own foundries, they use TSMC for manufacturing, the same as AMD.
      Without Quadro and Geforce, Nvidia would have no reason to design midrange and low-end chips that Quadro and Geforce would use, so your idea would only work if Nvidia licensed CUDA to another chip designer, which isn't realistically going to happen.

    • @brianjordan9446
      @brianjordan9446 Před rokem

      Making / designing, I didn't say manufacturing. Jesus you internet "know-it-alls" are annoying. Apple designs shit but Foxconn makes it.. you still call an iPhone an Apple product and not a Foxconn product. Grow up.

  • @SibirLupus
    @SibirLupus Před rokem +5

    The last time Nvida really cared about consumers with price to value was the 10 series. Since then it's been less % uplift over previous generations for as much if not more money. Good example: GTX 1080 Ti at $699 vs 2080 Ti for $1,200.

    • @christopherfortineux6937
      @christopherfortineux6937 Před rokem

      980ti was 650$ lol they started raising prices with the 10 series

    • @cj09beira
      @cj09beira Před rokem

      the fact people think this is the case shows just how good is their propaganda.

    • @mito-pb8qg
      @mito-pb8qg Před rokem

      The 1080ti was a panic reaction to upcoming Vega, which in the end turned out to be no threat at all. Especially with the launch drivers.

    • @SibirLupus
      @SibirLupus Před rokem

      @@christopherfortineux6937 True, but for that extra $50 you got far more performance with the 1080 Ti then the 980 Ti.

    • @SibirLupus
      @SibirLupus Před rokem

      @@mito-pb8qg Correct, it was. But Nvidia at least kept the cost down with that panicked reaction to Vega. This is why competition is always a good thing. Without real competition, we ended up with the 2080 Ti's roughly 20% performance increase for an extra $500.

  • @oM477o
    @oM477o Před rokem +1

    I think the main reason for the underwhelming 4000 series is that they don't want to canabalize their enterprise business. Workstation and server cards have huge profit margins compared to gaming cards. By holding back on the consumer cards, they reduce the risk that enterprise customers will just buy the cheaaper geforce cards for their AI projects.

  • @ChuckNorris-lf6vo
    @ChuckNorris-lf6vo Před rokem +2

    @Linus thank you for the content you are great. Nvidia can keep gaming because there is a lot of integration between that and the enterprise stuff especially for VR training purposes and networking. Think Davos, virtual Davos.

  • @panikk2
    @panikk2 Před rokem +10

    i think a factor you aren't considering is the mindshare they gain by being at the forefront of consumer GPUs. the same people that buy 800 dollar graphics cards might one day be in charge of purchasing decisions for big IT installations and NVIDIA is going to be a big consideration for that more "hardcore" and techie crowd. even if they aren't necessarily in charge, just having thousands of people being aware of the fab potential that NVIDIA is proving themselves on with consumer GPUs will definitely influence those decisions on the larger business level.

    • @TheAkashicTraveller
      @TheAkashicTraveller Před rokem +6

      And the same people learning and playing with machine learning in their spare time too. AMD certainly doesn't help by just failing to compete especially with the mess ROCm has been compared to CUDA.

  • @Flanktank2
    @Flanktank2 Před rokem +3

    Would it be too monopolistic/conflicting if Valve purchased a split off Geforce? I feel like that partnership would be amazing.

    • @mito-pb8qg
      @mito-pb8qg Před rokem +1

      The last thing we need is that sort of consolidation

    • @bruhder5854
      @bruhder5854 Před rokem +3

      Valve is too small to buy it. It's market cap is iirc 12b while Nvidia's at 1T now which means even their gforce will be about 100B or something.

    • @rkan2
      @rkan2 Před rokem

      You need at least Twitter money to buy stuff like GeForce. Elon would have it though. But it makes no sense. Even less than Twitter.
      Twitter has been Elon's personal free advertising space for more than a decade at this point, so as far as he is concerned, it has already paid him back even before he bought it lol.

  • @Lee883
    @Lee883 Před 10 měsíci

    My first graphics card was a matrox mystique with a whopping 220mhz and 4 whole megabytes!
    That card came with mechwarrior 2 and started my love for the battletech franchise and pc gaming

  • @MrSofazocker
    @MrSofazocker Před rokem

    Well if consumer branch with consumer margins isnt that profitable,
    It would absorb all the cost, if you have other branches that make tons of money.

  • @Expendable1
    @Expendable1 Před rokem +27

    I like how Linus is splitting hairs over the GPU/graphics card term while so many non tech people refer to their whole PC as a CPU

    • @jgold2813
      @jgold2813 Před rokem +1

      the pc is a pc its everything with or without a GPU in most case's pc's dont have GPU video cards all pc's must have some kind of CPU graphics is mostly on the cpu and the fact that high end GPU cards have cost more then the gaming computers for what ? 2 or 3 years makes it ok to use in this term in this space 😎

    • @galirt9196
      @galirt9196 Před rokem

      I have never heard a non tech person refer to their PC as Central Processing Unit

    • @DC3Refom
      @DC3Refom Před rokem

      gpu - the graphics processing unit , graphics card include everything the casing the mainboard etc )

    • @Expendable1
      @Expendable1 Před rokem

      ​@@galirt9196 I heard this a lot a few years ago back in college from people that wanted to sound like they knew more than they did. Maybe people have learned more now.

  • @kalmtraveler
    @kalmtraveler Před rokem +3

    funny thing about Volta not being for gamers, but then they released the Titan V which at the time was better at gaming than the Geforce-branded 1080 Ti and Titan Xp as a single-card solution. Would have loved to get Titan V's with functional NVLink just to see where a pair of them would have charted.

  • @slipoch6635
    @slipoch6635 Před rokem +1

    GeForce could use industrial 3d Graphics cards research to bring forward more gaming cards. bu they seem to have given up on that market too.

  • @LordOfNihil
    @LordOfNihil Před rokem

    i kind of want mining back. because just having a high end video card you can only game on it a few hours a day, it sits idle the rest of the day. games are also expensive, so if you pay upfront for the video card and buy games with mining proceeds, you could really supercharge your gaming hobby. just converting the tailings of the etherium mining craze to steam cards has greatly increased the number of games i buy a year. gone from buying a couple games a year to a dozen or more. its also a reason to keep my gpu up to date, so game companies get kickbacks, and gpu manufacturers get kickbacks. you need to find a way to nerf bulk mining and reward base participation with diminishing returns built in to limit local scaling. probibly using some kind of geohashing.

  • @robertt9342
    @robertt9342 Před rokem +9

    Gaming is a massive part of their operating income, people overlook this. Likely their focus on enterprise compute is partially profit margin, but more so the significant opportunity right now with the massive demand for Compute and AI.

    • @christopherfortineux6937
      @christopherfortineux6937 Před rokem +1

      I just pulled their latest quarterly results. Datacenter segment is growing twice the rate as gaming. Profit margin for both is very similar. By next quarter datacenters etc will out pace gaming on going by their results for the year. While the current fiscal year they are both performing similar they basically doubled their footprint in datacenters in the last year or two.

    • @mowtow90
      @mowtow90 Před rokem +2

      @@christopherfortineux6937 Yep, as someone who supports Datacenters I can tell you - all of the GPU nodes in servers are Nvidia , nobody bothers with AMD. The enterprise stuff is massive and will keep growing with the rise of clould computing. This is something NVDIA alrady fully grasped and went all in. Their consumer market even tho profitable , it will be just left behind while the main market keeps outgrowing it.

  • @cancerino666
    @cancerino666 Před rokem +6

    Intel and AMD are so far behind software-wise. It's gonna take ages before they catch up.

    • @ultratronger
      @ultratronger Před rokem +7

      if nvidia stops sponsoring the software so that it doesnt optimize for amd and intel it wont take that long

    • @KamanZ10
      @KamanZ10 Před rokem +3

      @@ultratronger And why would they do that, Nvidia and CUDA is so deeply embedded in industry and consumer applications, AMD and Intel have no chance of catching up on the software side for gaming LET ALONE anything non-gaming. Try using an AMD card for 3D modelling, Rendering applications, CAD application simulations, AI assisted software etc etc and you'll soon realise how far left in the dust AMD is compared to Nvidia not even including gaming

    • @ultratronger
      @ultratronger Před rokem +1

      @@KamanZ10 yeah, because nvidia pays blender to not optimize for amd, i thought that was obvious? nvidia and cuda are embedded in industry because nvidia paid for amd to be botched by these companies, youre literally agreeing with me.
      for example, amd's encoders arent bad, the issue is that nvidia is a partner to obs, did you know that the "dev team" behind obs' amd encoder stuff was A LITERAL SINGLE GUY who quit his job because when he asked something about the code he was told to contact HIMSELF? meanwhile nvidia works super closely with obs to develop its implementation of their encoder

    • @cj09beira
      @cj09beira Před rokem +1

      @@ultratronger where can i see the rest of that story seems hilarious

    • @ultratronger
      @ultratronger Před rokem

      @@cj09beira if youre talking about nvidia paying companies, the fact that stuff like blender has a literal CUDA option in their settings is enough, it also wouldnt be tbe first time nvidia or intel paid to hinder amd, if youre referring to the obs dev, i believe he himself told that story through youtube, but this was in like 2019 maybe so i dont know how to find it, im also tired

  • @Starfishtroopers
    @Starfishtroopers Před rokem

    I like the neural link you got going there

  • @primcrew
    @primcrew Před rokem

    i can see nvidia gaming division move to intel chip factories to free up supply space on tmsc for more profit enterprise customers

  • @Energyswordsunday
    @Energyswordsunday Před rokem +15

    look at they sharing headphones

  • @surft
    @surft Před rokem +10

    Selling it off completely would be a mistake. Even Intel sees the value ( for now) of having a discrete GPU division.

    • @Activated_Complex
      @Activated_Complex Před rokem

      For Intel, it absolutely makes sense to capture that revenue rather than keep letting all of it go to AMD and Nvidia. And while they're clearly committed to gaming, I'm sure they're also thinking, all those kids who say they want to be "influencers" when they grow up are going to need to render video.
      Selling off GeForce would probably make financial sense for Nvidia in the short term, riding the first wave of AI interest from businesses of all sizes, whether they have much of a use case for it or not. Longer-term, I think they would end up re-absorbing GeForce if they had spun it off, or looking to buy it back if they sold it.

    • @PhilWard
      @PhilWard Před rokem

      Intel is less about capturing the GPU market and more about showing everyone "they can make" solid GPU dies. If Intel wants AMD, Nvidia etc to use their foundries for GPUs, they have to prove they can make them, thus, the ARC was born. I'm not saying Intel will never be a competitive force down the road, I just don't think that's their goal with ARC.

  • @proxybishop
    @proxybishop Před rokem +1

    People don't understand that companies that go full corporate business and ditch consumers will always make more money. Let's face it, consumers in the most part for tech companies are a liability.

  • @dgillies5420
    @dgillies5420 Před rokem

    We got back into gaming with the Rx660m, 600 megaflops on a 3620 4-core 8-thread laptop.

  • @Aerobrake
    @Aerobrake Před rokem +11

    if nvidia sells geforce and they get dumped by AI shareholders, I won't feel bad.

    • @tylerclayton6081
      @tylerclayton6081 Před rokem +1

      They’re the second most important American chip company . Them failing is bad for the country. AI is really going to help the US economy, and we pretty much have a monopoly over the technology

  • @Ktmfan450
    @Ktmfan450 Před rokem +50

    They need to diverge the brand

  • @ccvideotech
    @ccvideotech Před rokem +1

    What happens if GeFrorce loses the buying power from nVidia's scale and the prices go up even more?

  • @sir1junior
    @sir1junior Před rokem

    There CEO taking a 2.5 million cut also helped with the stock value.

  • @E_Clip
    @E_Clip Před rokem +12

    What sells Nvidia products is their proprietary technologies, Cuda and Tensor. Cuda has a stranglehold on CGI and Tensor on AI. THE ONLY way this changes is if competitive technologies are built as well as additional production capabilities. That is the sole reason Nvidia sells and other GPU's don't. Yes yes, AMD sells a bit more Radeon cards now, still that's peanuts compared to what Nvidia sells. This has been a problem for over a decade and neither AMD or Intel have even tried to do anything about it.
    Having said that, Im not sure how spinning off GeForce will do anything to fix the underlying problem. Do you take out cuda and tensor and build strictly a gaming chip? If so, why would you sell only to gamers when those cards with cuda and tensor can be sold to a much wider consumer base? And if you do that, do you sacrifice production capabilities to make an inferior product that has a much narrower consumer base? Yeah I dont think so. The only way I can see this work [spinning it off) is if they actually build a new dedicated production facility or these gaming GPUs too, otherwise its a net loss for the company to use current production facilities to produce inferior products that sell for a much smaller profit margin. No company, especially a publicly traded one, will ever do that.

    • @TywinLannister0
      @TywinLannister0 Před rokem +1

      you make fantastic points. Geforce shouldn't be separate from NVIDIA.

  • @CarlosHernandez-wp3tr
    @CarlosHernandez-wp3tr Před rokem +4

    Imagine if nvidia sold Geforce to Evga.

    • @Gravy1255
      @Gravy1255 Před rokem +1

      That would be the best outcome, a high quality company with a great track record gaming first company buying them.

  • @mmcc2852
    @mmcc2852 Před 10 měsíci

    is there a possibility that it has always be consumer subsidizing the B2B R&D, and now it finally make more money than consumer sector? like if F1 make more money than selling consumer model sports car?

  • @nohemoglibin8677
    @nohemoglibin8677 Před rokem

    The headphone situation gives a Lady and the Tramp vibe.

  • @TiagoTiagoT
    @TiagoTiagoT Před rokem +5

    NVidia has done a lot to have a monopoly on certain specific tech, PhysX, Cuda, GSync etc, to keep consumers trapped in their ecosystem

  • @-tf-boforsho4008
    @-tf-boforsho4008 Před rokem +24

    so GEFORCE would be an AIB partner like asus and others i could see that for sure

    • @jimmified
      @jimmified Před rokem

      Except the AIBs but from them, but yeah basically

    • @flamingscar5263
      @flamingscar5263 Před rokem +19

      no not really, Geforce would still be the GPU designers, still using Nvidia tech, but instead of it being "Nvidia Geforce RTX 3080" it would just be "Geforce RTX 3080" as Geforce would now be its own company, it might seem crazy to think Nvidia would want to step out of gaming, but with how much money is in AI, Nvidia would probably be better off

    • @pieceofschmidtgamer
      @pieceofschmidtgamer Před rokem +7

      No, Geforce would literally be the ones designing the GPUs from the ground up.
      AIBs simply take the work from Nvidia/Geforce, i.e. the chip that is the GPU, and then do the finishing work in order to get you a finished graphics card.

  • @johnhpalmer6098
    @johnhpalmer6098 Před rokem

    The 600 series of NVIDIA cards came in both a GT and GTX variant if not mistaken, I have the basic GT610 that runs on Ferni, not Keplar, unfortunately, and it only has 1GB, now try to edit 1080P video with that. Not well, but doable. I know this card initially came out in 2012/13 but I didn't get mine until 2015, new in box from PNY, yes, it's Ferni but for now, it will suffice.
    I came close to making an inexpensive upgrade by moving to a GTX760 I think but it required a totally different driver set, and barely fit in the X16 PCI slot with the shroud/fan attached, but remove shroud and fan, it fit great in a Dell Optiplex SFF. I later found out that I could have run it off the smaller PCIE slot next to it, but at a slight performance penalty, yes, it had 2GB on it, which was what I was after. Oh well. So needless to say, I still run that 610, for now but do hope to do a total upgrade of the 10 YO Core i5 PC soon..

  • @burakozc3079
    @burakozc3079 Před rokem

    For lower and mid tier gpus, rt and tensor cores is just stupid for gaming. They take almost half of the chip area. If they used those transistors for rasterisation we wouldnt even need dlss.

  • @DaysofKnight
    @DaysofKnight Před rokem +4

    I started gaming with a GTX 560, and upgraded to a 1080 after they came out and dropped to $500. Still sitting in my computer and runs every game I own (350+ Old to modern) on the highest settings. nVidia's newer cards are just too expensive

  • @ixortair
    @ixortair Před rokem +33

    Two multimillionaires that have access to every piece of tech in existence for free, and they are sharing a single pair of wired headphones.

    • @Default_Defect
      @Default_Defect Před rokem +15

      Its called love.

    • @cydia11centra
      @cydia11centra Před rokem +9

      multimillionaire? luke?

    • @WellKnownAlias
      @WellKnownAlias Před rokem +8

      ​@@cydia11centra In terms of net worth, maybe? I mean he's not a stakeholder in LTT or anything, but he's also been there from the beginning and has been in a senior position for much of that. He's also self admittedly extremely frugal, and much of the "stuff" that costs money like expensive tech in his life... he gets for free. Then again, Vancouver is expensive af so 🤷‍♂️

    • @ScottGrammer
      @ScottGrammer Před rokem +7

      Single pair of wired earbuds, plugged into the headphone output of a single USB microphone. It's a portable streaming rig meant to be carried halfway around the world, easily.

  • @abcpea
    @abcpea Před rokem

    the 8000 and 9000 series were very confusing. GT, GTX, GS, GTS, GX2

  • @VioletGiraffe
    @VioletGiraffe Před rokem

    I remember being so sore when 8800GT came out, it was just months after I bit the bullet and bought Radeon 2900XT for $400 (which was more than my monthly income at the time).

  • @nwheatcraft
    @nwheatcraft Před rokem

    11:35 - This needs to be a luke techno song.

  • @benjaminmiddaugh2729
    @benjaminmiddaugh2729 Před rokem +21

    As AdoredTV would tell you, the metric to watch with Nvidia is performance per dollar, with generational performance increases as a reasonable secondary metric. Their top-tier cards since the 1080 Ti have had comparatively abysmal performance per dollar and generally worse generational increases, especially when considering power efficiency. Of course, the 1080 Ti was far more powerful than necessary at the time, but those are still disappointing metrics.

    • @FOGoticus
      @FOGoticus Před rokem +3

      AdoredTV who has the sole purpose of trashing Nvidia and praising AMD's mediocre efforts really isn't something you should consider as a valuable news or info source.
      1080Ti was a big bump in performance over the 1080. It was a one off in quite literally all generations. Taking it as a "standard" of how things should be is... quite simply put, idiotic and awfully desperate for times that will simply never come back. Nvidia, as any major corporation, wants profits. People buying these cards basically means Nvidia has no incentive to release reduced prices. The 40 series is priced so badly mostly because they wanted the RTX 30 stock to be as far gone as possible. The 40 series cards are okish but priced badly. The 50 series cards will change that. Even if they will have identical pricing to the 40 series, the performance uplift will be well worth it.
      Also if AdoredTV didn't acknowledge the efficiency of the RTX 40 series, that pretty much says everything you need to know about them.

    • @ultratronger
      @ultratronger Před rokem

      @@FOGoticus yeah adoredtv loves amd, thats why he hates rdna3 and isnt happy with ryzen. everyone hates nvidia unfairly as nvidia has done nothing wrong! the 4070 is 3 times faster than the 3090ti !!!!!

    • @benjaminmiddaugh2729
      @benjaminmiddaugh2729 Před rokem +5

      @@FOGoticus You didn't pay close enough attention, either to what I said or to AdoredTV in general. I acknowledged that the 1080 Ti was an outlier, and AdoredTV is most certainly not an AMD shill. Among other things, he's critical of them following Nvidia's pricing more than pushing for market share on the graphics side, and he's paid a lot of attention to how Intel might actually have a meaningful near-term answer to AMD's consumer CPUs.
      He's also been trying to get away from constantly criticizing Nvidia, and I'm not aware of him discussing the 40 series in terms of efficiency. I still consider the power cost of current-gen top-end consumer cards to not be worth the performance, though, and the cost/efficiency ratios tend to show that (lower-tier cards are much more reasonable, of course).

  • @bluegizmo1983
    @bluegizmo1983 Před rokem +3

    I knew it!! Just look at them two, sitting there sharing a set of earbuds like an old married couple! His marriage to Yvonne is a LIE! ....... 🤣

  • @superbibibobo
    @superbibibobo Před rokem

    It was the red rebellion. They did have some catchy slogans.

  • @MrJonFinger
    @MrJonFinger Před rokem +1

    I think this heavily underestimates how much the Ai chips will benefit gaming. We are just starting the clumsy transition period from the old rendering systems into the new generative systems. Game pipelines are still a little ways from having the tech to transition so I feel like there will continue to feel like a there’s a disconnect for a bit.

    • @bruhder5854
      @bruhder5854 Před rokem +1

      I don't think it'll have as much of an impact when it comes to running a game. It'll help with developing a game and improving in game ai but actually playing it, I'm not so sure about that.

    • @MrJonFinger
      @MrJonFinger Před rokem

      @@bruhder5854 I can see how that would feel true when generative is just starting to get images right. It will completely reimagine video games in both fidelity and execution. Nvidia is has already done tests with GAN game engines and generative nerfs that make clear the future. NVIDIA's CEO Jensen Huang has even said: “Every single pixel will be generated soon. Not rendered: generated.” He’s not talking about Dlss, he’s talking about the nerf systems they are developing. Expect the Midjourney of games in the next 5 years.

  • @singular9
    @singular9 Před rokem +4

    Nvidia could sell the 4060ti at 299 and still make profit

    • @sidrolf
      @sidrolf Před rokem +5

      But the shame of doing something for their customers instead of shareholders will no doubt prevent them from pursuing this.

  • @misterio10gre
    @misterio10gre Před rokem +22

    GeForce could not survive without Nvidia's strength, the consumer version of each graphics generation is subsidized by the enterprise models

    • @robertt9342
      @robertt9342 Před rokem +2

      Yes and no. It does help due to margin, but they generate ton of revenue and operating income from gaming.

    • @RedStripeMedia
      @RedStripeMedia Před rokem

      AMD has a very small enterprise and does fine.

    • @Gravy1255
      @Gravy1255 Před rokem

      ​@@RedStripeMedia in what regard cpu wise they are gaining a lot of traction

    • @RedStripeMedia
      @RedStripeMedia Před rokem

      @@Gravy1255 their GPU division is very separate and they make money off of it.

    • @Gravy1255
      @Gravy1255 Před rokem

      @@RedStripeMedia but not in the enterprise sector really

  • @ash0787
    @ash0787 Před rokem +2

    They made an AI super board with 600GB of VRAM, but the reasonably affordable card ( after mining collapsed ) the RTX 3070 only got 8GB.

    • @Notyouraveragename
      @Notyouraveragename Před 11 měsíci

      Yeah that's a thing that bugs me. They made a 11 gb card 7 years ago, the 1080ti that still performs on the level of a 4060/ti today. While having 50% more vram.. For a 7 YEAR OLD CARD that has MORE vram.
      Yet the ps5 uses normal ram and acts as if it has 16 gb vram with integrated graphics, and is sold as a complete package for 399$.
      Nvidia COULD put 32 gb of vram on a card for 100$ if they wanted, and release a 32 gb 4060/4070 for the same surcharge as the +100$ for 8 gb model. however even despite the ps5 and xbox providing 16 gb of vram as a complete package, nvidia charges 400-700$ for cards with sometimes LESS vram than their prodessors (11-12 gb 1080s/3060s for 150-250$ VS 8 gb 400-700$ 4060/70s),
      This completely breaks the pc master race jokes because now you often have to spend 1000-2000$ on a computer.. to match the 16 gb vram of a 400$ ps5.. Yet if you go amd who sells reasonable amounts of vram, you're handicapped on cuda / stable diffusion /coding uses without cuda, yet nearly everything that uses cuda wants vram and nvidia won't sell 16+ gb for you for 700$ on a 4070 without 1000$ 4080s for +27$ worth of vram..
      It's planned obsolence.. Why sell your consumers a 300$ product that lasts for 8-10 years. When you can scalp them for 700$ for a product that lasts 5?

  • @alrizo1115
    @alrizo1115 Před rokem +4

    Western tech companies rarely go for cheaper options to accomodate the asian market. The only chance for gpu to dramatically reduce to it's true price is for China or any Asian country to compete in the same market.

  • @realSirDextrose
    @realSirDextrose Před rokem +21

    Linus discovering basic economics all by himself through the PC industry is incredible

    • @PrograError
      @PrograError Před rokem +4

      It not much, it's as much as being able to tell a country's defence from defence acquisition.

    • @UncleJemima
      @UncleJemima Před rokem +24

      I think he's well-aware of basic economics, and this is far from the first time this kind of topic has come up. He is after all a capitalist and understands what it takes to run a business and be profitable (even if he's nowhere near the scale of massive publicly-traded corporations).
      The tough thing to handle is the cognitive dissonance between "this the system I advocate for" and "I don't like the inevitable outcome of this system where corporations make maximized and ever-increasing profits their #1 goal at the expense of the consumer".

    • @avengerulsasuke5814
      @avengerulsasuke5814 Před rokem +2

      @@UncleJemima “A business that makes nothing but money is a poor business”. - Henry Ford

    • @johnathanera5863
      @johnathanera5863 Před rokem

      ​@Avengerul Sasuke unfortunately that's not how it works in the real world.

  • @dermf9
    @dermf9 Před rokem

    i live the new background - i feel like im alive

  • @gendo1123
    @gendo1123 Před rokem

    Where is the tear list video?

  • @jcorey333
    @jcorey333 Před rokem +7

    This reminds me a lot of the innovator's dilemma

    • @jimmified
      @jimmified Před rokem

      What is that?

    • @SaltyAdvisor
      @SaltyAdvisor Před rokem +2

      Innovators Dilemma - though large innovators have some motivation to innovate, they also have a strong disincentive from doing so as new products will undermine their existing ones.

    • @jimmified
      @jimmified Před rokem

      @@SaltyAdvisor ah, ok, thanks

    • @jcorey333
      @jcorey333 Před rokem +2

      It's a book that talks about how large companies don't innovate as much/ in the same ways as small companies, and gives the advice that if you are a large company and you want innovation, you should spin off a small subsidiary.

    • @jimmified
      @jimmified Před rokem

      @@jcorey333 oh, I’ll look into it