8 GB VRAM is a Problem. Is 10G any Better?

Sdílet
Vložit
  • čas přidán 1. 06. 2024
  • 8 GB of VRAM is a problem with games coming out, isn't even uncommon for games to want about 12. The rtx 3080 released a couple years ago and is still a beast of a card. Can it hold up?
    HUB: • VRAM Issues, Crashing ...
    MLID: • Death of 8GB GPUs, RX ...
    Danny Boy: • Not Enough VRAM!!!
    TechPoweredUp: www.techpowerup.com/gpu-specs...
    My Spotify:
    open.spotify.com/artist/3Xulq...
    0:00- The memory problem
    1:47- 3080 in RE4
    3:04- 3080 in Last of Us
    3:38- 3080 in Fortnite
    5:13- 3080 in Atomic Heart
    5:26- THE IRONY
    6:51- VRAM misconceptions
    7:39- Who VRAM actually effects
    9:36- Does AMD's approach work?
  • Hry

Komentáře • 1,9K

  • @livedreamsg
    @livedreamsg Před rokem +1685

    If Intel can afford to put 16GB VRAM on a $350 card, I don't want to hear any excuses for Nvidia.

    • @BeatmasterAC
      @BeatmasterAC Před rokem

      NVidia: "but...but...mUh TeNsOr CoReS...mUh DlLs...MuH FrAmE gEnErAtIoN...mUh iNfLaTioN...mUh HigHeR tSmC pRiCeS...mUh CuDa CoReS..."

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh Před rokem +275

      agreed, i mean Nvidia has a $350 card with at least 12GB (3060) the 3060ti, 3070, 3070ti and 3080 having less VRAM than the 3060 is so stupid, Ngreedia at its best.

    • @KermenTheFrog
      @KermenTheFrog Před rokem +19

      The difference is intel was selling those cards at or near cost

    • @sorinpopa1442
      @sorinpopa1442 Před rokem +42

      Exactly , Puck Ngreedya , they keeping the gaming industry freeze in the last years bcs of they gpus severely lacking vrams. (3090 only exception)

    • @karlhungus545
      @karlhungus545 Před rokem +25

      Unfortunately Nvidia could care less what you think...or anyone else on YT for that matter. They own the GPU market, and will for the foreseeable future. You don't need that VRAM anyways, unless you only play crap console ports at 4K with the 1% that have a 4K monitor 🙄😂 Buy AMD then (you won't), or better yet, have a brain and just get a console...

  • @Barrydick186
    @Barrydick186 Před rokem +620

    Vram isn't the Problem. Nvidia's Vram is the problem.

    • @ForceInEvHorizon
      @ForceInEvHorizon Před rokem +12

      Lol more like AMD

    • @RationHans
      @RationHans Před rokem +50

      I thought the game Devs that do not optimize xd

    • @ForceInEvHorizon
      @ForceInEvHorizon Před rokem +6

      @@RationHans if AMD haven't released they're Fsr we wouldn't have this problem. Nvidia DLSS isn't the problem since its only exclusive to RTX card but once amd released FSR in which available to every card the devs got lazy optimizing they're games since they know we can just use dlss/fsr

    • @V1CT1MIZED
      @V1CT1MIZED Před rokem

      @@ForceInEvHorizon you sound like a child

    • @Rivexd
      @Rivexd Před rokem +147

      @@ForceInEvHorizon that’s like saying “if guns weren’t invented, people wouldn’t kill each other”

  • @Tubes78
    @Tubes78 Před rokem +576

    I remember choosing the 6800xt 16gb over the 3080 10gb because of this.

    • @eldritchtoilets210
      @eldritchtoilets210 Před rokem +90

      Same, I guess it's the "fine wine" effect starting to settle in.

    • @OffBrandChicken
      @OffBrandChicken Před rokem +44

      Same starting to see my purchase to be the correct one over all.
      While some people are saying “you don’t need 8gbs” they are also throttling their game. While I’m turning every up to ultra.

    • @gruiadevil
      @gruiadevil Před rokem +67

      @@OffBrandChicken It's the same people that said "YoU dOn't nEeD a 4770K. What are you gonna do with 8 Threads? E8400, 2Cores/2Threads can run any game"
      In the meantime, in 10 years time, they swapped 6 CPU-s + Mobos + RAM kits, while I held on my i7.

    • @OffBrandChicken
      @OffBrandChicken Před rokem +18

      @@gruiadevil Same, it's not about what tech is "technically fast right now is this specific use case". But "Is this going to fulfill my use cases for next X years"

    • @OffBrandChicken
      @OffBrandChicken Před rokem +26

      ​@@gruiadevil I just love when Nvidia adds more than needed for the time, they're seen as the almighty jesus bestowing greatness. While when AMD does something extra, that frankly is very beneficial.
      The Nvidia users are like, "yes but i could get 10% speeds on games that use less than 8gbs 10 years ago. And that's what truly matters." Instead of thinking about the 20% they'll gain in the future.
      Like the copium is so hard that they don't even see it. Even the youtubers.

  • @LucidStrike
    @LucidStrike Před rokem +111

    I mean, the latest AAA games eventually become affordable Steam Sale games, and so the same problem eventually hits you even if you're not buying at launch.

    • @77wolfblade
      @77wolfblade Před rokem +20

      Remember no preorders!

    • @SpinDlsc
      @SpinDlsc Před rokem +9

      True. I also think his Steam argument only has a limited degree of validity, because if you also look at the Steam Hardware Survey and see what kind of graphics cards most people have, it's 50, 60 and 70-class cards, and a lot of those are still in the 10, 16 and 20 series. A big reason many people haven't wanted to upgrade in the last couple of years is because of the recent pricing problem in the GPU space and the current recession, so by that metric, most of those people aren't exactly going to try running any of the newer, shinier games.
      Also, if VRAM not being needed is the argument we were going to make, then we also have to ask why NVIDIA is going in so hard on marketing ray-tracing and now path-tracing to begin with when they aren't adding enough VRAM to help that feature run better on some of these cards in the long term. By the point of NVIDIA "not needing" to add more VRAM than is necessary for most users, we should also argue that they shouldn't be marketing ray-tracing to begin with.

    • @OrjonZ
      @OrjonZ Před rokem +2

      A lot of people bought and played Hogwards.

    • @ZoragRingael
      @ZoragRingael Před rokem +2

      + there are steam sales

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 Před rokem +1

      @@SpinDlscidk, I think raytracing is a good thing to push but it’s not there yet, so nvidia should be taking a hit to their profit margins to keep the prices actually making sense as opposed to doing the complete and utter opposite like they currently are.

  • @xEricGNx
    @xEricGNx Před rokem +113

    I remember when 512MB was enough for gaming.
    Thank you for your service, GeForce 8800 GT.

    • @ro_valle
      @ro_valle Před 7 měsíci +1

      I remember asking my dad for a 8800GTS 320mb and he surprised me with a 8800GTS 640mb , I was amazed by the amounts of vram

    • @davidrenton
      @davidrenton Před 7 měsíci

      my 1st PC had 4MB of Ram (yep MB, not GB), not in the GFX card, it did'nt have 1, 3D acceleration did'nt exist.
      I think my hard disk was 20MB
      4MB System ram total, but hey that was'nt the problem it was trying to get all the DOS drivers like CDROM,Sound,Mouse into the 1st 640K.
      Doom 1 final Boss was stuttery, but then i spent an insane amount and went to 8MB, Doom final boss, smooth as butter.

    • @m.i.l9724
      @m.i.l9724 Před 5 měsíci

      that means if youd commit to be a father when you were 18 your child could have a child my age i guess damn@@davidrenton

    • @davidrenton
      @davidrenton Před 5 měsíci +1

      @@m.i.l9724 not yet my child would off had to been a parent at 13 , which is unlikely, i'm 49, so 36 i would have had an 18 year kid, if i had them when i was 18. Hence my hypothetical grandkid could be 13.
      Saying that it's not impossible, people have kids at say 15, their children at 15, they are a grandparent by the time the are 30
      i recently watched a TV bit from the 80's, it was about 6 generations alive in 1 family, so from baby, mother, grandmother, great grandmother, great great grandmother and the great great great grandmother was still alive, they where they all together in the studio

    • @valentinvas6454
      @valentinvas6454 Před 4 měsíci

      In 2012 I thought 2GB in the GTX 660 will be enough for a decade but even 4-5 years later it was easily overwhelmed. Nvidia often screwed us over with VRAM capacity. The popular GTX 970 only had 3.5 usable VRAM and as soon as you used the last 0.5 GB that was much slower you started to see huge stutters.
      With Pascal and Turing they were quite generous but after that it's downhill once more. The 3070 TI and 4060 TI with 8Gb are such jokes.

  • @madrain3941
    @madrain3941 Před rokem +245

    As soon as I heard that the RTX 4060 was gonna release with 8GB of VRAM I Instantly went ahead and purchased the RX 6700 XT with 12 GB of VRAM, and honestly, It is a HUGE game changer at least for me.

    • @weshouser821
      @weshouser821 Před rokem +11

      What I don't understand is that we have systems with 64gb of memory I really don't see why we cand have a card that has 32/64gb of vram instead of messing around with it. Why can't we just make cards that have upgradable vram slots? I don't know... it's over my head, but I really think it's because of "planned obsolescence".

    • @LeoMajor1
      @LeoMajor1 Před rokem +23

      @@weshouser821 Yes it isa bit over your head because GPU VRAM and SYSTEM RAM are not the same..... and a card with 64gb of VRAM would be HUUUUUUUUUUUUUUUUUUUUUGE and need a heavy power draw and more coooling and its even over my head so someone else can add to that

    • @weshouser821
      @weshouser821 Před rokem

      @@LeoMajor1 Would it really though?

    • @brkbtjunkie
      @brkbtjunkie Před rokem +3

      @@weshouser821 have you seen the prices of ddr5? Gddr6x is a whole different ballgame as well. Apples and oranges.

    • @Elinzar
      @Elinzar Před rokem +5

      Why we don't have 64gb cards in the consumer space is simply because Gddr6 is still not dense enough, like the enterprise 3090ti card had 48gb of Vram and I think this gen might have an 84gb one or something like that
      Also capacities like that where only archived by HBM3 in the past gen
      So yeah I would say even 20gb+ midrange cards are still miles to go, 16gb will become the norm tho and something I love about what AMD did with RDNA2 (absolute underated cards this gen) is that the 6800 all the way to the top tier all got 16gb, I mean the 6950xt should have gotten 24 at least but you get the point, from the high midterm to the high end got a fair bit of Vram and the 6700xt got 10
      Only the entry level got 8gb

  • @ConfusionDistortion
    @ConfusionDistortion Před rokem +57

    Working adult here that buys AAA games, so yeah, this affected me. It was sobering to start up Company of Heroes 3 and find I couldn't max it out due to a vram limit on my 2070 Super. Have had this card for 3 years, and I still like it, but yeah, sign of the times. So now it sits in a secondary pc and dropped the cash on a 7900 XT. Problem solved, and now I am back to running everything again and not sweating vram issues on Last of Us, COH 3 etc.

    • @kirbyatethanos
      @kirbyatethanos Před rokem +7

      Same boat as me. Recently had an RTX 2060 6GB. Upgraded to an RTX 4070 Ti(the 7900XT is more expensive where I live).

    • @xTurtleOW
      @xTurtleOW Před rokem

      Same man my 3080 was running out of vram very fast so dropped in 3090ti on second rig and 4090 on my main now no vram issues anymore

    • @xkannibale8768
      @xkannibale8768 Před rokem +4

      So go from ultra to high? Like lmao. There isn't even a difference you'd notice 90% of the time and it uses half the vram 😂

    • @legionscrest143
      @legionscrest143 Před rokem +1

      ​@@xkannibale8768 really?

    • @JonesBeiges
      @JonesBeiges Před rokem

      @@legionscrest143 yes really i did a cpu upgrade and can still play the newest titles with my 6 yr old gpu above console graphics.....
      Many people like you seem to fall for those youtube salesman....... Only idiot game devs create games for that 1% elitist pc nerds who need the 4 k ultra settings 144 hz because ......

  • @liberteus
    @liberteus Před rokem +190

    I own a 3080 10gb and my favorite game received tons of graphical updates to the point where 10gb isn't enough anymore, i had to cut all settings from ultra to a mix medium/high to get it over 60fps, down from 90 2 years ago.

    • @clownavenger0
      @clownavenger0 Před rokem +19

      so they added more demanding graphical settings and that hurt performance. okay cool.

    • @sirab3ee198
      @sirab3ee198 Před rokem +110

      @@clownavenger0 the GPUs are limited by VRAM, nothing to do with the demanding graphics, this is the same as r9 290x which had 4GB VRAM and 780TI which has 3GB RAM and the r9 290x aged much better than Nvidia counterpart. Nvidia is playing the planned obsolescence game forcing you to upgrade your GPU because it is RAM starved not because is slow. I can bet you my RX 6800 with 16GB RAM will run games smoother in 2 years than your 10GB RTX3080. When we told people 2 years ago about the limitations on 8GB VRAM on the RTX 3070 they called us Nvidia haters ....

    • @nazdhillon994
      @nazdhillon994 Před rokem +5

      which game

    • @weakdazee
      @weakdazee Před rokem

      literally same

    • @Tubes78
      @Tubes78 Před rokem +3

      ​@clownavenger0 that's always going to have an impact but I can't help but think that the card would lose less performance with more VRAM.

  • @ngs2683
    @ngs2683 Před rokem +67

    I just want to say one thing. I got a 1060 6GB in 2016 and spent nearly 7 years with it. Then finally I took my hard earned money and bought a 3080 Ti in November, this black friday. 12GB of VRAM. Then IMMEDIATELY new AAA games became this crazy demanding and devs are saying that 12 GB is minimum. On top of that, NVIDIA is effectly implementing planned obsolescence. The 4070Ti, the superior card to my 3080 Ti, had no evolution in VRAM. It's a 12 GB card. I just gotta say...it hurts to get an upgrade after 6.5 years only to end up immediately becoming the new low tier for this future they speak of. And I do blame NVIDIA. No card above the 3060 should have only 8 GB and the 3080 Ti should have been a 16 or 20 GB card. 3070 owners have all the right in the world to be mad. NVIDIA KNEW this was an issue but they don't care. They still don't.

    • @aquatix5273
      @aquatix5273 Před rokem +5

      The cards would be completely fine with this list of VRAM. The VRAM on these cards would be how much they actually would need to stay efficient to their compute power:
      RTX 3050: 6 GB
      RTX 3060 + RTX 3060 Ti: 8 GB
      RTX 3070 + RTX 3070 Ti: 10 GB
      RTX 3080: 12 GB
      RTX 3080 Ti + RTX 3090 + RTX 3090 Ti: 16 GB
      RTX 4050: 6 GB
      RTX 4060: 8 GB
      RTX 4060 Ti: 10 GB
      RTX 4070: 12 GB
      RTX 4070 Ti: 16 GB
      RTX 4080: 20 GB
      RTX 4090: 24 GB

    • @dimintordevil7186
      @dimintordevil7186 Před 8 měsíci +1

      @@aquatix5273
      rtx 3070 = 12 gb of vram. lets be honest . it is cheap for factory
      rtx 3070 ti =12gb
      rtx 3080 = 16 gb
      3080 ti =16 gb
      3090 =20 or 24
      4050= 8 gb
      4060=10 gb
      4060 ti =12gb
      4070 =16gb
      4070 ti =16gb
      4080 = 1200 usd / 20 gb
      4090 = 24 gb

    • @ozgurpeynirci
      @ozgurpeynirci Před 8 měsíci +1

      @@aquatix5273 4060 Ti is WAY MORE capable than 8 gb, that's why they made a 16GB version. 4070 should be 16 as well if not more. As a 10GB 3080 owner, this hurts.

    • @aquatix5273
      @aquatix5273 Před 8 měsíci +1

      @ozgurpeynirci Yeah, doubt, 4060 ti barely is better than the 3060 ti, both cards don't have the performance.

    • @dimintordevil7186
      @dimintordevil7186 Před 8 měsíci +2

      @@aquatix5273
      3060 ti is faster than 1080 ti . 1080 ti was faster than 2080 super . nowadays 2070 is as fast as 1080 ti . therefore , 3060 ti is a great card .

  • @rajagam5325
    @rajagam5325 Před rokem +89

    i got a 6800xt for 3070 price, am really happy with it. and the video export times are really good.
    (dont support companies, support the better product :))

    • @gruiadevil
      @gruiadevil Před rokem +2

      This is the best attitude!
      If NZXT or Corsair make a GPU and it's good price/performance compared to the other ones, I'm buying it.

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh Před rokem +1

      yep, right now the 6950XT is CHEAPER than the 3070ti while having DOUBLE the VRAM (8gb vs 16gb) and being 36% faster!!! CRAZY

    • @IchiroSakamoto
      @IchiroSakamoto Před rokem +4

      +1 I buy Nvidia products all my life, but went for 6800XT for better value for money as my 2070s really disappointed me in RT. Couldn't care less who's the maker but I'm disappointed there are so many fanboys around

    • @abdulqureshi2803
      @abdulqureshi2803 Před rokem +1

      I got a 6800xt for MSRP right from AMD, upgraded from a 3070 (which I sold for the price of the 6800xt during the crypto boom) and it's soo much better. I ran into the vram problem on FH5 a few times ok the 3070, never had that issue with the rx 6800 xt tho had the black screen of death a couple times when I first got it

    • @Fluskar
      @Fluskar Před 6 měsíci

      facts. i will never support brand loyalty, its plain out stupid.

  • @trr4gfreddrtgf
    @trr4gfreddrtgf Před rokem +246

    This is also why I'm going to take a 7900xt over a 4070 ti, 20gbs seems a lot more future proofed then 12

    • @VaginaDestroyer69
      @VaginaDestroyer69 Před rokem +40

      Yeah, and with FSR 3.0 coming out soon AMD is really closing the gap on Nshitia. RT is still not in a place where I would be willing to base my entire GPU purchase on just ray tracing performance alone. I can see AMD and Intel offering outstanding value to gamers in the future compared to Nshitia's extortionate pricing.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf Před rokem +13

      @@VaginaDestroyer69 Even the 7900xt and 7900xtx have made massive improvements on ray tracing, the 7900xt is only a little bit behind the 4070 ti (with raytracing) and I think we can expect the gap to close as the 4070 ti runs out of VRAM over time.
      I can't wait to see how FSR3 compares to DLSS3, it probably won't beat DLSS in terms of visual quality but hopefully it gives a similar performance boost.

    • @CRF250R1521
      @CRF250R1521 Před rokem +10

      7900XT has low 1% lows. I returned it for a 4080.

    • @chickenpasta7359
      @chickenpasta7359 Před rokem +2

      @@VaginaDestroyer69 you're acting like AMD is the hero in this situation. They literally wait until Nvidia drops their MSRP and then undercuts them by little

    • @WackyConundrum
      @WackyConundrum Před rokem +4

      @@CRF250R1521 Interesting! Do you remember a particular benchmark with these results?

  • @sirab3ee198
    @sirab3ee198 Před rokem +9

    So AAA games are a niche now ????? :)))))))))))) RE4 sold 5million copies, Elden ring sold 20million Witcher 3 sold 40milion etc ....... I hate it when people misuse Steam Charts to prove their point, Hogwards Legacy is a singleplayer game same as Elden Ring (kinda) but after a month from launch people move on because it is a single player game!!! people finish it and move on. Nvidia giving you 8GB VRAM for the x70 series was a slap in the face for consumers, now they are doing the same with 12GB VRAM. People who bought the RTX 3070 and who will buy the RTX 4070 will want to play the latest AAA games.

  • @veda9151
    @veda9151 Před rokem +50

    It is very true that most don't actually affected by the vram issue now. The real controversy is Nvidia not providing enough vram while pricing their GPU as a high-end model. No one is complaining the 3050 or the 6600 only gots 8Gb. It's the 3070 and 3080(10Gb) that attract all the attention.

    • @MATRIX-ERROR-404
      @MATRIX-ERROR-404 Před rokem +1

      RTX 3070 /RTX 3070 Ti/ RTX 3060 Ti = 8 GB vRAM

    • @infiniteblaz3416
      @infiniteblaz3416 Před rokem +6

      Because the 3050 & 6600 are entry level cards priced accordingly. The 3070/ti is around a whopping $500 with the same amount of VRAM as the entry level GPUs. Hence why people are calling out Nvidia’s stupidity.

    • @vectivuz1858
      @vectivuz1858 Před rokem +3

      @@infiniteblaz3416 That is exactly his point though. Price accordingly and people will understand.

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      system ram like ddr4 or 5 or fast nvme can easily be used to compensate for the lack of vram, devs just need to implement it, but they are lazy af

    • @vectivuz1858
      @vectivuz1858 Před 11 měsíci +1

      ​@@r3tr0c0e3 Uhm yes some games do that, and it causes major lagging.

  • @HogEnjoyer
    @HogEnjoyer Před rokem +112

    Nvidia: “Here gaming studio, $$ to use more VRAM”
    Also Nvidia: “Higher VRAM costs $$$$$$$$$$$$$”

    • @hardrock22100
      @hardrock22100 Před rokem +11

      You do realize the last of us and RE4 are AMD sponsored titles, right?

    • @gruiadevil
      @gruiadevil Před rokem +29

      @@hardrock22100 You do realize they use more VRAM precisely because AMD packs their GPU-s with more VRAM, and nVidia doesn't.

    • @hardrock22100
      @hardrock22100 Před rokem +13

      @@gruiadevil
      1. This person was trying to claim that Nvidia is paying devs to use more VRAM in games that are sponsored by AMD.
      2. It's interesting that amd sponsored titles are running like hot garbage.
      3. The company that ported the last of us was the same one that ported Arkham knight.
      4. The last of us crashes when you run out of vram. That should not be happening. I've seen it even BSOD some PCs.

    • @AntiGrieferGames
      @AntiGrieferGames Před rokem

      @@hardrock22100 The last of US is just a piece of shit port to getting bsod

    • @vaguedreams
      @vaguedreams Před rokem +4

      @@hardrock22100 3. The company that ported arkham knights is also the same company that ported uncharted legacy of thieves collection.

  • @stratuvarious8547
    @stratuvarious8547 Před rokem +35

    When Nvidia released the 3060 with 12 GB of Vram, everything up to the 3070Ti should have also had 12 GB. with the 3080 and 3080Ti getting 16 GB. I just hope this is the straw that costs them enough market share to change their ways, instead of always thinking they can do whatever they want and people will just buy it.

    • @MarcoACto
      @MarcoACto Před rokem +1

      The thing is that the 12 GB version of the 3060 was obviously aimed at crypto mining, which required a lot of vram and was hot at the time. It was never designed for gaming in mind.

    • @stratuvarious8547
      @stratuvarious8547 Před rokem

      @@MarcoACto Yeah, it's true, but that doesn't change the fact that the skews above should have still been increased. Making GPUs obsolete 3 years after their release is inexcusable, and that's all that giving those cards 8 GB of Vram has done.

    • @naturesown4489
      @naturesown4489 Před rokem

      @@MarcoACto Yeah that crypto thing is a myth. NotABotCommenter has the correct reason.

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem +1

      3060 will still have 30fps less than 3070/80 regardless of how much vram it has lol
      you people are clueless

    • @stratuvarious8547
      @stratuvarious8547 Před rokem +1

      @@r3tr0c0e3 Of course it'd have less FPS, it's a lower class GPU, I was talking about the longevity of the purchase. Maybe before calling someone "clueless", look at the context of the conversation.

  • @Obie327
    @Obie327 Před rokem +39

    Very good observation VEX, The older Pascal cards with 8 gigs of Vram utilize only what features sets they have baked in. The problem now is all these new advanced DX 12 features plus higher resolutions become more taxing on limited Vram buffers in niche situations. There's a car analogy here: When it's fast but runs out of gas? (tiny tank) Or the car can get to sixty really quick but tops out at 80 mph? (low gearing) i really think everyone wants something that performs great and has future potential/practicality, Or value? Hoping their GPU will last a good while for their current pricey investment? Limiting the Ram only limits the possibilities for game developers.

    • @user78405
      @user78405 Před rokem +6

      Limiting ram should force game developers to open doors ...not milk them, that is john Carmack philosophy of good quality work ethics over quantity always become sloppy when having more than it can chew for a company, and ion storm is good example back then

    • @Obie327
      @Obie327 Před rokem +1

      @@user78405 I totally agree with you. But like Moore's Law is dead interview with the game developer... The modern Consoles are using around 12 gigs of Vram. I hate sloppy laze code, But I do like to experience everything that the developers have to offer? Maybe AI can help clean this up? I feel that if more adopt higher ram limits this issue won't be a problem going forward. I feel like we are in a weird transition period and Nvidia could be more generous with their specs? Have a great weekend and Easter!

    • @David_Raab
      @David_Raab Před rokem +1

      Some people like to buy a newly released graphics card (3070) for 600$-700$ and like that they have problems with already released games because of too less VRAM. People who critizice this are obviously AMD fanboys.

    • @Obie327
      @Obie327 Před rokem +2

      @@David_Raab It's been years that we have had 8 gigs of Vram on a premium GPU. My GTX 1080 is 7 years old and AMD even longer. The latest consoles "was", The warning sign that more ram was going to be needed. And now they are using 12+ gigs for their new Console releases. I just think it's a damn shame to pay 500+ for anything new with only 8 gigs and call it exceptable going forward. I think Nvidia just missed the mark with their current product stack. Also Nvidia's new stuff still has the older display connector standard. Which has me scratching my head since they have the same display tech on the $1600 RTX 4090 as well. Intel's ARC A770 LE is only $350 dollars and has the latest display connectors, DX 12 ultimate/Vulcan/XeSS Feature sets, And 16 gigs of Vram. Is Video Ram that expensive to put more on a $800 4070ti? I just think the whole current role out of GPU's are off on many levels. Time will rapidly tell how fast are cards become obsolete? Crossing fingers, Peace!

    • @David_Raab
      @David_Raab Před rokem +1

      @@Obie327 I agree to all of that. I find it a shame currently. I'm buying Nvidia now for nearly 20 years, and now i'm at the point of buying AMD instead. Nvidia now sells overpriced cards, the 4070 in my opinion should have been a 4060. I could live with such a card and 12GB if it would costs 300€, but not 600€. And yeah, they can't tell me that 4GB or 8GB more GDDR6 RAM can be so expensive. Any card over 300€ should have 16GB VRAM at least.

  • @clownavenger0
    @clownavenger0 Před rokem +33

    Hogwarts was patched and works fine on a 3070 now. RE has a bugged implementation of RT so if you turn that off the 3070 does not have any issue in that game either. TLOU has PS1 textures on medium and other graphical issues regardless of hardware. If the question was "Is 8 GB on the edge for the 3070?" i would say yes but games are releasing completely broken which increases the need for over powered hardware. Some very good looking 3D titles with high quality textures use 4-6GB (Atomic Heart for example) on ultra while TLOU uses 8 while looking like a PS2 game at medium settings. I run a 3080 10GB myself and play everything at 1440p or 1440p ultrawide while using DLSS quality whenever offered to push about 100 FPS. I have not has a single issue but I only buy games on sale. So the game might be finished by the time I buy it. It seems like people just want to make excuses for game developers.

    • @DenverStarkey
      @DenverStarkey Před rokem +2

      well these games wer also designed around a card that had 16 gigs ,( the radeon cards) so the devs got sloppy with vram usage.

    • @jorge86rodriguez
      @jorge86rodriguez Před rokem +4

      just buying the game on sale avoids a lot of headaches jajjajaja early buyers are beta testers xD

    • @tyisafk
      @tyisafk Před rokem

      I played through RE on an RTX 2070 and Arc A750, both 8GB cards. I agree that RT (And hair strands on the Intel) was the main issue with the game. To be fair though, both reflection implementations aren't good at all so it's worth just having both RT and Screen Space turned off. I even used higher quality textures than the game suggested with those disabled as per Digital Foundry's suggestion and the game ran flawlessly on both cards. I'm glad I don't often care for big AAA titles, and I have a PS5 if I'm that desperate to play one that isn't optimized properly on PC, but I do feel bad for regular AAA game fans who exclusively play on PC. PC used to be the main go to for long term savings if you didn't mind more up front, but now a current gen console is definitely the better option if you just want to be able to play anything decently.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem

      @@DenverStarkey those are high end cards. How many people actually own such GPU? All the talk surrounding this "not enough VRAM" mostly if not all of them is about max setting. Years ago i read an interview with dev (forgot which tech outlet are doing it) they said on pc they will try to optimize their game even on intel iGPU because of how big the user base is. And back then intel iGP are considered as super crap.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +2

      10GB will soon not be enough. Dying light 2 maxes it out easily on RT. You'll have to increase you reliance on DLSS and lower texture quality in the upcoming years. Even flight simulator maxes out 10GB VRAM at 1440p. So....................

  • @Doric357
    @Doric357 Před rokem +28

    6800xt with the 16GB seems to be a sweet spot. I'm a casual enthusiast so I won't claim to know the in's and out's about all this but creators have been talking about VRAM forever I always believe more is better. However, I don't believe it should be at such a high premium.

    • @tokki2490
      @tokki2490 Před 10 měsíci

      if you havea 6800xt... you are not a casual enthusiast lol

  • @tech6294
    @tech6294 Před rokem +133

    Great video! We need more people talking about this. If Nvidia and AMD tomorrow put 24 GB as the standard midrange and 48 GB on the high-end games overnight would look photorealistic. And no vram doesn't cost that much more to go from 12gb to 24gb. You're probably only talking about a 40$ hike in price. These companies could easily make a 600$ 24gb card. They simply choose not to.

    • @Clashy69
      @Clashy69 Před rokem +23

      amd has already put enough vram on their cards even their low end 6000 series, nvidia should do the same and add at least 12gb vram on their lower end cards id even be fine if it was just 10gb vram but we'll see since they gave the 4070 12gb vram

    • @bronsondixon4747
      @bronsondixon4747 Před rokem +28

      It’d make no difference if 24gb was the minimum. It just needs to have more vram than the current console generation.
      Game developers wouldn’t take advantage of more than 16gb since that’s all they have available in PS5.

    • @66kaisersoza
      @66kaisersoza Před rokem +8

      ​@@bronsondixon4747 the console ram is shared with the OS.
      Around 10gb is for the games and the other 6gb is dedicated to the OS

    • @luisgarciatrigas3651
      @luisgarciatrigas3651 Před rokem +23

      ​@@66kaisersoza 13.5 for games, 2.5 for OS 👍

    • @retrofizz727
      @retrofizz727 Před rokem +14

      24gb is overreact wtf, you wont need 24gb for 4K before like 2030

  • @NootNoot.
    @NootNoot. Před rokem +54

    I have to agree with you with the whole 'vram niche' point. I myself don't usually play AAA games that tax my gpu, although I do use workloads other than gaming that needs a lot of vram. Although, I do think that this whole vram fiasco, is a very important thing to discuss. Nvidias planned obsolescence should be put to a stop, and give consumers what they NEED for what they PAYED for. Like you said, performance is what scales with vram.
    The 1070 doesn't need anymore vram because of how it handles, unlike the 3070 where it should be able to play 1440p+ and shouldn't need to be bottleneck by memory, causing stutters, instability, to even not booting up the game. It's a business move, and it totally sucks. While these videos may seem 'repetitive' or controversial', I appreciate you making this.

    • @johnny_rook
      @johnny_rook Před rokem +5

      Define "niche".
      AAA games sell by the millions on PC and new consoles have at least, 12GB VRAM addressed to GPU with a 4yo RTX 2070 tier GPU. People (both devs and players) will use high res. textures if they can, regardless of GPU power and textures are the biggest contributor to VRAM usage.

    • @Lyka-clock
      @Lyka-clock Před rokem

      So far my 3060ti works well for mostly PC type games like RTS and some RPG's. I'll use console for AAA games but these days, most of its trash really. The ports haven't been that great to begin with. I tried Returnal on PC and it was stuttering no matter what settings i used and it wasn't a vram issue either. I already played Dead Space, RE4 and LOU. Maybe there needs to be a focus on making a new IP with great gameplay. That would be a wonderful idea! Lets hope Diablo 4 is actually fun. The demo was good but not great and this isn't like a new IP or anything.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +2

      ​@@johnny_rook millions of those PC does not have gpu with 12GB vram. In fact more than half of them probably only using iGPU.

    • @johnny_rook
      @johnny_rook Před rokem

      @@arenzricodexd4409 Yeah, not having enough VRAM is the issue isn't it?
      How do you know that, exctly? Isn't it funny when people pull numbers out of their asses, without a shread of evidence to support it?

    • @alanchen7757
      @alanchen7757 Před rokem +1

      @@johnny_rook proof is when amd competing gpu against 3070 3080 etc out last Nvidia due to having more vram

  • @denerborba4994
    @denerborba4994 Před rokem +29

    digital foundry recently made a review about RE4 remake and there they tested a 2070 super and on their analysis it seems that RE4 will never crash with texteures at 8 GB unless you are using ray tracing. i also have been playing the game with a 8gb card and have not faced any crash for far either about 13 hours in.

    • @viktor1master
      @viktor1master Před rokem

      I test re 4 remake with 3070 ryzen 9 3950x 16 gb ram with total limit of 12 gb of vram textures 8 gb funny tho in settings it showed me mi card have only 7 gb of vram sou i dont know in another games its normal 8 gb but i was impressed it was 135 fps to 70/80 lowest slighty over 60 30 min testing demo tho now i know i must have it the games looks sou good🤣

    • @JoeL-xk6bo
      @JoeL-xk6bo Před rokem +4

      it still has issues loading in high res textures. stop and look at certain surfaces the texture will go in and out.

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      rt doesn't look particularly accurate or appealing in this game anyway, besides it's just another condom they placed on a perfectly good looking game without it, so it was off just like in any other unoptimized garbage they tried to sell us to
      unless game goes full path tracing it simply not worth it and as we can see even 4090 struggles to do that at playable fps

  • @Sinflux420
    @Sinflux420 Před rokem +35

    Just got a 20gb 7900XT. Being able to run RE4 at max with ray tracing and having 6 gb leftover is pretty nice, ngl. Didn’t realize this ongoing issue until after getting the card, glad it’s well-equipped!

    • @Drake1701
      @Drake1701 Před rokem

      Out of curiosity, what resolution do you play at?

    • @Austrium1483
      @Austrium1483 Před rokem

      What did you pay

  • @franciscoc905
    @franciscoc905 Před rokem +16

    Definitely was waiting to see what you have to contribute on the discussion. I definitely see this as a negative for people wanting to play AAA games in 2023 with high details, but it will be a fire sale of great deals and second hand graphic cards for competitive gaming.

  • @themadnes5413
    @themadnes5413 Před rokem +26

    I had a 1080ti, and the main reason i did not get a 20 or 30 series was vram. 3080 has less and 3080ti had 12 gb, 1200€ for a 12gb vram gpu is kinda stupid and in this regard a sidegrade. Now i have a 4080, 16gb is still a bit on the low side for a 1200€+ gpu but i can live with that. I know amd is an option too, and i was about to get a 7900xtx but the price of the 4080 was like 50€ more. So i choose the nvidia gpu, also i like rt and dlss a lot.

    • @dededede9257
      @dededede9257 Před rokem +4

      I think 16gb still fine yeah he could be more for this price but i don't think you get vram limited

    • @zdspider6778
      @zdspider6778 Před rokem +11

      1200€+ is the price of a "decent enough" second-hand car.
      The MSRP of the 1080 Ti was $699.
      Ngreedia is laughing all the way to the bank every time a schmuck buys one, lol. They're sitting comfortably on the shelves, not even scalpers are touching them. But enjoy it, I guess. LOL. You got a step-down, btw. From a "Ti" to a "non-Ti" 80-class, for much more money.

    • @paranikumarlpk
      @paranikumarlpk Před rokem +3

      You could have easily choosed 7900xt but u just made an excuse to stick nvidia lol ggs

    • @dededede9257
      @dededede9257 Před rokem +6

      @@paranikumarlpk he have make the good choice for almost same price the rtx 4080 is better thant xtx and doesn't have issues like 100w idle with multi monitor

    • @vaghatz
      @vaghatz Před rokem +2

      ​@@paranikumarlpk DLSS

  • @stratuvarious8547
    @stratuvarious8547 Před rokem +18

    I expected when the games designed for the current gen consoles (Xbox Series, PS5) started releasing on PC, this was gonna start to be a problem. That's why when I was looking to upgrade my 2070, I was looking for something with a minimum of 12GB of Vram. Since I couldn't get a new 3080 (or Ti) for a reasonable price, I went with the RX 6900 XT and it's massive 16 GB of Vram. Since it was $650, It felt like the best price to performance in the price range I was looking at.

    • @latlanticcityphil
      @latlanticcityphil Před rokem +1

      Man, I love my RX 6900 XT, I have no problems and a great investment too. I can play all the games and have a great experience even with Cyper punk. 16 gb of VRAM DOES MAKE A DIFFERANCE!

  • @friendofp.24
    @friendofp.24 Před rokem +28

    Heavily regretting buying the 3070 now. I camped outside of a Best Buy for 20 hours and had the option to choose any 30 series. I didn't understand at the time how much VRAM mattered.

    • @HUNK__S
      @HUNK__S Před rokem

      😂😂 suck to be you

    • @soumen8624
      @soumen8624 Před rokem +16

      It’s not your fault, VRAM really didnt matter until very recently.

    • @Stephan5916
      @Stephan5916 Před rokem

      @friend You live and you learn. Vram always mattered.

    • @Stephan5916
      @Stephan5916 Před rokem +2

      @clockworknick9410 It's still Nvidia's fault. Before the 3080 their flagship card was the 2080ti. That was 11gb of vram. They should have at least matched it or better the Vram with the base 3080 model.

    • @naturesown4489
      @naturesown4489 Před rokem +1

      @Clockwork Nick There were people saying at the time of 3070 release (hardware unboxed) that the VRAM wouldn't be enough in a couple of years. The 1070 had 8GB, the 2070 had 8GB.. so why would they not have put more on the 3070?

  • @Einygmar
    @Einygmar Před rokem +54

    VRAM is a problem but optimization affects this issue as well. Better texture\asset streaming and more optimized BVH structures for ray acceleration would fix a lot of problem. I think the bigger issue is memory bandwidth on the modern cards that limits the throughput affecting the streaming capabilities.

    • @Vasharan
      @Vasharan Před rokem +5

      Yes, but games will continue to be unoptimized, as long as every developer isn't John Carmack* or Takahisa Taura (Platinum Games), and as long as studios have deadlines and cashflow constraints.
      As a consumer, you can either not buy unoptimized games, or not buy underprovisioned hardware, or some combination of both.
      * And even Carmack's Rage was a buggy mess for years as he tried to get texture streaming to work seamlessly

    • @gagec6390
      @gagec6390 Před rokem +4

      @@SkeleTonHammer That's just not true though. 4K monitors and especially TVs have become very affordable. 1080p is only still the standard among e-sports players and extreme budget builds or laptops. Most people who are even somewhat serious about PC gaming have at least a 1440p monitor and the fact is that anything under 12gb of VRAM just isn't enough for even the near future much less the 4-5 years that most people keep their graphics cards. If you paid more than $300 for an 8gb card recently then you got fucking scammed. (I would know, I unfortunately bought a 3060ti a year ago instead of a 6700XT)

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      @@gagec6390 1440p even 1080p looks playable on oled, upscaling made it possible
      i'm never going back to garbage tft panel

  • @Tripnotik25
    @Tripnotik25 Před rokem +5

    5:52 during this interview the dev brings up stuff like "using VRAM to have high quality on all the body parts, like the eyes incase someone looks closer" and im thinking were supposed to pay 500$+ for the sake of having 4K resolution on 30 body parts of an NPC. You're not kidding when you say niche, how bout these devs make good games with griping writing and stop crap ports relying on DLSS/FSR to cover up laziness. 95% of the market is going to continue using 8~12GB products and indies will thrive.

  • @rebelblade7159
    @rebelblade7159 Před rokem +40

    I remember buying the GTX 960 4GB in 2015 for like 200$ equivalent brand new. That amount of VRAM was considered overkill for many but it allowed me to use it all the way up to 2021. VRAM matters a lot if you want to use a GPU for a long time.

    • @FatheredPuma81
      @FatheredPuma81 Před rokem +3

      At the time you gained almost nothing for whatever you paid extra and now you're gaining around 15% extra performance for whatever you paid extra.
      Just put the extra money you saved into a savings account, wait a few years, sell your 960 2GB, and get a 970 3.5GB for the exact same cost.

    • @jacobhargiss3839
      @jacobhargiss3839 Před rokem

      ​@@FatheredPuma81 that assumes the price actually does drop and you can find the cards.

    • @FatheredPuma81
      @FatheredPuma81 Před rokem

      @@jacobhargiss3839 Always has always will.

    • @FatheredPuma81
      @FatheredPuma81 Před rokem

      ​@@DeepfriedBeans4492 Looking at the wayback machine it was around $40 more. Toss that into a half decent Savings Account (not your local garbage bank) and that turns into $44 minimum in 5 years.
      GTX 970 was under $100 just before the mining craze and I'd guess the GTX 960 2GB was at the very least above $55. I actually sold a GTX 960 that summer and bought a RX 580 8GB for $120 but can't remember how much I sold it for. (Sold the RX 580 for $120 though a year later though lol)
      Sucks to be you I guess if you're too terrified of potentially mugged at a McDonalds in broad daylight with people around over $60. Sounds like you live in Ghetto-Siberia or something. I'd suggest moving.
      P.S. Do Ghetto-Siberian shippers not let you pay $2 to print a label? Do Ghetto-Siberian's not order things online and have loads of boxes and packing materials laying around? Does Ghetto-Siberian eBay not give you free shipping for broken items?

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 Před rokem

      @@FatheredPuma81 please tell me what savings account you use that gives 110% returns in 5 years because the only kind of accounts I know of with that much potential are not what I would call a ‘savings account’, and most certainly do not come without large amounts of risk, and are also called ponzi schemes and are illegal.

  • @brandons9138
    @brandons9138 Před 7 měsíci +4

    Nvidia doesn't design these cards in a vacuum. They are talking to developers and working with them. I'm thinking that nVidia has something cooking that will help mitigate the vram issues. I can't see them making 8GB the baseline knowing full well that the games will not run well on them. We're all so used to having cards with 12-24 GB of vram, because in the past GPU performance was based solely on brute force rasterization. With technology like DLSS and FSR I think we'll see that change some what. I just benchmarked my base model 4060 with 8GB and it runs Cyberpunk 2077 at 1080p on ultra with DLSS at an average of 90 FPS. Even without DLSS it was over 60FPS. I'm thinking that both nVidia and AMD are working to make DLSS and FSR a way to keep performance high without having to resort to massive power hungry chips that are too expensive to make and don't sell as well because of the price. The reason why we are seeing some games crush these cards is because they were not developed with DLSS and FSR in mind. It may have been patched in, but who knows how well optimized it is.

  • @bladimirarroyo8513
    @bladimirarroyo8513 Před rokem +8

    Man im about to buy my first gpu and all your videos all answering my doubts.
    Thank you sm 🤗

  • @Saltbreather
    @Saltbreather Před rokem +23

    If you go into the settings for MSI afterburner/RTSS, you can enable dedicated and allocated VRAM. That’ll give you a more accurate number when looking at how much VRAM a game is actually using.

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      by that account if you have 48gb game will allocate that much if needed, so yeah we need 64gb vram now lol
      funny how while all recent RE games are supposedly use more than 8 gb vram yet games run smooth and without any stutters, yet in settings it indicates that vram limit is exceeded
      RE4 remake crashed not because of that, it was eventually fixed later, you will get small stutter and dips if you enable RT, even on 4090, so vram is not in an issue in this case, Farcy 6 however will destroy your performance if you enable ultra textures and only got like 8gb vram, which kind of look just like high lol, RE games will use system ram to compensate, many games do that actually, simply because recent consoles have shared ram, be it ddr6 but still, lazy devs simply can't be bothered to port them properly, hence the vram rage

  • @Ben256MB
    @Ben256MB Před rokem +12

    I don't think it's a problem because games are graphically more realistic and the texture sizes are bigger too .
    Remember the tech demo of Unreal engine V on PS5 I knew that 1440p might consume all 8GB or more .
    We can keep it extra real " Most games there very little difference between ultra and high settings .
    Just down the settings to high at 1440p or 4k in an 8GB card . People are too sensitive !!

    • @OffBrandChicken
      @OffBrandChicken Před rokem +3

      Or just add more VRAM and don’t have the issue to begin with.
      You see Im not even remotely worried about my card anytime soon, and I’ll be able to run my games at higher settings than Nvidia because of it.
      I can crank up settings without issue.
      I just find it ironic that a worse performing GPU is performing better now.

    • @gruiadevil
      @gruiadevil Před rokem +4

      You can't ask to crank down settings.
      I paid the nVidia Tax.
      I paid the 3000/4000 Series Tax.
      I expect to play a game at highest possible settings on the resolution of it's tier.
      You can't charge extra, and deliver less, just so people buy another GPU in 2 years times, because you want to sell more GPU-s.
      It's the same mentality General Motors had in the '70-80's by starting to making cars that brake in 10-15 years. And everyone followed suite.
      If you buy a BMW made before 2005, it's a tank.
      If you any BMW made after, it's going to start breaking piece by piece.

    • @Ben256MB
      @Ben256MB Před rokem

      @@OffBrandChicken Lol VRAM can't be added because it's on the PCB of the GPU board !!

    • @Ben256MB
      @Ben256MB Před rokem

      @@gruiadevil Lol bruh !! Because you bought a low end or a med GPU you don't get the same benefits as someone who paid $1700 for a 4090 .
      You have a choice of buy an AMD GPU which has bigger VRAM for less the price for slightly less performance .

    • @OffBrandChicken
      @OffBrandChicken Před rokem +1

      @@Ben256MB are you serious? I’m saying add more vram to begin with. How was that hard to understand?

  • @Skylancer727
    @Skylancer727 Před rokem +11

    I completely disagree on it being a niche issue. I do agree AMD advertising higher VRAM hasn't helped them, but this is an incredibly serious issue. People bought 3070s, 3080s, and 3060s only a couple years ago and have the power to play newer games, yet it won't work. Remember that some even bought 3070s for over $800 during the mining gold rush. That's an incredibly rough place to be in. Especially since most people seem to own a GPU over 2 generations and even the 40 series is low on VRAM. Even at MSRP the 3070 was $500, the same as the next gen systems that are running the games and again, this GPU is objectively faster. This could scare people away from PC gaming just after it recently took off again.
    And yes it's only AAA games, today. But when games start adding features like direct storage (which most likely will) even 12GB will be in a tough spot. Hell even Satisfactory announced moving to UE5 for new map loading systems and nanite. More games are going to continue to do this. And many people play at least one AAA game. Did you see the player counts for Hogwarts Legacy? They even announced over 3 million sales on PC alone in the first week. And games like COD will also likely become similar in the next 2 years with dropping PS4 and Xbox One, likely this next COD game.

    • @DenverStarkey
      @DenverStarkey Před rokem +2

      i just bought a used 3070 in october of 22 , and i already feel like nivdia is standing on my dick. Re2R and Re3R both go over and crash when Ray tracing is on.

  • @seaspeakss
    @seaspeakss Před rokem +10

    Tbh I was expecting this. When Nvidia decided to put 8 gigs into the 1070, I was amazed, and looked forward for the future. But after the release of the 1080Ti, Nvidia got really comfortable, and havent really came out with a great card, considering cost-to-performance ratio (the 90 series cards are powerful, but super expensive, unlike what the 1080Ti was back in its day.) The 3070 STILL having 8 gigs of VRAM is what holds it back, and 3080 only having 10 is also a major dealbreaker for me.

    • @Mr.Stalin116
      @Mr.Stalin116 Před 9 měsíci

      Tbh I feel like 10 gb is enough for 3080 for its performance. I recently got a 4070 ti 12gb, and I’m playing in 4k just fine. It does run out of vram when I’m playing 4k ultra rt in some games, like cyberpunk with rt override but those games would run at 20-30 fps anyway with more vram so there is not really any point of having more vram. And it sucks that there aren’t many other options if u wanna experience rt. And amd just doesn’t run well in rt. After trying rt in cyberpunk I was amazed by how much better it looks.

    • @y0h0p38
      @y0h0p38 Před 6 měsíci +1

      ​@@Mr.Stalin116Right now, 10 gbs is pleanty fine. What about the future though? Its a higher end card, you should be able to use it for years without any issues

  • @metroplex29
    @metroplex29 Před rokem +8

    that's why i preferred to go for the 6800xt with 16GB vram

  • @c523jw7
    @c523jw7 Před rokem +5

    You bring up some good points here. All that matters is your personal experience and your card fitting your purpose. 10g has been more than enough on the games that I play and really happy with my experience . Now I do think nvidia suck for not giving their cards enough vram, no excuse for that. 4 year upgrades seems about right just a shame vram usage has really spiked the last few games. Best to future proof for any card purchase moving forward though.

  • @ElladanKenet
    @ElladanKenet Před rokem +6

    I upgraded from a GTX 960 4gb to a 3060ti in early 2021, and went from 720p to 1080p. The improvements were staggering, and it's still mostly impressive two years later, but there are a few games, like HL, that punish my system.

    • @jayclarke777
      @jayclarke777 Před rokem +1

      Went from a 1050Ti to a 3060Ti. It was like going from VHS to Blu-ray

    • @Trainboy1EJR
      @Trainboy1EJR Před rokem

      @@jayclarke777 as someone who had a 2gb GT640, I gotta say that Textures high, shadows low, everything else off looked really good. Never went past 40% gpu usage because of a CPU bottleneck. XD Played Lost Planet 2 totally cranked at 17fps. Had the game memorized from PS3, looked AMAZING on PC. Just Cause 2 24fps in the city, only like 15%GPU usage. XD
      Upgraded to 12gb 2060. Most Vram minimum price, high textures are what matters. Can’t wait to finish going through all my favorites in 4K 120fps! XD

  • @terkiestorlorikin5958
    @terkiestorlorikin5958 Před rokem +8

    7:39 The people that gets affected by the VRAM are the ones that like to run the game on Max settings, seriously, 90% of the games the differences between High and Ultra are barely noticeable, Alex from DigitalFoundry does amazing videos showing optimized graphical settings and most of the time you have to zoom in 200% to spot the difference between High and Ultra. I understand that the VRAM might be an issue in the future but some people should chill a little bit and ask themselves "Do I really need to run clouds at Ultra settings? Do I really need to run this specific setting at Ultra?".

  • @stephenpourciau8155
    @stephenpourciau8155 Před rokem +4

    One little flaw is you did not turn on the setting that shows "memory usage \ process". This one in afterburner/rtss will show the ACTUAL vram usage of the application, and not what is allocated on the whole card.

  • @VDavid003
    @VDavid003 Před rokem +7

    I'm just glad that my 3060 that I bought used has 12gb of VRAM.
    I actually wanted to have as much vram for the money as possible, since last time I went with a 3gb 1060, and in the end that 3gb bottlenecked the card in some cases.

  • @ComradeChyrk
    @ComradeChyrk Před rokem +6

    I have a 3070 and I was concerned at first about the 8gb of vram, but so far I haven't had any issues. I play in 1440p but I never was really interested in things lile raytracing, or playing at ultra settings. As long as I can play 1440p at 100 fps, I'm happy with it

    • @David-ln8qh
      @David-ln8qh Před rokem

      I bought my 3070 for $1000 deep into the card-pocalypse when 3080s were in the $1500-$1600 range. I'm frustrated about the situation but still feel like I didn't really have many options and was probably better off pocketing that 5-600 dollars for my next card, which I'm hoping is a least a couple years out.
      For the record I also play at 1440p 80-120 fps and haven't yet run into problems.

    • @ComradeChyrk
      @ComradeChyrk Před rokem

      @@David-ln8qh I'm glad I waited cause I got my 3070 at sub 600$. I was holding out with a 970 until the prices dropped. I got my 3070 about a year ago.

    • @Trainboy1EJR
      @Trainboy1EJR Před rokem

      @@ComradeChyrk Still, wouldn’t it have made more sense to go with a 16gb AMD card if you weren’t going to bother with Ray tracing?

    • @ComradeChyrk
      @ComradeChyrk Před rokem

      @@Trainboy1EJR I wanted the dlss. Plus it was in my price range. The amd equivalent (6700 xt) was roughly the same price but didn't have as good of performance.

  • @dalebob9364
    @dalebob9364 Před rokem +4

    The main thing is no one's telling everyone that 90% of the stuff that's eating up GPU memory are things you don't need to have on, and wouldn't even be on the console version of the game you're playing!

  • @ner0718
    @ner0718 Před rokem +5

    Spending a lot of time in Blender (3D modelling and rendering) being stuck on a 6gb card is incredibly frustrating as I can't render most of my scenes on the GPU as my Vram runs full. Can't upgrade as I am still a student and don't have the money to buy a new gpu.

  • @konstantinlozev2272
    @konstantinlozev2272 Před rokem +3

    The real problem with high VRAM requirements and even with raytracing requirements is not (!) that it taxes your hardware. It's that the visual output is very underwhelming for the uptick in hardware requirements.
    You seem to be younger, but I do remember the Crysis WOW moment when we were seeing what kind of visual fidelity was possible.
    I fired up Titanfall 2 yesterday and on high it is a stunning game. Fake reflections and all, but you know what? It runs on 6-7 year old mid-range hardware. And looks just gorgeous.

  • @Jerad2142
    @Jerad2142 Před rokem +7

    One bright side about laptop gaming is a lot of these chips have way more VRAM on them, for example my 3080 laptop has 16GB of VRAM.

    • @dededede9257
      @dededede9257 Před rokem +4

      So for the first time of story the laptop gaming will age better that desktop

    • @spaghettiupseti9990
      @spaghettiupseti9990 Před rokem +2

      @@dededede9257 probably not, 3080 mobile gpu's don't perform like 3080 desktop cards, not even in close.
      a 3080 mobile is about 40-50% slower than a 3080 desktop.

    • @whohan779
      @whohan779 Před rokem +1

      Correct, @@spaghettiupseti9990, their naming is hugely misleading. Even a 3080 Ti mobile may be trounced by a 3060 Ti desktop depending on clocks (it's realistic).
      This is mostly because the RTX 3080 mobile is almost identical to 3080 Ti mobile, so they need the same memory bandwidth. While I'm sure Nvidia could explain away 8 GB for all 3080 mobiles (as they do for some), this wouldn't fly for the Ti models, hence they always have 16 GB on mobile.
      The mobile 3070 and up are (according to somewhat unreliable UserBenchmark) just 20% apart vs. 38% on desktop, so the only reason to pay up for an 80(👔) SKU (apart from higher power-limit) is the additional VRAM.

    • @UNKNOWN-li5qp
      @UNKNOWN-li5qp Před rokem

      But a laptop with 3080 will be like 3000 dollars and at that point just buy 3090 or 4090 lol

    • @Jerad2142
      @Jerad2142 Před rokem

      @@UNKNOWN-li5qp Don't even know if you can get "new" ones with a 3080 anymore, one with a 3080ti and a 12800HX is about 2149.99 if you went Omen though. My 4090 laptop was about $3,300. But yea, definitely paying a premium for portability.

  • @ChusmaChusme
    @ChusmaChusme Před rokem +1

    9:00 I'm a pretty heavy Davinci user and I usually cap out my 16gb card and 4gb gets paged to ram. This is usually with something like 6k braw footage, applied with some frame interpolation, noise reduction, and a couple of other nodes. When it came from either upgrading from the RX 6800XT or RTX 3070 at the time, this was during the mining boom so prices didn't make sense, the 16gb made sense for my purpose of use.

  • @karlhungus545
    @karlhungus545 Před rokem +2

    Exactly, it's not a problem for gaming at the resolutions these cards were intended for...1440p. Big thumbs up to this guy for having some common sense! All of these 'sky-is-falling' not enough VRAM videos are using crap, unoptimized, poorly coded console ports as 'evidence'. First, NOBODY (less than 1% of gamers) has a 4K monitor. Of those, most have a 4090, which is the only truly capable 4K card on the market. 2nd, if you play nothing but crap, boring, run around with a wand/gun/stick console games...uh, GET A CONSOLE!! Why would you spend so much cash to try and play console games on a PC at 4K?! 🙄🤣 Also, learn the difference between 'allocated' and 'used' VRAM. Afterburner is NOT accurate. My son's have a 3060ti and 3070 respectively at 1440p and have experienced ZERO issues playing at Ultra settings in actual PC games (Total War Warhammer 3, Sons of the Forest, Squad, etc). DCS World is far and away the most demanding PC game you will ever play, and I didn't have any issues playing it when I had the 3070 (gave it to my son), other than having to turn AA down because it's such a system hog. If you want so much useless VRAM, buy AMD! (you won't) Nvidia will do what they want and could care less what YT thinks because you'll keep buying no matter what...

  • @Lev-The-King
    @Lev-The-King Před rokem +18

    Hope AMD takes the opportunity to clown Nvidia for this... They probably won't.

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 Před rokem +8

      They did it at the time of the gtx 970 and its 3.5GB of Vram.

    • @lotto8466
      @lotto8466 Před rokem

      @@nombredeusuarioinnecesaria3688 my 6750xt has 12 gb and play hog warts ultra perfectly

    • @Jacob_Overby
      @Jacob_Overby Před rokem +1

      16gb on 6900xt, yum yum, newer 7xxx pushing 24

  • @j.rohmann3199
    @j.rohmann3199 Před rokem +4

    I actually never had problems so far with my 3060 ti... it does amazing for me at 1080p and decent on 1440p. I will still be using it in like 4 years (if I live that long)

    • @j.rohmann3199
      @j.rohmann3199 Před rokem +1

      @@VulcanM61 damn, I was going to do the same thing!
      From 5600x to 5800x3d... but maybe I will just go for a 5900x instead. I didnt decide yet

    • @j.rohmann3199
      @j.rohmann3199 Před rokem +1

      @@VulcanM61 Epic!
      Yeah the X3D versions are crazy good. And pretty future proof!

  • @VisibleVeil
    @VisibleVeil Před rokem +10

    Vram is not niche because those triple AAA titles will be on sale for the rest of us in 1-2 years time. When that happens, how will we play with the limited ram on these cards?

    • @gruiadevil
      @gruiadevil Před rokem +1

      You won't. Lol. You'll buy a new, better, much more expensive card and thank big Daddy nVidia for giving you another shite product.

  • @DeadlyKiller54
    @DeadlyKiller54 Před rokem +7

    Just seeing this makes me super glad i got my 6700XT for 360 with the 12 GB of VRAM it has. Yeah not an nvidia card, but she still performs good, and streams decently.

    • @Trainboy1EJR
      @Trainboy1EJR Před rokem

      Seeing this makes me even happier to have a $240 12gb RTX2060. XD Although it is almost certainly going to be my last Nvidia card, Intel is looking super good with ARC right now. Hopefully they teach Nvidia the lesson AMD hasn’t been able to. And with EVGA out of the scene, I completely understand not wanting to touch the 40xx series! Honestly I’m surprised more board partners didn’t “nope” out of this generation. XD

    • @GrainMuncher
      @GrainMuncher Před měsícem

      ​​@@Trainboy1EJRVRAM doesn't matter if the card is too weak to even use it. There's a reason the 3060 Ti 8gb destroys the 3060 12gb

    • @Trainboy1EJR
      @Trainboy1EJR Před měsícem

      @@GrainMuncher Destroyed? HA! 65fps vs 71fps is just the 192bit bus vs 256bit bus. I’ve learned to go with the most Vram card and have never been disappointed! 1gb GT220, 2gb GT640, 4gb 1050ti (laptop), 12gb RTX 2060. Let me repeat that, NEVER DISAPPOINTED!!! I will never sacrifice textures to play a game, because textures have zero impact to performance if you have the Vram for it.

  • @MrGalax00
    @MrGalax00 Před rokem +3

    I wanted a 3070 or 3060 ti for my HTP but ended up getting a 6700XT because it was cheaper, and had more VRAM. On my main PC I'm using a 3090 FE, so I don't have to worry about VRAM usage.

  • @lukasbuhler1359
    @lukasbuhler1359 Před rokem +7

    Planned obsolescence go crazy

  • @Electric_Doodie
    @Electric_Doodie Před rokem +2

    Unpopular Opinion: VRAM isn't the issue, the consumer is.
    We can all talk about how bad Nvidias pricing is, and it is bad. But so is AMD with the 7000 series currently.
    There was back in the Day much much more people considering the different 10xx/20xx of Nvidia in "Tiers".
    And while the 3070 Ti was pretty powerful, so was the 3060 Ti and so on, but they all are more Low to Mid Tier Card's if we go by that. We are now ~2 Years ahead of their release, they playing a AAA Game at ULTRA settings at 1440p/4k isn't their purpose anymore. What some influencers do test them for still.
    Does the extra VRAM of AMDs 5000/6000 Series do them any favors tho? I mean, sure you can probably set them to 4k/Ultra unlike Nvidia and don't run into the VRAM Bottleneck for some games, but at what Frames?
    The 6750XT is even with more VRAM then a 3070 Ti performing worse in HUB Hogwarts Legacy Testing (AVG and Lows FPS) and the 6800XT is performing worse then a 3080. Their kind of "Counterparts" of the Green side.
    There's obviously some other interesting things to Note, Power consumption, Performance in things besides Gaming (Davinci, Blender, etc), Driver's.
    12GB of VRAM of a 4070 Ti doesn't seem plenty, but it's a 1440p Card rather than a 4k Card which is all the hype for so many people tho, but they completely forget that.
    If you don't care about anything besides Gaming, special features (DLSS, Reflex, Ray tracing, etc), nor care about any sort of power consumption (which leads to more $$$ spend over time), AMD might be the right choice for you. Especially if you can get the Card's cheaper (like here in Germany, a 7800XT is ~100€ cheaper then a 4070 Ti).
    Always vote with your Wallet, not for Red or Green because someone said so / influenced you.
    People are just blindly following their favorite influencers choice and don't think much anymore it seems.

  • @StraightcheD
    @StraightcheD Před rokem +4

    10:33 It obviously depends on what you want to play, so I think you're technically right. I think people worry because nobody likes to end up in a niche by accident and be told that the game you want to play next just happens to be one of the dozen titles to avoid on your $600 card.

    • @dashkataey1740
      @dashkataey1740 Před rokem +1

      This. When you spend that much on a card, you kind of expect it to last you a few years and be able to handle new titles for a while.

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      @@dashkataey1740 problem is it's not 2016 anymore and big corps don't care if they ever did, they will just double down at this point
      consoles might be the best option for triple a title players, it will run like a turd, but at least you didn't pay 1000$ instead of 500 for a fake4k 30 fps experience, all this dlss and fsr is fake resolution and fake frames at this point
      rest can be played on a potato anyway

  • @ItsFreakinJesus
    @ItsFreakinJesus Před rokem +4

    Adjust your settings and it's a manageable issue even with AAA games. Shadows and lighting have massive RAM hits with little to no visual difference at the higher settings for example.

  • @simon6658
    @simon6658 Před rokem +7

    It's always good to force GPU makers to add more VRAM.

  • @laszlomiko9085
    @laszlomiko9085 Před rokem +2

    I'm running Hogwarts Legacy on an RX 6600XT at 1080p. No up-scaling, high preset with RT off, the VRAM usage was around 7.x GB. Changed the textures to medium, now it's usually under 7 GB. At 1080p you can't notice anyway, the game was made with 4K in mind. That's were you need textures at ultra (and also where you need a high-end GPU), high is probably fine on 1440p.

    • @TheTeremaster
      @TheTeremaster Před rokem +1

      TBH i feel like if you're shelling out for a high refresh 4k monitor, you can afford an XTX or a 90

  • @WebbTech1
    @WebbTech1 Před rokem

    I think we're "very much present" here in 2023 on the vram issue...which I have commented upon in various tech channels (Vex, glad I discovered you, I dig your content!). This issue is not going away. To keep my comment short (which I don't do very often...lol) I have a 3070 Asus Tuf Gaming OC, and looking very seriously at upgrading to the 7900xt (20GB vram) which is around the same price as the 4070TI (12GB vram), which I initially was interested in. I want to play ANY game at ANY resolution without issues!

  • @StuffIThink
    @StuffIThink Před rokem +5

    Just got a 6650xt. Don't really care if I can run games on ultra in the future. As long as it holds out a few years playing on medium or higher I'll be happy.

    • @gruiadevil
      @gruiadevil Před rokem +4

      Yes. But you bought it cheap.
      Look at how much a 3070, 3070Ti, 3080, 3080Ti cost. Those are the ones discussed here.
      Not the cheap products. When you buy cheap, you say to yourself "If I can play new games at a combo of Medium/High Settings, so I can enjoy the game, I'm satisfied. If it lasts me for the next 2-3 years, I'm satisfied."

    • @StuffIThink
      @StuffIThink Před rokem +6

      @@gruiadevil he asked people with 8 gb cards what they thought. Just answering his question.

    • @vanquishhgg
      @vanquishhgg Před rokem

      Havent had any issues with my 6650xt Hellhound. Will last me another year until I do a full rebuild

  • @Hakeraiden
    @Hakeraiden Před rokem +5

    At this point I'm scared to buy, 4070 which will be released soon. Not sure if I should wait for 7800xt or 7700xt reviews from AMD. Recently completely upgraded my PC after more than 9 years, I want finally to play games and not rely on streaming (which is still awesome as replacement). For now, I will play on 1080p, but consider upgrading to 1440p in a year or less.

  • @voteDC
    @voteDC Před rokem +2

    The vRAM issue is only for those who absolutely must be running at the top settings. I have a secondary PC I use just for media. It's an i7-4770K, GTX 970, and 8GB of DDR3 (in single channel), so not exactly a modern system. It runs Hogwarts Legacy at 1080p, all low settings (draw distance to high) with Quality FSR and rarely breaks from 60FPS. Sure it doesn't look as good as it does on my main gaming PC but it still looks and runs great.

  • @hyena-chase2176
    @hyena-chase2176 Před rokem +7

    I find 12gb vram about right,would not go any lower if buying a new GPU in 2023 ,I play a few games like rust that use about 10+ gb at 1440p,so more Vram the better imo

    • @Pimp_Shrimp
      @Pimp_Shrimp Před rokem

      Jesus, Rust got chunky. I used to play it on a 970 (albeit at 1080p) just fine many years ago.

  • @capnmoby9295
    @capnmoby9295 Před rokem +4

    It's probably the dev's becoming more and more complacent and the games becoming more and more complicated

  • @Sybertek
    @Sybertek Před rokem +6

    No regrets with the 6800XT.

  • @Aleksander50853
    @Aleksander50853 Před rokem

    @vex Listeing to HUB and other channels they have pointed out multiple times that looking at the vram number you are looking at does not give the correct answer to vram usage, and that it is in fact only vram allocation which is usually a bit higher than the actual usage.

  • @MrWasian
    @MrWasian Před 10 měsíci +1

    As someone that has been building PCs for two decades. Its insane to me that the vocal minority has caused the majority to think that the VRAM issue is major. It isn't a factor for most gamers out there, it isn't going to impact them AT ALL. Most people aren't trying to game with everything maxed and on 4k.
    We've been so spoiled with consistent releases that people think you absolutely have to have the most in everything. Someone who purchased a 3080 or even a 3060Ti can still get 5 years out of their card easily, especially considering if they OC up to 20%, all while still having settings at high (ultra is negligible and should be used just for ss).
    I've built PCs for many customers and it's honestly tiring having to quell these unfounded claims. As I'm able to save them literally HUNDREDS on their build so they can buy a better monitor instead or something else instead of them thinking that they absolutely have to buy a 80Ti/90 variant in order to have "future proofing." Yet most of them don't care for gaming at 4k let alone A LOT of people can't even tell the difference between mixed medium/high vs high settings on most games. Just wild the amount of hogwash there is around the PC building world. Great video though! Its channels like this, Gamers Nexus, and HU that keep consumers properly informed and educated!

  • @OffBrandChicken
    @OffBrandChicken Před rokem +4

    You know the only people that tend to try and even remotely justify 8gbs nowadays even on lower end graphic cards tend to be Nvidia users.
    You notice that? I think some people are coping and hoping they didn’t make a bad decision going with Nvidia this time.

    • @gruiadevil
      @gruiadevil Před rokem

      The amount of Copium is high in these comments :))
      I noticed that too.
      And they're not just nVidia users. They're nVidia users who bought a 3070-3080 card during the boom. Ofc they are going to lower settings. They are still paying the loan they took to purchase that card :)))

    • @OffBrandChicken
      @OffBrandChicken Před rokem +1

      ​@@gruiadevil It's so crazy on like how predictable their responses are. Like "Most Top Steam Games don't even require it."
      Like people don't realize that maybe the reason that they are still the most played. Is because the players Graphics Cards can't handle more.
      Most "Gamers" are gaming on laptops/prebuilts with lower end graphics.
      People that are building/upgrading, are doing so with the intent of playing modern games. Because you wouldn't waste that type of money otherwise.

    • @David-ln8qh
      @David-ln8qh Před rokem

      Don't you need 8gb to say 8gb works for you?

  • @afgncap
    @afgncap Před rokem +3

    AMD approch worked for me in the past when I could not afford top shelf gpu and upgraded once every 6 years. Their cards until now aged fairly well. However I agree that having ridiculous amount of VRAM at the moment of buying doesn't really help you. I now have 7900 XTX and I doubt I will ever be able to use 24 GB of VRAM before I upgrade.

    • @scythelord
      @scythelord Před rokem

      I've already used the 24 gigs of VRAM my 3090 has. It isn't difficult to do.

    • @afgncap
      @afgncap Před rokem

      @@scythelord in a gaming scenario, unlikely. Workload sure.

  • @nyrahl593
    @nyrahl593 Před rokem +2

    The EVGA 3090's had their VRAM in clamshell mode and its frustrating, because those cards clearly did not have issues running their VRAM in clamshell mode. So I really wonder how much more it would cost vendors to double the VRAM on current chips since clearly the X070 series must(JEDEC standards and all) support up to 16GB, the 080's 20GB.

  • @DeusVault
    @DeusVault Před rokem +2

    Yes, it is difficult to optimize massive games to use limited vram, especially in a limited time frame.
    Yes, for 1440p and 4k and Ray Tracing, more vram is going to help you in the long run.
    BUT, if you are not an obsessed setting slider and can deal with your games being on Medium or High instead of Ultra settings(which is barely noticeable), then 8GB vram is FINE.
    People running out and buying a 16GB vram card for 1080p drank the panic koolaid. There is zero reason you need that much vram right now unless you are going to be doing 4k, tracing rays, or doing some very demanding workload.
    I'd recommend 12gb just to not deal with issues on AAA games at launch or if you want to do machine learning, ray tracing, or max out all the settings. (except don't get a 3060, pretend it does not exist. 6700XT is just better value.)

  • @admiralcarrot756
    @admiralcarrot756 Před rokem +9

    Meanwhile, AMD 6000 series offers you VRAM based on card tier like 6500 for 4GB, 6600 8GB, 6700 12GB, 6800 16GB, and lastly 6900 for 16GB too.
    Nvidia 3000 series be like... 3050 8GB, 3060 8GB, 3070 8GB, 3080 10GB, 3090 24GB. See the problem there?

    • @user78405
      @user78405 Před rokem +2

      Having 8gb RAM feels like having 12gb in games ...while 10gb feels like having 16gb in games ...very different in tech between methodology on conservative usage that don't tax high in your system usage and finding out who are best game developer than lazy take more VRAM to cover up bad game flaws like forspoken that I find it embarrassing on quality AAA titles now vs doom brand is always been carefully respected still...and runs all cards today. Even 4gb GPU's due to idtech Carmack genius work he put into ...wish every developer is like him...to have brains to make games ...but like for spoken, I can tell the developer is rude and super lazy can bring entire team down with bad energy he or she spread into...talked about employee don't like his or her job

    • @VDavid003
      @VDavid003 Před rokem +1

      Actually, the regular 3060 is 12gb which is even weirder.

    • @silvershines
      @silvershines Před rokem

      Overall the line-up isn't too weird once you realize the original plan was that there was meant to be a RTX 3080 20 GB. But then the crypto boom happened and Nvidia decided to chase sales volume for that sweet crypto cash. Better to produce more cards than a good product.
      The past few crypto booms (that were isolated to AMD cards) also showed that regardless of what you do -- you will have to deal with a bunch of cheap 2nd-hand used cards cannibalizing your new card sales. So regardless of what happens, your company is going to be the baddie anyway so might as well raise the price and condition people to accept higher prices.

  • @16xthedetail76
    @16xthedetail76 Před rokem +5

    My GTX 980ti will be going hopefully for another 2 years...

  • @andreigroza12
    @andreigroza12 Před rokem

    i also wanted to get a 4070ti a bit after it's release, but thinking long term, it has just 12gb of vram so that might be hard for new games if i want all the fancy stuff in a year or two, so i decided to go for the rx 7900xtx despite spending a bit more.

  • @dazdaz2050
    @dazdaz2050 Před rokem +1

    @Vex nice video i was hoping for this exspecially as i have the 3080 10GB
    Ok sorry if this has already been mentioned below but i dont have the patience to read all the comments to check and i have allot to try and get across, so this post might be abit disjointed lol. The lack of Vram is only half the story and people with powerful cards like the 3080 and 3070ti and so on should be considering these facts well before upgrading.
    1. (Memory Bus/Controller size) on GPU and 2. ( Pc system ram speed)
    When you run out of Vram all that happens is the extra ram needed for game assets spills over into the system ram THATS IT, yes system ram is traditionally allot slower than Vram but most cards with a large memory bus can swop data in and out of Vram so fast it actually some what compensates for the lack of Vram amount, also if you tune your Pc system ram that will also help dramatically too and its free to try.
    The 3080 has two things going for it gddr 6X and a bus width of 320bits. I personally overclocked my DDR4 dual rank+dual channel system ram from 3200 cl16 to 3766 cl16 (the maximum my cpu IMC could handle with four sticks) which i personally recommend everyone try well before upgrading something as expensive as a graphics card.
    If i was on a intel system my ram speed could be in the 4000+ faster ram would set you back maybe 100 once you factor in the sale of your old kit compared to a new high end GPU with 20gb+ of Vram which could be an extra 600+.
    Finally cap fps with riva tuner to 60 or at least 10fps under what you see your card is able to hit consistently and youl most probably find your still good to go.

  • @aeneasoftroy9706
    @aeneasoftroy9706 Před rokem +3

    next gen is here and that's it. moving forward you're just going to need high end hardware to play the latest games on pc. however, that doesn't mean you can't pc game, all it means is you need to have proper expectations for your hardware.

  • @user78405
    @user78405 Před rokem +3

    Lot of people said that adding extra 16 GB of system ram in their system with 3070 fix the majority of frame stuttering in LAST OF US and RE4 remake when I did notice it using 24gb of system ram...what a shocker....now I know why 3070 dips to 20fps when 16gb ram is bottle necking of data to process...now with 32gb...it stops and no more shader loading during gameplay...and fps minimal went up to 41 fps

    • @OffBrandChicken
      @OffBrandChicken Před rokem +3

      When your computer cant use more VRAM it uses system ram. It’s still a VRAM issue at the end of the day.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Před rokem

      My system can get up to 16gb vram usage but 8gb come from my rtx 3070, games crash or low fps in games when vram is not enough and in your case is that your vram was not enough and ram memmory xD

  • @Tzalmavet
    @Tzalmavet Před rokem +2

    Definitely experiencing this issue with my RTX 3070Ti. The GPU itself runs fantastic but it's handicapped by lack of VRAM while playing games at 1440p. So I am in this dilemma of do I sell my EVGA RTX 3070Ti FTW3 and use the funds toward a Radeon 7900XT or do I hold out and hope Nvidia will actually offer sufficient VRAM on next gen cards that are marketed for high resolution gaming?

    • @devashishjn
      @devashishjn Před rokem

      Same here. it sucks to have 3070 with all the graphical prowess i need for 1440p being handicapped by vram limitations. Have decided to sell it and buy 6800xt instead.

    • @blissseeker4719
      @blissseeker4719 Před rokem

      Do your games stutter when they are at a high vram usage?

  • @mechanicalpants
    @mechanicalpants Před rokem +1

    I bought a RX 6700 10GB and I'm only gonna be playing at 1080p, will this be enough for these newer games and into the future and these super detailed textures? Because in games like Red Dead 2 and the Last Of Us, RE4 etc going down to the medium textures really starts to drop of in quality drastically z(really muddy textures) and there doesn't seem to be a nice middle point anymore with some of these newer games.

    • @itsJPhere
      @itsJPhere Před rokem

      Like my mom used to say: At 1080p large textures are a waste of vram.

  • @pointlessless2656
    @pointlessless2656 Před rokem +3

    I would argue that nvida isn't a full gen ahead of amd in ray tracing more half a gen if anything, the 7900xtx isn't that far behind a 4080, the only card that you could argue that has a huge ray tracing advantage is the 4090, a $600 mark up in price.

    • @debeb5148
      @debeb5148 Před rokem

      Meanwhile ps5s just play games.

    • @m.h.4907
      @m.h.4907 Před rokem

      ​@@debeb5148 Yeah with medium settings and limited to 30fps. GG

    • @debeb5148
      @debeb5148 Před rokem

      @@m.h.4907 Your shitty 8 year old pc might, but ps5s go up to 120 fps lmfao

  • @nixboox
    @nixboox Před rokem +2

    The issue isn't that deep. You're talking about needing...let me rephrase that..."needing"...to upgrade your graphics card when you have one from the LAST generation because it doesn't "quite" run all the latest games at their maximum settings. But for most of the real world this idea is insane. I have a pair of Nvidia Geforce 980s running in SLI that I have used since new. They have run every. single. game. I've wanted to play for the last ten years. It is only in the last year that I've found games that I am incapable of playing because they 4G of RAM on the cards is too little. No one who has a 30-series card is in need of upgrading for any valid reason. Those cards will run the latest games for the next 8 years with no problems. At this point, I would consider upgrading to a 40-series card because my 980s are, what, five generations old? The problem you ALL have is that chasing the next newest thing will always leave you unfulfilled. Learn to be happy with what you have and be thankful you don't have less. That's a general rule of thumb for living your best life.

    • @3rd.world.eliteAJ
      @3rd.world.eliteAJ Před rokem

      Amazingly put! These people want all the latest and greatest RTX features with 20GB vram for the lovely price of 200$... While simultaneously ignoring the fact that developers are just milking the consumers with terrible ports and remakes year after year.
      These same people say that developers are now targeting console vram targets. Yet, even new console games are running into performance issues - Redfall is running at 30fps on PS5 & XSX without a 60fps performance mode available on launch. LOL... Somehow this makes 8-10GB obsolete? No, the developers are making PC gaming obsolete... Requiring 400W+ GPUs just to run games on 1440p 60fps is absolutely hilarious.

  • @justinworley6538
    @justinworley6538 Před rokem

    Running a 1080 8gb and not seeing any issues but I've taped out of the AAA market for 2023 possibly 2024 as releases have been buggy messes working through my backlog and adding indies and AA games so far with no issues and probally most fun I've had gaming in a long time. Dredge was a fun time and working through legends of heros series currently. Happy gaming.

  • @jorge86rodriguez
    @jorge86rodriguez Před rokem +2

    Thank you and no you are not being controversial your are being honest, the vram issue only affects "AAA" (whatever that means) games on high settings. There is a huge catalogue of games that do not demand high vram, and the ones that do, turning off or lowering some of those graphical intensive settings alleviates the issue.
    That does not means it is fine and it sucks NVIDIA puts a limitation on a costumer that spend so much money on a premium product, however it is not a general pc gaming issue as a lot people have made it look

  • @axxessdenied
    @axxessdenied Před rokem +10

    Seeing how things are unfolding makes me pretty happy that I picked up a 3090. I've managed to hit 23+gb usage in CP77 with a bunch of texture mods.

    • @scythelord
      @scythelord Před rokem +5

      Yep, same here, but I knew this was coming. You can't stagnate with VRAM levels that were available 10 years ago. The Radeon R9 290X had 8 gigs of VRAM back in 2013. 8 gigs was good for 2016. Today it's practically minimum tier. Double or triple that is just more sensible.

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      @@scythelord depends on what you play, 8gb is still enough for games from 2017/8 and most people play older games, like csgo, mmos etc
      so only minority play unoptimized garbage triple a titles

  • @Verpal
    @Verpal Před rokem +9

    Considering there is only one singular GPU that was released with 10GB of VRAM, if the dev already decided to ''SCREW 8GB, lets put in more texture!'', I don't see how and why they will stop themselves at 10GB instead of the much more popular 12GB threshold.

    • @scythelord
      @scythelord Před rokem +6

      They literally didn't even consider PC graphics cards when making the game. It's made to use what the Xbox Series X and Playstation 5 can use, which is a lot more than 8. This isn't a result of them upscaling anything for the PC release or intentionally screwing anyone. They're just straight porting the same game over.

  • @hyxlo_
    @hyxlo_ Před 11 měsíci +2

    Devs are not optimizing there games and we are blaming gpu manufacturers 🤦‍♂️

  • @7lllll
    @7lllll Před rokem

    i use an old laptop, and i notice specific problems: low screen resolution, low cpu core count, and slow storage. these are the problems, and other things, such as the size of ram and gpu performance are fine. this experience and this vram fiasco tells me that it is likely that, that specific problem in a new laptop i will buy, will be low vram capacity

  • @tylerstokes6722
    @tylerstokes6722 Před rokem +3

    I don't see vram being a huge problem, but I do feel all new gpu in this generation should of been 12 gb atleast or even 10gb for entry

  • @brkbtjunkie
    @brkbtjunkie Před rokem +4

    Something to be aware of is cached memory vs actual memory use. Many games load up the vram to the hilt, but only a portion of it is being used. 8GB was fine for me at 1440p/165hz and 4K/60hz on my 2070 and 10gb is fine on my 3080. I have zero issues with vram and the hitching I do get sometimes is not a vram issue, it’s a shader calculation issue or engine frame time issue.

  • @trulycursed-_-2356
    @trulycursed-_-2356 Před rokem +1

    I plan on building a budget pc and was going to buy a rx 6600 8gb vram but after seeing all these videos I am thinking of going for the 3060 12gb Vram. What do you recommend I do since the 3060 is overpriced?

  • @aerions
    @aerions Před rokem

    I have actually ran 8gb of vram since 2015 when I got my first serious gaming rig with an r9 390. Avoided the gtx 970 because of that whole 3.5gb/4gb vram stuff back then. I now have a 3070 coming from a 1070 and 1080 , so I have pretty much ever known 8gb of vram. I recently switched to a 1440p monitor and oh man, it feels punishing to have to tweak settings to avoid vram limitations at 8gb in 2023. Really noticed this in titles such as hogwarts legacy and rdr2.
    Im also an amateur astrophotographer when the time allows and I utilize a few editing programs, such as adobe photoshop and one non astronomy people haven't heard of which is Pixinsight. I'd love to get an AMD card for 16gb of vram but its so unfortunate that Pixinsight has nvidia cuda acceleration features baked into it, with even more integration for gpu acceleration in the future. I'd be missing all of that with an AMD card, for example theres an AI suite to make 10 minute, multi step workflow processes complete in just a click of a button. Running on my 3070, star removal takes 30 seconds but on a 5800x cpu, easily over 4 minutes. I have also noticed that pixinsight takes up a ridiculous amount of vram pretty quickly but I don't see vram requirements on their website, so it may just be utilizing it before system ram.
    I really want a 4070 ti but even 12gb of vram is like oooh idk. then the 4080 at 16gb but lacking in performance for price. then the 4090 seems like a better deal and boom, ive bought into nvidias trap. not happening hahah

  • @soup-not-edible
    @soup-not-edible Před rokem +3

    When I bought a 16GB RX 6800, I wasn't thinking much about its VRAM, but this is a godsend.
    I prefer having more RAM that's slower than less RAM that's faster.
    (Definitely laughing at Nvidia for the backlash of this "supposed" planned obsolescence)

    • @r3tr0c0e3
      @r3tr0c0e3 Před rokem

      ironically 3080 with less ram still will be 30% faster than your 6800 and no vram will change that lol
      30 fps with stuttering is just as bad as 30 fps without, when the times comes to that and by then this cards will be just e-sports cards lol

  • @AlchemyfromAshes
    @AlchemyfromAshes Před rokem +4

    I'm on the fence on this being a niche problem. I would agree saying it's a bit more of a niche that people buy AAA PC titles on day one or immediately after release vs. console. I think there are a fair number of PC gamers who are like me though. I won't pay the initial AAA price for a game. They are bound to have serious bugs and I'm a bit older and used to older pricing, so the current standard AAA price just seems crazy to me. I will look at them seriously 1-2 years from release though when the price is cut in half in a steam sale etc. If Hogwarts Legacy has an OK sale any time this year for example I'll be picking it up. Definitely at half off, but maybe even if it just drops down to 75% retail. I picked up CyberPunk as soon as it went to 50%.
    I upgrade my video card every 4-6 years which I think is relatively common, so there is a fair chance that VRAM issues are going to impact me within the next year, and before I upgrade again (picked up a 3080 about a year and half ago). So, to me, the problem isn't niche as much as it's just delayed a bit. AMD has shown that VRAM cost doesn't have to be a serious factor in being competitive pricing wise. NVidia is just making up their own rules at this point and testing what the market will bend over and accept it seems. AMD is happy enough to do the same. Just my opinion, but at the obscene prices that NVidia and AMD are charging for cards right now, RAM shouldn't have ever been an issue. It should be plentiful on anything but a bargain bin card. It's like buying a luxury car and the dealer trying to stick the crappiest tires and no accessories on the thing, all while telling you that you should just shut up and be happy they're selling it to you. You don't expect the dealer to skimp on bells and whistles when you're paying well. It seems the video card manufacturers have lost touch with the average consumer and don't realize or care that, for most of us, video cards ARE luxury items. Very few of us can treat them like throw away electronics and just upgrade every time a new model comes out to keep up.
    From experience with friends and acquaintances, I would venture to say there are a fair number of people this already effects even at the AAA price though. For example, to people who are also console gamers, or coming to PC gaming from consoles, the AAA price is just the normal price of a game. It's not expensive. Most people probably aren't buying a AAA title a month, but it seems likely that a large number of gamers would pick up a AAA title or two a year, especially if they haven been waiting on the title for a long time. I think this could at least impact momentum for interest in PC gaming in the near future.

  • @kijuanadams5650
    @kijuanadams5650 Před rokem

    This is a good video and you made a great point a lot of people don’t play the most demanding AAA games and if you don’t you won’t experience as much vram consumption

  • @T0pN0tchSoldi3r
    @T0pN0tchSoldi3r Před rokem +1

    my question is what card is better for streaming amd or nvidia ?and does vram size matter

    • @Zxanonblade
      @Zxanonblade Před 10 měsíci

      Late reply but go with Intel or Nvidia. They both destroy AMD with their encoding in the most common formats (H264, H265) and then between Intel and Nvidia, I would personally go Intel because their encoding (Quicksync + Arc Hyper-encode) is pretty much equal for way cheaper and they also have AV1 (while being way cheaper than Nvidia's good options for AV1, the 4000 series).
      Vram doesn't matter much, you can easily use a basic card like Intel's A380 to do the encoding on its own while gaming off a second gpu (so basically, encoding doesn't need a ton of vram).

  • @TheWretchedWorld
    @TheWretchedWorld Před rokem +7

    I said 10gb wouldn’t be enough for a 3080, considering the next Gen consoles have a pool of 16gb of vram. I bought a 3090 and was told by other gamers that I wasted my money and 10gb is plenty. The writing was on the wall, you can predict the future hardware requirements just by looking at the new consoles.

    • @TheEnd0117
      @TheEnd0117 Před rokem +4

      Well they don’t have dedicated VRAM they have 16gb of unified ram. Consoles share their ram with the system OS and other processes. So games don’t have access to 16gb of VRAM. Consoles are also easier to optimize so it’s not really an issues.

    • @albertoalves1063
      @albertoalves1063 Před rokem +1

      Is not 16GB of VRAM, is 16GB of shared memory, but even if was 16GB of VRAM this should not be a problem for any RTX 3000 since we all see cards like the GTX 1050 with 2GB of VRAM and the GTX 1060 with 3 and 6GB of VRAM and the console at the time was the PS4 and Xbox one both consoles had 8GB of shared memory and those cards was able to run every game, the GTX 1050 was weaker but run almost like the consoles and the GTX 1060 was a lot better. The amount of VRAM is not the problem, the problem is the optimization, because if a game like Read Dead Redemption 2 can run on a GTX 1050 2GB with textures on ultra at 1080p 30fps, these new games should run on any RTX 3000 without problem and just to be clear when I said "run without problem" I'm not saying a RTX 3050 4GB run The Last of Us at ultra, but it should run at least on medium/high.

  • @rubsfern
    @rubsfern Před rokem +3

    So glad that I went with AMD. 16gb is king

    • @Deernailt
      @Deernailt Před rokem

      AMD has 24GB now it’s 7900 XTX

  • @OnikMod
    @OnikMod Před rokem +1

    Have concerns now about my rtx 4070ti. Anyway, we will see when time passes by. It is already obvious how minimal and recommended settings are not matching the graphics, take the Golum for example.
    Now it looks like 7900xt is a better choice.

  • @DonreparD
    @DonreparD Před rokem +1

    I noticed the Obsidian dev that was on MLID wasn't complaining.
    It's typically the VFX developers who want more VRAM in the hopes that more people will be impressed with their work on the explosions and fire effects. As far as I'm concerned, those developers would be happier working for _Scanline VFX_ .
    If people want to be "immersed" with visual effects, watch a movie or play one on a PlayStation.
    *This is just my opinion. I feel like the "EVEYONE & EVERYTHING NEEDS MORE VRAM" narrative is getting a bit ridiculous.