Is 8GB VRAM Enough For The Latest Games? If Not, Why Not?

Sdílet
Vložit
  • čas přidán 16. 05. 2024
  • ► Watch the FULL Video: • DF Direct Weekly #162:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry CZcams: / digitalfoundry
    ► Digital Foundry Merch: store.digitalfoundry.net
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on X/Twitter: / digitalfoundry

Komentáře • 335

  • @boaether
    @boaether Před 22 dny +69

    8 GB should be enough, but it isn't. I got tired of my 3070 Ti as the VRAM amount couldn't keep up with the processing power of the card. It COULD process RT and high settings in many games, but it WOULD often run out of VRAM bringing performance and sometimes stability crashing down.
    Now I have a 16 GB card and I don't have to worry about VRAM at all. I would not in any way get an 8 GB card in 2024, if you plan on playing AAA games.

    • @Philo229
      @Philo229 Před 22 dny +8

      Not if you want to play them at max settings, with RT and at a high resolution. But, you can easily get away with 8GB at 1080p with lower settings.

    • @CaptainScorpio24
      @CaptainScorpio24 Před 22 dny +3

      same here went from 3 yr old 3070 founders edition to gigabyte rtx 4070 ti super eagle oc 16gb at launch jan2024

    • @Pikkoroo
      @Pikkoroo Před 22 dny +5

      And there’s that guy saying “4gb is enough if devs optimize” like cmon man devs have been waiting for 16gb to be the norm for vram as well as more for the future

    • @TheIndulgers
      @TheIndulgers Před 21 dnem +13

      VRAM should never be the limiting factor. The card should age from lack of performance before anything.
      Having to lower texture quality just to fit in the vram budget is unacceptable imo. Texture quality is literally free visual quality if you have enough vram - and imo has the largest impact on visuals. Much more than RT, or shadows, or SSR.

    • @tomthomas3499
      @tomthomas3499 Před 21 dnem +6

      They know full well when they give 3070, and 3080 8 and 10gb of vram respectively, to force you to upgrade for newer gen, which you did, and probably one of the many reasons that kept Nvidia monopoly grip on gpu market..

  • @Pawnband
    @Pawnband Před 21 dnem +35

    John didn't have to flex so hard with that 12mb Voodoo.

  • @returningwhisper
    @returningwhisper Před 21 dnem +28

    Nvidia cheaping out on the RAM solutions for their cards is not great.

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Před 21 dnem +3

      They want you to buy their overpriced GPU's instead.

    • @johnarigi6274
      @johnarigi6274 Před 21 dnem

      If rumors of RTX 5080 having 16gb are true, then that's just Nvidia being very stingy. For their stinginess I'd expect at least 18gb, I don't see them putting 20gb

    • @rustyshackleford4117
      @rustyshackleford4117 Před 16 dny +2

      @@johnarigi6274 16GB is going to be plenty for any game for quite some time. I have never seen my 4090 hit even close to 16Gb usage in any game, even maxed at 4K with all RT settings to Ultra. More memory in this scenario is only warranted if you're doing AI workloads. Besides, due to the memory bus size, it's not possible to have a card that's say 18Gb of VRAM, it's either 8, 16, or 24.

    • @hannes0000
      @hannes0000 Před 16 dny

      @@DeadPhoenix86DP They also want that people upgrade every generation.

  • @manna6912
    @manna6912 Před 22 dny +160

    Damn.. Here we go again.. Ps3 with 256mb ram was able to run watch dogs and gta 5. That was hell of the optimization back then.

    • @johnconnorpliskin7184
      @johnconnorpliskin7184 Před 22 dny +15

      Not to mention, low-spec indie games continue to come out and sell really well.

    • @OwtDaftUK
      @OwtDaftUK Před 22 dny +32

      It is shocking how much they was able to do with the PS3's little ram.

    • @gothpunkboy89
      @gothpunkboy89 Před 22 dny +59

      Run those games is being generous. They could barely maintain a frame rate in the upper 20s.

    • @KrazzeeKane
      @KrazzeeKane Před 22 dny +31

      ​@gothpunkboy89 you tell no lies, my friend. I bought a ps3 to jailbreak and my god, do a lot of the ps3 games run like absolute crap.
      So many titles sub 30fps, or bad frame pacing and stutters and such. I ended up upgrading my pc so I can run the rpcs3 emulator and play ps3 games at a stable fps and a proper resolution lol

    • @manna6912
      @manna6912 Před 22 dny +5

      @@KrazzeeKane yeah.. Agree.. But that time had its own gold age vibe.. Back then games used to be smaller and affordable. But after witcher 3 every game became grinding and grinding for 100 hours and backlog keep getting bigger and bigger. Its just kind of overwhelming. Back then games used to chase players, today we are chasing to finish every game. It seems impossible with rpgs.

  • @FMBriggs
    @FMBriggs Před 21 dnem +10

    20 years ago I bought a gaming laptop with 2GB of shared system memory, during a time when 128-256MB was standard. That laptop lasted for well over a decade as a netbook. It's not something I'd recommend though, especially for gaming. Don't buy a 4090 or a 7900XTX for the large memory buffer, because by the time you actually need that much VRAM a mid-range card that uses 1/3 the power will outperform your space heater.

  • @TheIndulgers
    @TheIndulgers Před 21 dnem +67

    VRAM should never be the limiting factor. The card should age from lack of performance before anything.
    Having to lower texture quality just to fit in the vram budget is unacceptable imo. Texture quality is literally free visual quality if you have enough vram - and imo has the largest impact on visuals. Much more than RT, or shadows, or SSR.
    Stop defending a trillion dollar company. It is weird.

    • @OG-Jakey
      @OG-Jakey Před 21 dnem +1

      DF has arguably helped corporations more than they realise. Pushing 2560x1440 has been the biggest contributing factor when you look at the big picture.

    • @timothyjn100
      @timothyjn100 Před 21 dnem +3

      If you compare path tracing to 4k vs 1080p textures, textures actually have FAR less visual impact. Not even close.
      Not sure how you can say textures have the largest impact when objectively that is not true in the slightest.
      I agree with everything else you state. I agree that it's free improvements for nothing more than just having enough VRAM. But lets not get it twisted. It's night and day comparing pathtraced lighting to higher res textures. Take a game like minecraft for example. You can mod in crazy realistic high res textures, but doing so alone is hardly game changing. Now path traced MC with the standard 16x16 textures? It's game changing. I rest my case. That is the simplest comparison I can put fourth the express this basic comparison

    • @Hayden2447
      @Hayden2447 Před 21 dnem +4

      @@timothyjn100 Minecraft is an awful example as the 16x16 textures are literally the art style of the game, making them higher completely changes the Minecraft charm. Take a realistic game for one but if the textures are visibly low res vs a higher setting that makes them look nice and detailed then better textures are an all round win. RT can be big, but only when you crank all parts of it or path tracing which tanks frame rates vs textures which are free if you have the vram. However higher resolutions do make higher quality textures matter more as there's more pixels to represent them so you can see more of the detail such as at 4K or even 1440p vs 1080p where RT would make a relatively higher boost in fidelity as with 1080p high resolution textures are hidden more by the lack of rendered pixels so you need it to be closer to see the difference.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Před 21 dnem +1

      @@timothyjn100That's not what the OP meant at all. Compare low, medium, high and ultra textures on nearly any game and it'll always be the biggest impact on visuals where it has no performance impact if you have the Vram. Most of the time those texture qualities also don't state was "resolution" they are. 4k or HD textures is just something applied to texture packs you download as mods or for games like Far Cry 5. Games like GTA 5 have been said to have 4k textures, yet that is meaningless since it doesn't specify what texture quality it is, and it also has worse textures than newer games that are technically the same resolution.

    • @rdc3830
      @rdc3830 Před 16 dny

      That’s not fair,
      Defend what deserves to be defended at all times.
      Your definitely right about the rest.

  • @BatCorkill
    @BatCorkill Před 22 dny +21

    I noticed the Avatar game seemed to be using 10+gb of my 12gb cars.

    • @zackmandarino1021
      @zackmandarino1021 Před 21 dnem

      it used 13-15gb on my 20gb card lol 1440 ultra with unobtanium mode enable it was 17 i was like wtf

    • @SimonBuchanNz
      @SimonBuchanNz Před 21 dnem +6

      Avatar is a little bit of an odd case as it will upload as much as at can to reduce how much it needs to swap in and out.
      Since it's an open world game, this means it can fill up pretty much any amount of memory until most of the install is on the card!

    • @adlibconstitution1609
      @adlibconstitution1609 Před 19 dny +2

      Rtx 3070ti can run avatar ultra 1080p

    • @ymiround
      @ymiround Před 18 dny

      Funny enough this particular game runs pretty great on 8gb cards according to different testers. But for example Horizon Forbidden West and Ghost of Tsushima use around 8gbs so in case of 4060 and 4060 ti it's fine but you can't enable FG because FG also takes vram. I mean, you can enable it but it starts swapping vram into ram and you'll see the same fps at best

  • @user-cd3je7lo2z
    @user-cd3je7lo2z Před 21 dnem +5

    bad optimised games is a plague.

  • @deathshade777
    @deathshade777 Před 22 dny +27

    It’s not enough I run out on most of the current gen games with my 4070 laptop. It should be 12gb min

    • @desmond3245
      @desmond3245 Před 21 dnem +8

      Wow 4070 mobile is such a scam. 8GB 128bit Vram with the performance of a 3060 desktop. Even if it's called 4060ti mobile it's still bad. I thought with better power efficiency mobile chips would be much better than last gen. But then I forget it's Nvidia.

    • @filippetrovic845
      @filippetrovic845 Před 16 dny

      @@desmond3245You have no idea what you are talking about. With FG 4070 laptop has 120% higher fps than shitty 3060 in all fg supported titles which are exactly the most demanding games. I have 4060 laptop and couldnt be happier, never ran out of vram. However if you bought 1440p laptop with 8gb vram you dont know what you are doing.

    • @rustyshackleford4117
      @rustyshackleford4117 Před 16 dny

      @@desmond3245 I want to upgrade from my aging GTX 1080 laptop, but if you ignore ray tracing and DLSS as selling points, the 4070 mobile is about on par with the old school 1080 mobile in terms of pure raster performance. Back then, the mobile cards like this were full-fat GPUs with the same specs as the desktop parts.
      Now, we are power limited on cards with the highest-end mobile 4090 being functionally more similar to a desktop 4070, not even a 4080. Steam Deck has been plenty fine for 90% of my mobile gaming though, so getting a new laptop is more about having DLSS and an OLED screen...while 4090 on my desktop continues to provide more performance than I'll need for some time.

  • @sapphyrus
    @sapphyrus Před 22 dny +18

    8GB can still make do if you bought it in previous gen but a bad idea to buy in current gen.

  • @FormerHumanX
    @FormerHumanX Před 21 dnem +2

    8GB VRAM definitely isn't enough for what Nvidia is charging.

  • @EastyyBlogspot
    @EastyyBlogspot Před 22 dny +1

    Thing is say u had a new card with 8gb or and older card with 12 or 16...would u pick the higher vram or the less ram newer card that i would assume would have better performance? Though why they could not even make them a 10gb still puzzles me

  • @mondodimotori
    @mondodimotori Před 21 dnem +14

    Again with this debate... Some people in the field alredy warned about this limit way back when the 3070 launched.
    8GB are obsolete in 2024 for resolutions higher than 1080p and max quality textures.

  • @clinten3131
    @clinten3131 Před 22 dny +7

    I would proritize a 16 gb card if I was making a new rig. But people can do what they want ofc its not my system or money. I just dont want the headache of having a potent GPU that I have to scale game settings down bc of vram.

    • @KrazzeeKane
      @KrazzeeKane Před 22 dny +3

      Thats what I did when I built my pc last year. I wasn't going with any less than 16gb, as I wanted to be able to play games for the next 5 years minimum before having to turn down settings and upgrade.
      At that time the only card that fit was an rtx 4080, as this was long before the 4080 super or 4070 ti super existed.
      Managed to nab a 4080 for $950 and it has been treating me very well--paired it with an i7 14700k and 32gb ddr5 6400mt cl32 ram, so it will definitely last me the next 5 to 6 years of high settings + high fps gaming at 1440p before I need an upgrade.

  • @iambrd1543
    @iambrd1543 Před 21 dnem +4

    What I don't understand is why people expect 8GB to be able to run games at 1440/4k. That should have never been the expectation. 8GB is still viable at 1080p and will be for some time.

    • @_syedalimurtuza_
      @_syedalimurtuza_ Před 14 dny +2

      3070 was featured for 1440p by nvidia, but I still agree with you yet Dev's can still optimise games.

  • @f-35x-ii
    @f-35x-ii Před 22 dny +8

    I've got an Asus TUF RTX 3060 Ti OC gaming 8GB and paired it with an AMD R7 5800X3D with a curved 144hz 1440p monitor , I am having no issues with his at all, even in UE5 games, of course always need to optimize the settings and I'm also getting high FPS. I'm not switching videocards anytime soon neither but when I do it will definitely be 16GB one!

    • @Pikkoroo
      @Pikkoroo Před 22 dny

      Play rust, it’s an old game at this point.

    • @Dhrazor
      @Dhrazor Před 22 dny

      UE5 uses the least amount of VRAM out of any advanced engine, try running Alan Wake 2 with RT at 1440p, or 4K... if you had Frame Gen that would use even more VRAM.
      At 1440p MAX+PT+FG the game uses 14GB VRAM, at 4K it's almost 18GB... even at 1440p+RT the game can use 12-13GB so the 3060 beats the 3070Ti. 16Gb should be fine for now, but I hope there will be 24GB 5080Ti at around $1000 next year, I'd really like to move to 4k

    • @filippetrovic845
      @filippetrovic845 Před 16 dny +1

      @@DhrazorTelling a guy to play certain game which he wouldnt normally play only to prove that 8gb vram is not enough is exactly what i expect from poor amd fanboy. BTW 8GB cards are meant for 1080p, so you shoudnt expect to max every game on 1440p with 8gb but you can prove to yourself that even at 1440p you mostly wont be dissapointed.

  • @spawnkiller9779
    @spawnkiller9779 Před 21 dnem +2

    With dlss and fsr i feel that this plays a part in scalability for 8 gb cards more than higher gb cards like 2070 and 2080

    • @rustyshackleford4117
      @rustyshackleford4117 Před 16 dny

      This is a good point, as the lower internal resolution will keep 8Gb of VRAM relevant for much longer than it would be otherwise. That said, if we're down to the point of needing DLSS to upscale to 1080p final resolution we're getting to a bad state, as even DLSS quality with this output resolution has a very noticeable degradation in image quality over native 1080p. FSR2 is far worse, as it's basically useless unless upscaling to 4k from 1440p, and even then it still completely jacks up hair effects and fast motion.
      XESS is a good middle ground though, as it doesn't have the same hair/motion issues nearly as apparent as FSR2 does. At this rate, unless AMD pulls something amazing with the next version of FSR, XESS is set to be the universal replacement for DLSS that doesn't require Nvidia's proprietary AI cores as it's progressing much more quickly than FSR and is a significant upgrade over it at this point.

  • @D.Enniss
    @D.Enniss Před 21 dnem +10

    IMO 12GB VRAM is the sweet spot for up to 1440p, but at 4K no less than 16GB VRAM is the new standard

    • @roguetwice469
      @roguetwice469 Před 19 dny

      I'm definitely cutting it close, but my 3080 ti (12GB) is actually doing just fine at 4K in every game I've tested (LoTF, Tsushima, Starfield, Dragon's Dogma 2, etc) I certainly wouldnt recommend buying a GPU with any less than 16GB for 4K if you were building a PC today, but if you're on a very tight budget and need 4K, it's very doable on a 4070 Ti/3080 Ti

    • @adlibconstitution1609
      @adlibconstitution1609 Před 19 dny

      So your saying 8gb of vram is not enough in 1080p?
      Cards like rtx 3070, Rx 6650xt/6600xt, 3060ti, Rx 6600, Rx 5700xt and rtx 2080super = all 8gb cards

  • @DJBV
    @DJBV Před 21 dnem +4

    12 gb vram should be the minimum to match/exceed current consoles textures quality.

  • @vac59
    @vac59 Před 21 dnem +3

    The new Nvidia 5070 class of cards should have 16GB of memory. But leaks are saying 12 🙄

  • @MACROSS2KYTB
    @MACROSS2KYTB Před 20 dny +1

    Can DirectStorage resolve VRAM hungry issue?

  • @sengan2475
    @sengan2475 Před 21 dnem +19

    Ghost of tsushima alocates 8gb at anything above medium textures. This isnt a "future proof" thing, you will have troubles right now in many games unless you're planning to play 1080p medium with a budget rig, which there's nothing wrong with. But if your expectations are any higher, 8gb isn't enough

  • @DigitalVirusX2
    @DigitalVirusX2 Před 21 dnem +3

    I regret my 3070ti so much, $600 and at least in 2077 its impossible to maintain 60fps with any sort of ray tracing on because the vram gets eaten up so quick and performance TANKS after an hour of play. I'll probably buy a 4070ti super down the line when its a more reasonable price, hopefully the 50 series makes the price drop on the 40's.

  • @Hatecrewdethrol
    @Hatecrewdethrol Před 21 dnem +1

    I thought the title just said "RAM" and i did a double take that you guys uploaded it today

  • @virtualpilgrim8645
    @virtualpilgrim8645 Před 22 dny +25

    "640K ought to be enough for everyone." ~Bill Gates

    • @VicharB
      @VicharB Před 21 dnem +1

      Lol ... them times!

  • @GilbertGuilford
    @GilbertGuilford Před 21 dnem

    For people more informed than me. What config (CPU, GPU and amount of RAM) would I need to run modern games at 4k highest possible settings, ray tracing where available and solid 60fps?
    Games I had in mind are stuff like Deus Ex Mankind Divided, GTA5, Horizon Zero Dawn, Metro Exodus, new Tomb Raider games, Control and Witcher 3.
    Is it really 4090 or GTFO, or can something else actually run those games properly?

    • @rustyshackleford4117
      @rustyshackleford4117 Před 16 dny

      For CPU, 60FPS is a very easy target -- an AMD 5800x is more than sufficient, but it's hard to recommend when the 7800x3D is only $350. You could technically get by with a Ryzen 3000-series and 60FPS most titles, but some newer games are getting CPU bound on the 3000 series now. In either case, a cheaper 7000-series would make much more sense now.
      For the GPU, you don't need a 4090 for 60FPS 4K maxed -- the 4090 is for if you want between 120 and 160FPS in all games maxed at 4K. A 4070 could probably get you by for 60FPS in all of the games you mentioned, but maybe minus 2023/24 titles like Alan Wake or Horizon Forbidden West where you may be close to 60 but not locked in demanding areas. So I'd lean towards a 4080 as it simply has a lot more grunt and will let you continue to hit 60FPS at 4k for the games of the next year or 2 at minimum. With the 4080 you'll be able to hit around 100FPS in a lot of games as well, or near 120 in any of the older titles you've mentioned except maybe Witcher 3, as even that game needs frame gen and the 4090 to hit close to 120FPS with the RT update enabled at 4k.

  • @bogstandardash3751
    @bogstandardash3751 Před 21 dnem

    Will games demanding enough to need more than 8/12gb of vram also need dlss and frame gen to run 4k high settings on the same card?
    Pretty as it'll look, the 1200p balanced mode will effectively be rendering at a lower resolution for other factors which solves the vram issue at the same time?
    Happy to be wrong as I'm not sure, obviously we're you too but a card today you'd go AMD with 6750 xt plus.

  • @adamkallin5160
    @adamkallin5160 Před 21 dnem +1

    I agree with Alex. Devs will use the PS5 hardware as a baseline and may or may not spend time to optimize for 8 GB cards on PC. Even Switch 2 is expected to have 12 GB (total RAM).

  • @nabiltaylor2012
    @nabiltaylor2012 Před 16 dny

    upgrading from a 1660 super to a 3060 this week. should i do the 12gb or the 8gb 3060ti?

  • @melody_lane5081
    @melody_lane5081 Před 20 dny +1

    I wouldn't buy new card with 8, but would buy used with 8 if a good deal. I have 8 now, don't have any problems at 1080 on ultra in everything. I have 5700 xt. Cheers!

  • @KriegersBeasts
    @KriegersBeasts Před 22 dny +26

    My RTX 3080 has 10GB of VRAM. I've not had any real problems on my 4K screen yet. I don't mind turning settings down for now. When I upgrade next I want at least 24.

    • @minecraftmike5193
      @minecraftmike5193 Před 22 dny +6

      I run into issues but usually I just turn the textures down and it’s fine, hard to really even notice the difference at times.

    • @minecraftmike5193
      @minecraftmike5193 Před 22 dny +3

      Especially at 4k

    • @tonebalone607
      @tonebalone607 Před 22 dny +9

      Same. People be acting all catastrophic and i’ve never had a vram issue at 1440p on my launch 10gb 3080. Certain outlier titles will always exist but it’s not like 30% of all new games don’t work. My homie still has a 1050ti and refuses to upgrade. He still plays every game with the boys every night. Like I get the argument, but like.. life is good. stop believing everything you see on the news and go enjoy life with the abundance of whats available. People just love complaining tho aha

    • @robo_bravado
      @robo_bravado Před 21 dnem

      Cyberpunk will overcommit 10GB at times at 1440p and Horizon Forbidden West will do it a LOT at very high textures necessitating turning them down, to say nothing of 4K.

    • @TheAtomoh
      @TheAtomoh Před 21 dnem +2

      Rx 6800xt would have been a better choice

  • @plume...
    @plume... Před 21 dnem

    I've never had a problem on my 3070TI laptop with 8gb vram (32gb ram) @1440p with high settings on every AAA game I have played (aim for, and do achieve 60fps). I set most things to high (not ultra), no RT (big caveat there, I know), apart from shadows to middle, no bloom, no DoF, middle volumetric, no chromatic aberration, no motion blur (all the usual rubbish that you immediately turn off if you can). BUUUUUUTT, I do believe I will start to get vram warnings very soon with newer games. I have recently ordered a 4090 laptop with 16gb vram and 64gb ram (i9 13980), so hopefully should be safe for a few years now. This 3070TI 8gb has been amazing though.

  • @manostororosso2364
    @manostororosso2364 Před 22 dny +1

    Μαn what u talking about all game since PS3 are the same with betttgraphics

  • @fredsorre6605
    @fredsorre6605 Před 21 dnem

    I had to Play Horizon Forbidden West at Medium textures cause my RTX3080 runs out of Vram everything else was kept at high or Ultra and I even had DLSS Quality on and my 10GB was almost always full causing stutters I didn't wanna deal with.

    • @rustyshackleford4117
      @rustyshackleford4117 Před 16 dny

      Lack of texture optimization and compression is a major reason as to why these newer games both eat VRAM and Disk space usage for breakfast. We should start to see reductions with the next year or 2, as Nvidia and AMD are looking to offload texture decompression from the CPU to have it directly stream from the SSD to the GPU, such as with DirectStorage type tech. The next phase would be to have AI processing further speed this up and allow much high compression rates to be accomplished in real-time, whereas now it's not possible with introducing stutter/hitching while loading in new textures.

  • @rowsdower84
    @rowsdower84 Před 22 dny +40

    I blame both lazy game devs and Nvidia being stingy with VRAM.

    • @kphuts815
      @kphuts815 Před 22 dny +1

      This might be a bad take but I don't think Devs are that lazy considering they give lower textures settings to allow users with less vram to play smoothly, of course there are some outliers but they aren't the majority of vram demanding games from what I've seen

    • @CheckmateStallioN
      @CheckmateStallioN Před 22 dny

      As if AMD cards are any better. In fact AMD cards still have a long way to catch up to nvidia. Look how terrible AMD video drivers are. They are almost caught up in CPUs but have a long way to go for their GPUs to catch up with nvidia. AMD offering slightly more VRAM while at the sametime downclocking their GPUs and having subpar driver support compared to nvidia GPUs isnt helping the situation either. Pick your poison

    • @mitchjames9350
      @mitchjames9350 Před 21 dnem

      @@CheckmateStallioNthere isn’t much difference in performance between AMD and NVIDIA the only advantage Nvidia has is DLSS which is fake frames and ray tracing which tanks frames and one uses. At least AMD future proofed their cards with memory. Also AMD didnt offer slightly more they offered alot more memory than NVIDIA.

    • @ChiekoGamers
      @ChiekoGamers Před 21 dnem

      Blame the management and the publishers, not the workers 🤦‍♂️

  • @Otzchieem
    @Otzchieem Před 21 dnem +1

    The current gen consoles honestly should only have 3-4gb for the ui and straight 16gb minimum for vram I was one of hoping for 24gb of ram

  • @hrayz
    @hrayz Před 18 dny +1

    8GB should run any game at 1080p. The gpu processing power should be the limit and not VRAM then.

  • @kazumakiryu7559
    @kazumakiryu7559 Před 22 dny +11

    I only really saw vram related issues on my 3070 Mobile in titles ported by Nixxes (Horizon Forbidden West, Ratchet and Clank and The Last of Us). I feel the issue is related with the way they are performing memory management and people are bashing the hardware for no reason. Games like Cyberpunk 2077, RDR2, Forza Horizon 5 and Metro Exodus are proof that you do not need the latest and greatest hardware for graphics to look good.

    • @donofriocarlo46
      @donofriocarlo46 Před 22 dny +5

      Small correction for ya, TLOU1 was NOT ported by Nixxes sadly 😭
      Iron Galaxy Studios ported it and that's one of the many reasons (at least imo) why tlou1 was such a bad port. Hopefully Nixxes will handle the TLOU2 port.

    • @JBrinx18
      @JBrinx18 Před 21 dnem

      But you DO need time... Optimization takes time. RDR2 is a 7 year development game. So is Cyberpunk, though it arguably needed more. If we want games to have more variety, more assets, and more speed on release we have to grow VRAM

  • @scepticskeptic1663
    @scepticskeptic1663 Před 21 dnem

    my laptop as a 3060 6gb and it seems fine ,a few games require texture back tracking from ultra/very high to simply high and its fine

    • @filippetrovic845
      @filippetrovic845 Před 16 dny

      No!! Amd fanboys think you are not fine! You need to max out textures (which you cant even see on 1080p) or you cant possibly play the game. How dare you accept anything less than max settings. You are beta.

    • @mrman6035
      @mrman6035 Před 15 dny

      Tbh I think the only thing that makes or breaks a games picture quality is good shadows for me. I already game on a 1650 super and shadows are the one setting I like high.

    • @Shatterfury1871
      @Shatterfury1871 Před 14 dny

      Try Hogwarts Legacy and then comment on 6 VRAM being enough.

    • @scepticskeptic1663
      @scepticskeptic1663 Před 14 dny

      @@Shatterfury1871 not interested in harry potter and it doesnt look any better than the many of the other games i play

    • @scepticskeptic1663
      @scepticskeptic1663 Před 14 dny

      @@Shatterfury1871 also naming 1 Game doesnt define reality if 99.99999999999 percent of games are fine then so what . if you want to be super picky thats fine by you.

  • @christopherbohling5719
    @christopherbohling5719 Před 21 dnem

    I don't mind PC ports getting more demanding VRAM-wise, but what I do want is that if I need to set textures to medium in a 2024 game to get it to not stutter on my 2070 Super, I would hope those medium textures look at least as good as high-ultra textures on games from a few years ago. The texture quality available for 8GB cards, whatever it's labeled in the menu, shouldn't be visually worse than it is for games from a few years ago.

  • @stonejito
    @stonejito Před 19 dny

    Tbh, it’s enough if you know what you’re expecting out of the card. So like a 3070 or 3070 ti power wise is a really decent card but the 8gb vram holds the card back with 1440p gaming. So if someone wants that card with 1080p with “high” and not “very high” with some dlss, it will be a decent card still. Now I think the min should be 12gb at least maybe even 10gb vram. I know most pc gamers want the highest performance with the hightest resolution but really knowing your card limits and knowing what games and real achievements for your card to perform at is key.

  • @nintendoconvert4045
    @nintendoconvert4045 Před 22 dny +21

    Nope! 12GB is the minimum standard.

    • @Dhrazor
      @Dhrazor Před 22 dny +3

      Unless your game uses UE5 long term even 12GB won't be enough, Frame Gen and RT both use extra VRAM and you will run out of VRAM even at 1440p native if you turn everything to max... when the 1070 with 8GB released, games used 3-4GB VRAM, now that games are using 8-12 probably even 16GB won't be enough 4 years from now

    • @mianlo2624
      @mianlo2624 Před 22 dny +1

      Nvidia: Best we can do is 8GB of GDDR6X & 4GB of GDDR6.

    • @JulliusTheGreat
      @JulliusTheGreat Před 22 dny

      @@Dhrazorthis is crazy, PC gaming is on the verge of being impossible for middle class gamers..

    • @tonin7228
      @tonin7228 Před 22 dny +1

      @@Dhrazor Nvidia releasing the 4060 and the 4060ti even though they cant utilize FG (the main selling point of the 4000 series) effectively was super nasty

    • @Aleksey-vd9oc
      @Aleksey-vd9oc Před 21 dnem

      @@Dhrazor The generator uses memory, but the base frames are important for the generator, which you will most likely want to raise at the expense of the upscale. In this case, the upscale will greatly reduce the consumption of video memory, it is enough to start the generator.

  • @mukiex4413
    @mukiex4413 Před 15 dny

    You’re basically looking it on a per-game basis. Most games will be fine. Some games have a ridiculously high texture cache and will “choke” on 8 gigs while on max settings. But real talk new cards at $500+ should really have at least 16GB.

  • @CheckmateStallioN
    @CheckmateStallioN Před 22 dny +1

    All those needing to upgrade to 4090s better chop off a part of your full tower chassis to fit it in if your full tower chassis is from like 2Kteens. Thats what I did with mine. I have a full tower Thermaltake Chaser MK-I blue LED version from 2013 and don't want to get rid of it as it is still one of the best looking towers out there combining the oldskool with modern look. Most chassis now are too bland and have no personality to their looks they are literally just simple angled boxes. I took out the metal side panel with the little glass window and swapped it with a full size acrylic panel. But even at full tower size it couldnt fit a 4090 in it bc of the metal frame holding all the harddrive bays so I had to chop off the whole metal frame so the 4090 can fit

  • @colaboytje
    @colaboytje Před 22 dny +2

    No, if we expect high resolution game with good anti-aliasing.

  • @mikelreborn3254
    @mikelreborn3254 Před 16 dny

    "I am currently utilizing a 2023 Lenovo Legion Pro 5i, equipped with an RTX 4070 and 8GB of VRAM. The performance at 1080p resolution is satisfactory, and the visual quality meets my requirements. I acquired this laptop at a competitive price, and it primarily serves as a platform for my digital audio workstation (DAW). Although it's intriguing that VRAM can be allocated from the SSD to the CPU, it's not possible to do the same for the GPU, nor is there an option to expand the GPU's VRAM using a microSD card. Nonetheless, as long as the graphics are pleasing and the frame rate is stable, I am content with the setup."

  • @Punisher6791
    @Punisher6791 Před 19 dny

    as someone who went from a 3060 Ti with 8 gb of vram to a 4070 Ti super with 16 gb of vram i can say it makes a world of difference.

  • @rdc3830
    @rdc3830 Před 16 dny

    8gb is perfect for most gpus because the power needed too use more is typically not present.
    You can have a 16gb 3050 but it’s not designed too handle a powerful title using ue5 or idk

    • @rdc3830
      @rdc3830 Před 16 dny

      Not always but for now mostly

  • @Louis_Bautista
    @Louis_Bautista Před 22 dny

    I'm really not worried about the 12GB on my 4070 Super. I play at 4K60 and even though people say you need higher VRAM especially for 4K, what they forget is that once you're at 4K, DLSS works so well that at quality mode you're getting a better than native image while using less VRAM because you're only rendering at 1440p internally.

    • @Wolfos530
      @Wolfos530 Před 22 dny +2

      12 should really be fine for any current-gen PC port. I get how 8 can get a little tight for lower budget ports - consoles do have 16 gigs shared and that's a bit more flexible - but 12 is significantly more.

    • @Sp3cialk304
      @Sp3cialk304 Před 21 dnem +1

      Same thing applies at 1440p. DLSS quality looks better than native in every game released in the last year+. Last 2 years if you don't mind taking 5 seconds to update the DLSS version being used.

  • @Brent_P
    @Brent_P Před 18 dny

    The bare minimum for discrete GPUs should be *16GB* of VRAM.

  • @jrmjrm2764
    @jrmjrm2764 Před 22 dny

    With boost of l2 cache, there is no need for 16GB of vram as 4070 vs 7800 xt demonstrates even with high resolution

    • @Pikkoroo
      @Pikkoroo Před 22 dny

      Boots up rust in 1440p: 15gbs vram being used.

  • @gonzalomarsilli5741
    @gonzalomarsilli5741 Před 21 dnem +4

    70% of gamers have less than 8gb. make better games, not tech demos

  • @wakuwaku798
    @wakuwaku798 Před 15 dny

    I have a gaming laptop with a 4060, so only 8gb vram. It has a 1080p screen and 8gb is good for 1080p so it should be fine for years to come.

  • @PSx1991
    @PSx1991 Před 18 dny

    Uncharted 1, 2, 3, last of us, beyond 2 souls, GTA V ran with 256MB VRAM and 256 MB RAm.

  • @TheUAProdigy
    @TheUAProdigy Před 18 dny

    I hate hearing developer slamder. Especially here where people should understand that developers don't control anything, the execs do.

  • @xxbiohatchetxx
    @xxbiohatchetxx Před 18 dny

    I think it depends on the game and what kind of quality they are really looking for you know if they are okay with a little bit less quality because of less vram I mean yeah it's fine it all really depends on the person it's basically just a person to person basis really because one person maybe okay with a little less quality not everyone cares about real quality look at Minecraft look at grounded look at even killing floor 2 or Skyrim/fallout 4 the list goes on but there's a lot of games that aren't exactly perfect or real high quality but are still very popular now out of all of those I think the highest quality one is grounded tho but most people won't understand that or notice it or this or that but they aren't looking in the right spots either but yeah!

  • @OwtDaftUK
    @OwtDaftUK Před 22 dny +3

    If your buying a PC today half way into the PS5's life cycle I think it's unwise to think, well it has enough video ram to play any PS5 game ported to PC.

    • @CheckmateStallioN
      @CheckmateStallioN Před 22 dny +2

      PS5 is x86 architecture so porting its games to PC is not much of an issue. We're not dealing with exotic architecture like the PS3's Cell BE

    • @vitordelima
      @vitordelima Před 22 dny

      @@CheckmateStallioN Somehow everything becomes a mess along the way even if all platforms are almost the same.

    • @KrazzeeKane
      @KrazzeeKane Před 22 dny +1

      ​​@@CheckmateStallioNpeople who don't know anything about programming always say this, like because they are both x86 it's just some easy auto convert button they push to port ps5 to PC. There is SO much more involved in it, even though both are x86. It is absolutely an issue still, and requires careful planning and development to make a proper port of ps5 to pc. Things like unified memory complicate direct conversions of ram requirements on pc and such, it's not a simple equation here

    • @CheckmateStallioN
      @CheckmateStallioN Před 22 dny

      @@KrazzeeKane of course theres finetuning to be done when these consoles have pooled memory and everything is on an APU but compared to an architecture like the PS3 where you had to account for 8 different SPUs it is much easier and 1 less major problem to deal with when you have a similar architecture to work with. This is why most 3rd party devs had major issues developing PS3 games because they had barely any training and experience working with that architecture unlike the 1st party devs. The architecture itself posed a far bigger problem to solve than achieving "parity" between unified memory in consoles and dedicated memory allocation (VRAM and system RAM) that we see in PCs

  • @GorillaKong
    @GorillaKong Před 21 dnem

    12 to 8 is a bit mad tho like why not just drop to 10 ahhhhhh

  • @onomatopoeia162003
    @onomatopoeia162003 Před 21 dnem +2

    Should be 12-16-20-24

  • @JoeWayne84
    @JoeWayne84 Před 14 dny

    Show the games released this year and the Vram use.

  • @MA-jz4yc
    @MA-jz4yc Před 21 dnem +4

    Theres no reason why graphics cards should come with less than 12-16gb vram in 2024. GDDR6 is dirt cheap, (less than 2 dollars per GB) the only reason manufacturers do this is to up-sell you more expensive products.

  • @HiddenAdept
    @HiddenAdept Před 19 dny

    Atleast with my 7900 XT (20GB) probably wont have to worry for a long time.

  • @iansteelmatheson
    @iansteelmatheson Před 22 dny +2

    it also depends what resolution and quality settings you're comfortable with. I'm still happy at 1080p and medium-to-high settings, so 8GB is probably enough for 4 years for me, especially since I do 90% of my gaming on console anyways.

    • @Pikkoroo
      @Pikkoroo Před 22 dny

      Exactly you can’t run high to ultra textures even at 1080p

  • @brucevandermescht6957
    @brucevandermescht6957 Před 11 dny

    Couldn't care less, I only play games pre 2016. Everything after that is all style and fuckall substance.

  • @SireDragonChester
    @SireDragonChester Před 21 dnem +7

    No 8 GB of Vram is not enough for current gen games. Current consoles have 16 GB. This is standard and what devs are targeting as 2024. Game demand will go up Imo in next 2+ yrs or more. 8GB Vram maybe fine for older games or less demanding games/indie games. But it’s not enough for future games that pushing 4k more, more RT. UE5 games are already showing this.

    • @Katsuchiyo
      @Katsuchiyo Před 21 dnem +3

      With 10 to 13 GB allocated to video memory, since 2020.

    • @SireDragonChester
      @SireDragonChester Před 21 dnem +2

      @@Katsuchiyo
      It’s true. When Horizon zero Dawn pc port came out. At max setting. It was already using 10.5 Gb Vram at just 1080p on my GTX 1080ti. Granted. It Wasn’t running well and we all know it was vary bad port. Games are only getting more demanding for current gen pc games. The old 8 GB Vram GPUs aren’t going cut it much longer. Even 10 GB GPUs may start having issues. My next gpu is going have at least 12GB Vram or more. Imo 16 Gb Vram will become bare minimum in few years. Nvidia may says Vram don’t matter. But that is bullshit. Consider all lies and bs they been doing. Imo the whole RTX 4xxx are scam since all cut down at primer price.

    • @dr.sivavignesh664
      @dr.sivavignesh664 Před 21 dnem +2

      That's shared memory. Watch some more df videos. The 16 gb memory is for vram plus RAM.

    • @SireDragonChester
      @SireDragonChester Před 20 dny +1

      @@dr.sivavignesh664
      I know it’s shared GDDR6 memory. Devs have access for games 12.5 ish Gb for games. The rest is os/background task. XSX has bunch weird memory config, XSS has only 10 Gb, leaving devs prob 7.5 to 8 GB for games. Which isn’t enough.
      Been gaming since intel 8088/86 days and building pc since 90’s and ex beta tester/server admin/ex moderator from small indie dev team.

  • @TurboPikachu
    @TurboPikachu Před 22 dny +2

    Meanwhile, I’m still running with my 4GB Radeon RX 480, and I have yet to own a 4K TV because of the Nintendo Switch looking disgustingly pixilated or soupy when upscaled on panels I’ve tried it with. Having yet to own a 2K/4K TV effectively milked a lot more viability out of the 2016 midrange “4K ready” GPU.
    To this day, I haven’t run into any real issue in the latest games (except for the i5-6400 causing jutter in VR experiences), still achieving 60fps in practically every new game I’ve thrown at it around 800p-900p low settings and having having a perfect-locked 30fps at 1080p medium in those same games. And it’s especially been a delight for newer games like Stray and Hi-Fi Rush to be optimized enough to maintain a rock-solid 60fps at 1080p/high on the 7 year old card.
    The only game I’ve seen so far that can’t get at least 720p/30fps at minimum is Alan Wake 2. Even after the requirement reduction for the game, it’s still a travesty of unoptimization even on newer cards.
    Once the Switch’s successor comes out, that’d probably be the day I get a 2K/4K TV and build a new PC with a Ryzen 5 and a 16GB Radeon RX 7600 XT

    • @CheckmateStallioN
      @CheckmateStallioN Před 21 dnem

      MClassic works wonders upscaling Switch games to 1440p and I herd if you daisy chain 2 MClassics you get even better results. A must have device for 1440p and 4K monitors

  • @mxyellow
    @mxyellow Před 21 dnem +5

    We've been held back by 8GB VRAM for too long. It's time to move on.

  • @pf100andahalf
    @pf100andahalf Před 19 dny +4

    I got tired of "not enough vram" and bought a 4090 and now I live under a bridge in a refrigerator box and eat sparrows cooked on a curtain rod over a pile of burning car tires. But I have enough vram now.

    • @a-animal1206
      @a-animal1206 Před 17 dny +1

      Oh so that was you the other day I'm under the same bridge after buying a 4090. Except I don't even have a curtain rod ..so when your done let me know so I can cook with it.

  • @kaseyboles30
    @kaseyboles30 Před 21 dnem

    8GB is good enough for near term 1080p (one or two years) 10 should add a year to that and 12 should be enough if you upgrade gpu's more than twice a decade. At 1440p 8gb will do, but not with some settings (texture res maxed) where as 10 is a year or two and 12 two to three years. for 4k less than 12 is likely a mistake, and 16 will get tight in about 3-5 years. That is not taking into account the raw power of the gpu in question relative to the game.

  • @adriannavarro9962
    @adriannavarro9962 Před 20 dny

    I mean, I heard all of that back in '21 when I miraculously stumbled upon a 3070 Ti in the middle of the cripto crisis, and three years later it still beats the 12 GB RX 6700 XT/6750 XT at 1440p in most games. By the time the RX 6700 XT overcomes it I'll have already updated. Does that mean I'd buy an 8 GB card in '24? Heck no. But I think we're at the same point with 12 GB cards right now, and I reckon they will be enough for many gamers who update every 3-4 years, just like the 8 GB 3079 Ti has been enough for me up to this point. Of course, if you plan on holding on to the card longer, then get as much VRAM as possible, but in my experience most regular PC gamers don't do that.

    • @Loundsify
      @Loundsify Před 20 dny

      Tbf the RX 6800 was it's main competitor and there's articles now that show the RX6800 comfortably beating the 3070, I can't remember how much faster the ti is over the standard 3070.

    • @adriannavarro9962
      @adriannavarro9962 Před 20 dny

      Thta's not the point though. The RX 6800 was already faster at rasterization than both 3070s at launch. Those days MANY people were arguing the RX 6700 XT would also quickly overtake them because of its 12 GB of VRAM, and it still hasn't happened. Has it performed better in an anecdotical number of games? Yes. Has it quickly overcome both 3070 and beat them across the board? No. Not at all.
      I was merely translating that situation into the current one, when 12 GB cards are what 8 GB card were back then. I'd bet things are gonna turn out similarly, with, say the 4070 Super beating the RX 7800 XT throught their prime, only to be overtaken when most gamers purchasing at that tier of GPU have already moved on to new ones.

  • @coleshores
    @coleshores Před 21 dnem

    8GB of ram will be more than enough for AA and Indie gaming for the foreseeable future so there isn't a shortage of content that will operate under those limits. Its only the massive AAA games like GTA6 where 8GB of VRam is going to become an issue.

  • @djtomoy
    @djtomoy Před 22 dny +3

    You need way more than 8gb, I shit 8gb 😅

  • @dr.sivavignesh664
    @dr.sivavignesh664 Před 22 dny +37

    VRAM requirements have lately been overstated. Especially due to the lazy ass piss poor ports. Cyberpunk 2077 and rdr 2 maxed out still uses very less vram but looking better than anything released lately.

    • @muratssenol
      @muratssenol Před 22 dny +11

      Cyberpunk 2077 as of 2.12 version at Harware Unboxed optimized settings at 2K uses around 7.7 GB of VRAM with DLSS Quality mode, there is not enough headroom for Ray Tracing or to install high quality texture mods. I have a RTX3070ti and I can't use Ray racing just because GPU does not have enough VRAM to use Ray Tracing in 2K. This card for example should have been a 12GB card imo.

    • @dr.sivavignesh664
      @dr.sivavignesh664 Před 22 dny +1

      @@muratssenol you have a valid point, but the point about high quality textures is invalid. If we had something like 24 gigs of vram we can talk about say high quality textures because we have the headroom and nothing uses it. The point is cyberpunk had excellent textures to begin with. I don't know if you will be noticing much difference on 2k display from the highest quality textures. Unless you are zooming in every little detail. 2.12 is probably the final patch ( they said they are done optimising the game ) so it's reasonable they gave us scalability. Pc has always been about scalability. That's what the wide range of cards are for. What I'm trying to say that , here in this instance you are talking about the highest possible setting being un usable with ray tracing. While the comparison games , obviously shitty ports can't even load tolerable textures ( even medium to high and not the ultra or highest ) but use like 12 gb and above of vram and that doesn't even have ray tracing. Just like Alex said we can't trust developers to make it a great experience for 8 gb vram users. Its not that we can't have a great texture at 2k that still uses less than 8gigs but that the devs don't seem to think its their problem to fix and they expect the consumer to fix it by buying cards with more vram. Obviously if it progresses like that even 24 gbs won't be sufficient. Also nobody could have predicted this when the 3070ti launched or when you bought it. But Nvidia 's rolling back on 4060 to have 8 gb is an absolute blunder.

    • @vitordelima
      @vitordelima Před 22 dny

      And they still use the common terrible data structures for storing geometry, textures, light maps, G-Buffers... which lack proper compression. They also lack any smarter method of managing the use of memory and statically transfer everything that seems to be needed for rendering (according to simpler algorithms such as visiblity tests) even when it isn't.

    • @1ch0
      @1ch0 Před 21 dnem +1

      RDR does not look that good anymore and this whole "unoptimized game!!!11" is just a very simplistic view.

    • @robo_bravado
      @robo_bravado Před 21 dnem +4

      Cyberpunk will spike north of 10GB of committed vram at 1440p with ray tracing.

  • @Funnylittleman
    @Funnylittleman Před 22 dny +1

    I really wish I could get more than 8gb VRAM but I can’t afford a 4080 😢
    Maybe a 4070 Super but that’s not much better honestly.

    • @KrazzeeKane
      @KrazzeeKane Před 22 dny

      4070 ti super is the card to go with if you need 16gb vram, it's what I would have bought if it had existed last year but I had to save up extra to go for a rtx 4080.
      The 4070 ti super isn't cheap that's for sure, but it will absolutely treat you right and last you the next 4 to 5 years easy at high settings and high fps. I wouldn't go with a 12gb card like the 4070 super unless you plan to upgrade in 2 to 3 years, or you wish to have to turn down texture settings.
      The 4070 ti super will have no such issues, and the 16gb vram will handle ANY possible titles you throw at it for the next 5 years at least.

    • @CaptainScorpio24
      @CaptainScorpio24 Před 22 dny

      ​@@KrazzeeKaneexactly i have upgraded from 3yr old 3070 founders edition to gigabyte rtx 4070 ti super eagle oc 16gb at launch jan2024 😊

    • @Pikkoroo
      @Pikkoroo Před 22 dny +1

      I got a 16gb 6800xt back in 2023 for $399, you’re asking the wrong company to provide

    • @mitchjames9350
      @mitchjames9350 Před 21 dnem

      Why not a 7800 xt or 7900 xt both cheaper and better than their Nvidia counterparts.

  • @Orlyy
    @Orlyy Před 15 dny

    I'm glad I upgraded to my 4080 from a 3070. I've never once gone AMD, but if you're on a budget, and are looking at an 8gb Nvidia card just go get an AMD card.

  • @sonusmeister2325
    @sonusmeister2325 Před 21 dnem

    Honestly i just play the game on medium or even low texture, not like i'm streaming or anything...

  • @adlibconstitution1609
    @adlibconstitution1609 Před 19 dny

    So your saying 8gb of vram is not enough in 1080p?
    Cards like rtx 3070, Rx 6650xt/6600xt, 3060ti, Rx 6600, Rx 5700xt and rtx 2080super = all 8gb cards will become obsolete in 1080p?

    • @bp-it3ve
      @bp-it3ve Před 19 dny

      In 1440p it is already a problem and evntually it will be the same in 1080p.

  • @brtcobra
    @brtcobra Před 21 dnem

    red dead 2 uses only 6 - 7 gb at 4k max settings. this limit is down to devs not optimising pc games

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Před 21 dnem

      Yeah, a lot of these game that use stupid amount of vram look worse than rdr 2

    • @Hayden2447
      @Hayden2447 Před 21 dnem +1

      rdr2 is from 2018 tho, 2019 for PC. It has no RT and the textures are lower res than some newer games. It is an amazing game for its time but of course anything new should run a 5 to 6 year old game made for last generation consoles. Optimization still matters of course but vram requirements will be higher than they were 5 years ago, that is a fact.

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Před 21 dnem +1

      @@Hayden2447 true,but i would rather texture look like last gen games if little bit texture improvement use 2 - 3 times more vram. In the world of tech if you already hit sweet spot point. Going more than sweet spot result in insane compromise for tiny improvement ,

  • @maplegamer4736
    @maplegamer4736 Před 21 dnem

    To correct misinformation, they should have addressed the claim that dlss alleviates vram requirements. This is tied to the myth that resolution is the main culprit for vram usage. Its actually higher res textures, which dlss cannot address.

  • @alanreynolds4262
    @alanreynolds4262 Před 21 dnem

    I had a 4070 that was hitting the 12gb limit on Hogwarts Legacy, which would cause the came to crash from time to time. I ended up returning the card and went went to a RX 7900 GRE with 16GB VRAM. Game never crashes now. Hate how stingy Nvidia is with VRAM.

  • @chris42069
    @chris42069 Před 21 dnem

    8GB is okay for me on 3070 because of DLSS and only becuae of DLSS. Its a hardware dongle for best tech in gaming

  • @jorismak
    @jorismak Před 22 dny

    To proper use dlss upscaling, you need the textures you would use of your _output_ resolution. So, textures at least, playing dlss 4k performance still requires loads of vram compared to 1080p native.

  • @cezarstefanseghjucan
    @cezarstefanseghjucan Před 20 dny

    Even 12 GB of VRAM isn't enough for 4K.

  • @acardenasjr1340
    @acardenasjr1340 Před 21 dnem

    For now, yes.

  • @myatko9246
    @myatko9246 Před 21 dnem

    I'm still using my GTX 1660 Ti

  • @Payload25
    @Payload25 Před 20 dny

    So I can throw my 2060 in the trash for future games. Just because the devs no longer optimize their game.👍

  • @asmodeusml
    @asmodeusml Před 20 dny +3

    The cope of the bro telling that "NVIDIA showed that 8GB is enough" is unreal.

  • @killermoon635
    @killermoon635 Před 22 dny +2

    I have not played any game that has issue with 8GB
    Last of us part 1 runs fine on 8GB. I have just to drop some texure setting to high instead of very high, and runs perfectly fine. I don't see any difference in visual between high and very high. Forspoken not issue with latest patches either....
    Horizon Forbidden West, Ratchet etx all ran fine.

  • @Radek494
    @Radek494 Před 21 dnem

    You can't even max settings in Ghost of Tsushima at 1440p with 8 GB VRAM, it will swap to RAM

  • @playneverends1228
    @playneverends1228 Před 21 dnem

    "hello founders"
    thats a neat name for old ppl who criticize games and gaming hardware for living

  • @danny200mph5
    @danny200mph5 Před 19 dny

    Good thing I bought a 4080 with 16gb.

  • @masterquake7
    @masterquake7 Před 21 dnem

    8 to 10GB is perfectly fine for the average gamer. The 3080 is 10GB and still kicks ass.

  • @virtualpilgrim8645
    @virtualpilgrim8645 Před 22 dny +2

    I have the top of the line factory overclocked ASUS 2080 Ti with 11 GB VRAM and can play any game at 1440P at max settings with frame generation. 5800X CPU. 32 GB windows 11 RAM.

  • @_ProBro_
    @_ProBro_ Před 18 dny

    I have 4 😭😭😭😭

  • @m-copyright
    @m-copyright Před 22 dny +14

    Having 8GB VRAM in 2024 (especially for the price the cards go for) is like having a dual core processor in 2024.
    Can it be used? Yes.
    Will it be a good experience? No.

    • @Threewlz
      @Threewlz Před 22 dny +1

      Idk man, I've had a great experience with my 3070 so far, secret is to not max out settings and stay away from shitty ports

    • @KrazzeeKane
      @KrazzeeKane Před 22 dny +3

      ​@@Threewlzstaying away from AAA games (which are generally crappy ports) and having to run on low settings isn't what most people consider a "Good experience". It's a "making do" experience imo

    • @Pikkoroo
      @Pikkoroo Před 22 dny +2

      @@Threewlzexactly you can’t max out textures due to limited vram on your card

    • @Threewlz
      @Threewlz Před 21 dnem

      @@KrazzeeKane I'm not running anything on low settings tho. Any game released on the last couple years is manageable at high textures/settings. Horizon FW is one of the best looking games of this gen, guess what? High textures. Alan Wake has got the best geometry of this gen, guess what? High textures. This vram stigma is way overblown.
      A RTX 3070 delivers a better experience than your average PC consumer hardware or console gamer is going to do, I can assure you that.
      Edit: even that mess of a port that was TLOU part 1 is playable at high settings/textures

    • @Threewlz
      @Threewlz Před 21 dnem

      @@Pikkoroo when there isn't a visually difference between high and max textures it doesn't make a difference. Most of the time it's just a vram streaming pool with no visual discrepances whatsoever.

  • @virtualpilgrim8645
    @virtualpilgrim8645 Před 22 dny +1

    Alex looks very German\Scandinavianesque

  • @nologig
    @nologig Před 13 dny

    256mb-512mb-1gb-3gb-6gb-12gb next time when I need to upgrade my graphics card newer technology beyond current RTX features it has to have 24gb too.
    Until it is available on reasonable price point now current RTX3060 12gb will be good enough for me as was the GTX1060 6gb before and GTX660ti before that.
    Only shills are defending 8gb in 2024 for gpu upgrade at over 200 dollar price point when vram usage has always gone up with other hardware demands for latest games.

  • @OG-Jakey
    @OG-Jakey Před 21 dnem

    I'd argue it's not just PC that devs dont give a shit about. It's Consoles as well. Games are generally made poorly across all platforms at this point and the only thing I can think of that is happening is a lack of skill in the industry. A veteran dev for a big studio has brought this up also though he didn't explicitly say this and may be to kind hearted to come to the conclusion but he has mentioned frustration of 45 min code needing 4 weeks for devs which does point to a lack of skill plaguing the industry imo.

  • @soralifestory
    @soralifestory Před 21 dnem +1

    Me with my 4gb of vram happy at 1080p low settings and enjoying games. You want the power user experience you got to pay the price, no if ands or buts about it, theres a price to pay for sticking with budget cards i understand that, and these companies should gear towards the consumer but alas here we are. As a child id cry tears of joy for the performance i get on a 4gb card, the day i can afford a high end gpu idk what id do

  • @luiscantu4968
    @luiscantu4968 Před 21 dnem

    Size matters