Wait, Did They Fix 8GB GPUs?

Sdílet
Vložit
  • čas přidán 13. 06. 2024
  • After looking at Hardware Unboxed's latest RX 7600 and RTX 4060 Ti reviews, I wanted to see for myself if 8GB GPUs were as broken as they claimed. Turns out, a lot has changed in the past few months, and in most cases, these graphics cards are fixed!
    I still don't recommend buying either of these GPUs, but if you have an 8GB card, rest assured that they are still relevant in 2023.
    Memes and Tech News: / theterk
    Join the TerkForce (Discord): / discord
    My Tom's Hardware Articles: muckrack.com/theterk
    #nvidia #amd #pcgaming
    00:00 VRAM Monster
    1:39 Overall 8GB Performance
    3:26 Measuring VRAM
    5:54 "Ultra Settings are Dumb"
    8:47 They Fixed the Glitch
    10:50 8GB is Mainstream Now
    13:53 Most Commenters Didn't See This
  • Věda a technologie

Komentáře • 920

  • @AlexCrMaReLiLukyanenko
    @AlexCrMaReLiLukyanenko Před rokem +154

    The point is that games now don't look THAT much better to require THAT much more memory. And when you turn settings down they start to look worse than some older games, that consume 3-5 times less memory and computing power on HIGHER settings

    • @frozby5973
      @frozby5973 Před rokem +31

      THIS, i say this all the time, yes there should be more vram its not that expensive to put 4gigs more especially on new gen cards... we had a 8gig 1080 release in may 2016 THATS 7+ YEARS AGO , BUT, there is no excuse for games to look worse than games from 5+ years ago and run so horribly and use so many resources aswell. its just lazyness and greed from both sides.

    • @BigHeadClan
      @BigHeadClan Před 11 měsíci +10

      Aye Halo Infinite looks much worse than most of the remaster series and Halo 4-5 while performing awful.
      Granted that too is due to poor optimizations from the developer.

    • @andremalerba5281
      @andremalerba5281 Před 11 měsíci +6

      On the latest Broken Silicon, Tom showed a tweet from someone blaming DX12 and got in touch with some devs and corfirmed the situation.
      DX11 had a built-in feature that managed which textures should stay in memory and which didn't.
      On DX12 and Vulkan the dev has more access do the HW as it's close to the metal, so the devs themselves have to build a system to manage which textures will be flushed and which won't and the devs spent most of their time optimized for the masses which are PS4, PS5 and 12Gb VRAM GPUs, that's one of the devs told him, everything else is a second thought and the devs themselves said "on PC you just lower settings or brute force and the game runs, we need to ship a game that runs! Optimizing for 8Gb of VRAM nowdays would add 2 more years to game development" so yeah, that's the harsh truth, publishers pushes devs to release the games in a working state ASAP and they only have time to optimize for so much and now have to manage which textures stay and which doesn't by themselves instead of by the API so that's the present and future situation.
      On the flipside the devs said that DX12 and Vulkan allowed the games to scale better and run on even lower end setups but they will or will not improve the VRAM usage for 8Gb VRAM GPUs on patches after launch.

    • @tiagomorgado3798
      @tiagomorgado3798 Před 11 měsíci +9

      Yup, remember the Witcher 3, it still has some good textures and the game uses 1gb of VRAM

    • @DenverStarkey
      @DenverStarkey Před 11 měsíci +7

      @@frozby5973 the 70 class cards got screwed the worse , at least they slowy upped the vram on the 80 calss going into the 2000's adn 3000's while the 2070 and 3070 bboth got stuck with 8 gigs

  • @Teapose
    @Teapose Před rokem +184

    Also be careful with average frame rates. I guarantee you will not remember the averages, you will remember those 1/.01% lows.

    • @Micromation
      @Micromation Před rokem +30

      So much this, people are looking at wrong numbers in benchmarks.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +10

      It depends on how frequent those 0.1 lows happen. And sometimes it is not hardware related issue. But simply game engine issue.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Před rokem +20

      @@arenzricodexd4409 they happen .1% of the time...? So a stutter, on average, every 10 seconds on a game running at 100fps average?

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +4

      @@CyberneticArgumentCreator just look at the time frame. Sometimes it only happen once or when the game load a new area. But if we look at the on screen display the lows still being mentioned. Ultimately even if the frametime graph show higher time frame it does not always being felt or affecting game smoothness. That's why the numbers that we got from all this kind of measurement cannot be a definite proof on how good the experience is.

    • @xerxeslv
      @xerxeslv Před rokem

      Daniel Owen just made video about those, and somehow 4060ti is doing fine...

  • @yasminesteinbauer8565
    @yasminesteinbauer8565 Před rokem +178

    If a 400,-€/$ graphics card is not able to match a 2 year old console in terms of texture resolution, that's not hysteria. With less than 12 GB you will simply get more and more problems in the near future. So the problem is by no means solved.

    • @terraincognitagaming
      @terraincognitagaming Před rokem +29

      Or you can be smart and not play on Ultra. Reminder: PC gamers were supposed to be smarter and more educated in terms of settings than their console counterpart gamers. But thesedays, zoomers gonna zoom and must toggle everything to max and then complain WHHYYYY DOESNT IT RUN WELL ON MY MACHINEE!?!?!? while they get ZERO visual gains from it over Ultra.

    • @DarkDorel
      @DarkDorel Před rokem +36

      @@terraincognitagaming or you can be more smart and don't buy gpu 8gb at 400$ in 2023.
      It's clear that this actual generation are just refreshing of the older cards 3000 that was 4 years old... With performance of 5-10% and with 8gb don't expect to play new games with UE 5 in 2024.
      It feel like this 7000 & 4000 series are totally scam move for both company.
      We don't get any real upgrade with this gpu.
      And i think the real upgrade will come in 2024 of the "real next-gen generation gpu"... Better save money when that days comes.

    • @yasminesteinbauer8565
      @yasminesteinbauer8565 Před rokem +9

      @@terraincognitagaming In fact, many last-gen titles High and Ultra often had barely visible differences. However, that is no longer true for current titles. In Diablo 4, for example, you can see very clear differences in texture resolution between High and Ultra. And if you want to achieve the same texture quality as on consoles, you need a 16GB card to play it smoothly. And again: We are not talking about homeopathic but clearly visible differences.

    • @headoverheels88
      @headoverheels88 Před rokem +5

      That's not entirely fair, considering one benefit of consoles is they have ONE model (give or take) which means game developers can optimize for a single platform with predictable hardware. It's a similar dynamic with developing apps for Apple V Android: diversity of models make Androids more difficult to optimize for.

    • @terraincognitagaming
      @terraincognitagaming Před rokem +5

      @@yasminesteinbauer8565 The reason the texture resolution is different is due poor optimization something that Last of Us proved. Look at 1.0 version of Last of Us. The medium textures looked absolutely horrible. Guess what, patch 1.0.5 drops and WHAT! The texture quality literally quadruples AND the VRAM usage goes down by as much as 2GB depending on the scene. CRAZY right? Yeah Blizzard is doing the same thing.

  • @PropaneWP
    @PropaneWP Před rokem +130

    This is much the case of "when the only tool you have is a hammer, every problem looks like a nail". A lot of hardware enthusiasts have developed a bad habit of looking at the problem from the wrong end. The whole philosophy of brute-forcing the performance of PC games is extremely backwards. Just throwing more hardware at software problems is a huge waste of resources and very quickly yields diminishing returns. Most gamers seem to revert to "my hardware isn't fast enough" whenever they encounter problems originating from a badly coded game.

    • @frozby5973
      @frozby5973 Před rokem +15

      its true on both sides really vram isnt that expensive, they should add more, but at the same time its not an excuse for game devs to make very badly made games, and recent releases are just a prime example of that, a very quickly very greedily and lazily done games for the most amount of profit.

    • @veduci22
      @veduci22 Před 11 měsíci +14

      When you buy 32GB RAM kit for 30$ more instead of 16GB nobody is concerned about "brute-forcing" but when you want more VRAM on lower-midrange cards that cost 300-600$ while 8GB GDDR6 costs ~27$ then suddenly it's a problem...

    • @andremalerba5281
      @andremalerba5281 Před 11 měsíci +1

      @@veduci22 ah another fellow follower of Broken Silicon

    • @RevDrCCoonansr
      @RevDrCCoonansr Před 11 měsíci +3

      It's not badly coded. Computer Science 101...35% of programming is done first. The other 65% is bug fixing. This was not a joke. This was an actual hard rule. Optimization is 65% of a project 9/10 times. The issue we have been seeing is a game coming out at the same time on all systems being more important than it actually running well on all systems. Due to the install base and nature of consoles, they will be programmed first. And consoles use mainly the same code as the PCs now so finding talented programmers that aren't already in a studio cranking out console games is slim these days. So we wait for the PC patches. What do they do? Allocate resources correctly for PC instead of defaulting to what this gens consoles do. In the past it was always throwing more horsepower at it. Right now it's about optimization. And the only people that should be worried about VRAM are people playing in 4K or higher. Especially high refresh 4K. I am maxed at 8GB in 4K 120. I definitely needed a 3080 10GB but ended up with a 3070 when I saw the window of opportunity to buy a GPU at retail disappearing as I clicked links. I have had to live with reduced settings since. Which is what this video recommends which makes me wonder. Is the real issue the glut of noobs that have hit the PC scene since COVID trying to figure out something they have no relation to due to being a console fan prior for decades? But I digress, beyond conspiracy theories, what people should be primarily concerned with is the time it is taking to optimize for PC. You need to take this into consideration and then hope the industry will start releasing when ALL ports are polished and ready to go. Thing is? They never did before. And just because they both share the same tech does not mean it works the same nor does it mean that AAA Software Houses will start cranking out PC games FIRST unless they already do. No. All the old rules apply and I do believe they are confusing people.

    • @mickmoon6887
      @mickmoon6887 Před 11 měsíci +3

      Its true that modern video gaming software standards have gone lenient and bloaty over the years to use more system resources like VRAM, RAM, CPU etc from lazy developers and company culture
      Its also true that modern games require higher VRAM capacity to push out good quality in higher resolution its been more than a decade almost two decade now that the most majaority gamers are still stuck with 1080p resolution because GPU manufacturers refuse to increase VRAM capacity for their low and mid range most common GPUs now that's simply inexcusable more than two decade with simply slow progression because of GPU duopoly mafia
      Just 2-3 gen ago or a console gen ago you could get 8 GB VRAM GPU the max at that time under $300 now you can't even get the same perf and VRAM capacity at that price or double that price the worst part you have to pay double to get same perf and capacity what you could've got back then looking at recent RX 6600 and rx 580 shame AMD
      VRAM aka DRAM prices have always been cheap only expensive during shortages someone mentioned here before its cheaper to increase your RAM capacity than VRAM capacity back before you were able to change VRAM capacity just like RAM
      Both the lazy devs + industrial standard culture and the GPU company greed are at fault here as both benefit from this mess against the consumers/customers

  • @Jkend199
    @Jkend199 Před rokem +44

    For me this is very simple
    1. On a low-end card (that's under $200 USD IMHO) yes, absolutely I expect to turn down settings even at 1080p 60 FPS
    2. On a midrange card (that's $200-400) expectations are determine by price... when you're talking $399.99 USD I expect 1440p 60 fps no compromises, the closer I get to $200 USD the more compromises I am willing to accept.
    3. When you're talk high end (cards over $400 USD) I expect 1440p 60 fps no compromises, and once you hit $600 USD I want 4k 60 fps yes with settings turned down, but I expect 4k 60 fps from a car that by itself costs more than a current gen console... I don't that's even slightly unreasonable.
    Expectations are set by the price, If Nvidia/AMD want outrageous price premiums, I want Outrageous performance.

    • @divanshu5039
      @divanshu5039 Před rokem +3

      Totally agree with you.

    • @Z3t487
      @Z3t487 Před 10 měsíci +1

      "If Nvidia/AMD want outrageous price premiums, I want Outrageous performance" How about more performance for the same price of last gen? You know, i was interested maybe in an RTX 4080 but not for minimum $1200. Yes, maybe the price scales good with the performance of last gen RTX 3080, but i don't care, i'm not going to submit to the new logic that is the following basically: goodbye better performance/price, hi new "the more you pay the better the performance is and this perfectly scales with price or almost does.

    • @Jkend199
      @Jkend199 Před 10 měsíci +1

      I was thinking more 4070 coming up to 4080 performance, but staying at 4070 prices, not that Nvidia will ever do that.

  • @badbasic
    @badbasic Před rokem +131

    I think the biggest issue is that texture quality is one of the settings, if not the setting, that impacts visual quality the most. And even though it has such a huge impact, if you have enough vram, it does not effect your performance at all.
    It's simple, if you have enough vram, you can go for the highest texture pack and it won't effect your performance, if you don't, it will either kill your performance or mess up the visuals, like Hogwarts and Halo.
    Almost every other visual setting has a performance price, no matter the hardware, unless you are cpu bottlenecked, you are trading performance for higher settings.
    What you can essentially do, even with an older card that has enough vram, is turn down other settings to get good enough performance, crank up textures for free, and the game will still look very good because of how important texures are.

    • @Zero939
      @Zero939 Před rokem +25

      This. His Ultra Settings reference is misplaced. Textures are unrelated to performance unless that limit is hit in the first place, which is the origin point of being mad in the first place.

    • @diegochavira5815
      @diegochavira5815 Před rokem +28

      Totally agree, this whole debate is just like trying to justify 8GB cards... The ultra settings HUB is referring to are the ones that actually impact performance, and as you said, Textures have no impact on performance whatsoever, but if you don't have a big enough frame buffer, everything goes south.
      Given the price Nvidia and AMD are asking for these "midrange" cards, they should get at least 10 to 12 GB VRAM.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +4

      In majority of games you can lower the texture from ultra and still not seeing the difference. Even with side by side still pictures you really need to look for it. In RE4R for example you can set the texture to high 2GB/3GB and it can still look the same to the one with high 8GB unless you really play around the camera angle and search for the difference.

    • @georgebessinski2122
      @georgebessinski2122 Před rokem +2

      @@arenzricodexd4409 Textures are not the only setting in RE4 remake that eats up vram. If you would want to max it out and use RT you will be left out to use 1gb texture pack

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem

      @@georgebessinski2122 with RE engine i have always think the game asked more but in reality you can still play the game smoothly even if you did not have the recommended amount of VRAM. In RE4R remake i can push everything to max except shadow and volumetric light to mid (since i did not see the difference above that), no motion blur, hair strain off, RT off, texture high 6GB and my 3060Ti can get mid 60s to mid 70 in open area. Indoors usually 80 to 90 plus. Sometimes over 100. And this is with 1440p res.
      Many people see the game being tested with cards with higher VRAM then see how much VRAM the game end up using conclude that it surely running poor or having issues on cards with less VRAM.

  • @titan_fx
    @titan_fx Před rokem +163

    Nvidia thought 8 GB GDDR6 cost $100. What a time to be alive.

    • @Micromation
      @Micromation Před rokem +17

      They prolly want to force power users to pay up and didn't really think about gamers at all. Like, 4060Ti with 16GB would be go-to card for many 3D artists that in the past were continously investing in Titans and learned to optimize their scenes to work around VRAM limitations because Quadros were too damn expensive. Yeah, 4090 is faster but speed is not that big of an issue for most projects (because heavy projects you always push on render farm because it's faster and cheaper than doing it at home), the VRAM is because it limits complexity of your scene and you want that viewport render preview and being able to churn out quick sample renders before you commit to the render farm.

    • @BenedictPenguin
      @BenedictPenguin Před rokem +15

      @@Micromation aren't u forgetting it only has 128 bit bus among other cut down specs? the more u used up those vram buffer the slower the performance will be, the memory bus just simply isnt enough for such capacity.
      i do agree that it will be pretty much a suitable 3060 12gb replacement for budget oriented artists until something better is out there

    • @WayStedYou
      @WayStedYou Před rokem +17

      no, they want to keep their 70%+ margin on what costs 30 bucks

    • @Micromation
      @Micromation Před rokem

      @@BenedictPenguin it matters very little for this particular application.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +5

      ​@@WayStedYouif company sell their hardware near cost how they going to finance their R&D and paid the salary for their engineers?

  • @thestoicwhinger
    @thestoicwhinger Před 11 měsíci +5

    In an age where developers release unfinished games, is it really that surprising that efficient memory management isn't exactly a priority?

  • @Pidalin
    @Pidalin Před 11 měsíci +8

    8GB would be more than enough if games were not broken and developers heard about optimalisation.

  • @0nyx11
    @0nyx11 Před rokem +21

    The main problem is my gtx 1070 already has 8gb vram and I feel scammed paying so much money for something that I can't guarantee to be good for more than a few years, as it has the same amount of vram.

    • @BigHeadClan
      @BigHeadClan Před 11 měsíci +6

      To be fair your 1070 came out in 2016. You still have another year or two most likely out of the card.
      That’s an excellent run for a gpu.

    • @mannyvicev
      @mannyvicev Před 11 měsíci

      @@BigHeadClan yea on low settings

    • @aquatix5273
      @aquatix5273 Před 11 měsíci +1

      @@mannyvicev this is entirely because the GTX 1070 is an old card without good throughput in the modern era, not because of its VRAM. GTX 1000 cards on modern games just simply don't perform well anymore unless you have the GTX 1080 Ti, which had amazing throughput for its time.

    • @aquatix5273
      @aquatix5273 Před 11 měsíci

      @tomrobinson2914 Yup. Amazing the card was able to keep up for so long though, now it's performance is that if the low range for modern generations.

    • @mitu123Copper
      @mitu123Copper Před 11 měsíci

      @@BigHeadClan I don''t have another year for my 1070, heck I'm on 3440x1440 and need a new gpu that's stronger badly. Hope to get one by the end of this year.

  • @michaelkennedy320
    @michaelkennedy320 Před rokem +340

    In 2023 Any GPU that cost $200 or more should have at least 10GB of VRAM

    • @MrBlacksight
      @MrBlacksight Před rokem +40

      16

    • @Micromation
      @Micromation Před rokem +22

      Snatched 2nd hand 6700XT in perfect condition for 250 few days ago for my secondary PC - sweet, sweet 4060Ti performance and 12GB of RAM :D

    • @curie1420
      @curie1420 Před rokem +27

      goodluck getting a $200 gpu.... best you can have is a 4050 that is probably just a 3050 but with lower bus width

    • @Mayeloski
      @Mayeloski Před rokem +3

      @@MrBlacksight 20

    • @Taluvian
      @Taluvian Před rokem +12

      Low end 12gb and mid range 16gb.

  • @27Zangle
    @27Zangle Před 11 měsíci +6

    I've always started with high settings and then fine tuned each game. Playing Ultra is so hard on nearly any system and overtime I learned that playing down a setting with fine tunning not only increases the lifespan of the system, but also saved some money on utilities, AND I could never really tell the difference graphically.
    I honestly think many people, even those who know how to build and throw tons of money at components, just set their game to ultra and call it a day. They never really spend the time in setting nor the system settings as a whole.

    • @mitu123Copper
      @mitu123Copper Před 11 měsíci +1

      Yeah I do agree with you on that, I normally just play on High settings for the most part along with setting textures to Ultra and call it a day.

  • @Dulkh4n
    @Dulkh4n Před rokem +77

    the thing is not that you should play on high because it's the rational thing to do, is that mid tier cards in 2023 shouldn't force you to do it jn 1080p for 60 fps... it's a 400 card.. that would work far better with 12gb of ram

    • @TheTerk
      @TheTerk  Před rokem +1

      That's why I classify the 4060ti as a mainstream card. It's way too expensive where it sits in the market

    • @aaren117
      @aaren117 Před rokem +12

      It's not just 2023, these cards are being reviewed now but they will be on sale for about two years. So someone could go "oh, it's tight but it'll do" before pulling up VRAM Punisher 2025 and just feeling cheated. This is especially problematic when memory right now is cheaper than it's really ever been.

    • @zushikatetomotoshift1575
      @zushikatetomotoshift1575 Před rokem +3

      @@TheTerk I am waiting to see how AMD is going to be in the next following year or two for RX 7900 XTX for 500 dollars usd.

    • @zushikatetomotoshift1575
      @zushikatetomotoshift1575 Před rokem +9

      @@TheTerk Also a 4070 is a rip off too.

    • @Magnulus76
      @Magnulus76 Před rokem

      1080p as a resolution is still relatively demanding. It's over 2 million pixels rendered dozens of times per second. If you add in additional effect passes like screen space reflections or raytracing, it will be that much more demanding. That's all about having adequate numbers of shaders, not necessarily VRAM.

  • @Sp3cialk304
    @Sp3cialk304 Před rokem +39

    Games developed for the current consoles are going to need a lot of vram on PC. Not only do the consoles have a decent amount of vram themselves. They can also stream massive textures files directly from the SSD. Saving on memory usage and CPU usage. PC needs to start using direct storage more and requiring decent nvme SSDs.

    • @tohro6969
      @tohro6969 Před rokem +1

      Wasn't windows 11 supposed to solve this issue? And then we probably need the game themselves to support it as well

    • @zues287
      @zues287 Před rokem +9

      ​@@tohro6969 Direct Storage has been out for a few months, but game developers need to design their games around it. Diablo IV is the first game to fully support it (Forspoken had a half-baked partial support for it). When all new games start supporting Direct Storage, it should mostly fix the low VRAM issues.

    • @Sp3cialk304
      @Sp3cialk304 Před rokem +1

      Direct storage will help. From interviews I've heard with game devs they are using over 8gb even without the textures. But streaming the textures directly def would help. It will take a few devs saying you have to use a fast SSD for this game. But that's scary for a dev. Because anytime they release something that has any kind of hardware requirements people loose their minds. Start screaming optimization.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Před rokem +2

      You're not making sense.
      The consoles have shared memory. So the games made for them use... all of the memory. On PC, they are unoptimized and instead of using sytem memory and VRAM together in an efficient way, like PC games do, they try to smash everything into both RAM AND VRAM. Think about a game that takes 10GB of system memory and 5GB of VRAM, which is under the PS5's 16GB memory. Then imagine that game is programmed on unified memory for a console and the program shoves the system-RAM-y stuff into system RAM and then the entire thing AGAIN into GPU VRAM, so it thinks it needs 15GB VRAM. This is way oversimplified and the numbers are made up, but this is conceptually what is happening to many of these ports.
      There's some porting process these companies are using that shoves the square console peg into the round PC hole. Whatever it is takes less labor hours to do but has a shitty result. They save money and get what they pay for in the final product: a shitty game that runs poorly.

    • @Sp3cialk304
      @Sp3cialk304 Před rokem +1

      @@CyberneticArgumentCreator you do understand that a lot of data that goes to vram also goes to dram on PC. So the consoles having shared memory means the CPU and GPU can pull from the same memory pool instead of putting the same data in two pools. That a long with direct asset streaming from their SSDs makes the very efficient.

  • @ImperialDiecast
    @ImperialDiecast Před rokem +5

    its funny how ALL cards are affected, not just 8gb cards people like to dunk on.
    for instance, the 8GB at 1440p issues of today are the 12GB 4k issues of more expensive cards of today as well. you buy them because they are powerful enough to play 4k, but at 4k, the vram usage takes another jump and occupies between 9 and 11 GB as of today, and tomorrow it will exceed 12GB.
    that being said, it wont change the fact that the biggest culprits here are the games themselves, showing how updates can fix stuff. for example, reducing the gap between allocated vram and actually used vram. and fixing vram leaks. halo infinite, hogwarts legacy and forspoken still have lazy devs, but callisto protocol and the last of us have been fixed and forza 5 also works fine despite high vram requirements.
    so at the end of the day, exceeding vram shouldn't automatically gimp your framerate or cause texture pop ins everywhere. this will depend on the game. dont rush to buy a slower card with more vram over a faster card with less vram. there are enough benchmarks out there proving that e.g. a 3060ti still outperforms a 3060 even in games that require more than 8gb vram.

    • @noway8662
      @noway8662 Před rokem

      3060 is also cheaper. I don't really see any real argument there. 3060 just has extra perk of having more memory.

  • @goldfries
    @goldfries Před rokem +1

    Yup, in many cases one can't even tell the difference unless screenshot the scene and do a picture by picture comparison.

  • @Viking8888
    @Viking8888 Před rokem +17

    I read something interesting recently (I can't find the article again though). It was saying that with DX12, game devs would have to do a lot more coding when bringing assets into VRAM, which made dev time way longer. What the devs decided to do was just load as much into VRAM as possible to make things easier on themselves. This willy nilly take on things definitely would make things less optimized, especially if the wrong assets were loaded in VRAM when others are needed. Those assets would need to be purged and the correct ones then loaded.

    • @TheTerk
      @TheTerk  Před rokem +8

      sounds like we should hold developers as accountable as the GPU manufacturers, no?

    • @Some-person-dot-dot-dot
      @Some-person-dot-dot-dot Před rokem +3

      @@TheTerk No, not necessarily. Developers have been developing with 8gb for awhile. It's acceptable to think they need a little bit of extra power after 7 - 8 years. I believe atleast 10 gigabytes Vram is the new appropriate minimum for any gaming card. Anything lower should be considered entry level. There will always be market a for 8 Gb cards, but you have to properly price those kinds of cards. Nvidia says "The more you buy the more you save..." Why is that not allowed to be true for game developers?

    • @Viking8888
      @Viking8888 Před rokem +3

      @@TheTerk To a certain degree, I would say yes because for dx12 games, at least, they are taking an easier route that definitely could be a cause of unoptimized games. I'm sure there is a whole lot more to things than just what I mentioned though.

    • @Greenalex89
      @Greenalex89 Před rokem +1

      I think this trend will continue and more L3 cache will be benefitial too. Usually the publisher is to blame not giving the devs enough time to optimize.

    • @stangamer1151
      @stangamer1151 Před rokem +1

      The biggest problem here is that some modern games are way too demanding in terms of VRAM requirement for what they offer in return. The texture quality has not really been improved that much since 2020-2021. Why would these games work so badly on 8GB cards then?
      And judging by the current state of those games, which were patched, developers could release them in a much better state right at launch. If this happened, then there would not be any big complaints about 8GB buffer obsolescence.
      To play at console settings these days you really need 10GB GPU, but 8GB ones should deliver good gaming experience as well, without any major issues.

  • @nightlyfrost
    @nightlyfrost Před rokem +10

    If you are spending over £350 for a gaming card you want to make sure it's somewhat future proof. You also want to enjoy current games and future games at their best settings. Graphic cards aren't cheap and people aren't willing to spend on a new one every year. My 1060 6gb is still going strong. Granted the settings are at its lowest but I can still play games on it. At this age and day I would think 10 or even 12gb of can justify its price for £350 but for 8gb is definitely not worth it. It also doesn't help how gaming companies are rushing out to release games that aren't fully optimized which causes gamers to rant.

    • @christinaedwards5084
      @christinaedwards5084 Před 11 měsíci

      You missed a trick here.
      What you should of done is upgrade to the 2060, then the 3060, then the 4060 when they released.
      You could of got a decent amount back, I’m seeing a 3060 going 2nd hand on eBay for £240. (Should be £220 for used imo)
      Say you bought all those cards for £300 and sold for £220,
      going from a 1060 - 4060 or 1060 - 2060 - 3060 - 4060 would still of cost you £300 either way but you’d be far less worried about this whole situation you find yourself.
      Trying to sell a 1060 in 2023 will be a tough sell.

  • @DJ_Dopamine
    @DJ_Dopamine Před rokem +2

    I always use my own optimised graphical fidelity settings. Unless they are simply not needed to hit the fps cap.
    The difference versus Ultra/Max is basically unnoticeable to me anyway.
    Texture quality is something I try to keep as high as possible, that has the biggest effect on the look of a game.
    For the rest, I'll dial them down until I notice a significant degradation.

  • @JustGaming24
    @JustGaming24 Před rokem +3

    so 12gb vram for 1440p should be fine for at least 3-4 years?

    • @GmodArgentina
      @GmodArgentina Před rokem

      12gb will likely be fine during all of ps4's lifetime

  • @nastytechniquez9685
    @nastytechniquez9685 Před rokem +5

    I’ve got 8GB on a 3070 oc and can get great frames with lower settings at 1440p and especially when I turn off ray tracing options or the minimal amount if the settings allow for it. I don’t mind it right now but eventually I’ll want a high end card with lots of vram for 4K when monitors are more affordable

    • @DenverStarkey
      @DenverStarkey Před rokem

      i've had a 4k TV as my monitor since 2018, i suffered through 4 years of not being able to do any new games at 4k , then in 2022 i finnally got a 3070 in october because all the benchmarks showed it could pull off 4k deccently well at high or ultra settings (on some games) , and less than 3 months later there were 4 games making that card's vram indaequate for 4k high settings. Now there are almost a dozen games doing so. long before i will be able to get my 450 bucks worth outta this card , it will not even be a good card for high settings at 1080p let alone at the 4k native resolution of the TV i've owned now for 5 years. you bet i'm pissed off and wished i had gotten a radeon 6800XT now they were both the same price roughly (used on ebay) when i got my used 3070.
      EDIT: Deccent quality 4k tv's can now be had at prices lower than many of the 1080p-1440p video cards and much lwoer than actual 4k video cards ( some samsung models cost 330-450 bucks). when your 4k TV cost 2x-3x less than the hardware to run it that is batshit and just wrong. foir that matter good enough 4k monitors arn't that much more expensive than the TV's either.

    • @valorantpro-zi4yd
      @valorantpro-zi4yd Před rokem

      @@DenverStarkey amd is BULLSHIT'S SHIT

    • @frozby5973
      @frozby5973 Před 11 měsíci

      What do you play if you need lower settings i play everything on max settings with 3070 1440p apart from rtx in some games where it doesnt make any difference like elden ring and get 70+ fps everywhere

    • @DenverStarkey
      @DenverStarkey Před 11 měsíci +1

      @@frozby5973 yeah i play every thing maxed out as well and i'm usually runing in 4k on a 3070 just a few games i can't do this on and a few others i run dlss to get 60 fps in 4k upscale. don't know what OP is on about ..,. "Low settings" LOL

    • @frozby5973
      @frozby5973 Před 11 měsíci

      @@DenverStarkey guy is probably cpu bottlenecked and blames it on gpu vram lol

  • @XieRH1988
    @XieRH1988 Před rokem +16

    None of the crysis games ever came close to needing 8GB of RAM, honestly I think game devs these days just don't know how to make a AAA game look good without brute forcing it with insanely high textures or other unoptimised visual assets. conveniently, it works in favour of the GPU cartel who are more than happy to sell you all that VRAM you suddenly now need

    • @HuntaKiller91
      @HuntaKiller91 Před rokem

      True even crysis3 looks better than some games and perform well even with my deck
      Ryse is another good-looking game

    • @TBKJoshua
      @TBKJoshua Před rokem +7

      True but in 2023, texture squeezing and vertex optimization should be much less of a burden. Devs shouldn't still have to do all of that manual optimization because Nvidia doesn't want to give GPUs basic modern vram specs. I see where you're coming from though.

    • @XieRH1988
      @XieRH1988 Před rokem +2

      I definitely don't think nvidia is off the hook their business practices are so shady I'll bet they'd try to sell a 4GB VRAM card in this day and age if they didn't have any competition

    • @gozutheDJ
      @gozutheDJ Před rokem

      crysis 3 is a DECADE OLD

    • @tumultoustortellini
      @tumultoustortellini Před rokem

      gta5 came out a decade ago and still looks better than some games today, and runs far better than others. Metro exodus game out 2021 and same thing.

  • @Razor2048
    @Razor2048 Před 11 měsíci

    One thing to also look at is the shared memory use. windows task manager will also display the amount of shared memory use for the card. Some games may allocate 500+ MB less than the available VRAM, but may allocate 1+GB of system memory. Modern videocards will issue priorities to different memory pools, thus a game can allocate some shared memory without a large performance hit. The times you get massive frame time issues, stutters, and other major issues, are when you encounter a scenario where bandwidth intensive aspects of the game cannot entirely fit into the dedicated VRAM. You can determine these scenarios based on the PCIe bus usage. (apps like GPUz will will list PCIe bus usage on the sensors tab). Games that do not need to put bandwidth intensive stuff into the shared RAM, will often see the PCIe bus usage jump from the 10-15% range, to the 20-30% range on a X16 4.0 card. Saturating the bus fully in one direction will result in a load of 50%. When a game actively needs to use the system RAM for bandwidth intensive work, then you will notice the bus usage jump to the 40-50% range at which point the game will also begin to perform horribly.
    The solution thus far in many games including hogwarts legacy, has been to just opt to not load in additional textures when you run out of dedicated VRAM.

  • @alexguild
    @alexguild Před rokem +2

    Thank you. very well argued and substantiated results. Multiple causes that goes into all the noise about this.

  • @eliadbu
    @eliadbu Před rokem +4

    The issue is more of what game developers will want to do in the future, I've heard 2 interviews already, with devs saying that if they want to push graphical fidelity further like let say more detailed hair, facial features (like eyes, mouth etc), they will need to more VRAM. There is point where they can't optimize any further for those low VRAM cards because it will take a lot of time which may not be worth it. Sure ultra settings are stupid If they offer little improvements for huge performance cost, I can't say that for texture quality - because if you have the VRAM the performance penalty is not usually high, but the gains are impactful. The issue is nvidia(and amd to some degree) standardize that mid range cards that cost upto 400$ or 270$ have 8GB of ram. If a 500$ console that came out 2.5 years ago have 16GB of VRAM, I'm sure they could match or at least improve their VRAM offerings on their mid range cards. You can blame the users that use ultra settings , you can blame the devs for not optimizing their games, but the fact is 230$ card that came out 7 years ago have the same memory capacity as 400$ new gpu today, it points to an issue with today's card(s).

    • @Serpic
      @Serpic Před rokem

      "console that came out 2.5 years ago have 16GB of VRAM", not VRAM, just a RAM (technically there is video memory), or Shared RAM between RAM/VRAM. If games on a PC consume about 6-8GB of RAM, then I can assume that in consoles this shared memory (16GB) is divided approximately equally between VRAM and RAM. For example, Hogwarts Legacy (PC port) uses about 10 GB of RAM and about 7 GB of the same RAM is allocated for cache. Total 17GB is only RAM "required" by this game, not including VRAM. And now let's remember how much TOTAL memory is in the console.

    • @eliadbu
      @eliadbu Před rokem

      @@Serpic technically it does not matter. The point is there is 16GB of GDDR6 memory in these machines. It is about cost - what you pay and what you get. These cards should have had more memory, all things considered - both from technical standpoint and economically.

    • @sparkythewildcat97
      @sparkythewildcat97 Před rokem

      I'm fine paying $250 (the current selling price for AMD's only 8 gb gpu from the new gen) for a gpu that has 8gb. Sure, ideally it would be 10gb, but it's fine. However, 8gb on a $400 gpu (which is a completely different class of product) is just offensive, especially when you can by a last gen gpu that is nearly as strong that has 12gb for about $100 less.

    • @eliadbu
      @eliadbu Před rokem

      @@sparkythewildcat97 TBH I don't really get the point for RX7600 at msrp of 270$, it would make sense at 200-230$, you're getting bit better performance vs 6650xt and some other small improvements of RDNA3 and that it. With 7600 you get less ripped off but it not great improvement anyway. Currently you can get it sub 260$ I would not be surprised if it will be available in several months for 230$ maybe less.

  • @donatedflea
    @donatedflea Před rokem +8

    The problem is that privelidged people these days expect a midrange card to run ultra settings at higher resolutions.

    • @gozutheDJ
      @gozutheDJ Před rokem +1

      yeah

    • @Keloot
      @Keloot Před rokem

      And they should
      If you make something exclusive less people buy it . Less people buy it less money you make ...

    • @gozutheDJ
      @gozutheDJ Před 11 měsíci

      @@Keloot take me back to when pc gaming was niche and wasn’t full of entitled babies who know nothing about PC

    • @darshankumar2717
      @darshankumar2717 Před 11 měsíci +1

      Mid range cards should be priced like mid range. Imagine paying 400-500$ for a card and still having to compromise on current games.

    • @gozutheDJ
      @gozutheDJ Před 11 měsíci

      @@darshankumar2717 imagine having your ego so fucking wrapped up in an ultra settings boner you refuse to adjust anything

  • @gaijinkuri684
    @gaijinkuri684 Před 8 měsíci +1

    This was a great video, exploring the issue from many directions.

  • @TechMage299
    @TechMage299 Před rokem

    I doubt its my phone, but a lot of the footage had screen tearing, very noticeable on the spiderman footag. That said, in assuming vsync is off. But could that also be a effect of stuttering from low framerate if it is? Im high af so i cant even tell if im wording this right

    • @TheTerk
      @TheTerk  Před rokem

      Nope, vsync is off for the capture.

  • @Simsationxl
    @Simsationxl Před rokem +5

    1080p was cool when I was in high school.

    • @zdspider6778
      @zdspider6778 Před rokem +1

      It became "mainstream" around 2009-2010, maybe even 2011.

    • @FilthEffect
      @FilthEffect Před rokem

      ​@@zdspider6778try 2008

  • @TheSanien
    @TheSanien Před rokem +3

    90% of the time it is just bad coding and optimization. Most likely because developers have tight deadlines.

  • @EdToml
    @EdToml Před 8 měsíci

    I have a 6600xt. Use it on linux with proton on an AM5 based system. There is a single (experimental) kernel patch that clears vram memory as it is released. This makes a big difference on some games, for instance Cyper Punk 2077, where it really really helps stabilize frame rates ( 1% lows very much improved). With FSR 2.x enabled in auto mode, with ultra settings, there is no problem maintaining 60+ FPS (1440p freesync vsync enabled). Think I'll wait before investing in a rx7800xt....

  • @JuanBG10
    @JuanBG10 Před 11 měsíci +1

    o have a 3060ti and playing the last of us on 1080p, I limit my fps to 60 and put everything on ultra, it pass the limit of VRAM but i'm playing without any problem, not a straight line on the frame times but nothing that bothers, I put some texture setting on high to not pass the VRAM limit and I don't see any difference on frame times locking my fps to 60 so I'll stick to ultra

  • @jeffphilp7430
    @jeffphilp7430 Před rokem +3

    Great reporting, balanced and well considered, subscribed.👏

  • @emanueldumea8217
    @emanueldumea8217 Před rokem +4

    I have a rx 6600 and a ryzen 5 3600 and i can play many games at high settings with 60fps. For Red Dead Redeption 2 i use a mix of high and medium settings with textures to ultra to get 60fps. There are a lot of great games that everyone can enjoy with 8gb of ram. Developers are just too lazy nowadays.

    • @valenrn8657
      @valenrn8657 Před 11 měsíci

      RX 6600 8GB is below PS5's VRAM GPU allocation. RX 6700 XT is a good approximation of PS5 and XSX.

  • @gamers-generation
    @gamers-generation Před 11 měsíci +1

    Another nice one Terk! Great point bringing up the console vram as well since it'd be easy to assume this is the metric/standard that developers are going by to "make sure things run properly"... although FFXVI's recent performance would suggest otherwise. I'm completely speculating at this point but it does seem as you put it "brute force" is the way devs are going with higher vram on their development systems/kits, and then well.. hey if it doesn't work day 1, they'll just patch it later right? You already bought the game anyway...💀

  • @PorckChop-mk3pw
    @PorckChop-mk3pw Před 10 měsíci

    so can I play at max settings with a 4060 (non-Ti version) at 1080p? i'm building a new PC and my 1050 Ti obviously won't cut it anymore

    • @resistancerobert
      @resistancerobert Před 3 měsíci

      I would get a 7800 xt it would be better than Rtx 4060 or ti.

  • @tapioorankiaalto2457
    @tapioorankiaalto2457 Před rokem +3

    I just bought a used RTX 3070 with a warranty for 300€ because of this panic, so I'm not complaining. They usee to cost 450€ just a couple of months ago and this one was 1050€ in 2021 when it was new...😅

    • @nadantagaming2276
      @nadantagaming2276 Před 10 měsíci

      A steal! Bought a used one a week ago for about $370 and I don't see 8GB as a big issue since I play 1080p. Also don't need to play on everything max.

  • @ForTheOmnissiah
    @ForTheOmnissiah Před rokem +10

    I'll just be dead honest, as an indie game developer, I'll say that the vram usage issues are completely, and utterly, a problem of the developers being lazy and not optimizing. Period. Consoles like the PS5 can run game just fine with a mass shared memory (both system and video) of 16gb, and it does it just fine. While on the PC side of things, we have to have minimum requirements now of 16gb vram. Why? Starfield will be released on Xbox Series X and Series S. Xbox Series S has a grand total of 10gb of total memory, system AND video shared memory. If the Xbox Series S can run this game with 10gb of memory *grand total* , why can't a PC with 8gb of vram and 16gb of ram? The serious issue here is developers not putting much time into PC ports. The minimum requirement for Starfield is 16gb of vram, which is less than the entire ram requirement for the Xbox Series S. That's a developer problem, not a hardware problem.

    • @Thakkii
      @Thakkii Před 11 měsíci +1

      Hold on where you saw the 16gb Vram minimum requirements for starfield? Are those requirements out ? Because steam requirements for the game is an placeholder and don't mention Vram reqs at least that i remenber

  • @Bill-lt5qf
    @Bill-lt5qf Před rokem +2

    i'm confused as to why even 8GB is being saturated, depending on the size of the game, it would suggest that a huge amount of the game is just sat in vram. say the game is 64GB, then that is one eighth of the game, or 12.5% of the game. seems like alot more than necessary.

  • @rohanchooramun7288
    @rohanchooramun7288 Před rokem +1

    It all depends on games and resolution and if the games doesn't require more than 8gb Vram.
    But a few new games are an issue now... if we have this issue today imagine what will happen in 2 to 4 more years.
    Some of the games are able to mitigate the issue by having little to no stutters. But not really, the game is actually eating you Ram instead of Vram which lowers you overall fps instead of a sudden drop aka stutter. Or even some games doesn't even load the texture or having texture popups.

  • @djdalgiux
    @djdalgiux Před rokem +6

    People for some reason don't understand that 7 year old cards like rx 580 still can run "The last of us" at 40+ fps with fairly high quality graphics... 8gb will always be enough you just need to tweak the settings a bit.

    • @AAjax
      @AAjax Před rokem +7

      Not very long ago, some people were saying that 4gb would always be enough for any game.
      TheTerk has it right - if you care about AAA games, you need to compare your GPU to consoles. 8gb GPUs will "always" be enough, until consoles are offering significantly more than 8gb.
      Xbox series S is the only thing keeping AAA developers from abandoning 8gb, and developers are complaining hard about having to support the S.

    • @damazywlodarczyk
      @damazywlodarczyk Před rokem +1

      @@AAjax It's nonsense what you're saying. I played Ghostwire Tokyo(a ps5 exclusive) and The last Of Us remake, also Ps5, with gtx 1060 6gb. 50-60 fps. It's not the series S, these consoles have 16gb shared ram, its not much. PC ports are weak.

    • @djdalgiux
      @djdalgiux Před rokem +1

      @Ajax Will be at least 5 more years before they cut support for 8gb cards, new triple A titles are also more or less CPU heavy now so personally I'm really not worried.

    • @curie1420
      @curie1420 Před rokem +5

      ​@@AAjax the point was 8gb for a $400 card today isnt good since we've seen 8gb gpus under $200 7years ago!!

    • @JackieRipper_AKA_Legs
      @JackieRipper_AKA_Legs Před rokem +2

      I guess the only argument for 8gb being too low is that you have to care about AAA games. Most people don't and for good reason, nothing good has come out in the past 5 years.

  • @kaptennemo1043
    @kaptennemo1043 Před rokem +5

    please don't show that nvidia ceo, I hate his face. make my wallet screaming.

    • @Kitten_Stomper
      @Kitten_Stomper Před rokem

      😂😂

    • @kaptennemo1043
      @kaptennemo1043 Před rokem +3

      @@saltee_crcker2363 because I'm care about GPU so the worst goes to Jensen.

    • @Micromation
      @Micromation Před rokem

      @@saltee_crcker2363 Always Bobby Kotick...

    • @zues287
      @zues287 Před rokem

      ​@@saltee_crcker2363 Zuckerberg is just creepy. Jensen is cringey as hell. His entire speech at Computex was full of cringe. On the upside, he gave us plenty of ammo for memes.

  • @SeifSoudani
    @SeifSoudani Před rokem +1

    Testing frame rate alone is not enough, you have to also test frame times / frame pacing, that’s another source of lag in vram buffer constrained setups

    • @mitu123Copper
      @mitu123Copper Před 11 měsíci

      Yep, these are important as well, definitely if they contain stuttering.

  • @wahidpawana424
    @wahidpawana424 Před rokem +2

    The reason why GPU makers are getting away from overcharge gpu with high vram is because we do not call out game developers for poor optimization of video games. Having disproportionately diminishing returns when increasing fidelity of games graphics from medium to high and then to ultra is unaccepteble.

    • @saricubra2867
      @saricubra2867 Před rokem

      Meanwhile the memory usage increasing from almost 1GB to 3.5GB on 2007 Crysis is glorious. I finally managed to experience that game properly at 1080p (on Intel UHD 770 maxed out).

    • @gozutheDJ
      @gozutheDJ Před rokem +2

      wtf are you even talking about
      medium settings these days look a million times better than they did 10 years ago. "disporportionaly diminishing returns" THATS WHY THE SETTINGS ARE THERE FOR YOU TO TWEAK YOURSELF.

    • @wahidpawana424
      @wahidpawana424 Před rokem

      @@gozutheDJ why are you arguing on the same point we agreed upon? Please read my comment again. Going from medium to high is a diminishing return for most game. It does not justify the performance and ram cost.

    • @gozutheDJ
      @gozutheDJ Před rokem

      ​@@wahidpawana424I wouldn't say medium to high is that diminishing, but if you think that feel free to set your games to medium then. no cue what you are trying to say. the configurable settings are there for a REASON.

    • @wahidpawana424
      @wahidpawana424 Před rokem

      @@gozutheDJ the point is game developers needs to optimise their game better.

  • @vygantaspetrauskas911
    @vygantaspetrauskas911 Před 10 měsíci +2

    I believe that the 6650 XT's 8GB of RAM is sufficient. Ultra settings seem excessive to me regardless. I opted for the 6650 XT over the 6950 XT because I'm not a heavy gamer and I'm hopeful that something better might emerge in the future, like the 7800 XT.

    • @lalas3590
      @lalas3590 Před 9 měsíci +2

      The 6650 XT has pretty insane Clock speed for its price TBH, I expect it to stay relevant for as long as someone keeps the game's settings as to stay below the full 8gb VRAM.

  • @richardfarmer6570
    @richardfarmer6570 Před rokem +37

    I said this from the beginning. Just because some recently released, terribly optimized games require over 8GB of VRAM doesn't mean you suddenly need 16GB to ever game again. Just showing the improvement from updates to these games shows that. Obviously proper settings are equaling important.

    • @ashkanpower
      @ashkanpower Před rokem +1

      You are right, but as we progress, game development gets harder and we can expect more unoptimized games.
      TLOU1 looked terrible in high and med for couple of months, the experience was frustrating. Only ultra settings looked good but my 3070 couldn’t handle it.

    • @SweatyFeetGirl
      @SweatyFeetGirl Před rokem +7

      that doesnt mean that you dont need more than 8gb in the near future though.... 8gb vram cards shouldnt cost more than 200$ in 2023

    • @JABelms
      @JABelms Před rokem +2

      @@SweatyFeetGirl I played 4K with my 980Ti from 2014 to 2020 with just 6GB VRAM, played that unoptimized Kingdom Come Deliverance on high just fine, even Skyrim with 500 mods or so. I own a 7900XT now and even I wonder about all the VRAM screaming, games are supposed to be more efficient now, and VRAM doesn't contribute large prices for GPUs, they are cheap...NVIDIA is just pushing their DLSS agenda with all these expensive crap

    • @SweatyFeetGirl
      @SweatyFeetGirl Před rokem +1

      @@JABelms 8gb of GDDR6 memory cost 25$. nvidia sells the 4060ti 8gb for 399$ and 16gb 4060ti for 499$. they quite literally make 3x profit on that lmao

    • @Battleneter
      @Battleneter Před rokem +1

      Even though longer term PC gamers are very used to tweaking settings including myself, you should not "need to" from a VRAM usage perspective is the root of the argument. No 8GB cards are not acceptable in 2023 especially given price hikes of recent years, and how dirt cheap VRAM now is.

  • @Magnulus76
    @Magnulus76 Před rokem +1

    Very rarely do I use ultra settings. People seem confused about what "ultra" is about. It's about making screenshots, not necessarily what developers expect you to use to have a good play experience.
    Unless you are playing games at higher than 1080p, I don't think there is an issue with 8 GB of VRAM. It's more than sufficient for a well-written game.

    • @mitu123Copper
      @mitu123Copper Před rokem

      Textures on ultra with rest of the settings on high is normally how I would play them.

  • @educational8389
    @educational8389 Před rokem +2

    Very interesting video, thanks! It's amazing how much performance can be had just by dialing the settings down just a bit.
    I personally just leave it to whatever settings my Geforce Experience suggests, and it's quite good at it. Maybe that's a good no-brains-needed way to handle the optimum settings.

    • @StopBlurryGames
      @StopBlurryGames Před rokem +1

      The problem is he is not a UE5 developer which means he doesn't know next gen features like Lumen which look good, fast to implement, but at the cost of vram will soon be forced settings in games.

  • @paintitblack4199
    @paintitblack4199 Před rokem +5

    People in 2023 expect to play ultra high settings on 4k with low and midrange cards is the problem when it has never been that way before.

    • @WayStedYou
      @WayStedYou Před rokem

      Yeah it was, 2 years after the ps4 came out you could run Ps4 games at double the frames at max settings with a gtx 970 or 1060

    • @Magnulus76
      @Magnulus76 Před rokem

      @@WayStedYou That was a long time ago. Modern games are alot more graphically demanding, especially in terms of the number of shaders used.

    • @zdspider6778
      @zdspider6778 Před rokem +1

      In 2017 you could run Titanfall 2 (one of the best looking games at the time) at 4k medium on a GTX 1070.
      Wtf happened?
      It seems like we're going backwards. Now Nvidia wants you to play at 1080p and use their shitty upscaler ("DLSS"). I blame ray tracing. 3 generations later and people still turn that shit off.

  • @kaiservulcan
    @kaiservulcan Před rokem +8

    Hi, and thanks for this piece of information. So finally, if the games are well optimized, I can play them with my 500$ 4060ti at 1080p ultra, otherwise, I have to lower the settings presets. High runs better than ultra and medium better than high. BTW, I was already aware of these things. Also, games optimization has improved things over time... Sorry to say that, but I don't understand what the video's purpose is.

    • @Greenalex89
      @Greenalex89 Před rokem +2

      The gist of the vid is: 1.) Peeps were ranting over 8 gigs not being enough. 2.) Games werent optimized at launch but can be handled when optimized with 8 gigs. 3.) Ultra settings are stoopid anyways. Conclusion: Dont panic, tune down settings and wait for a patch (or buy AMD for much cheaper, same performance but much more Vram)

    • @gozutheDJ
      @gozutheDJ Před rokem

      lol well, pc gamers are generally dumb as fuck these days.

    • @TheTerk
      @TheTerk  Před rokem +1

      Neither of yall watched the end of the video. 13:53. You shouldn't spend $500 on a mainstream card, the $400 6700 XT should be the defacto GPU for this Era of gaming.

    • @Greenalex89
      @Greenalex89 Před rokem

      @@TheTerk Friend, that's my GPU.

  • @doomtomb3
    @doomtomb3 Před 10 měsíci +1

    Good job differentiating the mainstream Vs midrange vs high end cards

  • @parker4447
    @parker4447 Před rokem +2

    I got an RTX 3060 TI and I always use only optimized settings even if I can hit 1440p 60fps Ultra(I'm on a 1440p monitor) with or without DLSS Q and I think Ultra or max settings are dumb and rather take the extra 30 plus fps with optimized settings and play with 90 plus fps. In some games ultra shadows, volumetrics, AO or what have u look almost the same as high or even medium but take like 20 fps lmao.

  • @Hman9876
    @Hman9876 Před rokem +6

    The vram histeria has caused people to reccomend 3060 12gig over the 3060ti. It's a mantra

  • @Brian-dd2df
    @Brian-dd2df Před rokem +4

    Glad to see other's talking about this. UFD Tech did a similar video to this showing that yeah we aren't utilizing all of the available ram in newer titles. I feel like a lot this has hyperfocus on VRAM started with the launch of the 4060 cards which do appear to have serious issues related to VRAM. But most, if not all, cards more powerful than the 4060 cards don't seem to have trouble with VRAM bottlenecking.

  • @mikeDeSales943
    @mikeDeSales943 Před 11 měsíci +2

    The interesting thing is that even if you have enough vram, if your GPU isn't powerful enough to make use of it, there isn't much of a difference, look at the 1650 super vs rx 580, half the ram. Same with the 6700 xt, in most cases it's not playable over the 4060 ti when it does pull ahead in certain cases at 1440p and 4k, of course there is the occasional game where the 6700xt does beat it.
    Of course there is a good point to be made about how future games will perform, but will they only benefit slightly from more vram, what's an extra 10 fps when you are moving from 30 fps to 40 fps, of course it's not worth the company in adding the extra vram if that's the case.

  • @romangregor4552
    @romangregor4552 Před rokem

    best review i heared so far for entry lv and midrange thanks :)

  • @HuntaKiller91
    @HuntaKiller91 Před rokem +6

    I won't pay more than $349 for a 4060ti with 16gb🤣🤣
    My $199 6700xt did a better job for 6months+ now

    • @Micromation
      @Micromation Před rokem +3

      199? That's a steal, I thought I've got a good deal recently at 250 :O

    • @curie1420
      @curie1420 Před rokem +1

      ​@@Micromationyou guys got yours used? i got mine 3months ago for $360 lol

    • @Micromation
      @Micromation Před rokem

      @@curie1420 yeah I've grabbed used. It's for my 2nd PC for when my main is rendering (Blender). Sapphire Pulse in pristine condition :D

    • @HuntaKiller91
      @HuntaKiller91 Před rokem

      @@curie1420 just did a quick check on my region
      Asrock phantom gaming 6800xt is $329
      Crazy deal but I'd wait for navi32
      See how it fares

    • @Steven_Daniel
      @Steven_Daniel Před rokem

      @@HuntaKiller91 that's an insane price, you wouldn't want to miss out on that after it sells out. My 6800xt is really worth every penny at $375.

  • @Fortzon
    @Fortzon Před rokem +3

    Digital Foundry also recently dunked on HUB's VRAM claims with their TLOU revisited video. Using bad PC ports as a basis for your thesis is pretty bad.

    • @TheTerk
      @TheTerk  Před rokem +3

      Yep, that single setting (texture streaming rate) fixed it for me

  • @jimmyhughes5392
    @jimmyhughes5392 Před rokem +2

    i'm running an 8gb sapphire nitro rx570 and it's still a 1080p beast, most games run high settings with 100 fps avg

  • @mythicenchilada8371
    @mythicenchilada8371 Před 10 měsíci +1

    shouldnt at this point the mid to low range be in 1440p and no longer 1080? its been at 1080 for way too long

  • @badass6300
    @badass6300 Před rokem +3

    What you forget is that consoles have unified shared memory, meaning that the memory is used by both the CPU and GPU, the maximum allowed RAM for the GPU is 8-10GB, the CPU gets 6-8GB

    • @belliebeltran4657
      @belliebeltran4657 Před rokem

      That doesn't make any sense.

    • @badass6300
      @badass6300 Před rokem

      @@belliebeltran4657 yes it does... the consoles have an APU, their GPUs are integrated and share the memory with the CPU just like a desktop APU would. Meaning that out of those 16GB of RAM, the CPU uses 6-8GB and the GPU 8-10GB, sony's developer guideline also states that they shouldn't allocate more than 10GB of the memory to the iGPU.

  • @dancomando
    @dancomando Před rokem +3

    Im happy with 3070ti 8gb it handles every new game really well and couples with my 10gen I5. Some games would allocate more than 8gb but for any given moment they dont need that much to run and BUS just replaces memory with new data when needed.

  • @TheMichaelparis
    @TheMichaelparis Před 9 měsíci

    noob questions lol does ddr5 too slow to help the gpu vram ?

    • @car1o
      @car1o Před 3 měsíci +1

      More than 5 times slower approx and gpu cannot access it that effective like vram.

  • @34Berky
    @34Berky Před 11 měsíci

    Wait, why do you suggest 3700 instead 5600 or 5700x.

  • @supremacy98
    @supremacy98 Před rokem +3

    I wholeheartedly feel that the problem is optimisation. Adding more vram to GPUs is temporarily stalling the problem of running out of vram, but it's not solving the problem. There have been games that looks amazing withoit consuming 10-12gb of vram. Games by right should come properly optimised, and it's not impossible too, so that's the best solution, though an unlikely one

    • @Xenotester
      @Xenotester Před 11 měsíci +1

      It can be solved with good asset streaming, but... new 8Gb vram card also have only x8 pcie lines, not 16 - so it not just small amount of vram but slower speed for stream new data to vram

    • @redslate
      @redslate Před 11 měsíci +1

      Optimisation is one piece of the puzzle, as it allows games to run more efficiently. The other piece of the puzzle is better hardware (in this instance, more memory), as that allows game development to continue to advance.
      A lot of the features of higher end cards isn't 'trickling-down' as rapidly as it has in the past, leading the mid-tier (core) market to become stagnant.

  • @Devilion901
    @Devilion901 Před rokem +3

    That's a pretty bad take considering more and more titles will consume more vram, 8 gb has been standard for half a decade now, time to move up.

  • @AmCanTech
    @AmCanTech Před rokem

    Isn't that why we have resizable bar ?

  • @PackardKotch
    @PackardKotch Před 11 měsíci +1

    Nah, if i already paid almost as much for the GPU alone as a PS5 costs, and i cant match the texture settings that the ps5 uses in 4k while im running at 1080p, then i sure af aint paying 1.5x as much to have the same issue on another generation of GPUs

  • @photonboy999
    @photonboy999 Před rokem +3

    8GB VRAM - not enough now for some current and definitely upcoming games
    12GB VRAM - mostly fine likely for next couple years, but will increasingly require some THOUGHT to manage at times
    16GB VRAM - rarely going to be an issue over next couple years likely
    GAME DEVS have spoken about how INCREASINGLY DIFFICULT it is to get games to work with 8GB of VRAM. Put simply, the NEW CONSOLE's have a larger memory cache than old consoles and newer game engines/techniques are using that VRAM in newer and more memory intensive games. And NOT just for textures. There's almost certainly a lot of OPTIMIZATIONS yet to be done with these newer techniques. Streaming TEXTURE ASSETS from an SSD (or copying those textures to SYSTEM MEMORY is one huge, meaningful way to reduce VRAM footprint) so you can stream those "just-in-time" but we're not there yet. So LOTS of reasons to believe VRAM usage will go UP for some games and lots of reasons to believe it will go DOWN for o thers. In the end, the PS5/XBSX will largely drive what direction the PC market goes.... I suspect we'll need 16GB+ VRAM cards in the near future but that they'll get the software sorted out so that streaming from SSD/SYSTEM MEMORY drastically reduces the upward increase in VRAM... and with diminishig visual returns we may simply see hardware requirements stagnate... I suspect we'll end up with something very SIMILAR to the current PS5 on PC's in the future... a SoC/APU with 64GB of shared (VRAM+SYSTEM) memory that doesn't even require a fast SSD because they'll just have games COPY the memory to shared first (so no need to STORE every frikking game on a fast SSD just to run well)... so by then a standard 4GB per second SSD can transfer 24GB of game assets in SIX SECONDS.

    • @damazywlodarczyk
      @damazywlodarczyk Před rokem +1

      It's complete nonsense. You will not need more than 8gb at 1080p until the end of the console life. The ps5 has 16gb ram shared.

    • @JackieRipper_AKA_Legs
      @JackieRipper_AKA_Legs Před rokem +1

      Devs can't even make an original or fun game to begin with now, let's not get into how poorly optimized they actually are.

    • @aqwandrew6330
      @aqwandrew6330 Před rokem

      @@damazywlodarczyk bullshit, 16gb of shared is still 12gb min purely for vram

    • @kealeradecal6091
      @kealeradecal6091 Před rokem +1

      @@aqwandrew6330 how are you sure? 4 GB left for game logic and for the OS of PS5. It is just easier to create and optimimize games for console, as everyone has the hardware

    • @damazywlodarczyk
      @damazywlodarczyk Před rokem +3

      @@aqwandrew6330 You don't know that. Only devs know that. You think everything else works on 4gb? When PC ports have 16gb ram as minimum? Just think a little. It's 16gb ram vs 24gb ram. PC ports are shit because they are badly ported, not because of low vram.

  • @MrAckers75
    @MrAckers75 Před rokem +3

    8gb at 1080p was never the problem….the problem is devs pushing out substandard software

  • @Mayeloski
    @Mayeloski Před rokem +1

    top quality video here, I'm currently saving money for a new GPU, cus I want to play the new cyberpunk 2077 DLC in high settings with RT, dunno if 1440p but already got a 1080p 144hz monitor, not sure if pick up a 4070 or 4070ti for that

    • @Mayeloski
      @Mayeloski Před rokem

      @@tabrizi2000 dang, thank you for your reply; totally understand what you are saying, already got an i3 10100f with a rx 6600xt so wanna upgrade badly, have been playing cyberpunk in high settings at 50-ish fps, not bad but really wanna have a great experience also with other games; I do most of the time gaming, but also uses some softwares like premiere, blender, unity and that stuff, and I'm willing to spare and spend up to $1500 for gpu, cpu, mobo and ram update by the end of this year, got a nice case with 4 fans and 700w psu, and I don't want to upgrade in like 3 or 4 years from now, maybe going up to 1440p 144hz but still; you think rtx 4070ti, paired with an i5 13600k is the best option for my budget? also going for 32 ddr5 gb, it has gotten really cheap tbh

    • @gozutheDJ
      @gozutheDJ Před rokem

      @@Mayeloski dog if you are getting only 50 fps with a 6600 xt you are fuckin up.

    • @johnathanera5863
      @johnathanera5863 Před rokem

      ​​@@gozutheDJ no lol. The 6600 is fucking weak dude.
      Especially trying to run a juggernaught like cyberpunk.

  • @matthewloughlin970
    @matthewloughlin970 Před 7 měsíci

    So with this info would I be better getting a 3060 or a rx6650xt

    • @TheTerk
      @TheTerk  Před 7 měsíci +1

      neither - 3060 doesn't have the horsepower to utilize the 12GB, and the 6650XT doesn't have the VRAM to use its horsepower. go with at the minimum 6700XT or equivalent GPU

    • @matthewloughlin970
      @matthewloughlin970 Před 7 měsíci

      the problem here is that at least in my region their is a considerable price difference between the rx6650xt and the 6700xt @@TheTerk

  • @RetroVillager
    @RetroVillager Před rokem

    Just play it on CRT monitor, even at 720p it looks beautifull, the color, contrast, response time, unfortunately its so damn hard to find wide screen crt.

  • @zeljkoklepac3180
    @zeljkoklepac3180 Před rokem

    8 gb Vram is good mid range for many staff problems occur with the drivers and software- also ingame settings can be tuned so 8 GB GPU runs it but with no problems or being maxed out at least for my needs- key is in the game settings and drivers choice :)

  • @seanc6754
    @seanc6754 Před 11 měsíci +1

    If you dont realize that Nvidia put 8gb of vram in almost all the 30 series cards and even in some of their 40 series cards on purpose then that I would say you're not seeing the forest through the trees.. pretty convenient to have to buy another video card to run New age games at 1440p and max settings when more and more people are going to 1440p monitors... And games are getting bigger and more complex..

  • @iWillMakeYouTea
    @iWillMakeYouTea Před rokem

    How I play on my RTX 2080 on 1440p ?
    High Settings + Ultra Draw Distance (and Ultra Foliage, but only when There is in fact a big difference)
    If that's not enough to achieve 60Fps, at least all of the time, then G-sync compatible doing it's job, or of not turning on DLSS Quality.

  • @cyberpunk59
    @cyberpunk59 Před rokem

    With my vega 64 nitro plus that I bought 400 € in early 2019. I am still running games perfectly well on my 1440p high refresh rate monitor. A handful of games need some heavy tweaking, but those are not what I play the most, almost everything else runs at high /ultra and give respectable performance for the time of game (60-90 in fortnite ultra with shadows in medium for an example, a bit higher for deep rock galactic).
    I will change this card soon, so I can give it to my brother. But it's more about making my brother and myself a luxury gift than actually needing to replace my card

  • @mannyvicev
    @mannyvicev Před 11 měsíci +1

    people take this to an extreme though and act like these cards with lower vram will be unusable very soon. the reality is its a very gradual change that can be mitigated largely just by lowering vram hungry settings. but of course that means a compromise in graphics quality

  • @nukez88
    @nukez88 Před rokem +1

    I'm watching this video with the captions turned on and laughing my ass off because every time you mention A Plague Tale Requiem it shows up in the captions as plate tail, piketail, plugtail,

  • @jelly8594
    @jelly8594 Před 11 měsíci

    Very good video!

  • @exmerion
    @exmerion Před 10 měsíci

    While I agree most people won't be using ultra realistically(or at least shouldn't) The setting is used to consistently benchmark across the entire GPU range to make price comparisons. If you made high the standard then the 4090/4080/7900xtx would be CPU bound in almost every test outside of 4k in a handful of games. Also mentioning Vram on the consoles is a bit misleading I think since they don't really have to split their memory allocation between CPU and GPU.

  • @BlGDaddyRob
    @BlGDaddyRob Před 9 měsíci

    My 1070 has 8 GB lol. Amazing to me we are having this discussion 6+ years later about new cards that cost what mine did back then(not accounting for inflation) still having the same Vram.

  • @slc9800gtx
    @slc9800gtx Před 10 měsíci +1

    Excellent review

    • @TheTerk
      @TheTerk  Před 10 měsíci

      Glad you liked it!

  • @tech4u2022
    @tech4u2022 Před rokem +2

    And on top of that they (nvidia) also dumped SLI.
    So the option of running 2x 8Gb VRAM is taken away.
    Seems they planned this all along ...

    • @johnbilodeau6296
      @johnbilodeau6296 Před rokem +1

      For gaming (and in general), sli or nvlink doesn't increase vram. The data is mirrored on the GPUs.

    • @tech4u2022
      @tech4u2022 Před rokem

      @@johnbilodeau6296 thats one way you CAN use it

  • @danlt1497
    @danlt1497 Před rokem +2

    Optimization should be the hysteria

  • @factsoverfeelings1776
    @factsoverfeelings1776 Před 11 měsíci

    The PC game market is fairly unique in that the games more often than not, offer you numerous settings that you can change to make the game “playable” on your PC.

    • @factsoverfeelings1776
      @factsoverfeelings1776 Před 11 měsíci +1

      That being said, games today are not that graphically impressive looking. So what are they using all that ram for???

    • @floycewhite6991
      @floycewhite6991 Před 11 měsíci

      @@factsoverfeelings1776 Seems like the textures are drawn at 4k and have to be downscaled.

  • @UltraMicroBudget
    @UltraMicroBudget Před rokem

    Got a 6750xt w 12gb vram last month on sale from Newegg for $330

  • @chrisclarke8054
    @chrisclarke8054 Před rokem +1

    Great work. Fantastic expainations. I realy enjoyed your breakdowns and you encompased in this vid whats taken me months to fully research. Salute u sir. And no, 8gb GPUs are not "fixed", hell right now I'm on 4 and these game still work. Low settings admitentley. But I watch for the coming replacement...

  • @WayStedYou
    @WayStedYou Před rokem +2

    Remember when the gpus with half the vram of the consoles used to be able to play at higher settings at double the frames of the ps4 for 2/3rd the cost?
    Now you have to pay as much as the console for less than console settings/frames with the same half vram.

    • @gozutheDJ
      @gozutheDJ Před rokem +2

      may have found the dumbest comment on here

    • @Keloot
      @Keloot Před rokem

      ​@@gozutheDJexplain bootlicker. Because my GTX 1050ti is better than the ps4

  • @rzkrdn8650
    @rzkrdn8650 Před rokem

    Whoa. I'm glad someone digging about this, very thorough video. It is a hysteria and it is a valid concern at the same time. Like i can still play things, but we are sure like on a treadmill in terms of standard of gaming with nvidia's domination

  • @ErgonomicChair
    @ErgonomicChair Před rokem

    That.... yeah that makes sense. That was actually really obvious, and I never even thought of it even as a game dev lol

  • @Xenotester
    @Xenotester Před 11 měsíci

    It's not "just a 8Gb vram" - it also only x8 PciE lines instead of 16 for streaming textures - what about budget cpu/mb without pcie 4.0 like 5500 ?
    And 8Gb can be tolerated for $150 -200 or maybe evenm 250 price range, but totally not for over $ 300

  • @melexxa
    @melexxa Před 11 měsíci

    what you might be forgetting is that, people will buy the 4060 Ti over something like a 6700 XT for more money as well, only for the Ray Tracing performance and DLSS3 Frame Gen. however even at 1080P without RTX AND DLSS3 the modern games are consuimng more than 6.5GB of VRAM, so there really isn't any room left for using RTX along with DLSS3 and expect good performance because both these features indeed requires more than 8GB VRAM.

  • @doc7000
    @doc7000 Před rokem

    I believe you missed the entire point, while the bad stuttering and crashing was abnormal effect of not having enough Vram other effects remained, when you look at RAM utilization what you missed is those games are sending GPU data to the system ram which is resulting in a reduction of performance.

    • @TheTerk
      @TheTerk  Před rokem

      I've mentioned that in a short I posted on the channel. Long story short, you don't see performance truly tank due to PCIe traffic until avout 30% utilization. The failures you see in this video occur far before that

    • @doc7000
      @doc7000 Před rokem

      ​@@TheTerkin this video you pointed out v ram allocation completely missing that point in this video. You focused an entire section of the video to it without mentioning that point. Yes you see performance reduction when that happens. It would be even worse if you are low on system ram. Because of this the 8GB 4060ti not only suffers reduced performance in those situations in some cases it even gets beat by the 12GB 3060.

  • @jtnachos16
    @jtnachos16 Před 7 měsíci

    'developers are designing games to use more vram'
    Meanwhile the visuals don't match the additional resource they are demanding. That is the actual crux of the issue. Especially when most that increased resource cost comes from obscenely expensive volumetrics or raytracing that just don't look good enough to justify their resource demand. The amount of times I've blind tested people with screenshots or even video from in a game on high vs ultra settings, RT on vs RT off, and had people decide the lower settings are actually the higher ones, or that no RT looks better than with RT, is quite disturbing.
    We've seen this problem CONSISTENTLY as time has gone on, the more 'resources' game devs are allowed to think they have, the less focus on efficiency they have. Which is why modern games can use 2-3x as much resource as older games that have a similar level of visual quality, as well as a similar overall world/gameplay structure. God forbid we talk about bloat in storage space requirements, often from textures that are obscenely bloated, or text assets/audio for different languages that we aren't using.

  • @dankeplace
    @dankeplace Před rokem +1

    I have 2000K fps on my 1660 2GB VRAMs, I downloaded some moar VRAMs and made my GPU pretty stronk.
    Here is the details
    8K
    2000K FPS
    GTA V all settings Ultra
    1TB VRAMs
    CPU I3

  • @Exotic_Pate
    @Exotic_Pate Před 11 měsíci +1

    me watching this knowing i will never upgrare my old 2080

  • @fridaycaliforniaa236
    @fridaycaliforniaa236 Před 11 měsíci +1

    Meanwhile, if you pay 70 dollars for a game, you better have the possibility to use *all* the graphics quality than staying in a lower preset, just "because 8 gigs". I find this stupid asf that we now have to feel guilty of trying to get the most of the games we buy. Like it is our fault if we are not happy to have a card that would be considered a scam 2 years before, and we have to pay full price for it... 😏🤦‍♂

  • @danborg1228
    @danborg1228 Před 4 měsíci

    you forgot to list the 5700xt still rocking hard to day same goes for 1070+ in 1080p and 1440p so good infact in my new comp i never bother about gpu the 1070 and 5700xt did give me 60% bost with the semi old am4 platform

  • @EfendyTube
    @EfendyTube Před rokem

    If you are playin on the same resolution throughout the years, eventually you have to change the settings down. As the newer games demands more resources.