How long should you wait before upgrading your GPU?

Sdílet
Vložit
  • čas přidán 23. 03. 2023
  • Join us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Core i5-11600K - geni.us/jlEP
    Core i7-11700K - geni.us/4s6Flp6
    Core i9-11900K - geni.us/pq4sO
    Core i5-10600K - geni.us/Rrku
    Core i5-10600KF - geni.us/IH7l5FP
    Core i7-10700K - geni.us/hNx1
    Core i9-10900K - geni.us/tH9y
    Ryzen 9 5950X - geni.us/NsJEp
    Ryzen 9 5900X - geni.us/h38MFeB
    Ryzen 7 5800X - geni.us/AkRU1R9
    Ryzen 5 5600X - geni.us/Vyz5
    Ryzen 5 3600 - geni.us/w5Q4k6F
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunb. .
    Music By: / lakeyinspiredg. .

Komentáře • 867

  • @CuttinInIdaho
    @CuttinInIdaho Před rokem +227

    It could last forever considering the hefty backlog most of us have of games that will run on what we have already

    • @moccagriselda
      @moccagriselda Před rokem +10

      Agreed. GTX 1080 Ti still usable even these days, especially with the help of FSR.

    • @imqqmi
      @imqqmi Před rokem +5

      Same here, the 4070ti I recently bought buys me 2 years of 1440p future gaming and anything that was released in the past, which is a LOT of great games! After 6-10 years I'll probably do the same.

    • @semp224
      @semp224 Před rokem +1

      This is such a 💯

    • @Spike-qh2bb
      @Spike-qh2bb Před rokem +9

      Must have 100+ games not even started playing yet over last 10 years.
      Own about 300 games PC / console and tablet games.
      Zero interest in 4K gaming so my 3080Ti should last a long time.
      All you need is 1440p and a massive game collection.

    • @user-gz4ve8mw9l
      @user-gz4ve8mw9l Před rokem +3

      Even a 1050 Ti works just fine these days for most older games still.

  • @Shaggy77777
    @Shaggy77777 Před rokem +62

    I bought a 970 early 2015, it survived up until this year when halo infinite, god of war, and games like cyberpunk 2077 started showing the cards age (

    • @BlazeShorts820
      @BlazeShorts820 Před rokem +5

      I've got a similar card, a GTX 1060 3gb that I bought in 2017. It plays everything I have at 1080p. I'll look at the lower spec rdna3 cards from AMD later this year.

    • @freefall_910
      @freefall_910 Před rokem +2

      @@BlazeShorts820 smart move am looking to buy a 6700xt or looking what 7700xt bring to table if it's expensive i buy a 7600xt and call it a day if that gpu be something like 6600xt

    • @shiroka1371
      @shiroka1371 Před rokem

      That's pretty brave when you consider Nvidia scammed out all 970 Users that the last 500 MB Ram of the GPU runs slower than the rest of the 3,5GB RAM. I bought a 980 instead and ist lasted me more than 5 years. G-Sync does its wonders too to have a steady fps even on fast FPS games

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 Před rokem

      DLSS and frame gen on the latest 40 series cards is going to help extend their lifetime as well.

    • @dragani2330
      @dragani2330 Před rokem +1

      Same here. Upgraded 970 1080p to 2070 wqhd. Runs all games fine with some adjustments. For unreal V games and RT in 4k i will upgrade 2024 to a used 4080/90 or a 5070, depends on specs.

  • @chriswaine4101
    @chriswaine4101 Před rokem +126

    I'm still using a GTX 1080, while looking to upgrade this year. Rather than playing at lower settings, effectively I'm extending the lifetime for my GPU by having shifting habits regarding what to play, as I've been holding off on playing games which seem like they'd be a significantly better experience once I upgrade. There's still plenty of good, less demanding games to play instead, so I feel OK doing this, otherwise I would have upgraded already.

    • @E5rael
      @E5rael Před rokem +4

      Good point! I play both old and new games, so if my GPU's running out of steam for new games, I play more old games with custom content, such as Doom and Quake. Their modding/mapping communities are still going strong, so I'm not running out of stuff to play even without upgrading to a new graphics card.

    • @georgii3041
      @georgii3041 Před rokem +2

      Lucky you, my 1080 is starting to reach the end of its lifespan. Originally upgraded to it from a 1060 in 2019. It gets pretty loud now as it goes over 80 degrees. I play mostly CPU bound games, but a few games like OW really ramp up the fans on my 1080. No complaints on performance though, but it does get louder now.

    • @DETERMINOLOGY
      @DETERMINOLOGY Před rokem +1

      I was using a GTX 1080 @ 40/60fps and it wasnt enough any more and 1440p was shaky. Upgraded to a 4070 Ti @ 1440p /165hz monitor and it works great. Its a 6 year old or so gpu its def needed even at 1080p it can get a bit tricky in some games. DLSS is a game changer as well for games like call of duty. Bread and butter

    • @slob12
      @slob12 Před rokem +4

      Could play wow using low to medium settings on an hd 6870 last year and this with 50-60 fps😂....i think that it all depends on the age if the games you play....older games may not look as pretty but the playability is still there😊

    • @DETERMINOLOGY
      @DETERMINOLOGY Před rokem +2

      @@slob12 Yea, Like if im getting 60fps on certain games im fine. If im getting over 120fps in games like cod im solid as i play on a high refresh rate monitor

  • @zbirdzell
    @zbirdzell Před rokem +30

    A GPU typically last as long as you want it too. For me I make it last 5-6 years. I have a 3080 now, but my old Rig had 2 980 ti's in SLI. I buy the best at the time to make it last. I save $80 a month to accomplish this goal. $80 * 72 months = $5760. I plan to build a new Rig in 2026. Yeah sure you lose performance over the years, but for many years fps is usually fine if you target 60. With DLSS this goal is even easier to obtain.

    • @Tony_Calvert
      @Tony_Calvert Před rokem +1

      This is a good way to look at it. I may adopt this model.

    • @Kreege
      @Kreege Před rokem

      idk about that, My GPUs have typically died after around 8-10 years. I've never had a GPU last longer than 10 years(assuming consistent usage). The most recent one to die on me was a 760 gtx that I got in 2013 and died in 2021.

    • @T1tusCr0w
      @T1tusCr0w Před rokem

      @@Tony_Calvert yeah I like this idea too. I’ve had to drastically compress my saving to get a new rig. Been throwing back 500£ for 6 months now & im doing 700£ for the next 2 months ( hardly living lol ) get my 4090 watercooled rig by June hopefully.
      But I’m not doing this again lol. It’s hellish 🤣

    • @smoggert
      @smoggert Před rokem +2

      @@Kreege I only switched pcs last month. After running my current pc with a GTX 660 for almost 11 years.
      The card is actually still fine, just can't really run anything new anymore.
      So the pc will still get some use as a basic non gaming desktop after this.
      If you take good care of your card's temp during the 10 years, it can last very long.

    • @Lyu-Phy
      @Lyu-Phy Před rokem

      This is how we do it 💕

  • @scip11
    @scip11 Před rokem +24

    My 1080 TI aged really well with the 11 GB VRAM for 1440p. The 4070TI is twice as fast but have to upgrade sooner for sure.

    • @bigdaddyt150
      @bigdaddyt150 Před rokem +6

      Thats my problem with this generation mid tier cards . They give you more Vram but thin choke it with the memory bandwidth. So you still will be turning down settings as it age more sooner . I just don't understand Nvida

    • @yourfavoritelawnguy2722
      @yourfavoritelawnguy2722 Před rokem +4

      @@bigdaddyt150 they like money, and don't care what we think/want. its simple.

    • @bigdaddyt150
      @bigdaddyt150 Před rokem +3

      @YourFavoriteLawnGuy your right, as much as I want to like Nvida I can't. What there doing with there products on the mid tier and low tier for what there asking in price is purely out of touch with reality for those of us that understand the actual specs and how it effects performance long term

    • @curties
      @curties Před rokem +2

      @@bigdaddyt150 mining is more or less dead so they need to choke their retail customers harder than ever before lmao

    • @Enigma1336
      @Enigma1336 Před rokem +2

      @@bigdaddyt150 If you start calling them Ngreedia and learn about what planned obsolescence is, then it will all make sense to you. They can no longer make more money by providing better products, so they start to try to push worse products for more money in the pursuit of endless increased profits.

  • @Verpal
    @Verpal Před rokem +72

    if you just want to stretch the usable time a bit, VRAM size might become a dominant limiting factor, especially for newer NVIDIA GPU.

    • @aightm8
      @aightm8 Před rokem +25

      Only because they've limited it on purpose as an upselling strategy. It's the apple approach. Here's a base MacBook air with 8GB ram and 256gb of storage for 900. Want 16GB of ram and 1tb of storage? 1600 please

    • @rawdez_
      @rawdez_ Před rokem

      @@aightm8 ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
      so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
      1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
      and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
      the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
      but consoles are just THAT profitable for AMD so they just don't care.
      until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
      don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
      this is the way

    • @aightm8
      @aightm8 Před rokem

      @Blue well, at least the base Mac m1 air is ok with 8

    • @FrostyBud777
      @FrostyBud777 Před rokem +4

      ESPECIALY for unreal 5 games with ultra photo realistic textures. Onward on quest 2 easily uses 16 gb of vram with my settings. even 4080 for more money woulnt handle that. GOODBYE NVIDIA. I am glad I went to amd this build for cpu and gpu.

    • @aightm8
      @aightm8 Před rokem

      @Blue unified cpu and ram reduces the requirement

  • @navid617
    @navid617 Před rokem +21

    This basically is the reason I used to target the $350 price range (970, 5700xt). You got incredible value and could consistently good performance by upgrading every 3-4 years. Unfortunately, that midrange value is dead now.

    • @navid617
      @navid617 Před rokem +4

      @man with a username 6800xt/6950xt ... or waiting is also an option

    • @jamiekelly6851
      @jamiekelly6851 Před 10 měsíci +1

      ​@@navid617doesn't excuse stupid pricing on both brands at the moment.

  • @stevecade857
    @stevecade857 Před rokem +20

    Monitor resolution and refresh rate plays a major part in the necessity people feel in upgrading their gpu. No point in having a 4090 and playing on a 1080p/60hz monitor. If you're looking at upgrading a gpu to something faster you also have to look at if it's worth it with the monitor you intend to be playing on. The two really should be considered together. If people are also considering VR they should also take account of VR support and their VRs resolution and refresh rates. My recently replaced 1080ti was well balanced in my previous setup offering good frame rates (80+ fps) for many years but since upgrading to ultrawide 1440 and 144hz it couldn't quite get the smooth frame rates I liked so it was time for the upgrade.

    • @xblfanatic360
      @xblfanatic360 Před rokem +1

      Yeah I think if your satisfied with just 1080p 60-120hz gaming then something like a gtx 1080 still has quite a few years left in the tank.

    • @stevecade857
      @stevecade857 Před rokem +1

      @@xblfanatic360 I've always chosen my gpu with regards to my monitors capabilities. High spec monitors are reasonably priced now compared to gpus though.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      And obviously the games that you play on them. I went from a 660 Ti to a 1060 on 1080p and noticed the difference even at that res back in 2017
      Since it's easier to just slot in a new GPU and they get replaced more frequiently, I tend to go a bit overboard on the CPU side, knowing that I will use the platform for a while.

    • @stevecade857
      @stevecade857 Před rokem +4

      @@HappyBeezerStudios There is a lot more adaptability in the games though. Going from Ultra settings to High can gain a lot of fps without much loss in graphical fidelity that you don't really miss anyway. Dropping to medium isn't that major either but does tend to leave you feeling you could gain that extra fidelity if only your gfx card was that bit better. Great games mainly come from the story they tell and how they play more than amazing graphics with no depth.

  • @PLDrums1
    @PLDrums1 Před rokem +52

    I sold my 1080ti that lasted 6 years for £200 late last year and picked up a 3090 for £700. I think buying used from trusted sellers and sites is the way to go in terms of value. Stop rewarding these greedy GPU companies.

    • @Real_MisterSir
      @Real_MisterSir Před rokem +9

      I mean I would, but I also can't justify spending 700 bucks on a used 3090 when even the awfully priced 4070Ti offers better performance and much better efficiency for pretty much same money at new. The only major difference is the VRAM ofc, but still the 3090 prices have always been ridiculous and so are the used prices too. If it's used, then it's gotta be a truly great deal or the card has to match some very specific spec requirements for how you're gonna use it.
      My take is to simply not buy any gpu at all, until prices improve overall. And if that means I skip another gpu gen, then so be it.

    • @starvader6604
      @starvader6604 Před rokem +12

      ​​@@Real_MisterSir just buy used mid-range amd cards.. they are affordable and have more vram

    • @CaptainScorpio24
      @CaptainScorpio24 Před rokem

      yess

    • @diysai
      @diysai Před rokem +3

      I narrowly missed out on a used 6800xt that sold for $350😢

    • @Anankin12
      @Anankin12 Před rokem

      Yeah nah, used low end costs as much if not more than new

  • @1nown
    @1nown Před rokem +224

    The fact that my 1070 has 8GB of RAM and entry tier cards in 2023 only have the same or less gives my wallet a happy face. Before that, I had a 2gb card from the very first Radeon GCN generation. There are too many games to play that don't need a new, three-slot card that weighs and costs more than an entire laptop for me to upgrade.

    • @wiziek
      @wiziek Před rokem +4

      @@MAKIUSO uhm 3060 didn't have 12gb for clients benefit, just to jack up price even more during crypto boom and shortage. 3060ti had 8gb and look how nividia is introducing new 3060 with 8 or was it 6gb ram

    • @teardowndan5364
      @teardowndan5364 Před rokem +5

      @@MAKIUSO A game developer isn't running GPUs at 100% load 24/7. The GPU is idle while writing and debugging code between test runs, same for graphics design where software is waiting for user input and the GPU will be sitting at 15-30W refreshing a static screen most of the time.

    • @retrofizz727
      @retrofizz727 Před rokem +8

      Honestly the problem with nvidia isn't the ram, but the price. 4070 ti 12gb for 900€ is bad, but for 500€ it would be good. 12gb is enough for 1440p, and will be enough for a few years. Yes it's gonna struggle in 4k in some games, but it's not advertised as a 4K, though it's priced like a 4K card. It's a matter of price in my opinion, and it's the same for 4060. 300€ for a 4060 8gb to play at 1080p is good enough in my opinion.

    • @tsjeriAu
      @tsjeriAu Před rokem +8

      VRAM was a big part of why I went with AMD instead of Nvidia when I was looking for a new card than my 1070. Some games were close to maxing out 8GB, though the one I had "issues" with was Monster Hunter World with hires textures that used just over 7GB of VRAM.
      I knew that there wouldn't be long before games would start to bump up against that limit, and even exceed it, so I wanted more VRAM on my next card, as I typically wait 2-3 generation between upgrades. My previous graphics card was a 680, main reason I got a new one was the VRAM limitations.
      When I saw the 3070 only had 8GB of VRAM, I decided against getting it purely on that reason, especially since I had gotten a 1440p monitor. AMD had 16GB on their 6000-series at the time and was a more attractive purchase. Frankly, until Nvidia stops with artificially low VRAM on their cards, I don't see myself considering them for my next purchase.
      In the end I got a 6900XT as my 1070 was dying (originally was going to wait for 7000-series), and while looking for a card to buy I was contemplating getting something like a 6600/6700 until the new cards launched and sell off that card instead, but I found a good deal on this XFX Merc 319 6900XT. It cost something like 10% more than the 6700xt and less than all 6800xt models in my country so it wasn't a hard sell. So far it's chugging along quite nicely in all the games I play, with relatively little usage, low temps (typically sits at ~50C in games with a VERY lax fan curve. Even playing Final Fantasy 14 at a (locked) 60fps it's 50C at some 25-30% fan speed. The only downside to my card is some transistor noise, but I don't think it's bad, though your mileage may vary.

    • @rawdez_
      @rawdez_ Před rokem +1

      ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically sold 10-20-even more times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
      so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
      1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
      and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
      the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
      most likely consoles are just THAT profitable for AMD so they just don't care.
      until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
      don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
      this is the way.

  • @jovanpejic
    @jovanpejic Před rokem +12

    All hardware remains as it was (aside from malfunctions) as it was 1 day after appearance. There are also graphics drivers and optimization of programs (games) and they even work better over time. Quake 2 on Geforce 4 runs as well as it did 20 years ago. It's a matter of new software. No one knows how the software will develop... I roughly hope that my RX 6700 Xt will serve me for everything that has come out so far and for the future for at least another 2-3 years.

  • @Zalionn
    @Zalionn Před rokem +6

    I passed down my 970 to someone else and we were surprised at how well it handled some new games like New World (low-medium settings) for a 9-year-old card! We were expecting it to catch on fire as soon as the game loaded lol.

  • @Bassjunkie_1
    @Bassjunkie_1 Před rokem +92

    Id hope the people that bought into rtx 3000 get to use their GPUs for a good few years, before Nvidia makes them feel obsolete

    • @JayzBeerz
      @JayzBeerz Před rokem +9

      yep my 3080 ti is still going strong in 2023.

    • @janbenes3165
      @janbenes3165 Před rokem +52

      Nvidia already made them obsolete because of VRAM. They are still usable, sure, but so is RX 6500 XT if you keep convincing yourself long enough.

    • @shaneeslick
      @shaneeslick Před rokem +12

      @@janbenes3165 Yeah I have a 1080Ti so while a 3070/3070Ti may be a Core Improvement anything less than a 3080 is a backwards step for me as I play Adventure Games on Higest Possible Details

    • @hassosigbjoernson5738
      @hassosigbjoernson5738 Před rokem +9

      With limiting VRAM to 8 Gigs on 3060 Ti or below 16 Gigs on 3080 already is a limiting factor for a high tear product.
      But I also hope that many of these RTX 3000 customers don't buy a 40 Series soon, otherwise Nvidia would go on with overpriced cards with to small VRAM on them. We'll see how much VRAM 4060 is going to have (with having in mind that 12 Gigs on 3080 and 4070 Ti already hits a limit.

    • @youtubevanced4900
      @youtubevanced4900 Před rokem

      @Jan Beneš meh, the 10gb on my 3080 works great on my ultrawide.
      I do think the 10gb version shouldn't of ever been released, and the 12gb version should of been the $700 3080.
      Nvidia are cunts though.

  • @ChristianSandviknes
    @ChristianSandviknes Před rokem +25

    It's worth mentioning that the majority of the large jumps in performance-needs come from the transition from 1080 to 1440, to 4k - so unless people are jumping to 8k I don't think the VRAM will increase in the same speed as before

    • @ArthurM1863
      @ArthurM1863 Před rokem +10

      Not really, vram requirements have increased outside of resolution increments. Games became bigger and more complex, add Ray tracing and unoptimization to the mix and you have a lot of vram requirements in nowadays games.

    • @ChristianSandviknes
      @ChristianSandviknes Před rokem

      @@ArthurM1863 I *did* say the *majority* and *large* jumps 🙂

    • @aapje
      @aapje Před rokem +1

      Very few people are at 4k right now.

    • @ChristianSandviknes
      @ChristianSandviknes Před rokem

      @@aapje Very few people also have the highest models of GPU, though

    • @conyo985
      @conyo985 Před rokem +1

      Yes. I currently have a 4k display. No way I'm going to upgrade to an 8K display anytime soon. It's just too much of muchness.

  • @E5rael
    @E5rael Před rokem +5

    I wonder if this clip was published right now, because you guys had set up a poll asking us at what percentage increase we'd be willing to pull the trigger on a GPU upgrade.

  • @malkmusRules
    @malkmusRules Před rokem +4

    My GTX970 is 8 years old and still going strong. I use it for a 4K television box/email/some Photoshop. Don't play modern games on it (got a new system 5900X 2.5 years ago with a 3070) but that said I got 5 years out of it with some compromises which I was fine with (though often going down to 1440 for some games before I got the new system with the 3070). Mostly play racing games in VR these days and will probably wait until 50 series (use Nvidia over AMD for production software reasons) for the next upgrade.

  • @spawntohell
    @spawntohell Před rokem +5

    😂 good on Tim around 3:40

  • @cloaca1234
    @cloaca1234 Před rokem

    I'm loving these clips!

  • @TeaTimeMr
    @TeaTimeMr Před rokem +8

    Indie Games, Hidden Gems of the Last Few Years, and -DLSS- /FSR make my 1070 Ti shine like a little Gem itself.

  • @nauroticdax
    @nauroticdax Před rokem +3

    It depends largely on 2 factors main one is the games you play/are likely to play if you're always playing the latest games on release you will more than likely need new gpu more frequently vs someone who plays the same games regularly (ie mostly the big online multiplayer games)
    The other factor is your screen size/refresh rate weirdly enough as my eyes are getting worse I thought screw it and got a 48inch 1080p tv for cheap, unfortunately that also means I constantly have to force oversized resolution (super sampling) or jagged edges are EVERYWHERE and because of the nature of the issue AA kinda makes it worse so much so that some games I can't play because my 1070 can't really do any higher than 1080p and the jagged edges are headache-inducing, though it is worth noting on things like counter-strike LoL Valorant etc because they are fairly lightweight I can set it to 1440p which gets rid of most if not all the jaggeds and actually makes for a bit of an advantage in shooters because the target is physically bigger so if you do go that route again you will need a beefier card to deal with supersampling even more so if it's a high refresh rate since your acceptable framerate floor is much higher.
    again though if you're generally into the most popular online games on a 28inch or lower monitor you're probably fine with any 1070-2070 card

  • @rebelblade7159
    @rebelblade7159 Před rokem +2

    My GTX 9604GB that I got in 2015 for like $200 lasted up to late 2021. I mostly played at 1366X768 before I upgraded my entire system in 2021 where I ran 1080p for a while before getting my 3070. That 4GB VRAM which many called overkill for 2015 helped my 960 last so long. Its a shame we aren't getting that much value these days.

  • @I_am_ENSanity
    @I_am_ENSanity Před rokem +5

    Honestly, when I built my newest PC back in 2021 (6800xt/5800x) I gave my brother my old PC (2080/i7 10th gen), and he gave his wife his old PC (1070/i7 7700k) and we all are doing fine. I play games at 1440p so I'm happy, my brother plays at 1080p and can now play at 120hz no problem, and his wife tends to play games like The Sims so 1080p/60fps is perfect for her.

  • @Mainstayjay
    @Mainstayjay Před rokem

    Gave my brother an Xfx Rx580 8gb 3 years ago and he is still using it. FSR makes it's capable enough for 1080p mixed settings. I have a (2021) 2060 12gb that he is getting this year so hopefully that will allow him to bump settings up a bit for another year.

  • @CallsignTrike
    @CallsignTrike Před rokem +5

    Last month i upgraded from gtx 960 to 6800xt. Planning to use it for 5+ years

  • @dontsupportrats4089
    @dontsupportrats4089 Před rokem +13

    I went from rx 580 8gb to rx6700xt. used the 580 6 years. It still played all the games I play pretty well.

    • @tohro6969
      @tohro6969 Před rokem +1

      I just got a 580 for my new pc and its not bad.

    • @mxwans1832
      @mxwans1832 Před rokem

      Same here

  • @corneliusantonius3108
    @corneliusantonius3108 Před rokem +2

    I had my MSI GTX 980 for 7 years, From 2014 to 2021. It came with a free voucher for The Wither III. I upgraded to an ASUS RTX 3060 Ti OC (Non LRH)

  • @minibleau
    @minibleau Před rokem +15

    Up until this current generation id say 4-6 years, but now...id say 6-8. Were reaching the tip of the curve in relative performance gains. The 4090 is just an example of throw power at it until its better. The average gamer has bern priced out of the market

    • @RicochetForce
      @RicochetForce Před rokem +3

      Err, what? The 4090 is one of the biggest gen on gen improvements we've ever seen in the graphics card market. It's also one of the most power efficient cards out there (only the 4080 is more efficient).
      That said, you are right that something like the 4090 (that is actually CPU LIMITED at resolutions below 4K) has fucking ridiculous lifespan ahead of it. By the time a 4090 needs DLSS to hit 60fps in contemporary games it would be around the timeframe you mentioned 7-8 years.

    • @cash_banooca17
      @cash_banooca17 Před rokem +1

      Actually Ada-Lovelace is super fucking efficient and retains native performance with basic undervolts, it's honestly strange. My 4070ti dropped to 220w absolute peak (usually 170-180) and the clock is still slightly higher than reference. It's more NVIDIA pushing power for tiny percentile gains for marketing rather than "throw power to at it until it's better". It's a poor decision but it's not the fault of the architecture.

    • @RicochetForce
      @RicochetForce Před rokem +1

      @@cash_banooca17 Exactly. The TDP figures and coolers for the Ada Lovelace series are wildly overbuilt.

  • @Real_MisterSir
    @Real_MisterSir Před rokem +6

    Had my old 980 Strix for about 5 years, before I swapped to a 2070S which I am still running. I would want to upgrade now, but prices are idiotic and I don't mind waiting out another GPU gen to leverage the performance diff. The 2070S is perfectly capable with 3K UW, no I don't max out raytracing but I am also satisfied with the visual quality overall in most modern AAA games so what's the point. Most AAA publishers have doubled down on live-service games that rarely get major overhauls and thus don't push visual fidelity and gpu demand like they used to 4-5 years ago. And with no major new titles on the horizon that are expected to push gpu performance demands significantly, there isn't a big reason to not wait for Nvidia 5000 series (or a mid-gen 4000 refresh).

  • @PaulDickson7
    @PaulDickson7 Před rokem +16

    I waited 6 years with my GTX1080TI on my 34" 1440p Ultrawide. Still plays most games well on low quality. Bit of laggyness on newer games. I had my heart set on a RTX4080 but settled for RTX4070TI to save $600

    • @Neburion
      @Neburion Před rokem +2

      I'm in the exact same situation. But I really don't wanna pay like 900-1000€ . 4070ti would Double the performance but idk man. I guess I'll wait few month to drop prices maybe.

    • @Neburion
      @Neburion Před rokem

      @@blue-lu3iz das ist ein guter Preis. Aber ich hab einen g-sync Monitor relativ alt glaub von 2016 pg348q. Weiß nicht ob der freesync kompatibel ist. Ich hab wegen Testzwecken Mal g-sync ausgemacht weil ich 3d Mark gestartet habe. Hab vergessen es wieder einzuschalten danach und man merkt selbst bei 60-70 fps ein krassen unterschied wenn g-sync nicht aktiviert ist. Kommt mir vor als würde das Game permanent Micro ruckler haben. Auf ray tracing kann ich getrost scheißen. Da ist die raw Performance der Rx 7900xt schon besser. Aber lutscht mit auch gleich Mal 50-70 mehr Watt Weg. Hab auch nur ein 650w Netzteil. Da müsste ich mir der 7900xt glaub ich auch aufrüsten. Bin kein Nvidia fanboy aber war immer sehr zufrieden . Aber meine 1080ti hat langsam ausgedient 😂. Ich beobachte Mal noch ein wenig. Mehr als 900 will ich nämlich echt nicht ausgeben.

  • @gohachi5313
    @gohachi5313 Před rokem +7

    I am a 980TI user. I bought it to use with my Oculus CV1. Still runs great, but it certainly is feeling long in the tooth. I'm looking to upgrade this year but it's been crazy to think I bought my 980TI for $650 and thought, this is way too expensive. Now, that price barely buys you into entry level

    • @upfront2375
      @upfront2375 Před rokem

      Dude it's the goddamn inflation too.. In 2015 650$ was like 900-1000$ today. Still prices are high but about 150-200$ 😂 Seems like we have to pay for inflation affecting their mistresses and boats owned by those rich fks too! SMH

  • @JustSomeDinosaurPerson
    @JustSomeDinosaurPerson Před rokem +4

    If you are buying a 4090, you are probably buying it for the VRAM anyway and/or productivity, as opposed to the expectation it is going to remain a high-end product for almost a decade. And even then, most people who bought the 1080 Ti are still getting use out of that and okay with it being a great option for 1080p - 1440p high settings.

  • @kleinemonnik
    @kleinemonnik Před rokem +1

    I'm currently have a Radeon R9 290 that's 9 years old (and since two of the three fans have flown off, so it's ready for replacement). Before that, I had a Radeon X1950 XTX that I had for around 7 years. Soon I'm going to build a new rig, and I hope my new graphics card will also work for around 7-9 years.

  • @twinklenutz1449
    @twinklenutz1449 Před rokem +7

    I just recently bought a 4090 to replace my 1070 and the only reason I did that was because I went from a 2k monitor to a 2k ultrawide monitor. Otherwise it was still going strong with no real reason to upgrade. I expect this 4090 to last me another 7-9 years.

    • @Adri9570
      @Adri9570 Před rokem

      Yeah, the longer you hold a high end one, the longer you can still enjoy it's power and reduce it's "price per year of use". That's also a double edged sword because if it lasts too long maybe in his 2nd half of it's life you could have bought some model equal or better for less than half of the price you paid for the high end setup. The good news with the 4090 is that if there is another storm of bad prices in these years (crypto, scalping, Ngreedia, etc.) the 4090 shields you longer against the need to pay again in those bad times thanks to it's 24GB of VRAM and overall performance.

  • @BKJ-88
    @BKJ-88 Před rokem

    Is it better to buy overkill for your resolution frame target so that when games catch up the hit won’t be as bad?

  • @MK-hi4jy
    @MK-hi4jy Před rokem +1

    I bought a gtx 660 in 2013 and I'm STILL using it. Many people aren't fans of nvidia, but they're supporting Kepler for 12.5 years, and they have geforce now to help you run certain demanding games. I even did a 1080p stream with it... with my CPU, this is only possible cause of nvenc. I will upgrade in like 1.5 years or so at the latest but I certainly got my money's worth out of the card, I'm happy.

  • @tommyg3031
    @tommyg3031 Před rokem +8

    I bought an rx 580 near launch day 2017. 6 years later, it's still going strong.

    • @MrTHORN74
      @MrTHORN74 Před rokem +2

      I'm in the same boat, rx 590 here. But I was looking to upgrade this gen. After seeing prices and performance, I'm currently considering a RX 6950xt at say $600.

  • @fturla___156
    @fturla___156 Před rokem

    I only upgrade based on the game or program I want to use it for, which means if I choose to play games such as StarCraft 1 or 2 for the vast majority of gaming, then the current video card I have will last me more than half a decade before I consider a new card. The primary reason I will upgrade is the utility of work a new video card will give me, which for me will likely be a GPU that can process multiple streams of tasks such as encoding, decoding, and video output to more than one monitor.

  • @RAM_845
    @RAM_845 Před rokem +3

    @HUBClips, I'm still rocking my 1080ti, coming close to 6 years old lol. Struggle with 21:9 3440 x 1440p gaming though. But regarding upgrading GPU...on 21:9 gaming, which GPU should I upgrade to from my 1080ti? One issue is my current PSU only has 2x 8pin PCIe connectors. Also with FSR and DLLS improving;
    I've forgot to mention; MY CPU - R9 3900X and RAM G.Skill Ripjaws, 2x16 (32gb) 3600Mhz 2x 2tb NvmE Gen 4 drives

    • @shaneeslick
      @shaneeslick Před rokem +4

      G'day R4M_845, I have a 3900X + 32GB DDR4-3200 + 1080Ti in my editing/transcoding PC which I sometimes game on if it is not working, I am like Tim where I play Adventure Games so High Textures are what I am after, I don't need to upgrade as my 1080Ti easily maxes out my 27" 1440p 144hz Predator Monitor with my games. If I was to upgrade I wouldn't want less that the 1080Ti's 11GB VRAM so 3070Ti is out & would want 6800/3080 12GB at least

    • @RAM_845
      @RAM_845 Před rokem +1

      @@shaneeslick I have the same processor are you mate, DDR4 32GB 3600mhz RAM, 2x 2tb Gen 4 NVMe drives...I play fast-paced games on ultrawide, you think a 4070ti would do the job I was considering a 4090 lol

    • @Real_MisterSir
      @Real_MisterSir Před rokem +1

      If you don't wanna upgrade the PSU, then I would recommend the 4070Ti from a performance/efficiency pov. Depends on your local prices tho, they are overpriced and some vendors do try to scalp them on top - but from a pure spec pov I think that card would offer exactly what you're looking for, and match your existing build specs nicely. You'll get about double the raster performance, and solid Raytracing too if you're into that - and still at an efficiency level where 2x8 pin is all you need (with the 16pin adapter ofc or custom cable).

    • @RAM_845
      @RAM_845 Před rokem

      @@Real_MisterSir Yeah here in Australia, our component pricing is off the charts, the 4090 for example nearly over AUD$3k+.
      Regards my PSU it's a Corsair TX850M 80+Gold

    • @shaneeslick
      @shaneeslick Před rokem +1

      @@RAM_845 Sorry I'm not really the person to ask but my train of thought generally stops at these stations...
      1. How much is $$$$ a priority, Don't want to spend to much & stress your finances if you have an OOPS.
      2. In FPShooters for big fps Yeah 4070Ti would be a huge jump, so it is a good starting point for upgrade as anything this level or higher will be a very noticable improvement.
      3. (As I don't know FPS Games so depending on which ones you play) With that huge Core Performance jump would you also want to increase Details for better visibility of enemies ??? if so depending on how much you are using now will 12GB (basically no increase) be enough or would 4080's 16GB/4090's 24GB be a much better choice, or the other option is a 7900XT 20GB/7900XTX 24GB. (The XTX Cooler problem was only with some Reference Models, Not all of them & definitely not Partner models like PowerColor, Sapphire, XFX & so on like the scare that some are trying to make of it)
      While it will lower your fps maybe put up with that for some testing using RivaTuner Statistics to see how VRAM increases as you increase Details/Textures with the 1080Ti to get an idea. At worst you will learn about what Steve & Tim go through to provide us all their Awesome content 🙃👎.
      4. If you are still not sure after Steve's Reviews on them comparing performance I'd recommend watching some of the side by side gameplay comparison videos that include the games you play.
      It is a big decision & it doesn't seem there is anything to spike prices in the near future plus as Tim has said there is plenty of stock so don't rush.

  • @jmanultra
    @jmanultra Před rokem +19

    This is the reason why I’m heavily considering an AMD card over a 4070ti. The 6800XT is already proving to age better than a 3070, and I think the 7900XT will age better than a 4070ti because of the VRAM. By age I mean only a couple years.
    For an $850 tier card the 4070ti is already running into a couple games that max out that 12GB of VRAM.

    • @rawdez_
      @rawdez_ Před rokem

      ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
      so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
      1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
      and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
      the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
      but consoles are just THAT profitable for AMD so they just don't care.
      until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
      don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
      this is the way

    • @JoeWayne84
      @JoeWayne84 Před rokem +2

      You got way more than a couple years before 12gigs of vram is a issue at 1440p considering it’s way more dedicated vram than the consoles have . Both cards aren’t 4k cards so having the extra vram is pointless .

    • @JoeWayne84
      @JoeWayne84 Před rokem

      I thought the 6700xt was the 3070 competitors and everything I’ve seen the 3070 has aged better then the 6700xt …

    • @JoeWayne84
      @JoeWayne84 Před rokem +3

      @Blue yeah your in Germanys though that’s like the US vassal state where everything good cost more

    • @JoeWayne84
      @JoeWayne84 Před rokem

      @Blue I don’t know I just watched hardware unboxed updated video where the 3070 was 19% faster than 6700xt in a 50 game benchmark ??at 4k mind you it was like 14% at 1440p

  • @TigerTT
    @TigerTT Před rokem

    I have a GTX 1080 inside a huge Thermaltake Core v71 ATX case that I bought for my first build back in 2015 and I'm still using that same case for every upgrade lmao. It's like the perfect house for new components due to it's insane airflow and it has 2 front fans, 1 top fan and I use a strong 3000RPM Noctua exhaust fan.

  • @ifanmorgan8070
    @ifanmorgan8070 Před rokem +8

    Just upgraded to a 3440x1440 display and pleasantly surprised to discover my 5 year old 1080ti plays everything buttery smooth at max or near-max quality settings

    • @Mcatz7
      @Mcatz7 Před rokem +3

      My 1070 still able to push high setting on ultra wide monitors, but I still want to upgrade soon to get new features like Dlss.

    • @jodiepalmer2404
      @jodiepalmer2404 Před rokem

      What PC Components do you have to go with the 1080ti? As I want to upgrade the rest of my PC (4790k and 16gb 1600 mhz RAM) to a current CPU and Motherboard. Plus, buy a gaming monitor (no OLED - I use a lot of static objects on screen) instead of using my 12 year old LCD 60hz TV.

    • @ifanmorgan8070
      @ifanmorgan8070 Před rokem

      @@jodiepalmer2404 I have an i7-5820 with 16GB 3200. What I also have - and I believe this is the component that really makes a big difference to the speed of the machine’s operation and my perception of its “smoothness” and responsiveness - is a Samsung 960 Pro NVME chip.

    • @jodiepalmer2404
      @jodiepalmer2404 Před rokem

      @@ifanmorgan8070 What display do you use that you mentioned?

    • @ifanmorgan8070
      @ifanmorgan8070 Před rokem

      @@jodiepalmer2404 The new Samsung 34” OLED G8. It’s beautiful for gaming and photo editing/video, but there is a colour fringing issue with text due to the shape of the pixels. You get used to it after a while but initially it’s quite jarring.
      My previous monitor was an Asus MX299Q, which I definitely recommend as a budget option - nice wide desktop and gaming field of view, good for productivity apps and great for gaming. It even has a good contrast ratio for a monitor of its age, but the blacks aren’t quite as inky, and of course the response time is slow by today’s standards. Tbh though the only real quality-of-life difference for me with the OLED G8 vs the MX299 is the enlarged desktop space / gaming FOV, deeper blacks and a minor sense of improved smoothness and responsiveness probably due to the 175hz vs 60hz increase. It’s not that big a deal though. Depending on your financial situation, I might be tempted to wait for the next gen to get rid of the colour fringing issue. Without that, my new G8 really would be the ideal monitor.

  • @bullbagaren1590
    @bullbagaren1590 Před rokem +1

    got a 1070 since 2016. Finally starting to look into upgrading my rig. But current pricing makes me gag. probably be looking into just upgrading my platform and keeping the graphics card a while longer.

  • @BlackChicken710
    @BlackChicken710 Před rokem +3

    for me personally if i cannot lock 60 fps anymore regardless of graphic detail thats when i start to look for a new gpu/platform

  • @luk2480
    @luk2480 Před rokem

    I'm playing in 3440x1440p, before I was gaming in standard 1440p.
    Well I bought Geforce 970 in 2014, and this lasted me until 2019 when I upgraded to 2070 SUPER, and just now I bought a used 3090 (which premiered in 2020). I am happy with the purchase and expect this over 2-year-old card to last me 3+ years.

  • @silvanbarrow86
    @silvanbarrow86 Před rokem

    Coming up in 4 years for my 1660Ti this summer. Might grab an upgrade either in summer or Black Friday/Cyber Monday

  • @danielpindell1267
    @danielpindell1267 Před rokem

    Bought a 3070FE on release and enjoyed it for 30 months playing at 1440p high to ultra graphics in games. I recently purchased a MSI RTX 4090 Surpim liquid X and it was a hefty purchase at $1750 ($1860 including taxes) I weighed heavily on weather to purchase it or not because of cost. But with its massive cuda core count and 24gb vRam I personally believe its good for a least 5 years or more before i need another upgrade. The real question is Will the power connector melt before then? 🤔 Thats my biggest concern.

  • @photonboy999
    @photonboy999 Před rokem +1

    HD5870-> 1GB VRAM buffer was the main problem
    GTX680-> 2GB VRAM buffer was the main problem
    GTX1080-> 8GB VRAM buffer is fine for me... but mostly because DIMINSHING RETURNS has meant that most games I own previously are running 1440p at 60FPS on HIGH (ish) settings (previously just cranking up the RESOLUTION alone from, say, 1600x900 to 2560x1440 would make a huge difference). I wouldn't buy a NEW card with less than 16GB VRAM but I also feel no urge to pay a LOT for a new card just so I can buy new games and/or enable ray-tracing. If I buy a NEW game I'll get one that runs great on my GTX1080... meanwhile I've got a back catalogue of great games that are fun, run well and look great!

  • @Swiss4.2
    @Swiss4.2 Před rokem +16

    I’m on a 3090 which I bought about 6 months ago used. I think the 3090 is probably good for another 1-2 years as a high end product just because of its VRAM and still decent RT and 4K performance. 6900XT and 6800XT will probably also be fairly solid cards even at 4K (no rt) just because AMD had the decency to not choke their cards with 8-10gb like Nvidia did.

    • @kerotomas1
      @kerotomas1 Před rokem +2

      Yeah i think the last time i bought a brand new card from a store was like back in 2014-15? a gtx 980, store prices are just ridiculous. You can get a used 3090 for 2/3 of the price of a 4070Ti and it will on par with it in 1080p and 1440p and destroys it in 4K due to the crap 192 bit bus of the 4070Ti. The only selling point of 40 series is frame generation no wonder Nvidia made it exclusive to the 40 series otherwise they would be pointless...

    • @filthyhanzomain7917
      @filthyhanzomain7917 Před rokem +7

      Wtf do you mean by 1-2 years lmao. It will last at least the next 5 years

    • @Alex-bl8uh
      @Alex-bl8uh Před rokem +3

      @@filthyhanzomain7917 i was confused as well

    • @rawdez_
      @rawdez_ Před rokem

      ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
      so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
      1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
      and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
      the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
      but consoles are just THAT profitable for AMD so they just don't care.
      until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
      don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
      this is the way

    • @zokuva
      @zokuva Před rokem +2

      @@filthyhanzomain7917 1-2 years as a high end product
      sure, it can most likely last for 5 years but at that point they wouldn't be high end anymore

  • @Madness801
    @Madness801 Před rokem

    Depends on the games and monitor, I'd say upgrade if you don't see enough performance that a bios update can't fix if the fps is low or you see weird graphic glitches

  • @ChasingAthleticism
    @ChasingAthleticism Před rokem

    My GPU did last 6 years, thermal compounds need replacement and the cooler needs more than compressed air to be cleaned. The GPU is not broken YET and I keep GPU core temps under 67C, anything above 72C and my VRAM starts to produce artifacts during stress testing. Thermal paste can lose function a lot faster in systems that are powered off for extended periods.

  • @hiimdrewdlr
    @hiimdrewdlr Před 9 dny

    I think something that also needs to be mentioned is the kinds of games you play and the res that you play at. 1080p competitive shooters at 240-360hz can still easily be handled by a midrange card from 2020-2021 and probably still can be handled for the foreseeable future!

  • @iTechGamingLOL
    @iTechGamingLOL Před rokem +3

    Honestly, it completely matters wether or not high resolution monitors will drop in price to be more affordable so they become the norm from what we currently have which are 1080p monitors. Besides that it is up to you what you demand to play, i told to myself that the online games i want to play should run at 144-120fps while the heavy single player games at a minimum 60fps medium/high mix settings.
    My 3070 does exactly that even though i run it with a somewhat outdated cpu (3600) which i plan to replace with a 7800x3d, i doubt the GTX cards can do that except the 1080ti while the 2000 cards don't hold a candle compared to the 3000 series which can be found extremely cheap especially if you don't mind used.

    • @rawdez_
      @rawdez_ Před rokem

      4090 can't even run all games on midrange 1440p 144Hz monitors with all eye-candy turned on and without using upscale trickery.
      and yes, 1440p 144Hz+ monitors ARE MIDRANGE if not budget already, look at their prices. there are great MODERN NEW 1440p 144Hz+ monitors for 250-300 bucks. try to find any great 1440p-class GPUs that can run MODERN games at 144+ FPS for 250-300 bucks. 1080p is extreme budget by now.
      1080p is obsolete extreme budget level monitors, nobody should make or buy GPUs for more than 100 bucks for 1080p.
      1440p is a starting point in 2023. just like 12GB VRAM should be.
      1080p GPUs literally should be A VERY BUDGET GPUs with prices around 100 bucks. a 500 bucks GPU with 8GB VRAM is laughable and SHOULD BE RIDICULED as it is morally obsolete dead on arrival overpriced AF piece of crap.
      but ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
      so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
      1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
      and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
      the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
      but consoles are just THAT profitable for AMD so they just don't care.
      until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
      don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
      this is the way

  • @Hushoo
    @Hushoo Před rokem

    The beauty of PC gaming is that there's usually a way to play games even on dated hardware. I'm still running a 1660 super with a R5 3600 and the total watts I consume is less than some GPUs on the market! I can play the games I want at 1080p and have yet to run into those fringe cases where I need to set graphics all the way down to low

  • @petersonfamily6223
    @petersonfamily6223 Před rokem

    Still using three Nvidia Titan X in SLI since 2015 in 1440P, 60 Hz, with no issues on high settings. I will adjust some items like shadows, water, sky, to lower settings and playing everything. Was tempted to update to a new system with 4090 GPU and Intel i9 13th GEN but I am still able to play everything I want without issue so held off.

  • @OverMixed2000
    @OverMixed2000 Před 7 měsíci

    I have a 1660ti I’ve been thinking of upgrading any recommendations all my games run fine still like rdr2 and all that. I get 100 fps with or more for most of my games

  • @diegoaccord
    @diegoaccord Před rokem +6

    I just upgraded from a black Strix 4090 to a white Strix 4090. I swear the white is +5% FPS.

  • @raresmacovei8382
    @raresmacovei8382 Před rokem +1

    Have had the 5700 XT for over 3 years now. It hasn't missed a beat currently. Either 60 fps, or 120 fps with upscaling in anything.
    Only Forspoken might've given it some troubles based on the demo, but otherwise rock solid.
    I'm skipping 7900 cards and I'll get a 8900 flagship or 9900 flagship. Probably 8900. What's 2-4 years more.

  • @korysovec
    @korysovec Před rokem +2

    Funnily enough, last year was when I upgraded from rx580, which I've given to my young brother and he's enjoying it quite a bit. For 1080p low, maybe some FSR gaming it's still kinda awesome. That was really good buy IMO.

    • @upfront2375
      @upfront2375 Před rokem

      You either have a high fps expectation (90+) or idk how a RX580 can't play higher than Low at HD🤔 It should be easily Med-High on all games except if you want crazy fps

    • @korysovec
      @korysovec Před rokem

      ​@@upfront2375 "Next gen" games are coming out and they are getting kinda taxing on the old 580. Plague tale, spiderman, last of us. But again I think it's absolutely great GPU and one that will, with the help of community drivers, last for many more years of gaming.

    • @upfront2375
      @upfront2375 Před rokem

      @@korysovecit's poor optimization man.
      I wish that I could but 1 for my old build in other house, but it's not found new or super difficult to find one that wasn't mined on.. a literally "work horse" ,coming from a Nvidia user since forever!

  • @AshtonCoolman
    @AshtonCoolman Před rokem +27

    I went from a 2080 Super to a 4090. I like my upgrades to be 100+% faster so that I really feel the gains 😂

    • @farhanmuzakki9879
      @farhanmuzakki9879 Před rokem +1

      how much do you get for 4090?

    • @yosifvidelov
      @yosifvidelov Před rokem +1

      @@MAKIUSO well not the only good 40 series card as mostly it is overkill.

    • @rawdez_
      @rawdez_ Před rokem +2

      a "top tier" overpriced AF RTX 4090 for 1600+ bucks can't even run new games at 1440p 144fps with all eye-candy turned on without upscale or fake glitchy frames. 4090 isn't really a 4k gaming card. and every other slower GPU is actually morally obsolete crap that can't run shit but morally obsolete 10 y.o. level game engines that are patched through the roof.
      it's crazy to me that stupid people and shills deem there shouldn't be any progress in the industry and that people shouldn't be able to play any game at 1440p 144fps on MIDRANGE (if not budget) 1440p 144hz MONITORS that cost 250-300 bucks with some MIDRANGE GPU that costs 100-200 bucks.
      RTX 4090-level card should cost 300 bucks max, and there should be WAY faster GPUs by now for 4k gaming. top tier cards for 500-600 bucks should be able to run anything at 4k 240fps. RTX 4090 is a midrange-level card if you assess it by what it actually can do and not by its price. and everything cheaper is a total obsolete crap for obsolete game engines and obsolete 1080p 60Hz monitors. relative performance IS OBJECTIVE. CPU/monitors prices ARE OBJECTIVE, the fact that 1080p is extreme budget now IS OBJECTIVE. 1440p is the NEW NORM with prices on those. GPUs are OBJECTIVELY TOO SLOW for their price and for reality we have in 2023.
      there SHOULD BE MIDRANGE CARDS TO PLAY GAMES ON MIDRANGE MONITORS and in 2023 1440p 144Hz+ is the new norm. 1080p is already ultra budget so GPUs expectations SHOULD BE adjusted accordingly.
      but instead of making actual progress in GPUs ngreedia sells GaaS so they make crap fake frames technology and milk the market with overpriced AF obsolete GPUs with laughable VRAMs and by doing so they cause stagnation in PC industry = sane devs don't make games for overpriced AF hardware = all games are made for affordable hardware i.e. for morally obsolete game engines that look like crap = 0 progress
      PC gaming is dead. 0 progress. some might be fine with all of that but that doesn't make it NOT morally obsolete games/cards/gaming.
      remember when 70 series cards were actually 70 series? now they are 60 series, named 70 series and priced 80 series

    • @yosifvidelov
      @yosifvidelov Před rokem +5

      @@rawdez_ Top Tier trolling! I am impressed by the ultra high end trolling skills. :)

    • @rawdez_
      @rawdez_ Před rokem +3

      @@yosifvidelov a 4090 die costs 300 bucks MAX to make according to wafer calcs and TSMC prices per wafer. and that 300 bucks number from calcs is too high because on the same wafer you can make smaller dies like a 4080, a 4070Ti etc. which ngreedia gets "for free" - if you account for them unlike wafer calcs do = they lower the cost of a wafer significantly.
      $1600-$2000 for a less than 300 bucks 4090 die? just lol. I understand if they'd want 800 bucks for that, max. it still would be at least 200 bucks overpriced.
      a 4080 die costs 150 bucks MAX - again without accounting for smaller dies that are made "for free" on the same wafer. $1200+ for that?
      and people are telling me It's even cheaper then that "ngreedia is not paying 20,000 per wafer, that was launch price of the node when the node is already 2 years old. 4090 isn't even the most egregiously priced GPU either, 4080 and below are all going to have much higher yields and much lower cost per chip. would be surprised if the 4070 die is costing ngreedia $50".
      and ngreedia is trying to sell 60-class cut down silicon as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first. 12gb VRAM on 800-900+ bucks cards, Carl! 12gb VRAM... in 2023... on 800-900+ bucks GPUs!!!111
      12gb VRAM is fine for 3060 performance level, a 4070tie should have at least 16GB or 24Gb as a 3090. otherwise its a morally obsolete GPU, especially for 800-900+ bucks.
      the 4070ti is basically a morally obsolete piece of crap suitable only for either extreme budget 1080p monitors, or for old games @1440p.
      and ngreedia wants 800-900+ bucks for that ridiculous piece of crap GPU with 12GB VRAM. The 4070 Ti sucks at higher resolutions as it has less memory bandwidth then even a 3070 Ti. It's an extremely crippled card.
      Rtx 4070ti main selling point is to milk dumb boiz' wallets with a stupid overpriced AF GPU with low VRAM thats already obsolete because it can't run shit natively without upscale and glitchy fake frames.
      DLSS 3 is glitchy fake frames that don't improve shit but sell morally obsolete hardware that can't run games natively to stupid ngreedia fanboiz.
      ngreedia is killing PC gaming on purpose to sell its crap laggy af GaaS subscription for 20 bucks/month.
      thats why it releases crap slow overpriced AF morally obsolete GPUs with low VRAM that have no sense.
      the problem is AMD does exactly the same - killing PC gaming on purpose to sell more morally obsolete APUs to sony/ms for their consoles

  • @diysai
    @diysai Před rokem +16

    I'm perfectly happy at 40 to 45 fps on 1080p with HUB optimized settings. I'm hoping this will help me keep my RX 6700 for at least 5 more years. I prefer lower power consumption over fps or ray tracing.

  • @jako997
    @jako997 Před rokem

    I got. 2070 super that I've been using since late 2019, I guess I'm still good for a while right? Was think of upgrading to a 4070ti or the 7900xt but these prices Are beyond highway robbery

  • @bigtimebucksfan_6887
    @bigtimebucksfan_6887 Před 8 měsíci

    I've had my 3060ti for two years debating on upgrading to 40 series I still get like 110fps and 71 degrees hot, is it worth upgrading after 2 years

  • @tk72231968
    @tk72231968 Před 11 měsíci

    I still have a 1080ti w/11gbVRAM...it's in a backup system and no longer my primary gaming system. However, it still works great for 1080 gaming. I have moved to higher resolution and refresh rate gaming....but that doesn't mean I wouldn't be able to play my games on a 1080 monitor still. I purchased that 1080ti back in 2017, so it's around 6 years old. I purchased an EVGA watercooled RTX3090 in the 4th quarter of 2021 for retail price with the expectation that it will also last me at least 6 years.

  • @Alpha___00
    @Alpha___00 Před rokem

    My upgrades went like this:
    Riva TNT 2 to 580 to SLI 680 (notebook) to 1080 to 1080 to 3080 (notebook) to 4090.
    I would advise everyone using 10xx series to go for upgrade, DLSS is a big thing now. But from 30xx to 40xx… 40xx are better, but not that much. Until you are gunning for 4K maximum settings top end gaming machine.

  • @Lykar
    @Lykar Před rokem

    My 1070 still kicks, I can play any game at medium-high settings im 1080p and I am happy with it. I am condidering just now to upgrade to a 70 class card, maybe the 7900xt.

  • @Obie327
    @Obie327 Před rokem

    Maxwell and Pascal where true performance champions and still very serviceable to this day. Actually just a few minutes ago upgraded a Kepler GTX 660 2 gig to a GTX 1650 super. They last as long as you have use for them?

  • @animegeek6118
    @animegeek6118 Před rokem +3

    I have a 5600x and a 6800 xt I’ll end up upgrading my CPU before my GPU. 6800 xt drivers have been friendly to me ever since I got it about a year ago. I see zero point in upgrading just to deal with newer driver issues.

    • @Shatterfury1871
      @Shatterfury1871 Před rokem +1

      If you use that 6800 XT at 1440p, the 5600x is more than enough.
      At 1080p the CPU might bottleneck the GPU but you will have well over 100 FPS.😊

    • @animegeek6118
      @animegeek6118 Před rokem +1

      @@Shatterfury1871 I play in 4k on a 55in LG C1 so technically I can justify an upgrade. But I get over 100 fps in everything I play using FSR and game play still feel nice and smooth. So it’s still a waste of money. Even if I had a better GPU I’d still be using FSR or dlss.

    • @Shatterfury1871
      @Shatterfury1871 Před rokem

      @AnimeGeek611
      You certainly can justify it, at the end of the day you decide what you do with your money.
      But at 4k, the bottleneck is 100% the GPU, not the CPU.
      At 4k, the 5600x can handle even the RTX 3090 TI.
      So going for a 5800 X3D, I guess you are on AM4, you will be lucky if you will see a 5 FPS increase at 4K.

  • @Multimeter1
    @Multimeter1 Před rokem +1

    USE GPU RISERS and GPU BRACKETS . It prevents PCB BEND AND SOLDER FROM CRACKING from bends

  • @Vivicect0r
    @Vivicect0r Před rokem +7

    I was using RX580 from its release and up until recently (no money, then no cards, then waiting for a new gen). That 8 GB of VRAM helped a lot. Its still usable now, most online FPS work ok and games like Atomic Heart run quite nice too. My friend now uses that old PC of mine. Hell, that card still keeps OC levels just like a new one.
    I believe VRAM amount on mid-tier cards today is pathetic. 8 gigs was overkill for rx580 on release but helped a lot at the long run. 8 gigs on 3070 is just a joke. Darktide uses 9.5 gigs at 1080p already!
    So I have gone with rx7900xtx cause I have no idea when and if I would be able to buy a new card so I need staying power. RT features are secondary and the 1st to go if performance drops down. Nice good textures provide much more to picture quality and 24gigs should be fine for a few generations.

  • @David_Quinn_Photography

    the fact that my 1080 can still do 1440p mix of high med settings is enough for me, most times I launch the game and just play it unless shadows or lighting look a little soft and the hardware's still got wiggle room for playable FPS.

  • @ApocalypseGuy1
    @ApocalypseGuy1 Před 10 měsíci +1

    My GTX 1070 has been going strong for about 6-7 years now. 1080p & 1440p. Still gets 60+ fps on most titles except on unoptimized AAA trash which I rarely play anyway. Was gonna finally replace it with a RTX 4060ti 16GB, but that card turned out to be *censored* so I guess it's time to start using Nvidia Image Scaling in more demanding titles until a better card comes along. :D

  • @rexxlono
    @rexxlono Před rokem

    still running a 1080 hybrid, looking for an upgrade within this year or next year

  • @photonboy999
    @photonboy999 Před rokem +1

    *GPU 3x rule*
    My general rule USED to be to look up Techpowerup charts and look at whatever GPU got roughly 3x the FPS. I wouldn't buy a card that only got 50% more because usually with a little tweaking my current GPU would look roughly the same at the same performance. Or if a game NEEDED a new GPU due to more VRAM or whatever I would just not buy that game. Plus CPU's where changing a lot years ago so i thought I'd just build a new PC every five or so years and donate the old PC. That worked well, but modern GAMING is so complicated due to ray-tracing, cost and other reasons. Still, I wouldn't UPGRADE unless two things are true:
    a) the PERFORMANCE difference on paper is at minimum 2x higher, and
    b) I think this would make a noticeable difference in my gaming experience (if the game you play already runs great at 1440p/60FPS or whatever do you REALLY need a better graphics card?)

  • @TheHighborn
    @TheHighborn Před rokem

    2:00 i don't think that'll necessarily be true. When i bought my vega64, everything was 1k full max. (except for AA & motion blur ofc). Now, i'm lowering settings ofc, especially for darktide, it's full low fsr, however, dark tide full low kinda looks like older games full max... so...
    My main gripe is actually VRAM. I can't watch 1080p videos on my second monitor while gaming, without something lagging it's life out.

  • @rozzbourn3653
    @rozzbourn3653 Před rokem +8

    when i build a system for myself, i select parts that i think will get me through 5-7 years without having to upgrade in order to do what i want to do. its nice to be able to play at the highest level now, but what im after is longevity and stretching my dollar. i think it all comes down to being able to play at the level you want it to, and there is no set amount of time.

    • @syncmonism
      @syncmonism Před rokem +1

      You would get better value AND, on average, better performance as well, if you were to upgrade every 2-3 years, but always only get cards which give optimal performance per dollar. When you consider how much you can recoup by selling the older card each time, the net upgrade cost ends up being very low, but you can also always have very good performance if you never wait longer than 3 years to upgrade from a good mid-range card. You certainly can get more life out of a mid-range card, but you're going to have to lower your expectations as time goes on. 7 years seems really ridiculous to expect a card to last though, especially a high end one, because why on earth would you pay a premium for cutting edge performance if you're planning to keep the card for 5-7 years?! You're not going to be having cutting edge performance for most of that period. On average, you would get better performance if you upgrade more often but only buy cards which have good performance per dollar. That being said, periods of GPU shortage and/or very high prices are a great time to put-off doing any upgrades, and to just make do with whatever you happen to have.

    • @rozzbourn3653
      @rozzbourn3653 Před rokem

      @@syncmonism if you were to buy a 3070ti for $600 and then upgrade to a 4070ti for $800 it would have been cheaper than getting the 3090 for $1500 by $100, but you would have went 2-3 years without the 3090/4070ti level of performance. so in that instance, im not sure i would have went the midrange route.

  • @deimosok2003
    @deimosok2003 Před rokem

    You need more subs you are uploading very useful q/a

  • @lexterlongares
    @lexterlongares Před rokem

    Monitor resolution and fps is what drives the demand for these newer gpus,. Basically if you have a 30 series, you could play any AAA games forever on a 1080p or some 1440p., but if you upgraded your monitor to higher reso and fps, future games might be a challenge without considering gpu upgrades

  • @twiggsherman3641
    @twiggsherman3641 Před rokem

    I was using a 1070 TI at 1440p. With the newest FPS games like BF 2042, I had to lower all the details to low and run it in Nvidia NIS mode, to get decent frame rates. I upgraded to a 6950 XT. I should be able to get the same amount of time out of this, if not more, even if in a few years I have to lower details.

  • @nihren2406
    @nihren2406 Před rokem +1

    I've owned 6 GPU's in the past decade. My first was the GTX 670, which I upgraded two years later to the GTX 980 because I was convinced to not buy the 980 Ti... which I then ended up buying months later anyway so as of right now the GTX 980 is my most shortlived GPU. And I ran with that 980 Ti for the next 5 years before upgrading to an RTX 2070 Super which I only did because my computer was starting to die after 7 years and I was forced to build a new one earlier than I had planned. One year later I upgraded to the GPU I always intended to get, the RTX 3090 which I bought used off of someone that didn't want it anymore. And now, on the cusp of building another new rig because I made some mistakes on the last one, I managed to get an RTX 4090 for retail price. The only thing that really convinced me that a 40 Series GPU in general was a worthwhile upgrade from a 3090, is because of DLSS 3.0. It's locked to 40 Series cards and it does make a noticeable impact on perceived framerates albeit with mixed results sometimes, but it's obviously actively being worked on. The real question for a graphics junkie like me is if they'll have anything exclusive to the 5090 next year. Otherwise I'll probably stick with the 4090 this time around.

    • @msg360
      @msg360 Před rokem

      literally was going to sell my 3090 was going to get 800 for it and use that towards a 4090 only spending 1k... keep in mind I got the 3090 for free in 2022.
      I end up not selling the 3090 and decided to keep it for now. because
      1. While the 4090 is sometimes double the performance of a 3090. I still can't justify spending $1700 for a card that does not max out every game at Native 4k with all the bells and whistles.
      2. In The Witcher 3 with all of the next-gen upgrades the 4090 averages 49-50 frames even with patches 6 months later. The same thing with Cyberpunk 2077 with all the settings turned up, you still can't Burt force your way to high frame rates. you have to use Upscalling and or DLSS3.. so what is the point in going all out spending $1600-1800 for a 4090 when several released games old and new I still even the 4090 struggle to keep 60 or above without the help of AI.
      3. If this is the case I might as well keep my 3090 which I end up using DLSS2 on the games that need it. it seems no matter if you're on the 30 series or 40 series you will still need to use AI to get playable frame rates, and it's only going to get worst once more real unreal 5 games start to roll out next year .So I rather wait since I got my 3090 for free, wait it out and see what the 50 series or 60 series will bring when it comes to price to performance, DLSS3 is a selling point /gimmick that Nvida made to justify the super Overprice 40 series . and here's the silver lining, 30 series owners will not be locked out of frame-gen because AMD is releasING FSR3 soon which will work with any GPU just like FSR2.
      So is no need to run out and get a 40 series because you feel like you're being locked out by Nvidia, they just want your money.

  • @jamesgodfrey1322
    @jamesgodfrey1322 Před rokem

    With current GPU I think VRAM size is going to have a major effect the life span of GPU and 8GB is not going to enough over time. I think that 12GB is going to be new minim and 16 GB or above is what I be looking for in next GPU.
    Now I always remember that It takes 2 years on average to create a game, so game are normal running on hardware that is 2 year old in effect so games are made for middle range GPU that are 2 year old by time the game come out and new GPU are normal a lot better
    I found in past that sweet spot is a top end card from the last generation if price is good which current not a card like this takes around 5+ years before I have to start having to play with visual setting and things high use setting setting down like shadow, water etc but this will heavily depending on type of game I playing and FPS I after I happy a lot time with 45 to 60FPS an example of this is that when I playable Red dead Redemption 2 I max out visual setting on my old RX580 8GB and got around 45+ fps most of time with very odd dip (Resolution of 1920 x 1080)

  • @josefknechtable
    @josefknechtable Před rokem

    I bought my first gpu in 2003. Till now i change gpus yearly, just adding some money and buying beter used card. I spent no more than 2500$ for all upgrades over 20 years. Now i am rocking rtx 3080, will see what to buy next year.

  • @ironhelix45
    @ironhelix45 Před rokem +5

    You mention how developers target midrange systems. So by buying a top end system you will be within the range targeted by developers for much longer. I went from a 1080ti to a 7900xtx. I never felt the need to upgrade my gpu until this year as I could always play my games at high - max settings and get 60-120+fps. This year is the first time I had to turn anything to low to enjoy a game. That is what pushed me to upgrade. I also play at 1440p.

    • @nike2706
      @nike2706 Před rokem +1

      This is the only reason everyone should consider upgrade their GPUs. If you have the need to change the settings from at least mid to low. That is your red flag to start searching for a new toy. You can squeeze your gpu changing the settings from high to mid, but I feel that the games looks shitty in low settings. Lower your resolution or upgrade your GPU. That's it.

    • @notyourbusiness4254
      @notyourbusiness4254 Před rokem +1

      Damn I had to endure a gt 730 till 3 weeks ago when I got my XTX 💀

    • @jessefisher1809
      @jessefisher1809 Před rokem +1

      @@nike2706 Really? I feel like resolution should be the last thing you change. Unless we are talking dlss/fsr. Anything non native looks... not great imo.

    • @adiach3nko
      @adiach3nko Před rokem

      ​@@jessefisher1809 For the sake of example, let's assume we're talking about a system built for 4K gaming. If my video card isn't up to snuff, I'd rather play at 1080p then at low settings, because resolution primarily hurts fonts, while low quality textures and shadows hurt everything.

    • @mikfhan
      @mikfhan Před rokem

      If you're willing to drop settings from ultra to high to medium then it will definitely last longer before sub-45 FPS happens, if you always want near ultra settings just for the heck of it that's another story of course, then it gets expensive buying new top end cards just as often as midrange cards. I am quite used to tweaking settings so I could probably make a high VRAM top end card last a long while (half render scale 1440p/720p might still show 100% size hud elements in some games, etc) but I could also make my current RX480 card last quite long on medium/low settings for much less initial money invested, it just depends on how low you're willing to tweak graphics settings and if CPU is bottlenecking.

  • @smokeythetear342
    @smokeythetear342 Před 6 měsíci

    I’m going to get a 4090 soon for 1440p ultra settings targeting the 165 hertz cap of my monitor. Is it unreasonable to accept 5/6 years of hitting that expectation?

    • @lamadoo
      @lamadoo Před 5 měsíci

      for 1440p? 10+ years man...particularly with DSLR. Expect 5 years for using it for 4k

  • @oetlets
    @oetlets Před rokem

    I bought my 1060 6gb in 2016 december for ~250usd. It still running multipalyer games on Mid settings 60+ FPS.
    Singleplayer games, like Howgarts Legacy running at ~50 FPS on low/mid settings.
    It's time to upgrade for a 6650XT or 6700XT, for the next ~5 years.

  • @kilroy987
    @kilroy987 Před rokem

    Resolution isn't the only thing increasing. Game developers are adding more polygons, more complex shaders, more situations that require high computation, and larger textures.
    So even 1080p 60fps can be harder to achieve on an older card with some new AAA titles.
    AAA games will continue to require beefier graphics cards.
    But some titles will run fine on older cards, too.
    I play flatscreen 1080. I had a 1060 6GB, then upgraded to an RTX 2060 SUPER to prep for PCVR. Then I got a 3080 to make that even better (but the improvement didn't really blow me away).
    I'll likely get a 50 series when they come out, assuming a 5080 will render over twice as fast as a 3080, and I don't need to commission a new power grid to do so.

  • @marshghoul
    @marshghoul Před rokem +1

    "Going from today 4K RT games to 1080p LQ settings in 7 years"
    Here I am still playing old games on my gtx970

  • @brynthomas9566
    @brynthomas9566 Před rokem

    I'm looking to update my gpu now, have a r9 285 which is finally giving up (Although still in the min specs for diablo 4!)
    When people like me ask how long a part lasts, I'm not buying a GPU to play at ultra. I'm buying to physically be able play the games I want to play. Turning down settings down over time, is to me, essentially saving money. And I'm looking for expert advice to create something that lasts as long as possible.
    So I'm looking for recommendations from reviews for what hardware is best value for money over a long time frame. The mindset of updating so often is wasteful to me. It's the same with recommending CPUs based on upgrade potential on the same platform
    It's fine if that's what you enjoy and can sell your used parts but surely there's more people that think like me than not and it always surprises me the number of people on youtube/forums that advocate parts with the mindset of updating in 2 years

  • @gaminginstilllife9429

    I just got a rtx 3070ti,. Will I be ok with 8gb VRAM since even the newer cards have the same amount of memory? I don't need to run anything on ultra.

    • @Extreme96PL
      @Extreme96PL Před rokem

      8 gb should be enough for 1080p, 8gb just start to appear in recommended specs, XSS with its low memory exist and needs to play games too, optimized settings are also more worth that set everything to high/ultra i think it should be ok. Keep in mind that games often have settings that are resource hog but offer nothing like Forza extreme settings vs high/ultra, Doom Eternal ultra nightmare vs ultra/nightmare or RE games where high +2gb or 3gb looks the same like higher.

    • @CaptainScorpio24
      @CaptainScorpio24 Před rokem

      yes if yuh don't play in ultra.
      high settings shud b ur ultra setting
      then yuh will b fine for years.
      i have 3070 8gb
      talking fron experience

  • @jobro724
    @jobro724 Před rokem +2

    My rule of thumb is pretty much 2 generations, which is around once every 4 years. I bought a GTX 1070 6 years ago and replaced it only last year for a RTX 3070 to keep playing at 1440p high quality setting and try Ray Tracing and DLSS. I'm certainly gonna skip the current generation, but will check if the one following will be worth it for that 1440p sweet spot depending on how demanding games will be.

    • @grlmgor
      @grlmgor Před rokem

      6 years for flagship
      4 years for high end (Can extend to 6 years if you compromise)
      2 years for low end (Can extend to 4 years if you compromise)

  • @professionalnoob7900
    @professionalnoob7900 Před rokem

    My 1660 super carrying me from 2020 till this day. Can easily do medium preset today. Though I am thinking of an update.

  • @PedroLauridsenRibeiro

    I did get my 9900K + 2080ti system in 2019 with a 5-year-plus timespan in mind, but then I already had a ultrawide 75Hz 1080p monitor and wasn't satisfied with the 1440p and 4K options in the market, which meant I was bottlenecked by my monitor even with everything maxed out until the end of last year, when I finally managed to get a 4K 144Hz monitor. By then the 4090 and 4080 had already been launched, and now my 2080ti (which has a 3070-level performance) can only keep 4K over 60 fps in maxed-out current AAA titles such as God of War with DLSS (quality mode) and a mild GPU overclocking. I hope I can keep this for at least another year (thus making the planned 5-year mark), because even a 4080 (which would give me about a 50% performance gain over the 2080ti, I reckon) is still too expensive as compared to what I paid for my 2080ti.

  • @StaelTek
    @StaelTek Před rokem +1

    It’s quite simple:
    As long as it gets the job done for your needs, keep using it for as long as possible.
    It saves money on unessesary purchases and reduces our overconsumption of electronics.

  • @shauns28
    @shauns28 Před rokem

    I went from a 570 4gig to the 3080, 1440p doesn't seem to be getting any harder to play since release of the card. (Lucky few that got pre order straight away haha)

  • @annettesurfer
    @annettesurfer Před rokem

    I still have about 2 months of warranty left on the used EVGA RTX 2080 Ti XC Ultra I bought on eBay for a high bid of $796 on 11-21-2020 and the high setting on COD Warzone seems to average around 130 FPS so I see no reason to upgrade yet. Of course I’m on an older PCIe 3.0 Z490 system (i9-10900K) so that may be why the PCIe 4.0 MSI RTX 3060 Ti Gaming X Trio (12/2/20 $489.99 MicroCenter) had a little less FPS but seemed far better on my sons MSI MAG X570 Tomahawk WIFI (Ryzen 7 5800x). So much so that I thought the Gigabyte RTX 4090 Gaming OC was a waste of $1699.99 (3-9-2023 BestBuy) I may return it.

  • @RichardJNeo
    @RichardJNeo Před rokem

    I mostly play older games 10+ years old. Just ‘upgraded’ to a 1080Ti running old games on max settings. In a couple of years I’ll get a 3090Ti when they’re pretty old and play the games that came out in the last 5 years (now)

  • @andrecoetzee8629
    @andrecoetzee8629 Před rokem

    Where can one go to ask you guys a question, if I may ask?

  • @Mirsab
    @Mirsab Před rokem +1

    Got the RX580 when it was already years old in summer of 2020 and I'm still using my RX580 about 3 years later!
    1080 60-75
    Now I want to upgrade to 4090, so 4K 120+ preferably

  • @HappyBeezerStudios
    @HappyBeezerStudios Před rokem

    Right now I'm still on my 1060 6GB, before that I had a 660 Ti and before that a HD 4870 and before that a 7300 GT.
    I'd say it all comes down what your usual budget is and what you play. And so far I haven't found a single game that I really want to play that won't run on it. (My CPU is a different topic, I see it being the limiting factor much more frequently) Usually I upgrade when my card can't keep up anymore, which so far happened to be around every 3-5 generations or about 4-5 years for a new upper midrange card.

  • @marcuslind3108
    @marcuslind3108 Před rokem +4

    I have a rtx 3070 and going to stick with it as long as possible, i play 1440p but its doing fine in the games i play.

  • @greyport
    @greyport Před rokem

    i bought a 2080ti when it was released and its still chugging along quite well. dont plan on upgrading anytime soon since i now play online games mostly now. no real AAA titles