RTX 40 series Price Analysis - How Much Should You Pay?

Sdílet
Vložit
  • čas přidán 8. 06. 2023
  • GPUs are grossly overpriced this generation leading many to skip this gen. How much would the price have to come down before you even consider buying this generation? What is a reasonable price?
    #gpuprices #gpuprice #gpupricedown
  • Věda a technologie

Komentáře • 349

  • @obi0914
    @obi0914 Před rokem +139

    Never underestimate AMD ability to completely screw up

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +16

      That sums it up!

    • @obi0914
      @obi0914 Před rokem +7

      @@ImaMac-PC Last time i was excited for a Radeon was nearly 20 years ago, and only because they genuinely offer a good card....dam i'm old.

    • @alvarg
      @alvarg Před rokem +21

      Amd literally had the ball handed to them this year with nvidia horrible launch prices, amd could of swept in and just given us a 7900xtx at like 700 bucks and the 7900xt for 600 and would of out sold Nvidia by miles

    • @Icureditwithmybrain
      @Icureditwithmybrain Před rokem +10

      @@alvarg They snatched defeat from the jaws of victory.

    • @tomtomkowski7653
      @tomtomkowski7653 Před rokem +14

      Duopoly. Nvidia is setting stupid prices and AMD follows.
      Another Massive Disappointment.

  • @Z4d0k
    @Z4d0k Před rokem +31

    It’s easy, the more you buy, the more you save! Thanks leather jacket.

  • @magnusnilsson9792
    @magnusnilsson9792 Před rokem +39

    Just shift the prices one more time and they will become decent:
    4090 $799
    4080 $599
    4070Ti $499
    4070 $399
    4060Ti 16GB $299
    4060Ti 8GB $249
    4060 8GB $199

    • @Krenisphia
      @Krenisphia Před rokem +9

      I would agree except for the last three. They should be:
      4060Ti 16GB $249
      4060Ti 8GB $199
      4060 8GB $149

    • @jasonking1284
      @jasonking1284 Před rokem +1

      Looks about right...

    • @XX-_-XX420
      @XX-_-XX420 Před rokem +7

      Those are actual decent prices

    • @curie1420
      @curie1420 Před rokem +4

      people will say "bUt inFLatIoN"

    • @OneTrueMomo
      @OneTrueMomo Před rokem +1

      You know the 10 Series Cards where priced like this.

  • @Sly_404
    @Sly_404 Před rokem +20

    As much as I agree with your recommended prices, there is no way the 40 series will drop to these levels outside of stock clearance after the 50 launch. Nvidia simply has 0 pressure to move units. If people don't pay at the set price levels, they shift production resources to commercial use products that make them the intended profit margins.

    • @AmericanCock
      @AmericanCock Před rokem

      Exactly they are no longer catering for gamers they have reached Apple level and moved on to catering for companies and commercial businesses

    • @nowaywithyoueveragai
      @nowaywithyoueveragai Před 11 měsíci

      Pray for a miracle of AMD, intel or the Chinese GPU company coming out with some breaking though technology...

  • @L0rd_0f_War
    @L0rd_0f_War Před rokem +20

    Love your content, but whats with the 18% increase for 4060Ti? That's higher than what nvidia claims (15%) over 3060Ti, and Meta review averages are around 8%.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +17

      That is a good catch. I swapped the 4060 & 4060 Ti numbers. The 4060 Ti should be 15% while the 4060 is 18%.

    • @zdspider6778
      @zdspider6778 Před rokem

      The 4060Ti is a whopping *_2 PERCENT_* "faster" than the 3060Ti (at 4k, on average, meaning it actually LOSES in some games to the 3060Ti).
      Nvidia gimped it too much with a smaller memory bus, less VRAM, PCI-Express 8x instead of 16x, smaller die size than it should have had, and for much more money considering that this is _4050 in disguise,_ which should be like $119-149. They're trying to sell it for $400 ($500 if you want an extra 8GB VRAM). Good fucking luck with that! What a bunch of clucking clowns! 🤡

  • @conza1989
    @conza1989 Před rokem +10

    I completely missed the small/medium die strategy video I'll have to go look for that, looks brilliant. For this video, I largely agree, well researched, well put, every card is priced 1 tier too high. I am curious, from your perspective, is it really popular for people to upgrade each generation? I've now skipped 3 generations so far, which feels like a long time, I thought most people keep a card for, 2 generations, skip then buy, are people buying more frequently, usually? Eg. Buy a GTX 1060/ RX 580, skip 20/5000 series, buy 30/6000 series, skip 40 / 7000 series. Obviously the last part is thrown off for some by shortages.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Thank you. Most PC gamers don't buy every gen. They buy a GPU and then when it can't play the game they want, will jumped back into the market and buy another, which is as you say, is every 2-3 generations. Only enthusiasts like to buy the latest and greatest. But even then, the upgrade has to be worth it from a performance to price perspective.

  • @zdspider6778
    @zdspider6778 Před rokem +5

    The 4060Ti is a 4050 in disguise, which should cost around $119-149. It has *_less than half_* the die size of the 3060Ti, which means they can cut 2x more chips from the same wafer and double their profits. And they're trying to sell that for $400 ($500 if you want an extra 8GB VRAM). Really, Nvidia? 😠 How about "no"?
    The 4070 is a 4060 in disguise, which should cost between $199-249, at most. Ideally, under $199. This should have been "the people's choice", with a decent performance boost and value, interesting enough and affordable enough that you can upgrade every generation, if you wanted to. Gatekeeping it behind an artificially inflated price is anti-consumeristic bullshit.
    The 4080 is what the the 4070 should have been, and the "4070Ti" ("4080 12GB") should not have existed in the first place (in its current form). Launching a "Ti" should come _after,_ not first. Wtf are you doing, Nvidia? The 4060 isn't even out yet and there's a "Ti" version of it? Dafuq?
    Why such small memory buses on all these cards? And why so little VRAM? Gimped cards for ridiculous amounts of money. 16GB VRAM should have been standard LAST gen, and it's not even THIS gen? Anyone buying an 8GB card in 2023 is making a terrible, terrible decision, considering that some games TODAY use more than 8GB at 1080p. Those charts showing off the cache size and texture compression is just blowing smoke up our asses.
    If they don't sort out their shit pricing with the 5000 series, that's gonna be another skip generation.

  • @user-lk5kn2tr7k
    @user-lk5kn2tr7k Před 9 měsíci

    Love your graphs man.

  • @rocketman3770
    @rocketman3770 Před rokem

    Those 4070 open box that you mentioned with those steep price cuts are the Gigabyte models which had that recent drama about the PCB cracking.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Those Gigabyte models are using massive quad slot Air Coolers and the weight alone puts massive stress on PCI slot. From jayztwocents video, the board seems rather weak. At least the 4070 is a two slot Air Cooler that is lightweight by comparison.

  • @4sightfilmsLLC
    @4sightfilmsLLC Před 11 měsíci

    Fantastic breakdown and video!

  • @joh2805
    @joh2805 Před rokem

    I own a g-sync only monitor, so together with dlss the GeForce feature set over Radeon (or Arc) is just substantially better and I kind of stuck with them. Additionally - PC hardware has ridiculous pricing and sometimes availability in my rigeon, and there are more green GPUs to choose from.

  • @jerryquesenberry2520
    @jerryquesenberry2520 Před rokem

    Thank you for doing this.

  • @SFO195
    @SFO195 Před 10 měsíci

    Can you do this same video but for AMD cards now that we have a 7900 and an upcoming 7800 XT? They're also doing the naming shenanigans

  • @peterkovacs184
    @peterkovacs184 Před rokem +1

    I would rather consider using transistor number for each gpu, as performance mostly correlates with that number. Die size is not really a good performance indicator, the smaller the lithography, the more likely errors will be in the gpu with iso size, so prices will naturally goes up for the same "size" as you need to create more and use more material to get the "same amount" of gpus.

    • @zdspider6778
      @zdspider6778 Před rokem +2

      Sure, the smaller the node process, the more wafers will have defects. But only up to a certain point. And only in the beginning. Once the process is mature enough, they make up those losses (by cutting more chips from the same wafers) and it all evens out.
      It's not like they can go back to 16nm or 28nm. It's not how it works. It would ironically cost them MORE to revert back to an older process once everything is set up for the smaller one. And the number of transistors doesn't tell the whole story. Performance isn't infinite, they are still limited by the laws of physics for how fast a capacitor can charge and discharge, meaning how fast registers can receive memory.

  • @dreamlocker
    @dreamlocker Před 9 měsíci

    So if I`m looking for upgrade what model should I look for, I`ve got 1080ti (buggy card, works only in debug mode) with i7 7700k z270 chipset (PCI 3.0)?
    Wanna stay on 2K 144 Ghz gaming if it`s real =)

  • @4hedron852
    @4hedron852 Před rokem +9

    I'm going for a 3060 because AI support on Radeon isn't where I'd want it to be, considering one of my main motivations for upgrading is messing around with ai. Otherwise the 6700 or 6800 would be looking amazing. In fact I think very similarly, maybe even better, about the A770 16GB and the Pro A60. I might even want to try underclocking the 3060 to reduce power draw making it similar to a more expensive A60.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +2

      The Pro A60 looks interesting however, it only has half of the cores of the A770.

    • @XX-_-XX420
      @XX-_-XX420 Před rokem

      @@ImaMac-PC I'd personally get intel just to spite Nvidia.

  • @pinakmiku4999
    @pinakmiku4999 Před 11 měsíci

    Love such analysis!! Great video! Based on this video I have tried to shop my 4090 for around $1200. Couldn’t reach $1200 but managed to get a brand new 4090FE for $1300 after taxes and offers. This is my first time paying so much for a gpu but I also have managed to get used 3070 with 2.5 years warranty for $200 and 3060ti new for $250. Gonna flip those for profit to reduce my cost for 4090 close to $1200!

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      Thank you and Nice work on scoring that 4090!

  • @DC3Refom
    @DC3Refom Před rokem

    i agree with what you said mack ,also those people that dpout i got an oprn box deal in the majoriry of Europe , including the uk we do not have option

  • @MARKELL00F
    @MARKELL00F Před rokem +7

    Amazing work , for me from beginning i saw Nvidia increasing prices or playing with GPU names and that misleading, let's wish we will see good performance and prices in the next RTX 5000 series or RIP Gaming GPU's

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +4

      Thanks! There is always next gen to look forward to...

    • @bfish9700
      @bfish9700 Před rokem +3

      If the AI bubble bursts before 5000 series, they'll have to deliver value. Just like Ampere came after turing (but that value creation was killed by mining), the 5000 series might deliver better value.

  • @mbmk6607
    @mbmk6607 Před rokem +4

    Another great analysis video from you. 4k series ez skip but you know plenty will fall for nvidia marketing unfortunately. It's a shame amd colluded with nvidia. It was a good opportunity for amd to gain marketshare.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +3

      Thank you. AMD could have gained market share but they need to make an effort. Sadly, they just don't seem to want to compete with nvidia.

  • @mistermcluvin2425
    @mistermcluvin2425 Před 11 měsíci

    Love your charts bro, some of the best I've seen. Not sure if we'll see your price structure anytime soon but...Id love to get a 4090 at 4080 prices.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      Thanks. As with every GPU ever made, the price will come down...but it may not be until the 5090 is announced next year 😆

  • @TheDrMihai
    @TheDrMihai Před rokem +1

    i bought a 4090 because i can totally aford it but i still bought it right before the scalping stoped and i got it AT MSRP and not a penny more because even if it`s a HALO top tier product in no shape or form is it a good deal =)) totally agree with you!

  • @cheetahfurry9107
    @cheetahfurry9107 Před rokem +5

    I just grabbed a gigabyte 4070 to replace my 2060 6gb. I would not have done it but I got it open box from bestbuy for $479 this week.

  • @jrddoubleu514
    @jrddoubleu514 Před rokem +4

    I bought a 1080TI when it was like 700 quid back in the day. I've since boutght one more and at least 5 Quadro M2000 cards so I can run multi-touchscreen arrays on various work-focal setups,
    That second 1080ti cost me 270 about a year ago.
    I've got a 1650 in a Razer Blade Stealth,
    and a 2080Ti Super in a Razer Blade Advanced I picked up a couple years back.
    I bought my first 1080 for a main gaming rig build to replace my old build, so I could justify the price then as I was upgrading from a GTS OC card (So long ago I can't remember what it was, but it was basically a lubricated poo pipe, so it did the job for it's time).
    TBH, the prices these companies are asking are a bit outlandish.
    1080TI will do most anything on ultra, so unless you're specificalluy looking to go large with 4-8K Ray tracing, or need those high frames to compete, you can save upwards of a grand just buying a second hand GTX1080TI.
    I seldom use the RB's with it's 2080.
    I've got an 86 inch 8k tv, so one day I will upgrade, but not til those prices get a lot more reasonable.
    I mean, 11GB vram. If I were getting 24GB Vram on a single chip card, I'd be willing to pay 700 quid, but....not now. Even then. I'd be more inlined to price at 500, or less considering you can pick up an M40 or K80 with 24GB Vram for less than 200 quid a piece.

    • @4hedron852
      @4hedron852 Před rokem

      Those workstation gpus, iirc, don't have video output. However, iirc, unless you really needed the performance of a 4070 or more, it would be cost effective to buy a 6800xt/3060 ti and grab a graphics accelerator for vram heavy toying around.

  • @DETERMINOLOGY
    @DETERMINOLOGY Před 11 měsíci

    @ 2:10 decent chart but i always look at where im coming from, Where im going to, What's the cost between both and is it worth it
    For me i had the GTX 1080 which was 599 @ launch and upgraded to a 4070 Ti which is 799 @ launch, 200$ difference but the performance gains was well worth it. To upgrade from a GTX 1080 to 4080 would of been 600$ or so to gain 16GB of vram overall but to gain 15-30fps more which wasn't worth it so im happy with the 4070 Ti by far.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      That 1080 was from more than 5yrs ago...you got your money's worth out of that one!

    • @DETERMINOLOGY
      @DETERMINOLOGY Před 11 měsíci

      @@ImaMac-PC Yea, That's how i look at upgrades. For me to upgrade from the 4070 Ti it would have to be sorta the same type of gains in performance
      I can remember back when i had the GTX 1080 and gotten the RTX 2080 which wasn't worth it at all and kept the GTX 1080 so small performance jumps like that really doesn't cut it for me

  • @BPMa14n
    @BPMa14n Před rokem +1

    Reasonable analysis

  • @03chrisv
    @03chrisv Před rokem +7

    I took advantage of the Diablo 4 promotion with nVidia so my 4070's effective price was $530. Not amazing but it was a pill I was willing to swallow. So far the 4070 has exceeded my expectations and runs everything I throw at it. I know it's seen as a 1440p card by youtubers and even nVidia themselves but it can run most games at native 4k 60. In fact the only game that it can't hit 4k 60 that i own so far is Portal RTX, which I drop to 1440p and use frame generation on. It looks absolutely gorgeous and even though it's 1440p it still looks very sharp on my 4k oled tv, so if in the future if I have to drop resolution to 1440p in other games even with DLSS I know it'll still look good on my TV . The 4070 is definitely at a different tier compared to my PS5, its almost like my PC is a Turbo charged PS5 Pro by comparison 😆

    • @SIPEROTH
      @SIPEROTH Před 10 měsíci

      The 4070 could have been a major hit if it was just a little lower in price. Is not the card that is bad but the pricing.

  • @fdjw88
    @fdjw88 Před rokem +1

    Radeon can't do stable diffusion right now. And SD is important to me.

  • @jaxonswain3408
    @jaxonswain3408 Před rokem +4

    I hate the 4070 ti, 12GB of vram is the bare minimum for ultra quality PC gaming, people who are paying 800$+ should not be getting the bare minimum. 😑

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Getting a gimped GPU for 800$+ just feels wrong...

  • @SigmaGrindset-vg4oh
    @SigmaGrindset-vg4oh Před 11 měsíci

    Should I wait RDNA 4/5000 series, if I have PS5 and it feels enough for now?
    Sure, I'd love to max my LG C2, but I am not happy to be scallped by Nvidia. Obviously, I can afford it even at the scalped price of 1600 USD, but money doesn't grow on trees and I need to be sure it's worth it.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      If you don't need to, wait.

  • @Beehj84
    @Beehj84 Před 9 měsíci

    I agree with your pricing and thesis entirely. I currently have a RTX 3070. The only card I would buy is an aftermarket 7900xt under 700 with Starfield (because I'll be buying it no doubt). Everything else is terribly priced. If the 4080 were appropriately priced it would be the most attractive card, in terms of power, features, efficiency, VRAM etc.

  • @Chejov1214
    @Chejov1214 Před rokem +3

    cool vid. thanks . the only reason I opt for Nvidia is bcz I do some 3d stuff and in blender green team works better :/ if it wasn't for that I would go with amd .

  • @GamerZHuB512
    @GamerZHuB512 Před 11 měsíci +2

    I've had a weird GPU lifecycle. As my 1060 breathed its last, I went from it to a 6600, and then I got bit hard by the upgrade bug. two GPUs and one monitor later, and I tried out the RTX 3070, and to be fair, I didn't find myself missing Nvidia's features nearly all that much. Furthermore, I kinda found RT to be a mixed bag since most of my games don't have it it. But DLSS was absolutely fantastic in what few games I had that worked.
    I remember an article that was done about two months ago, that was titled "Nvidia isn't selling graphics cards, it's selling DLSS" which is 100% true, considering it looks great in most games, provided you tweak the version and all.
    However, I'd go a step further to say that Nvidia may wind up getting rid of their GPUs and start investing more in GeForce Now for the midrange sector. Seems like the most logical choice since it cuts costs and provides a steady source of income every month.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci +1

      I agree. They would like to sell more GeForce NOW and it will grow if they keep raising their prices. However, it really is dependent on your internet connection and is not great for some games due to the latency issue.

  • @KinngMe
    @KinngMe Před rokem

    Just bought a 4080 open box new for 980 best buy! They still have some go buy asap

  • @Eiden193
    @Eiden193 Před rokem

    I would have liked to see where Intel fits into this whole lineup comparing performance and price vs nvidia

  • @vaghatz
    @vaghatz Před rokem

    I think the best metric is mean fps in same gen games, not die size

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Normally, that is all you should care about. Unfortunately, that only works in a competitive market.

  • @tbx59
    @tbx59 Před rokem +2

    The price gouging and wattage has soured me on getting a new graphics card from Nvidia anytime soon and AMD doesn't seem interested in providing a compelling alternative so I guess I wait and see what breaks first my card or my will.

    • @4hedron852
      @4hedron852 Před rokem

      Honestly if you don't need massive performance and are okay with beta drivers, the Arc Pro A60 doesn't seem like an awful pick. 12GB vram albeit barely better than the 3050. It does however have the same tdp.
      I do however not know what card you're running. For all I know you're running a 1080 and the A60 would be a downgrade gaming wise.

    • @Elvewizzy.
      @Elvewizzy. Před rokem

      if you need a new GPU, the only real option is the 4090 currently. You can undervolt it quite a bit for some impressive efficiency gains. IIRC it's like 30% power consumption decrease at the cost of 5% performance.

  • @dh8203
    @dh8203 Před rokem +36

    Toward the later half of 2022, I was thinking I would do a build with an AMD GPU when their new cards came out. Then AMD managed to torpedo themselves so hard at every opportunity I just haven't been able to bring myself to buy AMD.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +19

      I am utterly amazed at how AMD just torpedo's themselves over and over again.

    • @alvarg
      @alvarg Před rokem +6

      ​@@ImaMac-PC because instead of under cutting nvidia and making a profit via sales, they try and price match to performance of nvidia but as nvidia has a greater market share, as well as having more features or at least people believe they do people don't buy amd when they price similar to nvidia.
      which I'm surprised as nvidia has always charged a premium for their products so naturally when the "budget for the gamers" brand price matches based on performance to nvidia, no one is going to buy it.

    • @SIPEROTH
      @SIPEROTH Před rokem +2

      @@alvarg They undercut Nvidia every time actually. They just don't undercut them hugely like people want when announcing msrp but their cards are cheaper and get further cheaper after a while unlike Nvidias.
      Even now the best offers out there are AMD cards and some are amazingly priced actually.
      And they can't undercut Nvidia hugely when announcing mspr prices because that will bring a price war. Nvidia will then react and bring better products cheaper.
      You may get happy about that because then you will buy the cheaper Nvidia card you wanted but AMD wouldn't get a sale, it will just be helping you buy the Nvidia you wanted while they lose even more profit due to low prices and end up closing shop because they get no value from making GPu's.
      AMD is really struggling in the GPU market with Nvidia having over 85% of the market share.
      It's almost a monopoly. So how can they start challenging Nvidia in a price war when they have so little market share and profit in comparison while their GPU tech isn't crushing Nvidia ether?
      Obviously Nvidia will have much much better ability to offer better cards at lower prices if they wanted to crush AMD. If they wanted they could close them down and then increase prices again when they are the only ones left. Nvidia is just letting AMD exist because they don't want to be left alone and have government authorities bother them about being a monopoly and trying to split their corporation.
      So before nagging about AMD realize that it all starts from you people not buying their cards and putting them in such a weak place that they can't even do a price war. The only thing AMD can do is sell as they sell and hope that they will manage to find some magic bullet that will allow them to have a GPU that performs much better than Nvidia(or that Nvidia screws up) and then start gain share in the market threw that and be more able to hit Nvidia.

    • @REgamesplayer
      @REgamesplayer Před rokem +1

      @@SIPEROTH AMD doesn't undercut Nvidia as they launch their GPUs with ridiculous prices, gather bad reviews and them proceed to lower them. Even when it comes to performance, something like RX 7900 XTX is matched by RTX 4080. Considering regional pricing, they are priced within 100 euros in my region. Who is ever going to buy RX 7900 XTX then?

    • @shanez1215
      @shanez1215 Před 10 měsíci

      ​@REgamesplayer That's another issue entirely. I have no idea why (likely volume), regional distributors charge a lot more to ship/import AMD cards and the costs end up getting passed on. In the US the 7900 XTX is about 200 dollars or more cheaper than the 4080.

  • @garethwoodall577
    @garethwoodall577 Před rokem +1

    Spot on.

  • @mariotop742
    @mariotop742 Před rokem +1

    Very good analysis .... stupid is not the one who ask , a sucker is the one who gives .In the end of you want to pay the ultimate price , do it !!

  • @Icureditwithmybrain
    @Icureditwithmybrain Před rokem

    Im currently on a RTX 3070, I was planning on getting an AMD 7900 XT when the price reached $700 but now Im not sure because I recently got into local Ai LLMs and from what I read nvidia gpu's are better at running Ai. But nvidia cards are super expensive right now. Im considering buying a used RTX 3090 but Im worried about their overheating memory. I dont have a clear path forward.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Před rokem +1

      Overheating memory was an issue for people mining.

    • @XX-_-XX420
      @XX-_-XX420 Před rokem

      That's a though one. I occasionally see "cheap" 3090ti's where that issue is fixed. Although I think it was the worst om evga cards. It could probably also be fixed if you can work on the card yourself. Another option could be to try and snipe a 3080 12GB for about 500. The do sell for that much every so often.

    • @rekoj8085
      @rekoj8085 Před rokem +1

      try selling your 3070 for like 350-400 then buy a 4080.. I sold my 3070ti for 500 and then bought the 4080 FE from best buy so really I only paid 700. In 1440p the jump from 3070ti to 4080 is incredible, performance, vram, thermals, etc. The reason I didn't go for a 4090 is because with the 4080 I am able to keep all other components the same and i only game at 1440p 240hz

    • @xNickerishx
      @xNickerishx Před rokem

      @@rekoj8085 I'm building a new pc. i have an i9-13900k. what GPU should i get

  • @mickaelsflow6774
    @mickaelsflow6774 Před rokem +2

    To you last question: VR on AMD is tough. Works in most case, but ever so often, there is a problem with encoder or with game support (RRRE) or just plain performance.
    As for the price tier as "reasonable" that's... pretty much what I've been saying all along :D Except the 4090. Couldn't care less for it's price. I'll give them the $1500 as last gen. It's stupid, but all the "it's expensive", "it's top of the line", "it's the for prosumer" works for me to justify it. Anything else? Hell to the no. I see you, Nvidia. I see you!

  • @cyraxvisuals6203
    @cyraxvisuals6203 Před rokem +1

    Im probably gonna end up with a 4070ti since i need it for a new build

  • @kingcrackedhen3572
    @kingcrackedhen3572 Před rokem

    I got a 3070 with a alphacool waterblock for 305
    I was originally at the time looking for a 3070 for 280 or less and I’m still happy with the deal I found all be it a used card
    I use Nvidea simply for minecraft rtx (somehow my fps goes up when I turn it on)

    • @oskan8522
      @oskan8522 Před rokem +1

      wow thats insane i got mine at first release for 650 euro 🥲

  • @OrcCorp
    @OrcCorp Před rokem +3

    Need Nvidia for my work as a content creator and graphic designer, but I'm not willing to pay more than 900€ for my card. This is the price point I came up with by comparing the performance of my 3080 to the new 4000 series. And having any sense in my spending.
    I got my 3080 at release with 740€, which was a good and proper price for it, so even 900€ is a tad high to be honest. I'd be willing to spend that extra 160€ for a 4080 if had near double the performance of a 3080 (let's say around 170+%), but it doesn't come even close to that.
    So this generation is truly a skip-gen, as even with a much lowered price it's not a good upgrade.
    Perhaps the 5000 series will do better. A 5080 at a price around 800€, and with over double the performance of a 3080, would be a good deal that I'd happily pay for. We'll see how bad they f*ck up the 5000 generation 🤷🏻‍♂️

    • @XX-_-XX420
      @XX-_-XX420 Před rokem +2

      I'm afraid the 5080 at the very best well be a about as fast as a 4090. If this tren continues it'll probably not even beat a 4080 like how the 4060ti loses to the 3060ti a bunch of times.
      And it'll probably be like 1500 at this rate. I think they really don't care about GPUs for companies, and normal people anymore even if AMD will release a good product. Which lets face it AMD always succeeds to ruin everything for them selves one way or the other. So I think you're just dependent on intel to give you a good gpu for the money as they atleast try with with the A750.

    • @OrcCorp
      @OrcCorp Před rokem

      @@XX-_-XX420 a 5080 at a performance level of a 4090 at 800-900€ would be a proper upgrade from my 3080.
      We will see how it goes when it's released. I'm happy to roll with my 3080 a year longer. Maybe catch a 4080 from the second hand market next year at around 700€. Or not 🤷🏻‍♂️

    • @XX-_-XX420
      @XX-_-XX420 Před rokem +1

      @@OrcCorp 4080 used at 700 might happen but that's the best I see happening with how things are going. Greedflation is insane so not sure what prices will do in a few years I'd governments won't step in (which they probably won't).
      And I mean for everything. Food prices have gone up insanely much in just this past half year. Like everyone is gonna be eating 1 meal less a day at this rate in 1 year if not sooner.

  • @kkrolik2106
    @kkrolik2106 Před rokem

    Second hand is only way I get GPRO X080 for $100 *( Rx 6700 without off display ports) I using my motherboard outputs via IGP :)

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      Now that's a way to get price to performance!

  • @dannymitchell6131
    @dannymitchell6131 Před rokem +4

    Sadly I need Nvidia cards and ideally Intel CPUs (but some AMD ones would be ok) because I do 3D modeling for work and CUDA has been and option for so long you can guarantee it will work with your chosen software.
    Intel chips are great for using the viewer when working with anything like 3D or even media editing (the viewer is the window you do your work in, the render is the output from said work).
    I'd LOVE to build an AMD rig but I have no idea if their products will work with what I'm doing...hell I don't know if Intel cards would work either.
    For gaming I'd be fine with anyone but for work? Time is money and if there's a single piece of software that's not going to two work, it's worth it to just spend more upfront.
    I've even thought about buying the cheapest current gen AMD and Intel options just to see.
    I wish hardware reviewers tested for this. Like have a gaming review and a workstation review in separate videos, separate channels even.

    • @TheLinkedList
      @TheLinkedList Před rokem +1

      What's the software you are using? I strongly doubt there is anything an Intel CPU can do that an AMD option can't. They have the same instructions sets and features and they're both x86. The only thing that comes to mind that differs with Intel is their QuickSync encoder.

    • @dannymitchell6131
      @dannymitchell6131 Před rokem

      @@TheLinkedList
      AMD chips can do it, but the strong single thread lets you brute force a lot of geometry on screen all at once, just moving the model around will push a few cores and the GPU to 100%, especially with shadows and textures turned on.
      I use SketchUp to model and Thea to render but I also have vRay and Unreal Datasmith to drop models into Unreal Engine. I have Solidworks and AutoCAD to learn too but my job is done in SketchUp and rendered with Thea...it's always good to learn new things when you can. 😉
      My laptop has a 13th gen i7 and a Studio RTX 3060 (1080p) and even plugged in I can bog down the CPU with the viewport, it feels like playing a game at 25fps.
      My desktop is a delidded 4790k at 4.5ghz with a GTX1080, the laptop is much faster in the window and render time, that's why I want to build a new rig...but my desktop is 1440p so don't that that comparison too seriously.
      Obviously AMD CPUs will work just fine but I know I could have gone Intel and made the viewer alot faster (this is one of the things I do have some benchmarks for from the modeling community).
      I'm more than happy with the CPUs out there, I just don't know what sort of compatibility I'll have with any GPU other than Nvidia. 😔
      OpenCL is available for Thea but some GPUs are just Beta support.

    • @TheLinkedList
      @TheLinkedList Před rokem

      @@dannymitchell6131 if you're getting by with a mobile CPU you'll be just fine with an AMD desktop. Intel mobile CPUs only boost single cores for really short periods anyway (even when plugged in) and it sounds like you'll be viewing models/renders for way longer than the single core speeds will stay at their highest. A desktop CPU will maintain its boost indefinitely if cooled correctly. Rendering also is best done with massive cache sizes, a 7800 3D cache or greater would suit that workload well.
      Have a look on anandtech for reviews, they tend to benchmark the scientific and professional rendering software. Going Intel or AMD for the CPU won't be much different from each other if at all for what you do. Just get whatever is best value for money and has a potential upgrade path if that's what you think you'll be doing. Upgrade paths however are not the be all and end all though, it's just a nice to know/have.
      For a GPU, you could look into the ML series or whatever it's called from AMD (Vega architecture professional stuff). Or play it safe and get Nvidia. Unfortunately, they're all expensive. Might be worth pricing them all up alongside a Mac Studio, it might actually be the cheapest and best option in the end

  • @DKTronics70
    @DKTronics70 Před rokem +7

    AMD have got complacent, not just because of Nvidia, but because they have all that nice console money, considering their hardware is in both Xbox, and PS5 consoles, and will probably be in Pro versions of both consoles - should Sony, or MS, go that route.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +6

      I wonder what AMD would do if Intel decided to go after that sweet console money?

    • @InternetListener
      @InternetListener Před rokem

      every angry customer from Nvidia that won't find an AMD dicrete GPU if still wants to play it probably will buy a PS5/ Xbox...guess what instead of a 290€ deceiving gpu or a non launched 349€ 4060 (4060 Ti start at 419€, 449€ FE) they go and buy a full AMD based system at 549€/499€. INstead of selling some hundred thousands per year, or some milions like Nvidia has done, just imagine selling hundreds of milions of PS4/Xbox to current models...are we talking 20 milions per year, easily? on the same nodes as RX 6000 7000... and no need to waste on cache because you already hahve shared memory high speed bandwidth...losing latency but...the 1440p rendering and RT and 8 cores 10 to 12 Tflops are still to be maxed for 5 years to the levels as developers from Zelda have achieved on last launch on a 0.7 Tflops GPU of and old cut down Tegra X1... Any Nintendo current game ported to Apple Silicon will be 1080p to Retina digital work of art.
      Intel should clearly start pairing CEleron (efficient line) with PS4 Pro / Series S iGPU and i3/i5 with A730m at least... but Xbox or even PS5 could always go for a fee (like apple parallels or former dual boot) the Windows S (or Windows 11) and the Custom ISO (or Linux boot) and jeopardize cheap PC market with one bios and drivers release. kit AMD 4700S it has already the first working drivers finished...copy paste some Ryzen igpu and discrete source...and done in weeks or days...they could also add all the DXVK for Dx9/Dx11 and improve OpenGL etc with the linux opne source...and make Intel sweat...

  • @darthpaulx
    @darthpaulx Před rokem

    You are completely right with the pricing.
    And your pricing has already inflation considered into it.
    Normally i would like to see the x80 series not more then 700 dollars.
    Amd also added the xtx version to higher the prices.
    But it's a good thing that Amd didn't raise the price compared to the rx6900xt.
    The thing Amd did is to add a 900 dollar card that should be a 600-700 card like the rx6800xt.
    I think Amd should have just called the 7900xtx as 7900xt.
    And the 7900xt as 7900 non xt. Because there never was a rx6900 non xt.
    I hope that Amd will keep the rx7800xt at 650 usd like last gen.
    Seeing their 7600 not making any sense, i don't have high hopes.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      Thanks. Don't get your hopes up on the 7800xt. I covered in a previous video what should be the 7800xt. Instead AMD will give you the same performance for the same price. Not looking good for RDNA3.

    • @darthpaulx
      @darthpaulx Před rokem

      @@ImaMac-PC Yes, because the coming 7800xt should have been a 7700xt. While the 7900xt should have been the real 7800xt.
      I think it will at least have performance of the 6900xt. Still a little bit better then 6800xt. 10% or so.
      Well, let's see what will happen.

  • @istgyash
    @istgyash Před rokem

    The reason I bought 4070ti I still had option for 7900xt is simply because I wanted to productive tasks as well as gaming for me dlss 3 isn't as imp as nvidias cuda for application support it's much much useful than amd in adobe applications and in 3d modelling if amd had brought a better competition in productive market I would have really ditched nvidia...

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      Thanks for the feedback. Many people need CUDA for productivity and that just seals the deal against AMD Radeon GPUs.

  • @gedeuchnixan3830
    @gedeuchnixan3830 Před rokem

    My last GPU was Radeon, my current is a Radeon and my next might be Intel´s second generation but certainly no nGreedia card. Also Adrenalin is a way better driver than nGreedia´s ancient WinXP control panel and the maleware called "experience".

  • @dat_21
    @dat_21 Před rokem +1

    Yep, same observation. It's priced 1 tier too high. Almost like a well thought, purposeful thing to do.

  • @australienski6687
    @australienski6687 Před rokem

    I purchased a used 3090 for $500. I have tried AMD GPUs in the past and I kept having issues with drivers and random crashes.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Thanks for the feedback. Enjoy that 3090, you got a great deal!

  • @patb3356
    @patb3356 Před 11 měsíci

    great chart

  • @bharatdabhi1044
    @bharatdabhi1044 Před rokem +2

    I agree with your pricing. I wanted to upgrade from my GTX 1650 to RTX 4060. But those bastards rebranded it as RTX 4060Ti and charging almost $100 more. I wouldn't pay more than $300 = INR 24k-25k for RTX 4060Ti. And here in India they charge $ 70-80 more.

  • @Vi3nix
    @Vi3nix Před 11 měsíci

    in my market/country there is only one or two AMD products they are not providing enough products :/

  • @MWcrazyhorse
    @MWcrazyhorse Před rokem

    I have a 2060 overclocked.
    It's basically as fast as the 4060.

  • @lexluthor4156
    @lexluthor4156 Před rokem +4

    In the Chinese market, some lower end RTX 4080 are already 7299 CNY, which is about 1000 USD tax included.

  • @1oglop1
    @1oglop1 Před 11 měsíci

    I was considering AMD GPU myself but then I saw people selling AMD to get NVIDIA for VR and AI training. Also driver issues and lack of "geforce experience" convenience from AMD.

  • @catriona_drummond
    @catriona_drummond Před rokem

    For 1080p, get an Intel card.They need some sales.

  • @Drebin2293
    @Drebin2293 Před rokem +1

    There are a lot more AMD users than most think. A decent chunk of NVidia users on things like the steam hardware survey comes from net cafes. Then you have the laptop legion that is almost exclusively NVidia because AMD hasn't had a real presence there in a while. But when it comes to desktop users? I know quite a few people with AMD cards. Myself included. All you have to do is look at the gaming division sales numbers to know how many cards are shipping.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      I looked into the net cafe claim. When the pandemic struck, all the net cafes closed. All of them. So you would think, that after a couple of years, the numbers would have made a huge shift. Except they didn't. They were very consistent. So consistent that the "decent chunk" coming from them was insignificant. The problem with gaming division sales with AMD is that you need to remove the sales of the chips supplied to consoles. That's a lot of sales.

  • @davefroman4700
    @davefroman4700 Před 11 měsíci

    Screw that. I bought a recycled 5700xt with 12 gigs for $200 off the chinese recycling masters. Had it for over a year now with no issues and it will play everything just fine. If I get another year or two out of it I am happy with that. I can always get a another one if it dies for a less than I would pay for any new card today.

  • @JesManVP
    @JesManVP Před rokem +1

    I disagree with you about amd yes 7000 series us not worth buying at all but ill argue 6000 series is a way better deal than nvida you could get 6700xt for around the same price as a 3060 6000 series is damaging nvidia lower end 40 series

  • @theultimategamer8537
    @theultimategamer8537 Před rokem

    I guess I feel slightly more validated that I got my 4070 for effectively 500 with that MicroCenter deal, wouldn’t have considered it otherwise.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +2

      That really was the deal...for a short period of time.

    • @theultimategamer8537
      @theultimategamer8537 Před rokem

      @@ImaMac-PC yah I literally managed to pick it up the last day it was available too

  • @shasha023
    @shasha023 Před rokem

    i still have a GTX card, tried to upgrade to a 2080 but they were too expensive at the time, then came the 3080 and you could never find them, then came the 4080 had my money ready but then it cost $1200, i told myself ill wait for the 5080 hopefully it will have a reasonable price.
    Ill never buy AMD, maybe if Intel had a higher class card i would buy it but the A770 is too weak

  • @rudybomba2
    @rudybomba2 Před rokem

    I had a 1660s and went with a 4070 TI ... I really didn't want mess with my almost 10y old 850w power supply. Any superior AMD model would require me to upgrade both the power and cooling.

    • @stangamer1151
      @stangamer1151 Před rokem

      I'd recommend to replace 10 year old PSU anyway. I had a very good PSU, which served me ~11 years and once when I tried to turn my PC on, one of the capacitors just exploded. Thankfully this does not hurt any of other PC components, but I'd seriously recommend not to wait until a PSU fails.
      4070 Ti is an expensive GPU and it should be trated well.

  • @longview3k69
    @longview3k69 Před rokem

    you are literally the first Content Creator I've heard say that AMD has screwed up their reputation and the RX6000 should be ignored (Paraphrasing). Every other CC has said that you should get a 6000 Series instead of an Nvidia card. Is there a video you have that explains how AMD screwed up? I'm genuinely Curious

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      They screwed up the reputation. I personally think RX 6000 are good GPUs and do recommend them. Lot's of creators recommend them. But nobody is buying and general gamers just ignore them. And it's because AMD really has messed up the name of the RADEON brand. I'll have a video on that in the future.

    • @longview3k69
      @longview3k69 Před rokem

      @@ImaMac-PC I can't wait then, I'll be keeping up

  • @zaplizard
    @zaplizard Před 11 měsíci

    I got a 6750xt a few months ago for $400USD, weren't any 6700xts in stock where I was, and honestly its a pretty nice card. Shame AMD doesn't know how to make anything close to a good deal on hardware that isn't 2 years old

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      Compared to the original $550 price for the 6750xt, that's an ok price and it does have 12GB of VRAM.

  • @HDJess
    @HDJess Před rokem

    Well, the only reason I wouldn't consider a Radeon GPU nowadays is the lack of DLSS, the other stuff it's just gimmicks. Actually, I liked my RX6600 more than my current 3060 due to owning a Freesync Premium display, it was just so smooth and with zero stuttering. With Nvidia, even if the monitor is G-Sync compatible, it somehow works worse.
    About upgrading, I'll most probably skip this gen, the prices are insane in Romania. RTX 4060Ti = 505 EUR, 4070 = 685 EUR and so on.
    Plus, I don't have any hard reasons to upgrade, the most demanding games I play right now are PoE and D4, and DLSS does wonders in D4 at 1440p on this 3060, I get 100+ FPS with max details.

  • @alfonskick5939
    @alfonskick5939 Před 3 měsíci

    I nearly share your opinion concerning prices.
    But i do not really agree with your way of argumentation. You only compare the die size of GPUs. But you do not consider that the costs per die size have increased with the switch from samsung N8 to tsmc N4 process.

  • @taylorshin
    @taylorshin Před rokem

    I just purchased RX7600 for my secondary PC. And I'm still using 6700XT on my gaming(and main) rig. And if opportunity allows, I will also try out Intel ARC (or Battlemage) series. Unless you have specific reason to use CUDA based application, i.e. AI acceleration or media producing, or being a professional gamer who need 4090 performance with DLSS3, there is no reason to pay the premium to NVIDIA at all.
    The price of AMD cards are dropping and getting stabilized as their performance reveals. The driver fiasco does not last months. They at least fix critical bugs and most of them has at least some walkarounds.
    People avoiding AMD/Intel is just religious. Intel? sure, there are still a few fiascos left but catching up quite fast. And pissed off by their pope since he started charging the 'correct' price for being fanatics.

  • @pweddy1
    @pweddy1 Před rokem

    I feel like AMD's 6600(XT)/6650xt are currently the new "rx580s". They got crap reviews because they had a bad MSRP and were panned, but everyone recommended them after the prices dropped.
    The 7600 will also likely fall into that category.
    The big issue with Navi 3 is they seemed to fail to achieve the design goals. The 7600 is performing like an RDNA 2 refresh, not like a new generation. You have 32CUs that have a marginal improvement over the last 32CU card. This is more like the difference between the rx580 and the 590 than say the difference between Navi 1 & 2. There was a major technical fail somewhere and we may never be told what happened.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      Something majorly went wrong with RDNA3 and it's not the fault of "chiplets". I also think of the RX 6600/6600XT/6650XT as the new RX 570/580/590.

  • @paulschaaf8880
    @paulschaaf8880 Před 11 měsíci +1

    The 40 series should shift a full performance tier down in pricing because when you look at their specs and the die used for each of them, that's where they should have launched to begin with. Then people would actually be buying their stuff. That means the 4090 should be $1200. The 4080 should be priced where the 4070 is, and so on.

  • @IrrationalDelusion
    @IrrationalDelusion Před rokem

    11:15 I would buy only Nvidia for now because of support for DSR desktop upscaling. I want fonts to look like 4k on 1080p monitors, and DSR provides a slider to fix the blurriness issue that arises from this upscaling. Even in a laptop despite optimus, with a MUX switch that feature can be made permanent for use on power with monitor. AMD didn't provide font smoothing and it's just blurry, or didn't inform they did, cause it's a niche feature. Nvidia is selfish shit, but we don't have any other option. Maybe not less selfish intel also should make gpus... Nvidia just offers more really useful features like DLSS, RTX and DSR. Also Nvidia Broadcast.

  • @JulesJuno737
    @JulesJuno737 Před 11 měsíci

    Im upgrading fron an r9 390 (brand loyalty is dumb) i want a white radeon but i don't want to deal with the headache of special settings in VR and emulation 😭
    And my junk is starting to flicker on Bayonetta 1, time to retire my girl 😢

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      That R9 390 is a classic from 2015...and yes, it's time.

  • @NormanTiner
    @NormanTiner Před 11 měsíci

    Of the bad Nvidia options, the 4070 is the most decent. If you can snag a used or Micro Center returned card then it could even be a good deal.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      I agree that with openbox prices, the 4070 is the "least bad" this generation.

  • @purpledream9492
    @purpledream9492 Před rokem

    if i wanted to get the 4090 i have to pay $1599 + $366.30 shipping

  • @marymartin2167
    @marymartin2167 Před rokem

    we need a new gpu maker that can provide same or more performance than nvidia and amd intel need to step up more.

  • @starofeden9277
    @starofeden9277 Před rokem +1

    TSMC has announced another price hike the other day
    gpu prices going up again for next gen
    and nvidia isd helping with that
    every 5 years the prices go uip
    check the flow chart
    it aint coming down
    4000 series is the new low for prices

  • @phlips11
    @phlips11 Před 11 měsíci

    bought a 6950xt for $630. It's great. No issues at all. Everything max settings 1440p, no RT. Very happy with this purchase. I will be using this card for several years

  • @theultimategamer8537
    @theultimategamer8537 Před rokem

    I just hope that Intel has something to say about the state of the market with their next gen of gpus

    • @bfish9700
      @bfish9700 Před rokem +2

      They need to figure out how to write functional drivers and get a product out on time first.

    • @XX-_-XX420
      @XX-_-XX420 Před rokem

      ​@@bfish9700drivers are better than Nvidias ones tbh. Nvidia has the worst drivers. Having a good amount of issues with RTX 4000 Vs no issues with ARC and a little issues with RX6000.

  • @bananaboy482
    @bananaboy482 Před rokem

    I really like DLSS, I can't tolerate FSR 2.2 or especially TSR. It's a night and day difference at lower resolutions like 1440p and no competition at 1080p. I like ray tracing a lot, but also DLSS frame generation intrigues me. I would buy an AMD GPU if the games I play (mainly Battlefield 2042) supported FSR or ideally XESS 1.1 DP4A (1.1 is better than FSR 2.2 even in DP4A! Look it up! Amazing!). I'm waiting to see if FSR 3 frame generation is any good and actually gets adopted into lots of games, then I'll decide. I'm really looking forward to Intel Arc, and battlemage is supposedly going to occupy the tier I'm looking to buy (4080 performance, double my 2080 ti from 2018). Ray tracing, DLSS, and frame generation are just things I think are really cool and if I'm going to drop $500 on a graphics card, might as well add the extra $100 Nvidia tax so I can play with all the features that interest me so greatly. Looking online, it seems most savvy consumers are opting for AMD regardless of FSR and ray tracing being objectively worse. People just don't care about that like I do it seems, which is good. In the reddit community AMD seems to be winning but my out of the loop friend recently built a new PC and didn't even consider AMD because of the brand being bad for the past 10 years. Got a 4070 over the 6950 xt I tried convincing him to get. Marketing is everything

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Před rokem +1

      In 4 years, when this generation of cards is getting some gray hairs, he'll be able to run frame gen and DLSS on some newly-released console port and get like 2x the FPS with RT on of people with 6950XT cards.

    • @bananaboy482
      @bananaboy482 Před rokem +2

      @@CyberneticArgumentCreator Or, he could get 4070 ti levels of real frames, more vram, and a lower price. He doesn't care at all about hardware RT, only ingame he uses RT in is Fortnite and uses software RT anyways because of his CPU not being able to build and maintain the BVH well.

  • @TheCoolLama
    @TheCoolLama Před rokem +4

    Very good video, well researched and put together, thanks!
    Nvidia gpu's are generally (much) cheaper than AMD here in Europe (NL) and South East Asia price/perf and always in stock. We don't get the sweet deals the US has and this is most likely the reason you see Nvidia has the most marketshare and overrepresented in Steam HW Survey. The 6800 still goes for €540 and €400 second hand, while the 4070 goes for €600 and Intel A770 for €400. Same scaling applies for lower tier cards (amd is cheaper with high end). Nvidia got much better perf/W and features (much better encoder vs amd's utter shit quality per same bitrate, dlss, etc.). It's a no brainer which is the (much) better value for monery here and even if price/perf was the same, I'd pick Nvidia over AMD (innovation vs copy-pasting half baked "chinese" clones). Pure raster performance, e.g. WH3 Ultra @ 1440p UW @ €/Fps: A770 €11.88, 6800 €10.35, 4070 €8.04.
    That being said, there is no price/perf improvement over last gen, 4070 is about on par and the rest are worse. Rdna3 is a complete disaster. Clear skip here. It seems Nvidia got a tick-tock strategy Intel used to employ. Uneven numbers are pretty good value for the money, so let's see what 5000 brings. Regardless, I've expected it to be this way for over a year now vs all the hopeful sheep (hope = delayed disappointment) of current affairs. I'll upgrade once either release 3nm cards with >= 16GB VRAM with proper performance (>= 60 fps 1% lows WH3 Ultra @ 1440p UW), much better perf/W and at least 15% better price/perf over the 4070.
    Note, as an interesting observation comparing stats in my spreadsheet, the fab factor gen-on-gen scales pretty well within 10% margin of expected perf/W improvement, regardless of transistor density. 3nm should be interesting not only in perf/W, but theoretically also in price/perf, as the N3 node is only 15-20% more expensive with over 50% more density and better yields. Also, gddr6(x) is at an ATL price of $25/8GB (not even counting staple discount). If they still up the premium and therefore margins on the 5000 series, that will be my last straw and quit the occasional AAA games that I play on pc. I noticed that I enjoy and play indie (Rimworld and co) and paradox games way more, which can easily utilize a weak old gpu. I always buy console when a new Rockstar comes out (because I love rdr and gta series). I can afford the 4000 series and 4090, but I refuse to be milked, those experiences and memories will last a lifetime (you always remember the bad purchases you have ever made, i.e. "I should have" and people seem to never learn, not me).
    I wish amd could and even tried to compete, but nvidia is just too far ahead and can price w/e they want if they feel people will buy it and still make a good profit from. I already saw this with the 1000 series, nvidia's hevc encoder was actually really useable and perfected it with adding b-frames on the 2000 series (the fixed encoder is still the same on the 4090), whereas amd's encoder was pure shit quality for the same bitrate and not useable for either archiving or plex usecases. This technology got used in DLSS and now DLSS3, the latter which is pretty complex and I don't think AMD will be able to match that (based on their history) any time soon (say 6-8 years, if ever). Amd can only compete in the price/perf department and they REFUSE to do so and are copying Nvidia's strategy (too high pricing, delaying like Nvidia does, absolute shit marketing for over a decade (pure lies), the failure that is rdna3 (~100W idle power consumption with multi monitors), etc.). If you're also in this market for decades like me, you know this shit happened before just before the 2008 crash where AMD and Nvidia pricefixed the market. I dare to bet this has been happening here again since the crypto mining craze.
    Thing is, if people keep buying, prices will go up next gen out of proportions once again. Corporations have one goal, which is to make more profit. I already knew smartphones would be priced the way they are because of this. I know quite a lot of people who buy the latest and greatest iPhone each year and the price always has gone way up and they still will buy it, but, Apple has at least reasonable margins on the quality components they use, about 25-30%. This is why democracy never works, you got the masses of idiots who will ruin everything. Remember the first days of something new are always the best experiences, like the internet or a new game online. Yeah, it will be ruined later on by hackers and trolls. It only will get worse, unless there is saving grace from those corps (the IRONY), don't expect it from the sheeple, it will never happen, the numbers you will never get, just like in any revolution, it's only a few % who make the change and need to enforce it on the unwashed or brainwashed/indoctrinate them by daily propaganda, else nothing gets done. This is why history keeps repeating, same morons keep reincarnating over and over again.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Thank you! I echo your comments on the 4090...I also don't like to be milked! Human behavior seems to not learn from history very well and over time, it tends to repeat itself.

    • @SIPEROTH
      @SIPEROTH Před rokem

      Lol, no. Nvidia isn't cheaper in Europe and AMD isn't cheaper in high end only. The RX6600 is literally the best value card in Europe right now and so are other cards like the RX4700 etc.

    • @TheCoolLama
      @TheCoolLama Před rokem

      @@SIPEROTH Nah... Like I've said, Nvidia is cheaper. You pay upfront more, but you get more fps/€ here in NL (Europe). Just because you see €200 "cheap", doesn't mean it is. Go do your math first. WH3 Ultra @ 1440p UW @ €/Fps: 6600 (€200) €8.46 (@ 19.69fps/100W) vs 4070 (€600) @ €8.04 (@ 37.14fps/100W).
      6600 is 5.2% more expensive and uses 88.6% more W to get the same fps as the 4070. Also, it only gets 23.63 avg fps @ 1440p uw vs 4070's 74.66, so it's a useless card for me, i.e. I'd never buy your "cheap" card in the first place, it would be literally €200 pissed away, as I've no use for it.
      With current ceiled electricity prices @ €0.40/kWh (in '24 no more, price variability €0.76-1.12 per kWh last 6 months), the 4070 is MUCH cheaper in use as well, it already is €/fps. You also get more features, which I do use (like the encoder).
      Yes, big "lol" to all people who don't use their intellect.

    • @9210069235
      @9210069235 Před 11 měsíci

      @@TheCoolLama which is best value card right now

    • @Name-tn3md
      @Name-tn3md Před 11 měsíci

      ​@@TheCoolLama lol smart geek wannabe

  • @floorgang420
    @floorgang420 Před rokem

    Remember that you don't need a "too powerful" GPU now thanks to upscaling (DLSS, FSR).
    If you got a 4k120 display like an OLED TV, your target framerate is 1440p90 (70 -110 VRR).
    And for that reason, if you don't need RT, RX 6800/6900 are good enough. And if you need RT, RTX 4070/ TI are best for the price.

  • @bfish9700
    @bfish9700 Před rokem +2

    AI saved Nvidia this gen, thankfully no one is buying these GPUs at these prices. I think the AI bubble will burst in a year or two, Nvidia will have to deliver value to gamers at that point or explain to investors why their revenue tanked.

  • @CreateNowSleepLater
    @CreateNowSleepLater Před rokem

    good video and I think the entire PC gamer AND content creator space agrees with you. Nvidia really is the only choice if you do content creation. AMD fails miserably at Blender and Premier pro. So the thing is, what choice does anyone have? Its not like the 5k series is going to be lower. You might have some value proposition in terms of increase in performance over the 4K but they are not lowering prices. Sales are terrible and Nvidia doesn't care. They will likely sit on the 5K launch because what motivation do they have for a next gen? Their current gen isnt selling.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      Thanks. You may be right about delaying the 5k launch because...why do they need to.

    • @CreateNowSleepLater
      @CreateNowSleepLater Před rokem

      @@ImaMac-PC They still have the 16Gb 4060ti (which wont help much because of the die size and memory bus) and the 4050. That is even before you get to the inevitable 4080ti and 4090ti. As we speak I am agonizing over getting a 4080 or 4070ti. One thing to note, you don't get dual encoders for AVI unless you go 4070ti and up. If you stream and record at the same time, this is huge. I know the AMD cards have this but as mentioned, their benchmarks for 3D stuff are garbage. You also have to hope that programs support AMD's implementation. I believe OBS and Davinci do now. To your point, if the 7900XTX was $800, it would be flying off shelves but why would I spend $1000 when for $100 more I can get a 4080? This gen sucks lol.

  • @Humanaut.
    @Humanaut. Před rokem +3

    None of the above.
    Vote with your wallets.
    You can wait to buy, they can't wait to sell.

  • @ceuser3555
    @ceuser3555 Před rokem

    $500 is my limit for the highest end gpu. Anything above that is crazy.

  • @snapcount321
    @snapcount321 Před rokem +4

    I'll be the first to say it, any GPU this point forward that costs $399 or more should have a minimum of 16 gigs of VRAM or GTFO.
    For years, we had games that didn't use up all 8 gigs of VRAM and yet even cheap cards like the RX5700 and the 2060 Super launched with 8 gigs of VRAM. All this VRAM compromise from NVidia makes me especially annoyed since it makes the poor price/perf and generational uplift that much worse.

  • @CostasLilitsas
    @CostasLilitsas Před rokem +1

    Well, I personally have two reasons to avoid AMD. 1 - After all these years of Raster graphics I simply got tired of them. I fell in love with ray tracing when I first saw it on Control! It was the first time after 20+ years that I got excited again about graphics. And 2 - My monitor's vertical resolution is 1200 (3840x1200). I need that ppi because I use my computer for Cubase too. Higher ppi makes UI elements too small. So in that low vertical resolution the only upscaler that works well is DLSS. FSR needs higher pixel count to do a good job. So the only card I'm considering right now is 4080 (currently on 2070). It's good on ray tracing, it supports DLSS and it has a chance to be adequate in vram (cross fingers). On the other hand nVidia is stealing my money.. no doubt about that.. And I don't see next gen to be any cheaper, thanks to AMD.. + it's a year and a half away. I recently played Dying Light 2, and in order to have Global Illumination active (day and night difference) I had to play it in DLSS Performance.. Should I wait and play nothing until next gen? I don't know man.. All deals are crap everywhere I look... Damn it..

    • @bansheezs
      @bansheezs Před 11 měsíci

      Wait till fsr 3. It will be equal to dlss almost and work on everything

  • @brianmcleanjr
    @brianmcleanjr Před 11 měsíci

    General research has said don't buy the reference 7900 XTX - buy an AIB. Okay, there are technically some at ~999, but for another $50-$100, I can buy a better card from a more reputable brand...but at that point, why not just drop the extra and get a 4080, where the Founders Edition is still comfortable at $1200 and has the extra features AMD can't match. RT, DLSS, etc are not going away as games come out, and if anything, NVIDIA will already be working on whatever is coming next and their drivers have rarely failed me. If you want me to consider AMD, you better drop the price of that XTX to a hard $800 max, like the 4080 should be.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      That's a great perspective!

  • @jeremysohan
    @jeremysohan Před rokem +1

    Jensen: money.....money....moneeeyyy

  • @MarvRoberts
    @MarvRoberts Před rokem +4

    I just said screw it and replaced my 1660 ti with a used 3080 for $400 from ebay.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      That's a deal that will last for years. Best way to spend $400 in this market!

  • @RudolfSikorsky
    @RudolfSikorsky Před rokem

    Well. I bought 3080/12 last summer for 600 euros and won't buy next one at least couple of years.

  • @Uraim
    @Uraim Před rokem

    I got an XFX RX Vega 64 8Gb used for 110eur+5eur shipping, I think it worth it.
    I upgraded from a dang stinky Radeon R7-240 2Gb.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      Much more usable GPU and that price is good. My recommendation is to watch the temps and undervolt as needed to keep it cool.

    • @Uraim
      @Uraim Před 11 měsíci

      @@ImaMac-PC on max load +25% more power and the temps are 70-72C

    • @Uraim
      @Uraim Před 11 měsíci

      @@ImaMac-PC Temps are good, using on +25% power, which consumes arround 240Watts, and its still 70-72C, even after hours with no vsync but maxed out load. No undervolt needed.

  • @ZeroSleap
    @ZeroSleap Před rokem

    I skipped too many gens not just last gen.Being holding out with a 11 year old gpu

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci +1

      I could tell you that GPU is at the end of the road, but I'm sure you've heard that one before 😆

    • @ZeroSleap
      @ZeroSleap Před 11 měsíci

      @@ImaMac-PC Yeh i know that's why im considering getting like the 7900xtx so it stays relevant for a lont time too :P

  • @julianB93
    @julianB93 Před rokem +2

    4080 should cost what the 4070 ti's Msrp is right now

  • @mc_sim
    @mc_sim Před 11 měsíci

    i want to support AMD and i did that buy buying 5700xt, and 2x6600xt for past few years. but gosh i wish they fixed their software. i hate it. looking forward to make a new rig and honestly i am not ready to pay msrp for any new gpu would it be amd or nvidia. they all seem overpriced or underpeforming. used market gives nice opportunities with good deals on ex-mining cards. this is the only hope not feel robbed.

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci +1

      At this point, the deals on the used market is the only thing that keeps me interested.

  • @sanzyboy3952
    @sanzyboy3952 Před rokem +3

    I'd say 4070 is probably the best deal, not perfect, but the best out of 40 series

    • @xxNATHANUKxx
      @xxNATHANUKxx Před rokem +1

      Me too, plenty of 4070’s below msrp already. If this continues we might be able to pick one up for that $500 mark

  • @arsenyratnikov7725
    @arsenyratnikov7725 Před 11 měsíci

    AMD has an oversupply of last gen cards at very low prices, in order to make their new cards fit their market the older ones have to deplete in stock which is why AMD has no Radeon cards to show atm

    • @ImaMac-PC
      @ImaMac-PC  Před 11 měsíci

      AMD has been shipping its lowest volume ever (over the past 23 years) for the past three quarters. If they still have oversupply of GPUs, then why would they launch the 7600 before the 7800 or 7700? Or launch any at all until stock is depleted?