RTX 4000 vs RTX 3000 - Real FPS (Rasterization) Performance vs Price Comparison is Shocking!

Sdílet
Vložit
  • čas přidán 22. 09. 2022
  • nVidia announced the RTX 4000 and was vague on rasterization performance improvements. I breakdown what they did provide to get a sense of the actual rasterization performance increase you can expect without DLSS or Ray Tracing enabled. A comparison of performance and price is provided and the results are shocking.
  • Věda a technologie

Komentáře • 884

  • @Many_Mirrors
    @Many_Mirrors Před rokem +683

    I think it is disgusting that they decided to name the 4070 a 4080 12GB instead. We're basically paying $899 for a card tier that used to cost $350 just a few generations ago. That is a 150%+ price increase

    • @BDRmongoose
      @BDRmongoose Před rokem +88

      More like 250%

    • @deneguil-1618
      @deneguil-1618 Před rokem +49

      It's even worse than that, the 4080 12Gb has about 48% performance of a 4090 when looking at core and clock, and this performance range is generally reserved for the 60ti card.
      Basically Nvidia is trying to sell us a 4060ti for $900

    • @Ubreakable-lr2dk
      @Ubreakable-lr2dk Před rokem +18

      @@JoeWayne84 dude noone bats an eye if cards go up. But like over 200% is just ridiculous if a card costs a few years back 350 it should be like 500 and not 1900 by now its insane

    • @WayStedYou
      @WayStedYou Před rokem +5

      329 for gtx 970, which was 399 in the 770 which was a rebranded 680

    • @cyllananassan9159
      @cyllananassan9159 Před rokem

      true.....but fanbois will buy them by the millions.....just like apple phones.... personal identity has been replace by mega corpo identity....as if they were your friends....pathetic....

  • @Icureditwithmybrain
    @Icureditwithmybrain Před rokem +689

    A couple years ago nvidia said the 3080 was 2 times faster than the 2080. In reality it was about 35 to 40% faster. Everyone should keep that in mind when nvidia makes claims.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +90

      nVidia's claims are just that, claims. But the data they provided is gold.

    • @thecatdaddy1981
      @thecatdaddy1981 Před rokem +83

      It's becoming a tradition at this point. They also did it with the 2080 vs 1080. They always cherry-pick some special scenario that gives a huge advantage to the newer card with some workload that is inefficient on the old one. That time it was some 4K HDR scenario which if I remember correctly pascal was really inefficient on and/or it was straining the memory bandwidth of the 1080. Then 3080 vs 2080 it was Doom Eternal with a texture setting that utilized 9GB of vram (10GB vs 8GB on these cards). Now it's Cyberpunk with a new "RT overdrive" setting which I can bet is really inefficient on ampere compared to ada.

    • @Snakss-yo1ju
      @Snakss-yo1ju Před rokem +4

      That is in the best case, so it is “kinda” true

    • @Afterthought_
      @Afterthought_ Před rokem +32

      68%* On average, still isn't double, but noticeably more than 35-40

    • @philiplubduck6107
      @philiplubduck6107 Před rokem +8

      If I recall the 3080 was more than 2x a 2080. Not in raster but ray tracing. The 30 series was a crazy upgrade for RT.

  • @Ujaah
    @Ujaah Před rokem +99

    Increase in performance is NOT a reason to increase the price. It is a reason to buy a new unit, when yours is still working.
    If price hiked with every increase in performance, we would be having $10K gpus by now.

    • @a64738
      @a64738 Před rokem

      It is most likely not much in form of raw performance gain as they cheat the 2 x and 4 "performance" number by adding artificial frames that inflate the frame counter number...

    • @larion2336
      @larion2336 Před rokem +16

      Actually they would cost millions or billions each considering the exponential perf improvements in silicon chips for a long time. Like imagine you paid $50 p/m back in the 90s for a 56k dialup internet connection. Today you can get a 1 gbps fiber connection, which is 17860 times faster. Should you then reasonably expect to pay (17.8k x $50) = $893,000 p/m for it? Increase in performance should only be a factor _within_ generations, ie it's reasonable for a 4080 to cost more than a 4070. For a 4080 to cost more than a 3080 is either price gouging or inflation.

    • @dracer35
      @dracer35 Před rokem +2

      @@larion2336 Exactly! This seems so simple. Just like when flat screen tvs were thousands of dollars but now they have higher resolution, better technology and only cost a few hundred bucks. Nvidia is just price gouging to the absolute extreme.

  • @sinfatus
    @sinfatus Před rokem +325

    When the 90 series card is the only one that makes sense, you see that things are very wrong.

    • @reviewforthetube6485
      @reviewforthetube6485 Před rokem +8

      I've got a funny feeling nvidia is holding back a budget lineup they will launch a bit later

    • @2ndAveScents
      @2ndAveScents Před rokem +1

      I got my 1080 mini on sale for 399 and it still holds up well, but I wouldn’t mind springing for a 3080 FE

    • @bricaaron3978
      @bricaaron3978 Před rokem +6

      @@reviewforthetube6485 *"I've got a funny feeling nvidia is holding back a budget lineup they will launch a bit later"*
      What about the notion that these prices are deliberately high in order to sell the remaining stock of 30 Series cards, while still being able to claim that a new generation has been released on a "normal"** schedule, with the intent of later reducing prices (if demand isn't driven back up by $!#@ miners)?
      ** That is, the _new_ normal of ~24 months between generations (2016 - 2022), as opposed to the 13.7 month average from the GeForce 2 (2000) to the 10 Series (2016).

    • @Snoop_Dugg
      @Snoop_Dugg Před rokem +1

      @@reviewforthetube6485 I think they're holding back a ti lineup, and waiting for these to be bought.

    • @reviewforthetube6485
      @reviewforthetube6485 Před rokem

      @@Snoop_Dugg of course they are

  • @tuga4684
    @tuga4684 Před rokem +138

    I just love how he named "the 4080 impostor" thats what im calling it from now on 😂

    • @BDRmongoose
      @BDRmongoose Před rokem +13

      4080 ti (total imposter)

    • @averageadventure200
      @averageadventure200 Před rokem +8

      would be better if nvid was honest about it, and simply called it 4070 rather than give into marketing anxiety xD

    • @tuga4684
      @tuga4684 Před rokem +2

      @@averageadventure200 totaly agre with that statement, i hope people dont think thats only vram diference, very shady tatics used by nvidia, this will work great for amd. This remembers me the 3.5gb gtx 970 i bought an r9 390 8gb and it served is purpose until last year

    • @zkilla4611
      @zkilla4611 Před rokem +6

      I just simply call it a 4070. I will call out anyone who buys it.

    • @Dr.WhetFarts
      @Dr.WhetFarts Před rokem

      @@zkilla4611 What will you call 4070 then? It should have been 4070 Ti tho, since it's not cutdown at all.

  • @I.C.Weiner
    @I.C.Weiner Před rokem +56

    My list of possible Reasons for price in no particular order.
    1 Cleaning out old inventory. They want to sell the abundant 3000 series so the higher 4000 series will push people that way.
    2. They saw what people were willing to pay and are now rolling with it.
    3. They don't believe amd will be competitive.
    4. They know sales will be lower and investors will expect similar returns.

    • @Hi-levels
      @Hi-levels Před rokem +14

      It wasn't ppl. It was miners

    • @bricaaron3978
      @bricaaron3978 Před rokem +2

      @@Hi-levels *"It wasn't ppl. It was miners"*
      **blink** ... **blink** ...

    • @legendp2011
      @legendp2011 Před rokem

      NVIDIA is being charged almost 30% more for silicon by tsmc, and the card cost almost 50% more to make. nvidia would be losing money selling it below $700..... $800 isa more realistic price (prices have risen everywhere. ps5 went up in price, quest 2 went up in price. apple didn't even put the apple a16 in there base iphone 14)

    • @Chrissy717
      @Chrissy717 Před rokem

      @@bricaaron3978 well, I think what the other guy wanted to say is those buyers where miners, not gamers.
      But there are no miners left so all Nvidia can do is sell their GPUs to miners and those definitely won't pay these prices.

    • @Emeric62
      @Emeric62 Před rokem +3

      They saw what Apple manages to get away with, so now they sell their gpu the same price as an iphone, and they never decrease the price during the lifetime of the gpu just like the iphone

  • @georged5420
    @georged5420 Před rokem +263

    I always bought Nvidia gpus. But after the 3000 series and now the prices of the 4000 series forget it. More inclined to spend my money with AMD even out of spite at this time. Been holding on to my GTX 1080 for 5 years now. I really just wanted to upgrade ridiculous.

    • @cyllananassan9159
      @cyllananassan9159 Před rokem +14

      Problem is AMD not any better it sucks....

    • @neklmae3375
      @neklmae3375 Před rokem +62

      @@cyllananassan9159 actually its much better perfromance and price related

    • @cyllananassan9159
      @cyllananassan9159 Před rokem +13

      @@neklmae3375 not really , ray tracing is horrible, and drivers too, also they consume a lot of power, and are really expensive, also low availability..
      not going to enter the BS fanboi arena fight , i looked at reviews of all the cards both sides, so not a fanboi of nobody, i went for a cheap 3060ti like $500, in Europe, the equivalent AMD card was more expensive, more power hungry and crap ray tracing

    • @Penguin-gq7mp
      @Penguin-gq7mp Před rokem +78

      @@cyllananassan9159 You've never used an amd card lol

    • @taxxzempt3576
      @taxxzempt3576 Před rokem +4

      Feel you 100%, my wind force 1080 getting old! I might even consider a amd crapcard just because fk these prices

  • @ClayWheeler
    @ClayWheeler Před rokem +47

    The only major improvement in GPU leap was from GTX 980 Ti to GTX 1080. The Performance uplift is more than 2x, while the watt slightly lower.

    • @TheWretchedWorld
      @TheWretchedWorld Před rokem +7

      1080ti*

    • @debore3654
      @debore3654 Před rokem +5

      1080 ti was the better investment imo

    • @onyachamp
      @onyachamp Před rokem

      LOL I'm old.
      The 8800gtx vs 7900gtx was a bigger leap back in 2006.
      The 8800 ultra was even more.

    • @Dr.WhetFarts
      @Dr.WhetFarts Před rokem

      980 Ti with OC on a custom card to 1080 non-Ti was almost a side grade. 980 Ti to 1080 Ti was bigger, but 1080 Ti did not OC well compared to 980 Ti. A 980 Ti at 1500 MHz (which custom cards all hit) is like 30-40% faster than a stock 980 Ti at 1150-1200 MHz. In NO WAY OR FORM is 1080 Ti twice as fast as a 980 Ti when both are overclocked / custom cards. My 980 Ti ran at 1580 MHz and performed like my friends GTX 1080 in pretty much all games at 1440p, hell some I even won slightly. I upgraded to 1080 Ti and could barely OC the GPU 5% and the upgrade was like 50-60% at most. 980 Ti was the most insane overclocker ever released and if you did not OC you simply did it wrong. Even reference cards hit 1450 MHz with ease. Thats a 300 MHz bump for free and still 30-35% performance gain.

    • @TheWretchedWorld
      @TheWretchedWorld Před rokem

      @@Dr.WhetFarts This is true the 980ti overclocked like crazy on water. Can verify stock v stock it was equal to a 1070. If you could push it to the limits with cooling the thing was a beast.

  • @worstcat8489
    @worstcat8489 Před rokem +14

    Another thing you will note is that the 80's and 90's in each series they always have at least a 256bit memory interface - this holds true going back to the granddaddy 1080, which STILL has impressive rasterization over some of the 20 series cards. The 4080 'imposter' is every bit a 4070 just named deceptively. If you ever wanted an iron clad metric for what an 'actual' 80 series card is - 256 bit is the magic number when it comes to the memory interface (as in that number or higher).

  • @devindykstra
    @devindykstra Před rokem +46

    Great analysis of the data we were given. It will be interesting to see if these numbers are representative of 3rd party reviews or if Nvidia is cherry picking these results as well.

    • @cremer2027
      @cremer2027 Před rokem +1

      My thoughts as well with cherry picked. I haven't looked at past Nvidia reveals and graphs, but those 3 games for rasterization could possibly only be best case scenarios. Meaning, a bigger list of tested games could bring those average increased values down. This really has me curious to see what third party reviewers will end up finding.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Před rokem

      Every GPU manufacturer ever made has cherry-picked results for marketing materials. AMD is the worst offender here, so this isn't an area you can knock NVidia. If you want to knock them, say they are pricing too high for their GPUs.
      But if you're doing that, compare apples to apples by comparing the current generation of NVidia against AMD with the cards which cards closest in performance across the two manufacturers.

  • @Dellerss
    @Dellerss Před rokem +13

    The prices seem much worse in Europe. The thing is - the dollar is extremely, extremely expensive right now. The last week it has become 1%+ more expensive every single day, on top of being 80% more expensive than in 2013 in some countries already. The 4080 is actually twice as expensive as the 3080 several places. Here in Norway for instance, because of the exchange rates the price is now comparable to the card being 2200$+. Hardly anyone is interested in paying that much to play games, and especially so these days.

  • @GTFour
    @GTFour Před rokem +4

    That straight to the point, concise and well presented analysis just gained you a subscriber 👌

  • @reddead2067
    @reddead2067 Před rokem +1

    Thank you for this video. Easy to understand.
    I’m very interested in the rasterisation improvements from the 3080 to the 4080.16G

  • @RainSandAndFlowers
    @RainSandAndFlowers Před rokem +6

    If I remember correctly, the Titan Rtx gpu was about 2500 usd back in 2018. Few seem to remember.

  • @flink1231
    @flink1231 Před rokem +42

    Great video! As expected the fake 4080 is very expensive for what it is... labeled as 4070 at 600 USD in a parallel universe with no excess stock issues this fake 4080 would be a huge success...

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +3

      Your right!

    • @legendp2011
      @legendp2011 Před rokem

      NVIDIA is being charged almost 30% more for silicon by tsmc, and the card cost almost 50% more to make. nvidia would be losing money selling it below $700..... $800 isa more realistic price (prices have risen everywhere. ps5 went up in price, quest 2 went up in price. apple didn't even put the a16 in there base iphone 14)

    • @formulaic78
      @formulaic78 Před rokem +1

      @@legendp2011 don't Nvidia have 80% margins on their GPUs? They could, you know, absorb some of that price increase.

    • @legendp2011
      @legendp2011 Před rokem

      @@formulaic78 if that where the case than AMD or intel should easily be able to undercut NVIDIA pricing by a significant margin.............however they don't. AMD gpu are only slightly cheaper, and the new intel gpu seems like it will also be only slightly cheaper than NVIDIA

    • @joeperry2065
      @joeperry2065 Před rokem

      @@legendp2011 what do u mean by the ps5 and quest 2 prices going up . ? Those are also complete setups. Not just 1 component. They literally sell those with no room for profit and they still cost a lot less than the fake 4080.. They make there money from games and apps. I doubt it costs them $600 to make any of there GPUs

  • @jettsom
    @jettsom Před rokem +7

    I remember just a few years ago, you could have a top tier GPU for only 649$ (gtx 980 ti). Now 6-7 years later price have more than doubled for the same performance ratio.

    • @jonni1988darthify
      @jonni1988darthify Před rokem +1

      This.. For 12 years was like that from 2004 to 2016..using the Nvidia nomenclature usually you should pay 150/250 for a xx60 card 300/400 for a xx70 500/650 for a xx80 and than up.. I remember in 2009 i brought my gtx285 for 480 euros.. In 2013 my 760 oc for 360 euro and my attual card 1070 for 400 euros.. Wtf appen.. And when you talk with the peaple they act like the smartphone crowd they think this is normal.. WTF

  • @Themisterdee
    @Themisterdee Před rokem +11

    Great bit of pixel counting 👍. I know i would have gone cross eyed doing that and also run out of fingers.

  • @mayaalshami2811
    @mayaalshami2811 Před rokem +12

    It's not just about the 4080 imposter, it's about the real 4080 16 GB, it has price increase of 500$ ! From 700$ to 1200$
    Above all of that, the 3080 had almost similer performance to 3090, but now the gap is crazy between these two cards.
    It's like they are justifying paying 1600$ for a gpu!

    • @larion2336
      @larion2336 Před rokem +6

      Nvidia have turned into the scalpers.

    • @violent_bebop9687
      @violent_bebop9687 Před rokem +2

      Yeah the top end prices will al was ays be crazy, check back next year this time.
      4090 on sale like the 3090

  • @aaronriggs4430
    @aaronriggs4430 Před rokem +8

    This is exactly what I thought when I saw those graphs. The fact that dlss3 could be made to run on 20 series cards means youre buying paywall inhibited software in stead of real hardware upgrades with 40 series to obtain "2-4x performance".

    • @larion2336
      @larion2336 Před rokem +1

      And it only works for games that support it afaik.

    • @sakracliche
      @sakracliche Před rokem +1

      Dlss 3 most likely cannot be run on older cards. The tensor cores are supposed to be improved by a lot. Yes, fuck nvidia, but sometimes newer things just mean incompatibility.

    • @aaronriggs4430
      @aaronriggs4430 Před rokem +1

      @@larion2336 yea from what ive read, the graphs on the left are games that dont support rtx.. which means the new 4080 is visibly slower than 3090ti. Also, dlss3 performance in cyberpunk is cool, but 22fps at native rez?? 😬

    • @paullasky6865
      @paullasky6865 Před rokem +1

      Correct. It's a software launch dressed up as a hardware launch

  • @jolly7188
    @jolly7188 Před rokem +1

    Your video is incredibly useful and well done. Keep up with the good work man. Thank you.

  • @everythingrc2324
    @everythingrc2324 Před rokem +1

    Dude I’ve watched a lot of these videos and THIS one is the best breakdown. Thanks man.

  • @williammurphy1674
    @williammurphy1674 Před rokem +4

    Sticking with my 3080 ti for a long time, playing happily at 1440p works for me! I've always been interested in keeping up with what's new, but less so lately with price gouging, chasing deals, or waiting on inventory, upgrading my psu or mobo to take advantage of the latest tech edge! I know it's the way, but I have a X470 mobo with a 5800x (2700x as a back-up) 850 watt psu (600 watt back-up), got a 4 yr. warranty on my 3080 ti (2070 Super FE as a back-up) .. So, yeah now I'm more concerned with what game shall I play next instead of trying to keep up with The Jones and having more fun doing it as well! ...

    • @marinated5425
      @marinated5425 Před rokem +1

      Yeah I had to upgrade to b550 because of stability issues. It was either that or reset my operating system, and spend hours messing with my ram. If your system works well with your older gen motherboard then it's great.

    • @williammurphy1674
      @williammurphy1674 Před rokem

      @@marinated5425 .. Very true! Knock on wood, I have been very lucky so far with my MSI X470 Gaming Plus mobo. It's been so good to me not 1 minute of problems from it even though most people say it's a meh mobo! I heard the X570 Gaming Plus is not too good tho, which I have to wonder. Built my son a pc with a MSI X570 Tomahawk Wifi a few months back and he's very happy with it so far as well. It's all a roll of the dice with a dash of luck when it comes to electronic components or them actually playing nice with other electronic components! Good Luck on your new mobo and may it give you many years of peace of mind!

    • @Martin23Kings_
      @Martin23Kings_ Před rokem

      @@marinated5425 what issues where you having? I have a 3080ti with a MSI b450 with no issues

    • @mouwa7793
      @mouwa7793 Před rokem

      Me I have the 3090 and I'm more than happy with it just built my PC recently so I have no reason to upgrade

  • @DarthFurball
    @DarthFurball Před rokem +3

    Will be interesting to map this over the next few months given the downturn in GPU market demand, coupled with AMD and Intel cutting in with new offerings. This could be big mistake for Nvidia, especially with the bad exposure and current overstocking issues with the 30 series...

  • @geeknproud321
    @geeknproud321 Před rokem +5

    5:36 i just did that exact thing. Gigabyte RTX3080 Gaming OC $575 shipped from Ebay. I kept a 1070 since release and it was a great 1440p card for years until things like Metro Exodus and RDR2 came out. It lived through FIVE CPUs before being replaced and allowed meaningful gains from each. My brother gave me his 1080Ti in May 2022 when he upgraded and it's a big jump from the 1070 but just not enough anymore. A base 10GB 3080 is nearly twice as fast as a base 1080Ti. And for $550 before shipping I'm happy with the price finally. I mean you always paid this much for a regular 1080 when they were current and people were happy with them. The performance jump from 3000 series is big. While they DO suck power, that's why I chose the 10GB 3080: It has the lowest TDP.

    • @vMaxHeadroom
      @vMaxHeadroom Před rokem +2

      Enjoy the RTX 3080 is a mighty GPU and should see you through a long time and at that price it is absolutely great! Managed to pick a a new ASUS TUF RTX 308012GB for GBP 699 a while back and I am over the moon...Most will not even need a 4080 unless they are hardcore 4K high refresh and RT gamers...for the rest playing at 1080p/1440p high refresh and even 4K 60hz the 30 series is stellar and I am not even including DLSS yet..

    • @geeknproud321
      @geeknproud321 Před rokem +1

      @@vMaxHeadroom It seems as if 4000 series isn't that great anyway, so 3080 remains relevant for at least one extra generation... Same thing happened to 1000 series. 2000 wasnt a big jump so 1000 stayed relevant as the choice that made financial sense. Likewise I jumped from Haswell to Coffeelake and hit the same wall, haha. I keep hitting sweet spot hardware.

  • @josetrigueiro5978
    @josetrigueiro5978 Před rokem +3

    I think you are mistaken. The leftmost graphics are from games that use RTX, but not the DLSS. This means that even these graphics do not represent the increments in rasterization of the RTX 4000 series over the RTX 3000 series, as we know that with this new generation RTX 4000 will see big improvements in performance when using RTX. I could be wrong, but the increment in rasterization on this new RTX 4000 series will be much smaller than the numbers you see in these graphs.

    • @dhaumya23gango75
      @dhaumya23gango75 Před rokem

      Ac Valhalla and Division 2 on the left side lack both rt and dlss implementation so that invalidates your point. It's gain in pure rasterisation

    • @josetrigueiro5978
      @josetrigueiro5978 Před rokem

      @@dhaumya23gango75 It looks that you are right! Apparently, for these two games what exists are reshade mods that simulate ray tracing. Sorry! My mistake.

  • @sneedtube
    @sneedtube Před rokem +1

    Finally a very informative video giving raw numbers in terms of performance increase vs price point, really appreciated

  • @brianjones6465
    @brianjones6465 Před rokem +13

    I will never ever call that 40x0/12gb a real 4080. It's a 4070 and nvidia is violating the RICO act.

    • @totallynotthebio-lizard7631
      @totallynotthebio-lizard7631 Před rokem +6

      When you’re rich, you don’t follow laws, you follow guidelines.

    • @johngray3449
      @johngray3449 Před rokem

      4080 16gb = 4070
      4080 12gb = 4060
      Nvidia really knows how to game people.

  • @averageadventure200
    @averageadventure200 Před rokem +7

    there's seriously little that can compete with the 30 series on price.
    I bought my 3080 10gb for 722usd 4 seconds after it became avalible for purchase and at 10% less than the MSRP here in norway, where the cheapest was 800$ msrp.
    Sadly the gpu never came and i had been waiting for so long that i had totally forgot i even purchased the thing, until i got a suprise parcal at my door. I thought i had bought something while drunk to warrant such an unknown parcel.
    When i opened it i almost jumped through the roof! Finally in front of me a shining 3080! Only scalpers had them at that time, the gpu took exactly 4 months on the to arrive, when i installed it the performance jump blew me away. Going from an old 1070 SLI setup that had low fps in 1440p with few games having SLI support i was extatic. I really couldn't affort the price of the gpu, as a student with barely any loan income, i decided when i wasn't gaming i would let my pc run a mining program, my 722$ came rolling back in just 71 days. The new gpu had a dual function as a heater when mining, so it allowed me to turn off my portable heater. The 3080 created just the right amount of heat to keep my room at a comfy 21*C during the cold months, in the end the 30 series allowed me to buy a free gpu which then made me money, the 3080 payed for itself 6 times! That's a price to performance mark i doubt the 40 series can beat.

    • @Isaax
      @Isaax Před rokem +1

      I have a similar story. Truly the right GPU at the right time.

    • @a64738
      @a64738 Před rokem +2

      You was very lucky.... I only got the hands on a RTX3060 and did pay the same price. But then again I sold my old GTX1080ti for the same as I did pay for the new card and it has made more in crypto mining then it did cost :)

    • @larion2336
      @larion2336 Před rokem

      I guess that's a major perk of living in Norway. If I leave my GPU on at 100% usage in Australia my room becomes an oven.

    • @fr33kSh0w2012
      @fr33kSh0w2012 Před rokem

      @@larion2336 Same here we need Liquid Nitrogen / Liquid Helium to get the cards down to ROOM TEMPERATURE levels!

  • @Michealxlr
    @Michealxlr Před rokem

    Very good analysis. Pretty informative!

  • @abelarunus3727
    @abelarunus3727 Před rokem +3

    Exactly what I’ was studying about the latest 4000 series… the comparison are good .. but when u go to used market for 3000 series… the value is just incredibly high for used 3000 right… bottom line- 4000 is too costly for its perf.. even with new to new comparison so used market is the best bet now!!!!

  • @Kxrben
    @Kxrben Před rokem +3

    I just bought an 11600k and an msi z490 mobo for like 250$, and an evga 3080 hybrid for 550$. As far as an upgrade from my 1060 goes, im more than satisfied for a while, 1200$ for a 4000 card seems steeep af even with the better RTX and dlss 3.

  • @notvin
    @notvin Před rokem

    Excellent breakdown. Thank you so much!

  • @pfln
    @pfln Před rokem +8

    Nvidia is gonna hurt after this launch. People arent spending as much, scalpers are out of business and mining is at an all time low right now. Their sales are going to tank along with the MSRP. Hopefully this will change for the 50 series

  • @simonfil2
    @simonfil2 Před rokem +4

    I think I'll be happy with my 3080 for a long time. With even more performance you start to venture into 4k at higher than 60fps. And sure that sounds really nice. But I'm very happy to tweak some settings set res scale to like 83% and get basically the same experience.
    But we're in a golden age of cpus so next upgrade is definitely gonna me a cpu, motherboard switch and maybe go for ddr5 Ram.

    • @error_nr-1232
      @error_nr-1232 Před rokem

      Why would you want to upgrade your CPU if the bottleneck is clearly the GPU (at 4K res). Wasted money in my opinion, unless you're a content creator of some sort.

    • @simonfil2
      @simonfil2 Před rokem

      @@error_nr-1232 The rtx 3080 is powerful enough to actually bottleneck my 8700k in a lot of games because it can push really high fps even at 4k. I also play alot of VR games where I want 144fps at as high resolution as possible and there my 8700k is reaching it's limits. I'm also doing a lot of content creation.

  • @Snoop_Dugg
    @Snoop_Dugg Před rokem +5

    This is actually really good value. You're getting a free heater with your card. Perfect for this winter.

    • @sakracliche
      @sakracliche Před rokem +3

      To be honest, I'm kinda hoping for heating the place up a bit with a new gpu

    • @kosajk
      @kosajk Před rokem

      Card is not that cheap to start with and it take some wattage to generate that heat not mentioning need for psu

  • @bonbondesel
    @bonbondesel Před rokem

    Super video !!! Thanks for that !

  • @FaisalGH002
    @FaisalGH002 Před rokem

    wait what about those warhammer40K and microsoft flight sim numbers? were those dlss or raw fps comparisons?

  • @_google_user_
    @_google_user_ Před rokem

    Where is the Racer X available to download?

  • @Onb3k3nd3
    @Onb3k3nd3 Před rokem

    So wich card is the best to get or have???

  • @Dybvad1992
    @Dybvad1992 Před rokem

    This video deserves thumbs up and to be shared. Load of valid information

  • @taxxzempt3576
    @taxxzempt3576 Před rokem +10

    When I got my 1080 it was a nice upgrade from my 780 and I paid about 620 usd a month after release
    Still using that gpu. Would love to upgrade to a new card however not buying anything at a stupid price
    I hope they crash and burn raising prices so high

    • @loki76
      @loki76 Před rokem

      I'm in the same boat. But I need a new GPU the GTX 1080 is not viable anymore and I don't want to play games at shitty resolution and graphical settings. So, I have to get a new card.
      I have the money to do it.
      I am waiting for reviews and what AMD announces for their performance and cost. As it stands only the 4090 has sparked my interest. Even though it's very costly.
      I would go with it, if AMD has nothing better.

    • @marinated5425
      @marinated5425 Před rokem

      3080 is a worthy upgrade

    • @taxxzempt3576
      @taxxzempt3576 Před rokem +3

      @@marinated5425 when I can find a brand new 3080 for 600 I’ll happily pay

    • @larion2336
      @larion2336 Před rokem +1

      Agreed, I upgraded from GTX 580 to a 1070 TI and that's what I'm still using. I was hoping to upgrade again in this 4th Gen but Nvidia have gotten so damn greedy that I'm more inclined to look at AMD's offerings at this point.

    • @sakracliche
      @sakracliche Před rokem

      They crashed, that's why lovelace is expensive, why it's purposefuly not a good deal. Get a 3000 series, it's the best bang for your buck atm :)

  • @WayStedYou
    @WayStedYou Před rokem +7

    The shader count gap between the 3080 and 3090ti is smaller than the gap between the 4080 12 and 16gb

    • @nossy232323
      @nossy232323 Před rokem

      Nvidia didn't want to make the same mistake again and so now there is a huge gap between the RTX 4080 16Gb and RTX 4090.

  • @emp1985
    @emp1985 Před rokem +18

    Just be careful with GDDR6X-equipped, second-hand cards. Those VRAM ICs run extremely hot when mined, so in these particular casses, avoiding mined cards is a good idea.

    • @CsFanatiker
      @CsFanatiker Před rokem +1

      So basically miners spoiled the used gpu market coz ppl are not honest enough to tell you the truth about the gpu they are selling to you. We are buying a cat in a bag with all this used 3000 series.

    • @bionic1matt
      @bionic1matt Před rokem

      mining cards aren't even that bad. most of the time you'd just need to replace the thermal paste and you're good to go. The life may be reduced but they're still fine

    • @Azuria969
      @Azuria969 Před 6 měsíci

      lol bs nice try tho, those used cards are worthless

  • @aquaneon8012
    @aquaneon8012 Před rokem

    How much cheaper can we expect these graphics cards to get by Black Friday, assuming there are no shortages.

  • @iceburglettuce6940
    @iceburglettuce6940 Před rokem +4

    You seem to have missed the main issue with these numbers.
    These numbers are skewed by the small print. The small print notes DLSS is used where it can be.
    Therefore the biggest wins for the new gen may well be using DLSS 3.0 against DLSS 2 for the 30 series (as they are not allowing 30 series to run DLSS 3.0).
    We know that DLSS is actually adding in unrendered frames to make the fps appear faster.
    This may or may not work well.
    This will account for the “2x” or “4x” performance.
    Nvidia have misled before about new card performance.

  • @treymtz
    @treymtz Před rokem +16

    AMD has an opening to drive a knife through Nvidia this coming generation. I highly doubt they'll over take in popularity but they can make significant market share progress. Currently AMD has around 20% and team green 80% of the gpu market. I can see this shifting as high as 40/60 realistically if this is the kind of pricing Nvidia wants.

    • @sephirothcloud3953
      @sephirothcloud3953 Před rokem

      Yes 45-50% for the reason that many wont buy amd anyway and waiting for lower nvidia prices or skipping and waiting the 50 series. Moreover wont be amd shortages cause it uses the same node as ryzen so the amd prices will be steady.

    • @newyorknewyork933
      @newyorknewyork933 Před rokem

      You have to understand that AMD would not take large shares (think 40% or more) of the market, because majority of consumers of medium to high end cards are tech corporations, content creators, research organizations, universities, they need the CUDA cores of Nvidia cards to do their jobs, renderings, Data searching, AI testings, etc.
      They don't care how the price of cards increase, because what ever the price would be, they need to buy it.

    • @newyorknewyork933
      @newyorknewyork933 Před rokem

      AMD GPUs doesn't have CUDA cores, and they don't have the funds to spend on industry ecology, where Nvidia spent over 20 billion dollars on

    • @sephirothcloud3953
      @sephirothcloud3953 Před rokem

      @@newyorknewyork933 exactly i am one of them cause i use neural computing, but it all depends on the price, and im telling you, i dont want to be robbed, i pref buy used 30 series and im seriously looking for alternatives with amd, i never thought i would be forced to do this, but i am. Every preview in the internet are saying that is a robbery, at this price even for small companies Hopper is better priced. Unless it has a bad IT management at these prices you risk to get fired cause the Roi is too low. Only 4090 make sense. It all depends on amd performance and price now. Pc gamers are still the majority of the pool 80%+ and they dont like to be losers too, when they sees the price and see amd is the same at 300$ less, fuck nvidia, or their friend will call him stupid. If nvidia loses only 30% pool is a victory for them, it couod be even higher. In the end all our reasoning are wrong cause nvidia will cut down the prices for one reason only, tsmc already told nvidia that they dont cut the production and nvidia must pay every order it made. Is better for nvidia to earn 10% than losing 100% for every unsold card.

    • @BigYella
      @BigYella Před rokem

      @@newyorknewyork933 I suspect gamers, whom are not also doing production work with their rig, would gain the most from switching then? I personally like AMD products. I'd rather buy a Mac Mini/Studio for editing and rendering rather than to purchase marked up Nvidia cards.

  • @Flightcoach
    @Flightcoach Před rokem

    amazing video. thanks for going so deep. coming from a fellow youtuber i take my hat off: you managed to do what so many youtubers fail at: finding a niche and ROCKING IT!

  • @biggnasty989
    @biggnasty989 Před rokem

    Thumbs up for the slick outro music!
    I feel like I am going to get hosed either way. I paid 1400 for my shady back alley 3080 purchase, now I need more to run my Pimax 5k at 120 fps. I feel like I'll have to sell my car to be able to finally be able to recreate the level of fidelity in the U.S. Air Force V-22 osprey simulator I got to try out all the way back in 2005.
    It is true that the civilian world is about 20 years behind the military in simulation/gpu power.

  • @ngreediagraphics2858
    @ngreediagraphics2858 Před rokem +1

    excelent job!Subscribed!

  • @mpointe1978
    @mpointe1978 Před rokem +2

    Best video on this topic I've seen. And I already watched A LOT (including the "big" channels)

  • @greenbow7888
    @greenbow7888 Před rokem +1

    4:50 which is down to rampant inflation.
    5:48 .. what's worrying wbout these specs is that they are chosen for this purpose. Across more games the 4080 16GB vs 3090 Ti might not favour the 4080 as much.

    • @averageadventure200
      @averageadventure200 Před rokem +1

      4080 12gb should be compared to the 3070, which is a 499$ gpu. inflation might be high but not 80% in two years, nvid got used to higher profits and seems to want to keep their new and improved salaries, this is just like the 20 series launch all over again. xD

  • @user-lk5kn2tr7k
    @user-lk5kn2tr7k Před rokem +1

    Amazing analysis.

  • @rickgerritsen8972
    @rickgerritsen8972 Před rokem

    I guess announcing hi prices for the newer 40xx generation is a marketing strategy in order to sell more of the 30xx series prior to the start of 40xx sales

  • @flukve3574
    @flukve3574 Před rokem

    Great analysis, thank you!

  • @RyviusRan
    @RyviusRan Před rokem

    I got a 3090 at release for msrp to replace my 1080ti. Performance from the 1080 ti to 3090 was more than double which made me happy. If they can achieve the same doubling or better going from a 3090 to 4090 in normal rasterization performance then I wouldn't mind going for a 4090.
    I have my doubts about a doubling of raster performance because that would be the single biggest jump in one generation for a GPU since early 3d graphics accelerators of the 1990s.
    The 980 ti to 1080 ti was only around 65%-70 better as was the the GTX 7800 to GTX 8800.

    • @coffee7180
      @coffee7180 Před rokem

      you are telling me you had to wait for another generation and paye more than double the price to get a card worth a upgrade.

  • @timothygibney159
    @timothygibney159 Před rokem +5

    Glad I got the 3080TI. the leaked benchmarks are with dlss 3.0 that cheats on the 4xxx series so I am guessing it is as fast as the 4080 12 gig

    • @superpulaski9767
      @superpulaski9767 Před rokem +5

      I’ll take the 384 bus width over the renamed 192bw “4080”

    • @Dark88Dragon
      @Dark88Dragon Před rokem +1

      Yeah should be around this performance, more than a bad joke for expected 1.2k Euros in Germany at release...thank you Nvidia lol

  • @mogadanez
    @mogadanez Před rokem

    I miss something, but seems performance compared to 3080Ti but price is compared with 3080 regular?

  • @KaiThouKing
    @KaiThouKing Před rokem +7

    The thing is these new cards only matter if you want to use DLSS 3.0 + RTX that's the only huge difference in fps

    • @zkilla4611
      @zkilla4611 Před rokem

      They are still 25-35% Faster without DLSS3

    • @KaiThouKing
      @KaiThouKing Před rokem +8

      @@zkilla4611 That is not a large jump for something so expensive

    • @zkilla4611
      @zkilla4611 Před rokem +1

      @@KaiThouKing I agree. Hopefully AMD brings the heat this generation. It would be nice to see them put Nvidia in their place.

  • @RollyPolly74
    @RollyPolly74 Před rokem

    What is the "world-building" software?

  • @Ph0neutria
    @Ph0neutria Před rokem +2

    Damn, very likely I'll just buy RTX3080 12GB.
    I hope there will be Black Friday deals.
    I might look at used cards, but in Germany by some reason people bid very high price for used cards, I'd say it doesn't make sense to buy used card.

  • @martinhsl68hw
    @martinhsl68hw Před rokem

    I guess you have to include the current general levels of inflation also

  • @sameerpharma4906
    @sameerpharma4906 Před rokem

    great breakdown!

  • @ranakirti
    @ranakirti Před rokem

    Subscribed for this video.... U r a confirmed Master.

  • @Kyleplier
    @Kyleplier Před rokem +1

    For me personally going to the 4090 would actually be a massive increase in performance. Currently I'm on an RTX 2070 laptop connected to my LG CX OLED via a Mini DisplayPort 1.4 to HDMI 2.1 cable. If I'm not mistaken a laptop 2070 is equivalent to an RTX 2060 desktop GPU. Going to an RTX 4090 likely will be quite a jump in performance.

    • @Zalkryn
      @Zalkryn Před rokem

      The ports on your LG CX are 2.1 but imagine how pissed I was when I bought a 2080 ti to then find out it was NON A so cant flash other cards bios' onto it, power limit capped....only manual OC...AND the card is HDMI 2.0b!?!?!? I know Nvidia knew and could have been ready to give 2000 series gpus the 48gbs..errr 40gps ...I have the CX as well....Then a month ago it stopped working. BIOS and Device manager do not detect my 1200 dollar 2080 ti brick!! I dont know the hardware ID for it either...forcing me to buy 3000 series if I want the 48gps the 2.1 hdmi bandwidth offers... :(

  • @NETopham
    @NETopham Před rokem +1

    Your video has perfectly captured how I felt about this launch, big price increase and generational speed increase doesn't look that great. Looks like I'm saving my money for team red or 2nd hand bargain.

    • @mikaelborjesson3339
      @mikaelborjesson3339 Před rokem

      I'm with you. Also looking at Intel since I don't need update anytime soon.

  • @redcobra1246
    @redcobra1246 Před rokem

    The 3 benchmarks on the left could be using dlss 3.0 on the 4000 series. Dlss 3.0 works with any game that has TAA, not just games that support dlss

  • @84jdgregory
    @84jdgregory Před rokem +1

    There are people out there that think the 4x as fast claim is correct. When you see cyberpunk though, the 3090 was with ray tracing and no dlss. The 4090 was with dlss 3. They’re fudging with the numbers a lot on these new cards.

  • @SparlanxD
    @SparlanxD Před rokem +1

    Soooo should i buy now a 3090 or should i wait for the 12gb 4080? (same price between this two here in germany) what would you guys do?

    • @noThing-wd6py
      @noThing-wd6py Před rokem +1

      Überlege eine 3090 ti zu holen ... oder bis nach Weihnachten zu warten und zu schauen wo dann die 4080 16gb liegt. Hab noch nh 1080.

    • @SparlanxD
      @SparlanxD Před rokem

      @@noThing-wd6py Ich habe mir ne 3090 jz geholt, muss sagen bin sehr zufrieden, glaub ich warte dann mal auf die 5000er Serie, bis dann sollte meine 3090 reichen (hab sie für 1049 Euro bekommen). Leistung zusammen mit dem Ryzen 7 5800X3D ist einfach top :)
      Ich denke nicht, dass die 4080 16GB in den nächsten Monaten deutlich günstiger wird :D

  • @heatnup6899
    @heatnup6899 Před rokem

    You deserve more subs!

  • @birdsoup777
    @birdsoup777 Před rokem

    Did you used to have a video game review channel

  • @zackw4941
    @zackw4941 Před rokem

    I bought a 3090 FE for MSRP back in January of 20... It had about the same price to performance ratio as my outgoing Vega 64.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem +1

      Now that's an upgrade!

    • @zackw4941
      @zackw4941 Před rokem

      @@ImaMac-PC It was noticeable ;)

  • @user-mm7ot5zq2c
    @user-mm7ot5zq2c Před rokem

    To be more correct, in addition to zooming in, DLSS3 also uses Optical Multi Frame Generation to generate entire individual images based on the frame before zooming in. Optical Multi Frame Generation still operates at reduced resolution. Without first reducing the amount of data that needs to be processed, which reduces the resolution, it is impossible to achieve acceleration. In other words, the Zoom technology of DLSS3 is better, but the image quality defects brought by DLSS2 in the past will still exist, but the degree of defects in DLSS3 will decrease.
    The AI technology of DLSS3 is mainly based on the analysis and prediction of the motion of all objects in the game, but as long as there is an unexpected operation behavior, the image defects caused by DLSS3 will appear. That's why the DLSS3 images shown by nVidia are all straight forward, because the inertial operation of straight forward is unlikely to predict wrong. However, after maintaining a straight line for a period of time and making a sudden turn, the AI will make mistakes, but it is normal for the operator to change direction.

  • @matejjenko4209
    @matejjenko4209 Před rokem

    the game in the end name?

  • @kevykevTPA
    @kevykevTPA Před rokem

    What is that model car racing game?

  • @snakeplissken1754
    @snakeplissken1754 Před rokem

    The weird part, if you look a the nvidia page for the rtx models it also shows 2 different 3080 gpus, 12 and 10gb ram size and with different shader amounts 8.960 / 8.704.
    That said i never have looked at actual cards having that difference or if it was just a typical oem ripoff.

  • @boronat1
    @boronat1 Před rokem

    perfect analisis thanks !

  • @tomduke1297
    @tomduke1297 Před rokem

    did the same thing after the livestream, good to have confirmation. 4090 it is, thnx! ;)

  • @RainSandAndFlowers
    @RainSandAndFlowers Před rokem +3

    It is all well and good to talk about the flagship gpu 4090 all the time, everywhere, when in reality most people are struggling to justify 350 - 450 usd gpu purchases and a lot of people are still very happy with 1080p gaming.
    Also, every year there is talk of msrp and it seems that mostly people get dissapointed by reality.
    We might see 2000 usd 4090 yet!

  • @jrgensollie2641
    @jrgensollie2641 Před rokem

    Great job!

  • @jonathonschott
    @jonathonschott Před rokem +2

    Regarding this launch I called one thing spot on, that they would put more emphasis on ray tracing and for that I commend them, work that architectural advantage over your competitor. everything else about this launch is just plain poorly thought out down to the omission of Displayport 2.0. I remember being jealous of Maxwell until I finally got one, being blown away with Pascal, petulantly hopeful when they added ray tracing with Turing, and despite fighting the shortages was over the moon when I finally got my 3090 (24GB on a consumer card what?), but since Turing they have been getting greedier and greedier, just look at their NASDAQ ticker to see that. There was a good chance I'd be sitting out this generation anyway after just getting a 3090 but after this dumpster fire of a launch that decision has been made for me.

  • @netomilt
    @netomilt Před rokem +2

    Quality work. That's why I bought a 3080 this week brand new for $600 bucks. Still got an eye on AMD to see what they come up with.

    • @chazcarpenter1559
      @chazcarpenter1559 Před rokem

      Where from? I will probably snag one as well

    • @jimmyramos1989
      @jimmyramos1989 Před rokem

      @@chazcarpenter1559 Newegg seems to be your best bet. Micro center has tons of stock but not at that price to my knowledge.

    • @chazcarpenter1559
      @chazcarpenter1559 Před rokem

      @@jimmyramos1989 Preciate that!

  • @drumsmix33
    @drumsmix33 Před rokem

    Great analysis!

  • @perigil
    @perigil Před rokem

    I was confused, the NVIDIA chart was comparing the 40's to 3090 TI .....right?

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      nVidia chart did compare to the 3090 Ti. I compared to the replacements, the 3080 and 3090.

  • @primary7208
    @primary7208 Před rokem +1

    I’m happy with my 6950xt, maybe Amd will drop some bangers with the 7900xt and I will sell my 6950 but if it’s not a big increase in performance I have no issue getting 250fps 1440p on high settings every game I play (apex and warzone mostly)

  • @WayStedYou
    @WayStedYou Před rokem

    6:57 actually its smaller if you go by Nvidias on website specs of the 3090 cooler.

  • @diegosilang4823
    @diegosilang4823 Před rokem +2

    If you add DLSS 3.0 on 30 series, the 40 series FPS advantage is much smaller.

    • @BigYella
      @BigYella Před rokem +1

      Do you think that is why they will not offer it on the 30 series GPU?

  • @SilverHand-fu1jn
    @SilverHand-fu1jn Před rokem

    Should i buy 1yr old probably mining card for 400$ ? with 2 yr warranty ( new one is 650$ here due to tax)

  • @VakmanCA
    @VakmanCA Před rokem +2

    With this level of pricing, insane power requirement and 5G technology becoming broadly available and cheaper, we are near the end of graphics cards for gaming. Cloud gaming will soon become the prefer way for most gamers. Give it a few years and the performance/latency ratio will be up to par with current "physical" offers for a fraction of the price. Nvidia knows it very well and try to capitalize on remaining customer products. 3 or 4 generation down the road, Nvidia will provide graphic cards mostly to datacenters and it will be a lot less lucrative than current model. The "pay as you play" model or even the subscription model will be a tough one for Nvidia because they do not dominate the cloud market.

    • @Voklesh85
      @Voklesh85 Před rokem

      Personally, I absolutely disagree.
      Cloud gaming, especially for PC gamers, is not yet minimally able to replicate the performance of physical hardware and as Stadia has shown it is unattractive and also expensive for a company like Google to maintain which translates into high costs for the players.
      Not only that, PC gaming is also passion not just performance. I "want" to have my PC to assemble, modify and cure.
      Last factor is the experience tailored to the user. If you play on PC and not on PS5 or 0ne-X it is because you want to play in 32: 9 or maybe at 320+ HZ or in 8K etc.
      The best thing about PC gaming is that everyone plays the way they want and with what they want and a streaming service won't be able to give it to you for many years to come .

  • @billwiley7216
    @billwiley7216 Před rokem

    A big part of the cost increase is because of the extremely high cost per wafer from TSMC on the node this card is built on.
    This has been covered fairly extensively from other sources.
    Another is the need to keep 30 series cards prices more stable and relevant, again these cards cost more to produce than AMD's cards so they cannot sell them as low without taking huge losses.
    Even at the current under $1100 pricing of the 3090ti the 4090 with its performance improvement of roughly 70% more than a 3090ti is inline at its $1599 msrp.
    What kills this for a gamer doing an upgrade and not a new build is this card requires the new technology standard PSU or the card without the sensor communication to the PSU will throttle its power draw hence affecting its performance level from my understanding.
    For someone upgrading this will add roughly another $300 to the cost of the upgrade making the 4090 run $1899 not a $1599 upgrade.
    Not to mention that even though you may already own a less than a year old $300+ psu for this upgrade it is basically a paperweight if you want maximum available performance out of your 4090.
    This is how I understand it as without the communication standards the card will revert to a preset minimum power draw level.

    • @mpadlite2925
      @mpadlite2925 Před rokem

      Yea... High B.O.M + high margin== Ludicrous pricing. :)

    • @dracer35
      @dracer35 Před rokem

      @@mpadlite2925 Exactly. Nvidia not willing to give up even a tiny bit of that 60% margin they require. 40 series is a hard pass for me. They can keep their high priced GPUs and not get a penny out of me.

  • @trumpingtonfanhurst694
    @trumpingtonfanhurst694 Před rokem +3

    I can't wait until the 4000 series cards come out - not so I can buy one, but so I can maybe finally afford to buy a 3000 series!

    • @floppydisk1137
      @floppydisk1137 Před rokem

      I'll wait until AMD announcement of 7000 series.

  • @enzzz88
    @enzzz88 Před rokem

    Nice analysis, more and more leaning towards buying a 3080 and retiring my 1080ti.

    • @MrZodiac011
      @MrZodiac011 Před rokem

      Depends on what you're doing, you're better off with a 3080Ti or 3090, since the 1080Ti has 1gb extra Vram than the 3080 and 11gb is already running out on a few games at 4k, and getting low at 1440p, 12gb probably won't last long either.

  • @stevoone342
    @stevoone342 Před rokem

    Personally I’ve just picked up a brand new 3090ti for around 900£ witch I’m happy with I’ll probably look at the market this time next year due save a lot of money 💰. I also am concerned on the actual performance on the 40 series due the comparasons didn’t say the graphics settings used nor compared the current ray tracing and dlss to gen 3. Eg I have a lot of questions but little seen till the reviews start happening to see the real world performance, yes it will be better but to the cost to performance???

    • @a64738
      @a64738 Před rokem

      Considering the RTX2080ti never dropped under 1000£ Here in Norway I would say a RTX3090ti at that price with that much RAM it has is ok price... Here in Norway you can not find even a RTX3080ti at that price...

    • @zachoons
      @zachoons Před rokem

      I also picked up a little 'ol 3090 TI ;) Good purchase!

    • @bigdaddyschin5692
      @bigdaddyschin5692 Před rokem

      @@zachoons yup about to do the same. I have a 3060 now. But I wanna play games in 4K

    • @noThing-wd6py
      @noThing-wd6py Před rokem

      @@a64738 In germany you can get a RTX 3090 TI for 1250€ incl. Tax. What are the prices in Norway?

  • @davids8345
    @davids8345 Před rokem

    lol @ "for those that live in the omniverse"
    so... basically if you have more money than sense, then buy the 4090; otherwise pickup a 3080 (and if you want to give nvidia the middle finger - buy second hand)
    great analysis - you earned a sub!

  • @Elue-mtb
    @Elue-mtb Před rokem

    Can you make a video with the European prices? If it´s worth it to buy it or not.

    • @ImaMac-PC
      @ImaMac-PC  Před rokem

      Whether something is worth is to buy is based on a multitude of factors (e.g., use case, disposable income, do you require that GPU to do your job faster, etc) of which one of them is price. All I can do is provide the information and each buyer has to determine if it is worth it to them. My advice is that, until you know it is absolutely worth it for your use case, don't buy.

  • @TABSF1
    @TABSF1 Před rokem

    If you're already getting 60+ fps 4K rasterisation.
    Why would you be disappointed with limited Rasterisation upgrade?
    I'm more interested in RT performance.

  • @maxstepaniuk4355
    @maxstepaniuk4355 Před rokem

    Answering the question why new cards are so expensive. This was made to force you to buy from huge stock of 3xxx cards. Since release of 4xxx is still very far and can be postponed, with this high prices they motivate you to buy old stock to deplete it completely. When the old stock is depleted they can announce lowering prices for 4xxx cards. Though these high prices for next generation may stir up prices for secondary market and new 3xxx cards market since people have no choice what to buy before 4xxx release.

  • @johnb2572
    @johnb2572 Před rokem

    Good video! enjoyed it

  • @towsonfox6964
    @towsonfox6964 Před rokem +1

    I already knew the prices would go up but not this much. I feel like if the prices of the 4000 series was $200 lower than it is now, it’d be less of an uproar. Nvidia can easily sell their 3080 for $500 but they’re too greedy.

  • @rfx8459
    @rfx8459 Před rokem +2

    For gaming I’d stick with my 3090 FE, but unfortunately I’m also a VFX artist that heavily uses NVidias optix renderers etc and they’re lowkey amazing…. I guess I will be up early attempting to snag a 4090 hopefully 🤞

    • @ffwast
      @ffwast Před rokem

      And the 3090 can't do that? Because if it can unless it's a hell of a lot faster you'd be wasting your money.

    • @bigdaddyschin5692
      @bigdaddyschin5692 Před rokem

      Yeah I’m thinking about grabbing a 3090ti LC in the next week or so. The 4090s were melting PSUs during testing…

    • @rfx8459
      @rfx8459 Před rokem

      @@ffwast It can, but the 2-4X performance increase claims with the 4090 over the 3090 are specifically in Raytraced scenarios. Which obviously doesn't matter for most people but for me that's an insane value increase.
      My job is built around the idea ''time is money'' so its a little different for my case.

    • @rfx8459
      @rfx8459 Před rokem

      @@bigdaddyschin5692 yeah the new cards require ATX 3.0 PSU's. Theres a chance if you use a certain adapter with a current gen PSU itll start to melt the port on the PSU.

    • @ffwast
      @ffwast Před rokem +1

      @@rfx8459 The key word is "claims" and who's making those claims has a reputation for similar claims being marketing lies. Real benchmark results never live up to nvidia marketing.

  • @ZaRaven
    @ZaRaven Před rokem

    At least in Eu right now, I don't see how 4090 is just 6% more money. A new 3090 has a price of @ 1150EUR ( even 1100 some models), and a 3090 Ti gets to 1250 EUR. A new 4090 is around 2000 EUR . So that comparison may be valid when 3000 series arrived and such, but right now? seems like 70% more performance for 35-40% price increase. Maybe I am wrong, idk.