RTX 4000 vs RTX 3000 - Real FPS (Rasterization) Performance vs Price Comparison is Shocking!
Vložit
- čas přidán 22. 09. 2022
- nVidia announced the RTX 4000 and was vague on rasterization performance improvements. I breakdown what they did provide to get a sense of the actual rasterization performance increase you can expect without DLSS or Ray Tracing enabled. A comparison of performance and price is provided and the results are shocking.
- Věda a technologie
I think it is disgusting that they decided to name the 4070 a 4080 12GB instead. We're basically paying $899 for a card tier that used to cost $350 just a few generations ago. That is a 150%+ price increase
More like 250%
It's even worse than that, the 4080 12Gb has about 48% performance of a 4090 when looking at core and clock, and this performance range is generally reserved for the 60ti card.
Basically Nvidia is trying to sell us a 4060ti for $900
@@JoeWayne84 dude noone bats an eye if cards go up. But like over 200% is just ridiculous if a card costs a few years back 350 it should be like 500 and not 1900 by now its insane
329 for gtx 970, which was 399 in the 770 which was a rebranded 680
true.....but fanbois will buy them by the millions.....just like apple phones.... personal identity has been replace by mega corpo identity....as if they were your friends....pathetic....
A couple years ago nvidia said the 3080 was 2 times faster than the 2080. In reality it was about 35 to 40% faster. Everyone should keep that in mind when nvidia makes claims.
nVidia's claims are just that, claims. But the data they provided is gold.
It's becoming a tradition at this point. They also did it with the 2080 vs 1080. They always cherry-pick some special scenario that gives a huge advantage to the newer card with some workload that is inefficient on the old one. That time it was some 4K HDR scenario which if I remember correctly pascal was really inefficient on and/or it was straining the memory bandwidth of the 1080. Then 3080 vs 2080 it was Doom Eternal with a texture setting that utilized 9GB of vram (10GB vs 8GB on these cards). Now it's Cyberpunk with a new "RT overdrive" setting which I can bet is really inefficient on ampere compared to ada.
That is in the best case, so it is “kinda” true
68%* On average, still isn't double, but noticeably more than 35-40
If I recall the 3080 was more than 2x a 2080. Not in raster but ray tracing. The 30 series was a crazy upgrade for RT.
Increase in performance is NOT a reason to increase the price. It is a reason to buy a new unit, when yours is still working.
If price hiked with every increase in performance, we would be having $10K gpus by now.
It is most likely not much in form of raw performance gain as they cheat the 2 x and 4 "performance" number by adding artificial frames that inflate the frame counter number...
Actually they would cost millions or billions each considering the exponential perf improvements in silicon chips for a long time. Like imagine you paid $50 p/m back in the 90s for a 56k dialup internet connection. Today you can get a 1 gbps fiber connection, which is 17860 times faster. Should you then reasonably expect to pay (17.8k x $50) = $893,000 p/m for it? Increase in performance should only be a factor _within_ generations, ie it's reasonable for a 4080 to cost more than a 4070. For a 4080 to cost more than a 3080 is either price gouging or inflation.
@@larion2336 Exactly! This seems so simple. Just like when flat screen tvs were thousands of dollars but now they have higher resolution, better technology and only cost a few hundred bucks. Nvidia is just price gouging to the absolute extreme.
When the 90 series card is the only one that makes sense, you see that things are very wrong.
I've got a funny feeling nvidia is holding back a budget lineup they will launch a bit later
I got my 1080 mini on sale for 399 and it still holds up well, but I wouldn’t mind springing for a 3080 FE
@@reviewforthetube6485 *"I've got a funny feeling nvidia is holding back a budget lineup they will launch a bit later"*
What about the notion that these prices are deliberately high in order to sell the remaining stock of 30 Series cards, while still being able to claim that a new generation has been released on a "normal"** schedule, with the intent of later reducing prices (if demand isn't driven back up by $!#@ miners)?
** That is, the _new_ normal of ~24 months between generations (2016 - 2022), as opposed to the 13.7 month average from the GeForce 2 (2000) to the 10 Series (2016).
@@reviewforthetube6485 I think they're holding back a ti lineup, and waiting for these to be bought.
@@Snoop_Dugg of course they are
I just love how he named "the 4080 impostor" thats what im calling it from now on 😂
4080 ti (total imposter)
would be better if nvid was honest about it, and simply called it 4070 rather than give into marketing anxiety xD
@@averageadventure200 totaly agre with that statement, i hope people dont think thats only vram diference, very shady tatics used by nvidia, this will work great for amd. This remembers me the 3.5gb gtx 970 i bought an r9 390 8gb and it served is purpose until last year
I just simply call it a 4070. I will call out anyone who buys it.
@@zkilla4611 What will you call 4070 then? It should have been 4070 Ti tho, since it's not cutdown at all.
My list of possible Reasons for price in no particular order.
1 Cleaning out old inventory. They want to sell the abundant 3000 series so the higher 4000 series will push people that way.
2. They saw what people were willing to pay and are now rolling with it.
3. They don't believe amd will be competitive.
4. They know sales will be lower and investors will expect similar returns.
It wasn't ppl. It was miners
@@Hi-levels *"It wasn't ppl. It was miners"*
**blink** ... **blink** ...
NVIDIA is being charged almost 30% more for silicon by tsmc, and the card cost almost 50% more to make. nvidia would be losing money selling it below $700..... $800 isa more realistic price (prices have risen everywhere. ps5 went up in price, quest 2 went up in price. apple didn't even put the apple a16 in there base iphone 14)
@@bricaaron3978 well, I think what the other guy wanted to say is those buyers where miners, not gamers.
But there are no miners left so all Nvidia can do is sell their GPUs to miners and those definitely won't pay these prices.
They saw what Apple manages to get away with, so now they sell their gpu the same price as an iphone, and they never decrease the price during the lifetime of the gpu just like the iphone
I always bought Nvidia gpus. But after the 3000 series and now the prices of the 4000 series forget it. More inclined to spend my money with AMD even out of spite at this time. Been holding on to my GTX 1080 for 5 years now. I really just wanted to upgrade ridiculous.
Problem is AMD not any better it sucks....
@@cyllananassan9159 actually its much better perfromance and price related
@@neklmae3375 not really , ray tracing is horrible, and drivers too, also they consume a lot of power, and are really expensive, also low availability..
not going to enter the BS fanboi arena fight , i looked at reviews of all the cards both sides, so not a fanboi of nobody, i went for a cheap 3060ti like $500, in Europe, the equivalent AMD card was more expensive, more power hungry and crap ray tracing
@@cyllananassan9159 You've never used an amd card lol
Feel you 100%, my wind force 1080 getting old! I might even consider a amd crapcard just because fk these prices
The only major improvement in GPU leap was from GTX 980 Ti to GTX 1080. The Performance uplift is more than 2x, while the watt slightly lower.
1080ti*
1080 ti was the better investment imo
LOL I'm old.
The 8800gtx vs 7900gtx was a bigger leap back in 2006.
The 8800 ultra was even more.
980 Ti with OC on a custom card to 1080 non-Ti was almost a side grade. 980 Ti to 1080 Ti was bigger, but 1080 Ti did not OC well compared to 980 Ti. A 980 Ti at 1500 MHz (which custom cards all hit) is like 30-40% faster than a stock 980 Ti at 1150-1200 MHz. In NO WAY OR FORM is 1080 Ti twice as fast as a 980 Ti when both are overclocked / custom cards. My 980 Ti ran at 1580 MHz and performed like my friends GTX 1080 in pretty much all games at 1440p, hell some I even won slightly. I upgraded to 1080 Ti and could barely OC the GPU 5% and the upgrade was like 50-60% at most. 980 Ti was the most insane overclocker ever released and if you did not OC you simply did it wrong. Even reference cards hit 1450 MHz with ease. Thats a 300 MHz bump for free and still 30-35% performance gain.
@@Dr.WhetFarts This is true the 980ti overclocked like crazy on water. Can verify stock v stock it was equal to a 1070. If you could push it to the limits with cooling the thing was a beast.
Another thing you will note is that the 80's and 90's in each series they always have at least a 256bit memory interface - this holds true going back to the granddaddy 1080, which STILL has impressive rasterization over some of the 20 series cards. The 4080 'imposter' is every bit a 4070 just named deceptively. If you ever wanted an iron clad metric for what an 'actual' 80 series card is - 256 bit is the magic number when it comes to the memory interface (as in that number or higher).
My 980ti is 384bit. From 2015
Great analysis of the data we were given. It will be interesting to see if these numbers are representative of 3rd party reviews or if Nvidia is cherry picking these results as well.
My thoughts as well with cherry picked. I haven't looked at past Nvidia reveals and graphs, but those 3 games for rasterization could possibly only be best case scenarios. Meaning, a bigger list of tested games could bring those average increased values down. This really has me curious to see what third party reviewers will end up finding.
Every GPU manufacturer ever made has cherry-picked results for marketing materials. AMD is the worst offender here, so this isn't an area you can knock NVidia. If you want to knock them, say they are pricing too high for their GPUs.
But if you're doing that, compare apples to apples by comparing the current generation of NVidia against AMD with the cards which cards closest in performance across the two manufacturers.
The prices seem much worse in Europe. The thing is - the dollar is extremely, extremely expensive right now. The last week it has become 1%+ more expensive every single day, on top of being 80% more expensive than in 2013 in some countries already. The 4080 is actually twice as expensive as the 3080 several places. Here in Norway for instance, because of the exchange rates the price is now comparable to the card being 2200$+. Hardly anyone is interested in paying that much to play games, and especially so these days.
That straight to the point, concise and well presented analysis just gained you a subscriber 👌
Thank you for this video. Easy to understand.
I’m very interested in the rasterisation improvements from the 3080 to the 4080.16G
If I remember correctly, the Titan Rtx gpu was about 2500 usd back in 2018. Few seem to remember.
Great video! As expected the fake 4080 is very expensive for what it is... labeled as 4070 at 600 USD in a parallel universe with no excess stock issues this fake 4080 would be a huge success...
Your right!
NVIDIA is being charged almost 30% more for silicon by tsmc, and the card cost almost 50% more to make. nvidia would be losing money selling it below $700..... $800 isa more realistic price (prices have risen everywhere. ps5 went up in price, quest 2 went up in price. apple didn't even put the a16 in there base iphone 14)
@@legendp2011 don't Nvidia have 80% margins on their GPUs? They could, you know, absorb some of that price increase.
@@formulaic78 if that where the case than AMD or intel should easily be able to undercut NVIDIA pricing by a significant margin.............however they don't. AMD gpu are only slightly cheaper, and the new intel gpu seems like it will also be only slightly cheaper than NVIDIA
@@legendp2011 what do u mean by the ps5 and quest 2 prices going up . ? Those are also complete setups. Not just 1 component. They literally sell those with no room for profit and they still cost a lot less than the fake 4080.. They make there money from games and apps. I doubt it costs them $600 to make any of there GPUs
I remember just a few years ago, you could have a top tier GPU for only 649$ (gtx 980 ti). Now 6-7 years later price have more than doubled for the same performance ratio.
This.. For 12 years was like that from 2004 to 2016..using the Nvidia nomenclature usually you should pay 150/250 for a xx60 card 300/400 for a xx70 500/650 for a xx80 and than up.. I remember in 2009 i brought my gtx285 for 480 euros.. In 2013 my 760 oc for 360 euro and my attual card 1070 for 400 euros.. Wtf appen.. And when you talk with the peaple they act like the smartphone crowd they think this is normal.. WTF
Great bit of pixel counting 👍. I know i would have gone cross eyed doing that and also run out of fingers.
😁
It's not just about the 4080 imposter, it's about the real 4080 16 GB, it has price increase of 500$ ! From 700$ to 1200$
Above all of that, the 3080 had almost similer performance to 3090, but now the gap is crazy between these two cards.
It's like they are justifying paying 1600$ for a gpu!
Nvidia have turned into the scalpers.
Yeah the top end prices will al was ays be crazy, check back next year this time.
4090 on sale like the 3090
This is exactly what I thought when I saw those graphs. The fact that dlss3 could be made to run on 20 series cards means youre buying paywall inhibited software in stead of real hardware upgrades with 40 series to obtain "2-4x performance".
And it only works for games that support it afaik.
Dlss 3 most likely cannot be run on older cards. The tensor cores are supposed to be improved by a lot. Yes, fuck nvidia, but sometimes newer things just mean incompatibility.
@@larion2336 yea from what ive read, the graphs on the left are games that dont support rtx.. which means the new 4080 is visibly slower than 3090ti. Also, dlss3 performance in cyberpunk is cool, but 22fps at native rez?? 😬
Correct. It's a software launch dressed up as a hardware launch
Your video is incredibly useful and well done. Keep up with the good work man. Thank you.
Thank you!
Dude I’ve watched a lot of these videos and THIS one is the best breakdown. Thanks man.
Sticking with my 3080 ti for a long time, playing happily at 1440p works for me! I've always been interested in keeping up with what's new, but less so lately with price gouging, chasing deals, or waiting on inventory, upgrading my psu or mobo to take advantage of the latest tech edge! I know it's the way, but I have a X470 mobo with a 5800x (2700x as a back-up) 850 watt psu (600 watt back-up), got a 4 yr. warranty on my 3080 ti (2070 Super FE as a back-up) .. So, yeah now I'm more concerned with what game shall I play next instead of trying to keep up with The Jones and having more fun doing it as well! ...
Yeah I had to upgrade to b550 because of stability issues. It was either that or reset my operating system, and spend hours messing with my ram. If your system works well with your older gen motherboard then it's great.
@@marinated5425 .. Very true! Knock on wood, I have been very lucky so far with my MSI X470 Gaming Plus mobo. It's been so good to me not 1 minute of problems from it even though most people say it's a meh mobo! I heard the X570 Gaming Plus is not too good tho, which I have to wonder. Built my son a pc with a MSI X570 Tomahawk Wifi a few months back and he's very happy with it so far as well. It's all a roll of the dice with a dash of luck when it comes to electronic components or them actually playing nice with other electronic components! Good Luck on your new mobo and may it give you many years of peace of mind!
@@marinated5425 what issues where you having? I have a 3080ti with a MSI b450 with no issues
Me I have the 3090 and I'm more than happy with it just built my PC recently so I have no reason to upgrade
Will be interesting to map this over the next few months given the downturn in GPU market demand, coupled with AMD and Intel cutting in with new offerings. This could be big mistake for Nvidia, especially with the bad exposure and current overstocking issues with the 30 series...
5:36 i just did that exact thing. Gigabyte RTX3080 Gaming OC $575 shipped from Ebay. I kept a 1070 since release and it was a great 1440p card for years until things like Metro Exodus and RDR2 came out. It lived through FIVE CPUs before being replaced and allowed meaningful gains from each. My brother gave me his 1080Ti in May 2022 when he upgraded and it's a big jump from the 1070 but just not enough anymore. A base 10GB 3080 is nearly twice as fast as a base 1080Ti. And for $550 before shipping I'm happy with the price finally. I mean you always paid this much for a regular 1080 when they were current and people were happy with them. The performance jump from 3000 series is big. While they DO suck power, that's why I chose the 10GB 3080: It has the lowest TDP.
Enjoy the RTX 3080 is a mighty GPU and should see you through a long time and at that price it is absolutely great! Managed to pick a a new ASUS TUF RTX 308012GB for GBP 699 a while back and I am over the moon...Most will not even need a 4080 unless they are hardcore 4K high refresh and RT gamers...for the rest playing at 1080p/1440p high refresh and even 4K 60hz the 30 series is stellar and I am not even including DLSS yet..
@@vMaxHeadroom It seems as if 4000 series isn't that great anyway, so 3080 remains relevant for at least one extra generation... Same thing happened to 1000 series. 2000 wasnt a big jump so 1000 stayed relevant as the choice that made financial sense. Likewise I jumped from Haswell to Coffeelake and hit the same wall, haha. I keep hitting sweet spot hardware.
I think you are mistaken. The leftmost graphics are from games that use RTX, but not the DLSS. This means that even these graphics do not represent the increments in rasterization of the RTX 4000 series over the RTX 3000 series, as we know that with this new generation RTX 4000 will see big improvements in performance when using RTX. I could be wrong, but the increment in rasterization on this new RTX 4000 series will be much smaller than the numbers you see in these graphs.
Ac Valhalla and Division 2 on the left side lack both rt and dlss implementation so that invalidates your point. It's gain in pure rasterisation
@@dhaumya23gango75 It looks that you are right! Apparently, for these two games what exists are reshade mods that simulate ray tracing. Sorry! My mistake.
Finally a very informative video giving raw numbers in terms of performance increase vs price point, really appreciated
My pleasure!
I will never ever call that 40x0/12gb a real 4080. It's a 4070 and nvidia is violating the RICO act.
When you’re rich, you don’t follow laws, you follow guidelines.
4080 16gb = 4070
4080 12gb = 4060
Nvidia really knows how to game people.
there's seriously little that can compete with the 30 series on price.
I bought my 3080 10gb for 722usd 4 seconds after it became avalible for purchase and at 10% less than the MSRP here in norway, where the cheapest was 800$ msrp.
Sadly the gpu never came and i had been waiting for so long that i had totally forgot i even purchased the thing, until i got a suprise parcal at my door. I thought i had bought something while drunk to warrant such an unknown parcel.
When i opened it i almost jumped through the roof! Finally in front of me a shining 3080! Only scalpers had them at that time, the gpu took exactly 4 months on the to arrive, when i installed it the performance jump blew me away. Going from an old 1070 SLI setup that had low fps in 1440p with few games having SLI support i was extatic. I really couldn't affort the price of the gpu, as a student with barely any loan income, i decided when i wasn't gaming i would let my pc run a mining program, my 722$ came rolling back in just 71 days. The new gpu had a dual function as a heater when mining, so it allowed me to turn off my portable heater. The 3080 created just the right amount of heat to keep my room at a comfy 21*C during the cold months, in the end the 30 series allowed me to buy a free gpu which then made me money, the 3080 payed for itself 6 times! That's a price to performance mark i doubt the 40 series can beat.
I have a similar story. Truly the right GPU at the right time.
You was very lucky.... I only got the hands on a RTX3060 and did pay the same price. But then again I sold my old GTX1080ti for the same as I did pay for the new card and it has made more in crypto mining then it did cost :)
I guess that's a major perk of living in Norway. If I leave my GPU on at 100% usage in Australia my room becomes an oven.
@@larion2336 Same here we need Liquid Nitrogen / Liquid Helium to get the cards down to ROOM TEMPERATURE levels!
Very good analysis. Pretty informative!
Exactly what I’ was studying about the latest 4000 series… the comparison are good .. but when u go to used market for 3000 series… the value is just incredibly high for used 3000 right… bottom line- 4000 is too costly for its perf.. even with new to new comparison so used market is the best bet now!!!!
I just bought an 11600k and an msi z490 mobo for like 250$, and an evga 3080 hybrid for 550$. As far as an upgrade from my 1060 goes, im more than satisfied for a while, 1200$ for a 4000 card seems steeep af even with the better RTX and dlss 3.
Excellent breakdown. Thank you so much!
You're very welcome!
Nvidia is gonna hurt after this launch. People arent spending as much, scalpers are out of business and mining is at an all time low right now. Their sales are going to tank along with the MSRP. Hopefully this will change for the 50 series
I think I'll be happy with my 3080 for a long time. With even more performance you start to venture into 4k at higher than 60fps. And sure that sounds really nice. But I'm very happy to tweak some settings set res scale to like 83% and get basically the same experience.
But we're in a golden age of cpus so next upgrade is definitely gonna me a cpu, motherboard switch and maybe go for ddr5 Ram.
Why would you want to upgrade your CPU if the bottleneck is clearly the GPU (at 4K res). Wasted money in my opinion, unless you're a content creator of some sort.
@@error_nr-1232 The rtx 3080 is powerful enough to actually bottleneck my 8700k in a lot of games because it can push really high fps even at 4k. I also play alot of VR games where I want 144fps at as high resolution as possible and there my 8700k is reaching it's limits. I'm also doing a lot of content creation.
This is actually really good value. You're getting a free heater with your card. Perfect for this winter.
To be honest, I'm kinda hoping for heating the place up a bit with a new gpu
Card is not that cheap to start with and it take some wattage to generate that heat not mentioning need for psu
Super video !!! Thanks for that !
wait what about those warhammer40K and microsoft flight sim numbers? were those dlss or raw fps comparisons?
Where is the Racer X available to download?
So wich card is the best to get or have???
This video deserves thumbs up and to be shared. Load of valid information
Much appreciated!
When I got my 1080 it was a nice upgrade from my 780 and I paid about 620 usd a month after release
Still using that gpu. Would love to upgrade to a new card however not buying anything at a stupid price
I hope they crash and burn raising prices so high
I'm in the same boat. But I need a new GPU the GTX 1080 is not viable anymore and I don't want to play games at shitty resolution and graphical settings. So, I have to get a new card.
I have the money to do it.
I am waiting for reviews and what AMD announces for their performance and cost. As it stands only the 4090 has sparked my interest. Even though it's very costly.
I would go with it, if AMD has nothing better.
3080 is a worthy upgrade
@@marinated5425 when I can find a brand new 3080 for 600 I’ll happily pay
Agreed, I upgraded from GTX 580 to a 1070 TI and that's what I'm still using. I was hoping to upgrade again in this 4th Gen but Nvidia have gotten so damn greedy that I'm more inclined to look at AMD's offerings at this point.
They crashed, that's why lovelace is expensive, why it's purposefuly not a good deal. Get a 3000 series, it's the best bang for your buck atm :)
The shader count gap between the 3080 and 3090ti is smaller than the gap between the 4080 12 and 16gb
Nvidia didn't want to make the same mistake again and so now there is a huge gap between the RTX 4080 16Gb and RTX 4090.
Just be careful with GDDR6X-equipped, second-hand cards. Those VRAM ICs run extremely hot when mined, so in these particular casses, avoiding mined cards is a good idea.
So basically miners spoiled the used gpu market coz ppl are not honest enough to tell you the truth about the gpu they are selling to you. We are buying a cat in a bag with all this used 3000 series.
mining cards aren't even that bad. most of the time you'd just need to replace the thermal paste and you're good to go. The life may be reduced but they're still fine
lol bs nice try tho, those used cards are worthless
How much cheaper can we expect these graphics cards to get by Black Friday, assuming there are no shortages.
You seem to have missed the main issue with these numbers.
These numbers are skewed by the small print. The small print notes DLSS is used where it can be.
Therefore the biggest wins for the new gen may well be using DLSS 3.0 against DLSS 2 for the 30 series (as they are not allowing 30 series to run DLSS 3.0).
We know that DLSS is actually adding in unrendered frames to make the fps appear faster.
This may or may not work well.
This will account for the “2x” or “4x” performance.
Nvidia have misled before about new card performance.
AMD has an opening to drive a knife through Nvidia this coming generation. I highly doubt they'll over take in popularity but they can make significant market share progress. Currently AMD has around 20% and team green 80% of the gpu market. I can see this shifting as high as 40/60 realistically if this is the kind of pricing Nvidia wants.
Yes 45-50% for the reason that many wont buy amd anyway and waiting for lower nvidia prices or skipping and waiting the 50 series. Moreover wont be amd shortages cause it uses the same node as ryzen so the amd prices will be steady.
You have to understand that AMD would not take large shares (think 40% or more) of the market, because majority of consumers of medium to high end cards are tech corporations, content creators, research organizations, universities, they need the CUDA cores of Nvidia cards to do their jobs, renderings, Data searching, AI testings, etc.
They don't care how the price of cards increase, because what ever the price would be, they need to buy it.
AMD GPUs doesn't have CUDA cores, and they don't have the funds to spend on industry ecology, where Nvidia spent over 20 billion dollars on
@@newyorknewyork933 exactly i am one of them cause i use neural computing, but it all depends on the price, and im telling you, i dont want to be robbed, i pref buy used 30 series and im seriously looking for alternatives with amd, i never thought i would be forced to do this, but i am. Every preview in the internet are saying that is a robbery, at this price even for small companies Hopper is better priced. Unless it has a bad IT management at these prices you risk to get fired cause the Roi is too low. Only 4090 make sense. It all depends on amd performance and price now. Pc gamers are still the majority of the pool 80%+ and they dont like to be losers too, when they sees the price and see amd is the same at 300$ less, fuck nvidia, or their friend will call him stupid. If nvidia loses only 30% pool is a victory for them, it couod be even higher. In the end all our reasoning are wrong cause nvidia will cut down the prices for one reason only, tsmc already told nvidia that they dont cut the production and nvidia must pay every order it made. Is better for nvidia to earn 10% than losing 100% for every unsold card.
@@newyorknewyork933 I suspect gamers, whom are not also doing production work with their rig, would gain the most from switching then? I personally like AMD products. I'd rather buy a Mac Mini/Studio for editing and rendering rather than to purchase marked up Nvidia cards.
amazing video. thanks for going so deep. coming from a fellow youtuber i take my hat off: you managed to do what so many youtubers fail at: finding a niche and ROCKING IT!
I appreciate that! Thank you!
Thumbs up for the slick outro music!
I feel like I am going to get hosed either way. I paid 1400 for my shady back alley 3080 purchase, now I need more to run my Pimax 5k at 120 fps. I feel like I'll have to sell my car to be able to finally be able to recreate the level of fidelity in the U.S. Air Force V-22 osprey simulator I got to try out all the way back in 2005.
It is true that the civilian world is about 20 years behind the military in simulation/gpu power.
excelent job!Subscribed!
Awesome, thank you!
Best video on this topic I've seen. And I already watched A LOT (including the "big" channels)
Wow, thanks!
4:50 which is down to rampant inflation.
5:48 .. what's worrying wbout these specs is that they are chosen for this purpose. Across more games the 4080 16GB vs 3090 Ti might not favour the 4080 as much.
4080 12gb should be compared to the 3070, which is a 499$ gpu. inflation might be high but not 80% in two years, nvid got used to higher profits and seems to want to keep their new and improved salaries, this is just like the 20 series launch all over again. xD
Amazing analysis.
I guess announcing hi prices for the newer 40xx generation is a marketing strategy in order to sell more of the 30xx series prior to the start of 40xx sales
Great analysis, thank you!
Your welcome!
I got a 3090 at release for msrp to replace my 1080ti. Performance from the 1080 ti to 3090 was more than double which made me happy. If they can achieve the same doubling or better going from a 3090 to 4090 in normal rasterization performance then I wouldn't mind going for a 4090.
I have my doubts about a doubling of raster performance because that would be the single biggest jump in one generation for a GPU since early 3d graphics accelerators of the 1990s.
The 980 ti to 1080 ti was only around 65%-70 better as was the the GTX 7800 to GTX 8800.
you are telling me you had to wait for another generation and paye more than double the price to get a card worth a upgrade.
Glad I got the 3080TI. the leaked benchmarks are with dlss 3.0 that cheats on the 4xxx series so I am guessing it is as fast as the 4080 12 gig
I’ll take the 384 bus width over the renamed 192bw “4080”
Yeah should be around this performance, more than a bad joke for expected 1.2k Euros in Germany at release...thank you Nvidia lol
I miss something, but seems performance compared to 3080Ti but price is compared with 3080 regular?
The thing is these new cards only matter if you want to use DLSS 3.0 + RTX that's the only huge difference in fps
They are still 25-35% Faster without DLSS3
@@zkilla4611 That is not a large jump for something so expensive
@@KaiThouKing I agree. Hopefully AMD brings the heat this generation. It would be nice to see them put Nvidia in their place.
What is the "world-building" software?
Damn, very likely I'll just buy RTX3080 12GB.
I hope there will be Black Friday deals.
I might look at used cards, but in Germany by some reason people bid very high price for used cards, I'd say it doesn't make sense to buy used card.
I guess you have to include the current general levels of inflation also
great breakdown!
Subscribed for this video.... U r a confirmed Master.
For me personally going to the 4090 would actually be a massive increase in performance. Currently I'm on an RTX 2070 laptop connected to my LG CX OLED via a Mini DisplayPort 1.4 to HDMI 2.1 cable. If I'm not mistaken a laptop 2070 is equivalent to an RTX 2060 desktop GPU. Going to an RTX 4090 likely will be quite a jump in performance.
The ports on your LG CX are 2.1 but imagine how pissed I was when I bought a 2080 ti to then find out it was NON A so cant flash other cards bios' onto it, power limit capped....only manual OC...AND the card is HDMI 2.0b!?!?!? I know Nvidia knew and could have been ready to give 2000 series gpus the 48gbs..errr 40gps ...I have the CX as well....Then a month ago it stopped working. BIOS and Device manager do not detect my 1200 dollar 2080 ti brick!! I dont know the hardware ID for it either...forcing me to buy 3000 series if I want the 48gps the 2.1 hdmi bandwidth offers... :(
Your video has perfectly captured how I felt about this launch, big price increase and generational speed increase doesn't look that great. Looks like I'm saving my money for team red or 2nd hand bargain.
I'm with you. Also looking at Intel since I don't need update anytime soon.
The 3 benchmarks on the left could be using dlss 3.0 on the 4000 series. Dlss 3.0 works with any game that has TAA, not just games that support dlss
There are people out there that think the 4x as fast claim is correct. When you see cyberpunk though, the 3090 was with ray tracing and no dlss. The 4090 was with dlss 3. They’re fudging with the numbers a lot on these new cards.
Soooo should i buy now a 3090 or should i wait for the 12gb 4080? (same price between this two here in germany) what would you guys do?
Überlege eine 3090 ti zu holen ... oder bis nach Weihnachten zu warten und zu schauen wo dann die 4080 16gb liegt. Hab noch nh 1080.
@@noThing-wd6py Ich habe mir ne 3090 jz geholt, muss sagen bin sehr zufrieden, glaub ich warte dann mal auf die 5000er Serie, bis dann sollte meine 3090 reichen (hab sie für 1049 Euro bekommen). Leistung zusammen mit dem Ryzen 7 5800X3D ist einfach top :)
Ich denke nicht, dass die 4080 16GB in den nächsten Monaten deutlich günstiger wird :D
You deserve more subs!
Did you used to have a video game review channel
I bought a 3090 FE for MSRP back in January of 20... It had about the same price to performance ratio as my outgoing Vega 64.
Now that's an upgrade!
@@ImaMac-PC It was noticeable ;)
To be more correct, in addition to zooming in, DLSS3 also uses Optical Multi Frame Generation to generate entire individual images based on the frame before zooming in. Optical Multi Frame Generation still operates at reduced resolution. Without first reducing the amount of data that needs to be processed, which reduces the resolution, it is impossible to achieve acceleration. In other words, the Zoom technology of DLSS3 is better, but the image quality defects brought by DLSS2 in the past will still exist, but the degree of defects in DLSS3 will decrease.
The AI technology of DLSS3 is mainly based on the analysis and prediction of the motion of all objects in the game, but as long as there is an unexpected operation behavior, the image defects caused by DLSS3 will appear. That's why the DLSS3 images shown by nVidia are all straight forward, because the inertial operation of straight forward is unlikely to predict wrong. However, after maintaining a straight line for a period of time and making a sudden turn, the AI will make mistakes, but it is normal for the operator to change direction.
the game in the end name?
What is that model car racing game?
The weird part, if you look a the nvidia page for the rtx models it also shows 2 different 3080 gpus, 12 and 10gb ram size and with different shader amounts 8.960 / 8.704.
That said i never have looked at actual cards having that difference or if it was just a typical oem ripoff.
Shader has nothing to do with fps tho
@@joestuehmer7967 Which alone is a wrong statement of yours.
perfect analisis thanks !
Glad you liked it! Thank you.
did the same thing after the livestream, good to have confirmation. 4090 it is, thnx! ;)
It is all well and good to talk about the flagship gpu 4090 all the time, everywhere, when in reality most people are struggling to justify 350 - 450 usd gpu purchases and a lot of people are still very happy with 1080p gaming.
Also, every year there is talk of msrp and it seems that mostly people get dissapointed by reality.
We might see 2000 usd 4090 yet!
For me 1500 is a new pc😎
@@tmoshamo To most it is :)
Great job!
Regarding this launch I called one thing spot on, that they would put more emphasis on ray tracing and for that I commend them, work that architectural advantage over your competitor. everything else about this launch is just plain poorly thought out down to the omission of Displayport 2.0. I remember being jealous of Maxwell until I finally got one, being blown away with Pascal, petulantly hopeful when they added ray tracing with Turing, and despite fighting the shortages was over the moon when I finally got my 3090 (24GB on a consumer card what?), but since Turing they have been getting greedier and greedier, just look at their NASDAQ ticker to see that. There was a good chance I'd be sitting out this generation anyway after just getting a 3090 but after this dumpster fire of a launch that decision has been made for me.
Quality work. That's why I bought a 3080 this week brand new for $600 bucks. Still got an eye on AMD to see what they come up with.
Where from? I will probably snag one as well
@@chazcarpenter1559 Newegg seems to be your best bet. Micro center has tons of stock but not at that price to my knowledge.
@@jimmyramos1989 Preciate that!
Great analysis!
Thanks!
I was confused, the NVIDIA chart was comparing the 40's to 3090 TI .....right?
nVidia chart did compare to the 3090 Ti. I compared to the replacements, the 3080 and 3090.
I’m happy with my 6950xt, maybe Amd will drop some bangers with the 7900xt and I will sell my 6950 but if it’s not a big increase in performance I have no issue getting 250fps 1440p on high settings every game I play (apex and warzone mostly)
6:57 actually its smaller if you go by Nvidias on website specs of the 3090 cooler.
If you add DLSS 3.0 on 30 series, the 40 series FPS advantage is much smaller.
Do you think that is why they will not offer it on the 30 series GPU?
Should i buy 1yr old probably mining card for 400$ ? with 2 yr warranty ( new one is 650$ here due to tax)
With this level of pricing, insane power requirement and 5G technology becoming broadly available and cheaper, we are near the end of graphics cards for gaming. Cloud gaming will soon become the prefer way for most gamers. Give it a few years and the performance/latency ratio will be up to par with current "physical" offers for a fraction of the price. Nvidia knows it very well and try to capitalize on remaining customer products. 3 or 4 generation down the road, Nvidia will provide graphic cards mostly to datacenters and it will be a lot less lucrative than current model. The "pay as you play" model or even the subscription model will be a tough one for Nvidia because they do not dominate the cloud market.
Personally, I absolutely disagree.
Cloud gaming, especially for PC gamers, is not yet minimally able to replicate the performance of physical hardware and as Stadia has shown it is unattractive and also expensive for a company like Google to maintain which translates into high costs for the players.
Not only that, PC gaming is also passion not just performance. I "want" to have my PC to assemble, modify and cure.
Last factor is the experience tailored to the user. If you play on PC and not on PS5 or 0ne-X it is because you want to play in 32: 9 or maybe at 320+ HZ or in 8K etc.
The best thing about PC gaming is that everyone plays the way they want and with what they want and a streaming service won't be able to give it to you for many years to come .
A big part of the cost increase is because of the extremely high cost per wafer from TSMC on the node this card is built on.
This has been covered fairly extensively from other sources.
Another is the need to keep 30 series cards prices more stable and relevant, again these cards cost more to produce than AMD's cards so they cannot sell them as low without taking huge losses.
Even at the current under $1100 pricing of the 3090ti the 4090 with its performance improvement of roughly 70% more than a 3090ti is inline at its $1599 msrp.
What kills this for a gamer doing an upgrade and not a new build is this card requires the new technology standard PSU or the card without the sensor communication to the PSU will throttle its power draw hence affecting its performance level from my understanding.
For someone upgrading this will add roughly another $300 to the cost of the upgrade making the 4090 run $1899 not a $1599 upgrade.
Not to mention that even though you may already own a less than a year old $300+ psu for this upgrade it is basically a paperweight if you want maximum available performance out of your 4090.
This is how I understand it as without the communication standards the card will revert to a preset minimum power draw level.
Yea... High B.O.M + high margin== Ludicrous pricing. :)
@@mpadlite2925 Exactly. Nvidia not willing to give up even a tiny bit of that 60% margin they require. 40 series is a hard pass for me. They can keep their high priced GPUs and not get a penny out of me.
I can't wait until the 4000 series cards come out - not so I can buy one, but so I can maybe finally afford to buy a 3000 series!
I'll wait until AMD announcement of 7000 series.
Nice analysis, more and more leaning towards buying a 3080 and retiring my 1080ti.
Depends on what you're doing, you're better off with a 3080Ti or 3090, since the 1080Ti has 1gb extra Vram than the 3080 and 11gb is already running out on a few games at 4k, and getting low at 1440p, 12gb probably won't last long either.
Personally I’ve just picked up a brand new 3090ti for around 900£ witch I’m happy with I’ll probably look at the market this time next year due save a lot of money 💰. I also am concerned on the actual performance on the 40 series due the comparasons didn’t say the graphics settings used nor compared the current ray tracing and dlss to gen 3. Eg I have a lot of questions but little seen till the reviews start happening to see the real world performance, yes it will be better but to the cost to performance???
Considering the RTX2080ti never dropped under 1000£ Here in Norway I would say a RTX3090ti at that price with that much RAM it has is ok price... Here in Norway you can not find even a RTX3080ti at that price...
I also picked up a little 'ol 3090 TI ;) Good purchase!
@@zachoons yup about to do the same. I have a 3060 now. But I wanna play games in 4K
@@a64738 In germany you can get a RTX 3090 TI for 1250€ incl. Tax. What are the prices in Norway?
lol @ "for those that live in the omniverse"
so... basically if you have more money than sense, then buy the 4090; otherwise pickup a 3080 (and if you want to give nvidia the middle finger - buy second hand)
great analysis - you earned a sub!
Can you make a video with the European prices? If it´s worth it to buy it or not.
Whether something is worth is to buy is based on a multitude of factors (e.g., use case, disposable income, do you require that GPU to do your job faster, etc) of which one of them is price. All I can do is provide the information and each buyer has to determine if it is worth it to them. My advice is that, until you know it is absolutely worth it for your use case, don't buy.
If you're already getting 60+ fps 4K rasterisation.
Why would you be disappointed with limited Rasterisation upgrade?
I'm more interested in RT performance.
Answering the question why new cards are so expensive. This was made to force you to buy from huge stock of 3xxx cards. Since release of 4xxx is still very far and can be postponed, with this high prices they motivate you to buy old stock to deplete it completely. When the old stock is depleted they can announce lowering prices for 4xxx cards. Though these high prices for next generation may stir up prices for secondary market and new 3xxx cards market since people have no choice what to buy before 4xxx release.
Good video! enjoyed it
I already knew the prices would go up but not this much. I feel like if the prices of the 4000 series was $200 lower than it is now, it’d be less of an uproar. Nvidia can easily sell their 3080 for $500 but they’re too greedy.
For gaming I’d stick with my 3090 FE, but unfortunately I’m also a VFX artist that heavily uses NVidias optix renderers etc and they’re lowkey amazing…. I guess I will be up early attempting to snag a 4090 hopefully 🤞
And the 3090 can't do that? Because if it can unless it's a hell of a lot faster you'd be wasting your money.
Yeah I’m thinking about grabbing a 3090ti LC in the next week or so. The 4090s were melting PSUs during testing…
@@ffwast It can, but the 2-4X performance increase claims with the 4090 over the 3090 are specifically in Raytraced scenarios. Which obviously doesn't matter for most people but for me that's an insane value increase.
My job is built around the idea ''time is money'' so its a little different for my case.
@@bigdaddyschin5692 yeah the new cards require ATX 3.0 PSU's. Theres a chance if you use a certain adapter with a current gen PSU itll start to melt the port on the PSU.
@@rfx8459 The key word is "claims" and who's making those claims has a reputation for similar claims being marketing lies. Real benchmark results never live up to nvidia marketing.
At least in Eu right now, I don't see how 4090 is just 6% more money. A new 3090 has a price of @ 1150EUR ( even 1100 some models), and a 3090 Ti gets to 1250 EUR. A new 4090 is around 2000 EUR . So that comparison may be valid when 3000 series arrived and such, but right now? seems like 70% more performance for 35-40% price increase. Maybe I am wrong, idk.