RTX 4000 Is All Smoke and Mirrors
VloĆŸit
- Äas pĆidĂĄn 22. 04. 2024
- In today's video, I discuss the new RTX 4090, RTX 4080 16 GB, and the RTX 4080 12 GB. What is Nvidia hiding? Where is the RTX 4070?
CONNECT WITH ME
---------------------------------------------------------------------------------------------------
EMAIL ErockOnDeckYT@Gmail.com
DISCORD / erockontech
TWITTER / erockontech
INSTAGRAM / erockontech
-----------------------------------------------------------------------------------------------------------------------
The videos you will find on this channel are similar to videos you will find on channels like Alpha Gaming(Harris Heller) and JayzTwoCents. Also, there are a lot of themes and influences from creators like Nutty, BitWit, and Linus Tech Tips.
-----------------------------------------------------------------------------------------------------------------------
The music in this video is from Harris Heller Streambeats. You may have heard about this from Alpha Gaming.
-----------------------------------------------------------------------------------------------------------------------
đŹChapters
đ 0:00ââââ - RTX 4090 and 4080 Price and Release Date
đ 0:20ââââ - Where is the RTX 4070
đ 0:48ââââ - RTX 4080 12 GB Specs
đ 1:52ââââ - Nvidia is hiding something
đ 3:15ââââ - RTX 4090 Ray Tracing and DLSS
đ 3:25ââââ - DLSS 3 is exclusive to RTX 4000 Series GPUs
đ 3:53ââââ - DLSS 3 uses fake frames and creates more latency
đ 5:12ââââ - Where is the Raw Rasterization for 4090 and 4080
đ 5:25ââââ - What is DLSS
đ 7:00ââââ - DLSS 3 is the main selling point of the RTX 4090
đ 8:28ââââ - Is the RTX 4090 better than RTX 3000
đ 9:35ââââ - AMD Radeon 7000 Series GPU Release Date
đ 10:00ââââ - How can AMD beat Nvidia
-----------------------------------------------------------------------------------------------------------------------
This video addresses all the following topics:
RTX 4090
RTX 4080
RTX 4080 12 GB
RTX 4080 16 GB
RTX 4070
RTX 4090 price
RTX 4090 release date
RTX 4080 price
RTX 4080 release date
Nvidia
DLSS
DLSS 3
AMD
AMD FSR
AMD Radeon 7000 Release Date
AMD vs Nvidia
RTX 4000 Series GPUs
RTX 4000 vs RTX 3000
RTX 4090 is a rip off
RTX 4080 is a rip off
RTX 4090 is overpriced
RTX 4080 is overpriced
4080 12 GB is really a 4070
#Nvidia #RTX4090 #RTX4080
Are you getting a RTX 4000 Series GPU? Let me know below.
GET DISCORD ACCESS FOR ONLY $1: www.patreon.com/ErockOnTech
Follow me on Twitter: twitter.com/ErockOnTech
@No Scope GG You are the most enlightened!
As long as the 4080 16GB is better than my 3070 TI yes. Otherwise I'm likely getting a 3090 TI. I game and I use Stable Diffusion so I would like to have at least 16GB Video Memory to render AI Images. 24GB is kind of overkill for me. So that is my plan and I already have all the requirements for my PC to support a 4080 that has been my plan since getting this PC. But I admit a lot that either pisses me off or worries me about the 4000 series launch.
Nah, probly not. Currently have a 3060 but might try to snag a cheap used 3080 in the future
Yes 4090, already sold my 3080 at profit.
I'll keep my 3080 til the 13080 comes out.
NVIDIA: also don't miss the GT4030 starting at $499.99
Lol đ true
NVIDIA: Oops, did I say 4030? I meant 430.
I think you mean the 4080 4GB
Lol!
@@theinfinity2994 Nice!
I have been screaming that in the discords I'm active in. DLSS and FSR is ok for taking a lesser card and giving you a bit more. But when you need to use it to toot your own horn about 4X the performance, you are only kidding yourself.
FSR is for the budget cards 100%.
I'm an AMD fan, and when it works, it's magic, but yeah. It isn't supposed to point at quality graphics
I'm mostly interested in DLSS/FSR and other tech to fake/imitate full scale resolution/framerate (interpolation etc.) for truly low power systems like handhelds, tablets and ultra portable laptops which stand to benefit a lot from the low power consumption of simulated frames due to lower heat dissipation and longer battery life. If FSR 3.0 and/or DLSS 4.0 can give 1080p 60fps on a tiny handheld while upscaling from 480p, that's truly a great win. In the past a lot of games supported multi gpu setups for those who wanted the ultimate high resolution high fidelity high framerate experience but last gen consoles pretty much rendered that useless and Nvidia has zero incentive to go that route given how power hungry their high end GPUs have been lately. There's no way anyone can host a dual 3090(ti) setup without a significant cooling solution and/or compromising on noise levels. This is the only way they can pretend to be pushing the norm to a future they promised with each generation but failed to deliver like with Ray Tracing support. Even the highest end GPUs can't render 4K real time fully ray traced game as simple as quake at high framerates.
if your limitaded by 100 fsp, and already has 80 fps natively, turning dlss3 on would spam your 100 fps limit with fake frames, an unwanted effect. Is indeed safe to say just make sense dlss3 if your card is performing halft or less fps that your monitor can handle
DLSS on 1440 is like magic. Set it to quality instead of performance and it looks great.
@@eleventy-seven and then drops frames and is unstable. Idk, not sure it's worth the price.
Perfect take.
Nvidia wants to charge us based on DLSS 3.0 numbers, while it looks like flufff performance that is not real framerate in practice
Thank you! Yes, exactly. Itâs a rip off. This is anti consumer.
So many people, and I assume they are Nvidia fanboys, have said the 40 series performance is impressive. But itâs really not. Nvidia hardly showed any apple to apple benchmarks because they are trying to hide something. They are getting all of this fake performance from DLSS 3. Itâs not real performance and thatâs exactly how Nvidia are scamming gamers. They are charging ridiculously high prices for their GPUs but giving them such a minor performance boost. Nvidia likes to minimise performance and maximise the prices on their GPUs. They are trying to hide the fact that 40 series are very minor performance increase the 30 series. I highly recommend that everyone avoids buying a 40 series GPU. They are nothing but a huge scam.
Plus, Nvidia are making these false 2 to 3x faster performance claims. They shouldnât be allowed to make these performance claims unless the GPU is actually that much faster than its predecessor apple to apple. DLSS 3 is a cool feature, but they canât just rely on that.
Why is nobody understanding that Raytracing and DLSS is the future.. not rasterization? There's a reason they're pushing dlss..
@@jeremycovelli simple give me native.
As an owner of RX6900XT I am happy to know that FSR 3.0 will probably still support my card, not be slapped in a face like nVidia owners.
Maybe it also supports Nvidia's card too like FSR 1.0/2.0? Who knows.....
Are you sure about that?? Fsr 3.0 is rumored to function like dlss thus requiring dedicated hardware that the 6900xt simply doesn't have... still these are all rumors
@@beretteres4031 That's why I said probably. But if they do it, they will look just like nvidia. A plausible solution for them would be to offer fsr 3.0 for rdna2- that wouldn't provide such low latency/high frame rate but still would bring better experience for lower latency titles.
@@BartoszBielecki not really, because nvidia is locking down a feature that is possible in older generations, amd with fsr 3.0 is just trying to catch up with xess and dsl and in order to do so new hardware is needed, without lies and just trying to advance
@@beretteres4031 I think AMD will be smart. I think FSR 3.0 will function one way for the 7000 series, but a basic version that's superior to 2.0 will be made available to use on non-7000 series GPUs.That should be the move.
The only creator that's not leaking nvidia boot to get a free 4090. Didn't miss anything. Thanks alot. Excellent work
Thank you. I appreciate it! đ
DLSS is set to _Performance_ mode in their benches.
I absolutely agree with you. A $1600 card that only gives you 22 FPS with RT turned on is a terrible investment. So the major improvement of the 40 series is the software, not the hardware
exactly and they could activate that for 3series
I'm afraid that they can't go much further in terms of raw power so they have to look for improvements from this kind of technique.. even with 450W! I can't imagine if we are close to being stuck on performance? Is this going to be the end of GPU development?
The improvent is bigger than ampere was lol wdym
Actually a key part of DLSS 3.0 is built in to the 40 series hardware, it's why only 2 of 3 of its' features will be available to 30 series cards.
@@chiyolate 100% right on the money where nVidia is concerned, it's because they didn't innovate in the same way AMD has with chiplet design. They're demonstrating in real time why silicon is dead. AMD is utilizing the only approach left before we have to move on to graphene chips, or whatever tech is next.
Basically "Jensen Black Jacket" is saying buy out old rotten 30 series, because I need more jacker and I am all out of jacket.
I told people that Nvidia stock was priced so high because of crypto card sales. I said when Ethereum crashes the stock price will drop back to normal and it did. Jensen and the investors had a sugar high with the high stock prices and now that stock prices are back down to normal... mad scramble begins.
Consumers have the power at this point. Follow EVGA and tell Nvidia no and they will have to clean their act up. Buy the junk and Nvidia will not learn anything.
Have you been short-selling Nvidia when the ethereum merge date was announced?
@No Scope GG nah, most gamers canât afford the ridiculously priced 40 cards
@@kerbodynamicx472 I don't own Nvidia stock the dividend is too low. 98% of my stock have high dividends and I hold long. Only a few stocks for day trading.
Donât forget they also have an abundance of 30 series to offload as well. So they are also pricing the 40 series high to make the 30 series seem like a good deal so they can keep charging exorbitant prices for last gen cards.
Theres basically 0% IPC lift, it's all in the node shrink and clock speed increase. In other words all the improvements are from TSMC not Nvidia engineering.
Even with the node improvements, they focused so hard on tensor core performance that they forgot to improve anything else, and need 1.21 jigowatts just to power it. đ
Which is fine, if the price is fair. Maxwell and Pascal were also very similar in terms of architecture but there was a large performance increase for only a $50 price increase
@@x8jason8x If only your PC could bring you back in time.
@@PoisoningShadow671 Right? lol
What's up! can I still use my corsair hx1000 or need to upgrade gen 5? or can use adapter?
Great analysis 22 frames for a 4090 is shocking. I know Cyberpunk is the new Crysis but still. Also a new sub :)
Thank you! I really appreciate it! Welcome!
I just check some youtube video that test 3090ti w/ cyberpunk 4k Native no DLSS and it got 20 FPS! so... its only 2 fps difference!?
@@0rrchids9:28 he said he confirmed that they're both 4090. I still can't believe it.
â@@0rrchids this was the new RT overdrive mode, so not comparable at all
@@0rrchids Holy shit lol wow
22 fps on a RTX 4090 bruh
Pretty sure the 40 series is a software update, not really a hardware update. The only changes to the hardware looks to be to enable their new software magic and massive increases in power utilization.
The extra power and clock speeds seem to at least help rasterization for the 4080 and 4090, but not by by anything weâd really call a generational increase. Especially if you look at it in terms of pref per watt.
I think the benchmarks next month will be truly telling.
Lol yes you are correct! Itâs a $1600 software update. Ridiculous.
@@ErockOnTech The "purely software 5nm process node update" will net in a minimum 50TFlops performance, uplift from 40TFlops on top silicon die. It also seems you will be able to buy better perf/$ than last gen too. If you do some napkin math, you'll realize you get rtx20 series perf/$ level, that actually great considering world situation and market condition.
As for gaymers, you can opt out of real time lightning in any/all games and you'll have plenty extra rasterization compared to previous gen and no need to enable DLSS. Stop bitching.
@@aoolmay6853 no point arguing with this raster cultist amd fanboys.
â@@kingyogesh441 Why call anyone cultist fanboy? I use exclusively AMD GPUs for professional work, the price for AMD compute was significantly lower than Nvidia products. This may or may not change after both companies show new products this year.
I'm against exaggerating drama around rumors. Just wait for independent reviews and make your decision.
It's not a software update as I doubt Nvidia is stupid enough to try and pull that over our eyes lol, but something is off about the numbers Nvidia are showing us.
After all, 22fps on Cyberpunk with DLSS is really low for a card that's supposed to be a lot faster than this gen, there is no way that card should be getting such a low frame rate unless it's 8k but if it was, Nvidia would make sure we knew about it and even then, I suspect it should be able to do 8k at 30fps if not a little more.
Like many are starting to think, it's looking more like Nvidia is trying to convince us of the performance leap by using DLSS 3 to bump up the frame rate, which tells us very little about the raw real performance of the card which is what most gamers are interested in.
They took the 4070 and slapped a 4080 sticker and price on it. Took an actual 4080, put 4 extra gb of ram on it and charged the price of the 3080TI. They will release a 4080 ti priced around $1600 next year.
with it's 192bit membus it's rather a 4060.
Glad to see someone else noticed the GPU launch was just tacked on and more of a "fund us please" plea
Indeed. My guess is the price of leather jackets has gone up significantly.
This might be their worst launch since the FX5800, but let's wait for reviews.
I agree 100%!
By far worse, that was a Quadro. lol
@@x8jason8x u are too young :P he is not talking about the Quadro but the FX 5800 Ultra from 2003 that got spanked by 9700 pro from ATI (now part of AMD)
@@gametime4316 LOL I'm not too young to remember that at all. There's literally 5 years of difference between the two, but yeah I started using nVidia around 2004 exclusively.
Thanks though, made my morning.
It will be interesting to see the actual performance % uplift in RASTERIZATION at 4k, for the RTX 4090 over the 3090 Ti..... without DLSS or RT
Look up "how fast is rtx 4090 without fake frames" that video has some solid speculation based off of nvdias graphs.
Itâs about 70% per Nvidiaâs charts
@@Lightsaglowllc 61 percent when you combine the 3 games . Definitely not 2-4 times faster lol but it's decent. The 4080 is around 20% which isn't good at all for the price they want . The 4070 was 10 percent slower in resident evil and slightly behind in the other games .right where you'd expect a 70s card to be compared to previous gen top tier card. I refuse to collect it anything other than a 4070 lol
Thank you for pointing out exactly what DLSS is actually doing. Everyone saw the marketing for higher framerates, but IDK how many people understand what it is actually doing.
I can't wait to see what independent reviewers will say about pure rasterization with the three 40-series cards.
I used to be a fan of DLSS when I misunderstood it. I confused their meaning of native to what it means to me as a tech, EG 4K native is purely reliant on your monitor specs. When I looked further in to it, I finally understood they downscaled, then upscaled back. Using "native resolution" in this context is something of a misnomer, it's truly just AI upscaling, but it isn't true 4K either. It's motion interpolation, which TVs have been doing for years.
RTX 4090 is 80-90% faster than a 3090 and 60-70% faster than a 3090ti! Keep this in mind until a bit after launch and tell me you were right! to me this is just fine for a 6% price increase from previous gen! The numbers I just mentioned if it needs to e said is referring to rasterization ONLY!
@@cdiff07 LOL! No. If you believe that I have a bridge to sell you.
@@x8jason8x Come back to this comment after launch!
@@x8jason8x czcams.com/video/vt8Z8iM0aO0/video.html
Wow great video! with some very interesting points I haven't see anyone else talk about. You've definitely earned a new sub here, I've got the feeling this channel is going to become one of my main sources of tech news from now on. Keep it up!
Didn't they kind of say that they want the 30 series and 40 series to co-exist for a little while and it's assumed there is fairly much stock left. I wouldn't expect these cards to render the old ones obsolete. Maybe cost a little more but come with a couple new features like DLSS 3.0 and better ray-tracing performance.
Good take. I'm not interested in the 40 series cards. Looking forward to your video on AMD's presentation of the 7000 series. It'll be interesting to see what AMD offers and how they compare.âïž
Same; my overpriced 3080 works just fine.... I did an R7 1700 16gb 1060 1x970pro samsung in 2017; Then in fall of 2021 - 3x2tb samsung 980pro, 5950x, 3080, 128gb ram; Thus - I like doing 4-5 year upgrade increments... I find graphics cards hilarious - it reminds me of the car world where, when every new car model comes out, they use this term: "Structural rigidity improved 15%" lol... So a car built in 2022 vs one in 2010 is a 5000% difference?! The "marketing" morons amuse me... but! Bill - the low IQ morons buy into it because they're lost. They think material goods will make them happy, when - there's other problems that are more meaningful and rewarding in life to solve. Computers should be lower on the list.
They'll do the same and increase GPU commodity price. Corporate greed has no boundaries.
Agree with most of this. Only thing that hits me is the 22fps Cyberpunk stuff. It sounded like there is a new higher quality retracing mode being added to the game that they are using. I'd like to know how current cards handle that for comparison.
Exactly. There's no apples-to-apples comparison. It doesn't mean anything.
Ray tracing seemed an interesting gimmick. So few games I play actually have it as an option, though. The few that do, it's barely noticeable in normal gameplay - like a reflection in a puddle is more accurate. But, DLSS was practically invented to offset the lower frame rate caused by enabling ray tracing. Since I don't use one, I don't need the other.
I see DLSS as a "nice to have" for 4K gaming with the 30 series, but for NVidia to release fps benchmarks for the 40 series based on DLSS 3 being enabled without a proper benchmark for all the bells and whistles off, I just can't compare apples to apples.
It looks like they're just letting me know not to expect 4K 120 fps with the 40 series without the smoke and mirrors and best luck next gen.
In Cyberpunk 2077 there is a huge difference in how it looks with the RTX on but it is the only game I see any difference... Just to sad that with my RTX3060 it drops to 30fps with RTX on, but it looks so much better that I run it at that fps anyway.
@@a64738 I enjoy watching streamers play Cyberpunk 2077 on Twitch, but it's just not for me.
I have a 3090, but my favorite games - Subnautica and Subnautica Below Zero don't even have RTX capability, nor are there any plans to include it even though they still update the games, backport tech to the older one, and occasionally update the entire game engine. They claim not enough players even have RTX cards to bother with it. I think the latest Unity engine supports DLSS by default, so when they backport it, I might get that at least, but without RTX, I don't really need it.
Same goes for a lot of games, so while I'm glad NVidia is innovating, The marketing speak of fps on only certain titles that include their proprietary tech is pretty shady and I look forward to youtubers giving us proper benchmarks once they get cards to test. :)
Word
Good report Erock! glad you din't drink the Ngreedia cool-aid. And For the 1st time in years, I'm seriously considering to purchase a Radeon 7xxx series when available. (or if becomes, possible a new RTX 3090 for $400-600)
I concur. That release will be a major mess. Regarding DLSS 3.0, I think Nvidia will add it to 3000s series down the road.
I hope youâre right!!
they cant its a hardware limitation
@@sephirothlv100 The VP at Nvidia said it was theoretically possible. However, the performance uplift wonât be the same as RTX 40 series. So they could add it.
Should I go for 3080ti or 3090ti got 360hz 1080p and 1440p 144hz montier stuck with RTX 2070 super.
Depends on pricing.
Totally agree with everything you said Erock, Nvidia want to sell a service and that presentation was a shareholder event. This launch gives me Fermi vibes (GTX480).
Right on my man. There is something going on here with the raster numbers. I think they are fine and we will know about everything in 3 weeks, but nVidia wants to focus on the new shiny thing over what 99.9% of games use.
Here are the facts; The die stack has just been renamed. I'm still to see any tech channel catch up to this fact. It's right there in the numbers.
102 is still 102 (384-bit)
104 is now 103 (256-bit)
106 is now 104 (192-bit)
The "4080 12 GB" is a 60-class card. Just look at the bus width / number of memory PHYs on each chip. Going down the stack it's 384, 256 and 192 bit or 12, 8 or 6 memory modules. Also look at the die sizes; The "4080 12 GB" chip is under 300 mmÂČ, so clearly a 60-class die.
900 USB, or a lot more realistically, for a 4060 with a fake name and price. So I just bought a used Asus Strix RTX 2080 ti, because I'm done waiting. I'm not getting anything at these prices.
This has been said a lot and I'm betting it will only get worse over the next month.
Absolutely everyone calls it anything between a 4060 to a 4070Ti. I for one cannot wait to see benchmarks from Gamers Nexus and Hardware Unboxed as it's going to be a bloodbath. Even Jayz is not impressed.
I saw a video by Graphically Challenged and the rumour mill is on raw rasterization Radeon HD going to be up there and prices were leaked as lower.
That being said I think when AMD soak in all this backlash and the even greater one from the benchmarks they may .. drop prices a little knowing they could wipe the floor with nVidia for market share.
If AMD have any sense.
Doesn't matter how much cheaper your cards are if they don't sell and increase market share. In this current and unprecedented climate is going to take a fair amount of wisdom.
@@saintallnights7239 exactly! Iâm glad someone said AMD will need wisdom. They seem to have opportunities and sometimes canât take it.
@@VGFightSchool Been following this too long.
They have had their moments and they have had their misses. This is a moment they wont want to miss.
70- class not 60 class
@@user-ki1wq6pt9u Nope. 60-class. 192-bit bus is a 60-class card.
The GPU card architecture is hitting obviously a performance hard limit. They cannot achieve better performance anymore by design or material improvements or any other ways, so they just added more CUDAs and more power via upping the clock frequency. Pretty much just afterburning the engine.
And thatâs fine. As time goes on, the materials will improve. So why make this launch? They could just keep the 30 series going. Or if they needed to do the 40 series, then cancel the 30 series and make the 40 series a small refresh. Market it as a small refresh. Donât market it as 2x-4x the performance.
@@ErockOnTech OR actually wait until you have the card before foaming at the mouth about performance. Otherwise you come across as unprofessional and Unprofessionalism leads to the dark side, Fanboyism. No sane person is going to buy the 4090 until AMD have blown their load. History suggests AMD will balls this up, like last gen, but only a fool would buy until the raw numbers are out. But wild speculation only serves to muddy the waters and fan the idiot fanboy flames.
@@ErockOnTech a refresh like the "super" brand with the 20 and 16 series i would assume. I can imagine, that nvidia would do that kind of thing.
@@ErockOnTech I think they have AMD on their back, making a claim they'll take the raster crown this generation. They had to respond with something...
@@ErockOnTech Exactly nVidia really didn't a 40 series right now, the 20 series haven't been fully maxed except when done on purpose not through normal usage. The avergae gamer still plays with 900 and 10 series cards and has a great time. Competitive gamers are small 1% of the total gamers out there. They should be selling the 2k and 3k series raking in the dough and working on a proper fully ground up new gen 4k series for 2025 maybe a tad sooner.
The bigger your monitor, the blurier the DLSS.
I use a 42 inch TV, so anything not native be it DLSS or any image scaling technique, will result in softer or blurier image quality.
Besides, the fake frames aka frame interpolation will introduce artifacts and at best it just ignore the hard to process fast moving scenes, so in a fast moving games such as the FPS or action games, you'll see a lot of unprocessed parts of the image due to the lack of data between frame A and B. The less raw FPS you have, the worse the performance of the fake frame generation.
So may I ask, currently I have rtx 2060 oc 6gb, so I need wait to take rtx 4080 (12gb actually rtx 4070) or just get rtx 3070ti for 1080p ? Because nvidia has strategy not like human on the world :) I need to play 1080p for 5-6 year
If youâre doing 1080p, then get the 3070ti or even a 3060ti and save the money. Those cards destroy at 1080p. :-)
The 22 fps is because they used a special new ray-tracing setting in Cyberpunk called âoverdriveâ. This increases the ray-tracing fidelity to the max.
The point of showing this, and the reason why DLSS 3.0 was showcased to such an extent, was explained in the presentation, where they stated that NVIDIA canât battle the ever increasing demands of gaming graphics with extra cores only, smarter solutions are necessary. Therefore: DLSS 3.0.
The problem with showing off a new generation of video cards is you want to show they can tackle future games. But future games donât exist yet. So guesses are made where future games are going. NVIDIA thinks this will be ray-tracing galore. And it comes up with a âpossible future gameâ scenario that fits this.
Thatâs all very nice, but youâve got to believe what theyâre telling you. First, that the future is indeed going to be ray-tracing galore, and second that DLSS 3.0 is brilliant and only when using these new cards. And to be honest: Iâm not convinced by any of that. What bothers me the most, is they wonât enable DLSS 3.0 on older cards, so they essentially deprive us of the opportunity to verify their claims. That in itself makes you doubt everything thatâs been said. I hope AMDâs presentation will be less disappointing.
DLSS3.0 is NOT a smarter solution though. This tells me that NVIDIA completely missed the mark, because the reason why people want higher framerates is because input feels more responsive. That's the whole reason games push for framerates beyond what their monitor refresh rate supports. DLSS3.0 increases the framerate while making the input feel *less* responsive. That's a big mistake.
Besides, this started *because* NVIDIA is pushing a very inefficient rendering method onto gamers. Rendering based on rasterization is fine. Saves everyone a lot of performance but still gives a sufficient level of graphic fidelity. Ray tracing is a parody of its time, in the sense of that NVIDIA has to make giant leaps in terms of power consumption, for only marginal improvements on the graphics side. And now they have to compensate by giving us *fake frames in which we have no control over what happens in the game*.
The chief technical engineer at NVIDIA that created the principles that DLSS3.0 uses stated that older cards simply cannot cope with DLSS3.0, because of the fake frames that are being created in between. The input lag would simply be too much. That is if the cards aren't struggling at creating these fake frames to begin with.
DLSS3.0 also isn't really benched yet from what I have seen. I've seen plenty of cases where using it crashed your computer.
Suspicious that Nvidia who makes DLSS, doesn't specify what quality mode they are using in their benchmarks..
Yes exactly! What are they hiding?
Lol
I can't wait to see what AMD can bring to the table. This is their chance to dominate if they play it smart
It would be funny though if they have something similar..
I don't think AMD will get more than 20% market share even if their cards are phenomenal price to performance. There are way more Nvidia fans that will buy this regardless if the raster performance sucks.
thank you!!!!!!!!!!!!!!!!! ive been asking this since i saw the 22fps when it was live!
Yeah more people should be focused on this. This is very telling. Itâs all the proof we need that these cards arenât as much of an improvement as we thought.
People who recently bought 30 series cards at, or below, MSRP are feeling vindicated right about now.
i have a feeling the 40 series is really just the 30 series that are pushed as far as possible with a software upgrade
They're basically overclocked 30XX. They pushed frequency and power limits to the Max. That's why the 4090 is so thick, it needs a humongous heatsink in order not to melt.
Itâs just a rebranded 30series with a better process node. TSMC 4N is definitely much more efficient than the previous Samsung 8nm and I can bet nvidia didnât do shit and just ported the new node into the existing ampere arch.
Easy to check by removing the cooler and looking at the Chip and Chipset of the card, though they may have renamed everything to cover their butts but who knows. A good look at the layout would be a dead give away as no 2 genrations look identical as they layout and traces have to change as well.
nvidia has a habit of only pushing the performance envelope every other generation. And in the weak generations, just tacking on gimicks (which sometimes are deemed useful).
yeah I think when I finally upgrade my pc I'm going to stick to the 30 series, I just can't really get behind those prices especially since the current hardware we got i still feel like is being underutilized
Thanks bro, for checking out the video! I agree, it does seem like the 30 series cards arenât even that old yet. I am excited for the new AMD cards being announced later this year, though.
Can't wait to hear Erock drop some heat!
Haha thanks man! Itâs coming!
@@ErockOnTech Heat you say? Like enough heat to melt ATX 12v pins? đ
@@x8jason8x lol đ thatâs funny!
The thing about DLSS is that it's Nvidia's prerogative to implement it whereas rasterization simply requires a compatible driver for the API so if I buy a GPU that is more powerful with rasterization, it'll be theoretically faster for all games past and future.
Exactly! Just another reason why we should focus on raw rasterization more and DLSS less.
Eyy finally get a notification on time. You know what the scummiest thing of all in this lineup is? They're counting on people to not know which 4080 is which. They use different dies, and the 4080 12gb uses a smaller bus size than any version of a 3080 or 3080ti (3080 carries a 320bit bus, 4080 12gb carries a 256bit bus). I'd say this is at least a purposeful attempt to bamboozle consumers, if not outright bait and switch.
Yeah this is definitely unfair. The 4080 12 GB is 100% a 4070. But people would freak out if a 70 series card cost $900. Nvidia is being greedy.
Actually it's a 192 bit bus which is even worse
@@HeavyGun1450 Ooof hell, yeah you're right. BOTH "4080" have a smaller bus than the 3080 10gb.
I just don't have the words to describe how scumbag this is.
@@HeavyGun1450 Indeed! This 4080 12 GO is more a 4060TI than a 4070. NVIDIA are Masters of Illusion because they will outsell the fake 4K gaming card (4080 12 GB) instead of the true 4080 (16GB).
May i ask what is the performance of the 6950 xt with RT On???? It needs FSR to bring 51 and NVIDIA 3090Ti with DLSS 2 brings 59. So 22 frames with RTX ultra quality without any DLSS is not so bad.
I see the point youâre trying to make here. However, I love AMD cards and I think they are underrated these days. With that being said, they are way behind Nvidia in terms of RT. So itâs not really an apple to apples comparison. My ultimate point here is this is supposed to be the next generation. This is supposed to be the latest and greatest. Also, the 4090 is $1600 USD. So if a 3090 ti isnât so far off that makes the 4090 look even worse. Iâm not paying $1600 USD for a software upgrade.
All I am saying is you are talking about rasterized performance while you bring RT on numbers to justify your point of view.
@@AI23973 Iâm saying the only thing that Nvidia wants to talk about is RT and DLSS. Where are the raw rasterization numbers? When they did show RT with no DLSS, the 4090 looked pathetic.
The 4000 card must be passed unconditionally.
If you buy a 4090 and have to use DLSS something seems wrong...
I dont personally see anything wrong. I drive a 5120x1440p monitor. When I want pure immersion in a game, I go widescreen with DLSS. Its fine to want pure rasterization but you're goign to get to a point where if you want more performance, you either crank up the power for it or you start playing smart. IMO DLSS is playing smart instead of brute forcing.
@n n but the 3090 CAN actually do 8K even if its 30 FPS is a testament to advancement of GPU's. 4K was the pinnacle and now 8K is actually achievable. Stop pretending that its impossible.
@n n your comment was 8K isnt doable which is obviously not true as its doable even at 30fps and even at 300 watts. when you find a gpu that can push 8k and NOT be a power draw please let me know.
@n n well when were there and youre pulling 900 watts to pull 8k 60fps, maybe youll have another reason to complain then too.
Nvidia: we got 100 fps on cyberpunk in a straight away (although 20 or 30 of those frames are fluff) đ€Ł
Lol đ so true
I'd like to know if these fake frames change hitboxes in games with DLSS 3. For example, you shoot at someone using DLSS 3 tech but what you're seeing isn't their actual position due to DLSS sending incorrect data back to the server that refers to their position.
If you want one of theses cards, you will definitley have to by a new pwer supply to accomodate the power useage and the wiring harness's supplied.
Just waiting to see independent reviews of both Amd and Nvidia before I make any decisions đ
Bought the evga 3090 ftw3 ultra at 990$ that i think was fairly reasonable... And surely will last for a while.
These are a lot of amazingly good points. And I hadn't realized most of this before... Their entire graph setup was just "These games ran with way higher frames when run under cherry picked settings for the new cards."
There was no "This GPU will run Ultra in these games 1080p at 144fps while barely making a sound" Or "Enjoy a smooth 90fps in 4k Ultra with no fancy cooling needed' Just a lot of "They did a lot better with upscaled resolutions and false frames"
AMD has the floor at this point... And I am wondering if they're waiting for a full reveal just so they can watch Nvidia stumble first.
It's poor strategy to show your competition the cards youre holding well in advance of you having product on the shelves.
@@ctsd623 it's good strategy to show consumers a vetter/cheaper product so they hold off
When you said: where are?
I have immediatly subscribed to your channel
Well I appreciate that! Thank you!
Fire video!!!!!!! đ„đ„đ„ Won't be buying one of these bad boys soon, however maybe in the near future on my ultimate build! Thanks Erock! A++++!
DLSS will only work on 4000 series just like how LHR was unlockable
Lol đ Iâm open to a mod/hack for
It. I hope the 30 series gets it. But thatâs detrimental to Nvidia for sure.
I hope Americans realize that in Countries like Canada to UK, to all of Europe, and of course worse for places like South Africa to Australia. Where a GPU like the 4090 will be about $2300 the cheapest.
At that point just fly to America to buy one
Why should Americans care about the pricing of luxury items OUTSIDE of their home? Its not exactly my or anyone elses fault that the prices of EXPENSIVE LUXURY ITEMS during an inflation are high. If the prices are an issues, wouldnt it be best to not spend your money if you dont have it? No point in going into debt for a LUXURY ITEM to play video games.
@@SSoul0 I am not sure what your point is? And I am not sure where I indicated I was interested in getting the 40 series? I was just giving a perspective to what the rest of the world is facing when it comes to 40 series pricing compared to the United States? Yes, I am very clear on how debt works. That's why I am mentioning the high prices?
9:28 how did you confirm? Some say the 22fps is a 3090/Ti
Dude, your channel is the best. Thank you for the information. If for whatever reason I choose to upgrade my 3070, it'll be with AMD.
As soon as Jensen mentioned AI can "Predict" the frames, I thought that this is wrong. No one can predict the First Person shooter movement
They can't predict player movement at all. FPS will suffer worst though, everything is gonna look like you're playing on a VA panel. Smearing everywhere.
If you punch me in the face, I can't predict it. How can AI?
Itâs not actually âpredictingâ a future frame itâs inserting a new AI generated frame taking information from a past and latest rendered frame at worst itâs adding slight latency.
@@kingyogesh441 Jesus, the nVidia withdrawals are strong in you. In this kool-aid lies disappointment, but you do you.
@@kingyogesh441 yes it is taking frame data from the previous frame and predicting what the next frame will be using AI. It then shoehorns that predicted frame between the previous frame and the real newest frame to make it appear like a new frame was created. Ghosting/artifacting is going to be a major issue with this tech.
Wondering why DLSS 3.0 is only exclusive for 40 series? Maybe to artificially increase 40 series improvement over the 30 series, the real-life FPS gain is probably about 10-15 FPS for $1000 msrp. You need a new PC build to accommodate the 40 series.
They say itâs because DLSS 3.0 uses the new type of Cores inside the 40 series. The vice president also said that in theory they could get it to work with 20 and 30 series cards but the performance uplift would not be the same. But Iâm also inclined to agree with you and say they are doing this to give some type of value to the 40 series cards. Because if the 30 series cards have DLSS 3.0 then thereâs virtually no reason to buy these cards.
@@ErockOnTech DLSS 3.0 on the 30 series would help the AIB's clear their stocks đ€
@@gamingreflections6476 Good point. đ€
Regarding your comment at 5:47, they did specifiy the preset. If you check the fine print again, Nvidia used DLSS Performance, so they upscaled from 1080 to 4K.
To be fair. The Xbox 1 S had Native 4k. And it was beautiful. I took advantage of it in a few titles like RDR2 and Resident Evil: 2 Remastered.
The PS4/PS4 pro on the other hand at most got an upscaled resolution from. 2040p or something.
Happy the community is angry and telling Nvidia âThis isnât acceptableâ so theyâll be forced to make a great step forward on the 50 series so we can justify the prices or having them turned down. Ngl i still believe the future of gaming will be streaming video games on a Virtual Machine. Cuz at some point, evolve the technology its gonna cost so much money that having your hardware with you will be something luxurious. Remember first laptops, computers, tvs.. At some point we wonât be able to evolve easily. Down on hearing your thoughts..
I agree about the community. Iâm happy to see the community isnât just accepting this as mindless zombies. This is truly great!
As for the streaming, I see the potential. The issue is the tech is still a ways off. Anytime you want someone to adopt a new way of life, you have to convince them that the new way is superior. Right now if you stream a game, you can easily see the issues with it. So until they can make it feel native, physical equipment will still be around. But what you propose is totally possible and likely probable at some point in the future.
If the hardware evolution trended like TVs, we'd all be playing on graphene based chips and rocking 8k144hz. The key point being here that I can get a 55" LG OLED Evo for less than the 4090, having change enough left over to buy a couple new games, and dinner.
PC gamers DO NOT WANT streaming, we like the hardware, the building, the tinkering. Console being replaced by streaming? of course. PC? no way in fucking hell.
Exactly what i was thinking about raw performance. DLSS should be a feature to improve performance on old nvidia cards lets say on ampere cards but it should not be sole selling point. May be ampere card owners sacrifice some visual performance for the sake of higher framerate but someone who is buying 4090 for $1600 should get real 4k with framerate uplift compared to previous gen with RT on. This is bad news and i see no significant value in upgrading from 3090 to 4090 for gamers atleast for this ridiculous price.
The thing is when has raw performance ever been an issue for the last 5 years, all everyone gets fixated on is frame rate as if we have not had more than good enough frame rates at any resolution now for years, you still play at 1080p well there is a stack of cards that can do max settings at 1080p at stupidly high frame rate, same for 1440p and 4k was basically job done with the 30 series, heck a 1080ti will do the job at 4k in most games if you donât only want ultra settings and are happy with high or even medium.
I for one am glad that nvidia is at least focusing on advancing the visuals, games have looked great using rasterisation for the last 10 years now so at what point do we want them to stop focusing on making rasterisation faster and faster and start actually pushing the visuals on. RT as we should all know by now is that next step and DLSS makes it playable. Nvidia knows it, amd knows it and intel knows it and all have created their own technique.
So many people want games to look like Toy Story but run like itâs csgo and yet here we are trying still to get close to what movies could do 25 years ago in real time itâs that difficult to do and we still cry about framerates
RTX 4090 is an 80%-100% uplift compared to the 3090 in pure rasterization! that is Huge. Also, it's a 60%-70% uplift compared to a 3090ti. The price increase is only 6% price increase from previous 3090 MSRP! That is really good! Whatever you're watching, listening to or SMOKING, please my friend, stop now good sir!
@@jinx20001 i never said raytracing is not worth it, its actually awesome but why on earth do you have to rely on DLSS upscaling tech to achieve 60+ fps in 4k. And if RT is not possible at 4k 120 fps why would anyone want to pay $1600 for a card that can't do 4k with your new RT tech or else just do fair pricing. Locking DLSS 3 on only 4000 card to make it look good compared to ampere is also suspicious.
@@cdiff07 Those uplifts are without Raytracing my frnd. Once more i am saying which you didn't take note of earlier "can an rtx 4090 do NATIVE 4K with RT on at 60+ fps. Without raytracing even amd can do native 4k with those uplifts you are talking about. đđ
@@forgot46 What I read and replied to from what you said was referring to real 4k with framerate increase. Not to mention other people saying that the 4090 is going to barely be higher than a 3090ti in pure rasterization. All the outrage is OVER rasterization, which is not RT ON. Like I said, RTX 4090 is the Only card to buy, the 4080 16gb and 12gb model are way overpriced for what you get!
I wanted to be excited for the new cards but seeing this info about nvidias new pricing along with the 4080s kinda discourages me , I want to see what AMD is going to release, hopefully itâs something good , I am a AMD and nvidia owner of their cards
I remember back in 2014 the titan x (basically the most expensive gpu you could buy as a consumer, and not many bought it anyways because of it's ridiculous price back then) was 1k and 600$ gpus were considered high end...
Now we have this bs...
7:40 look at Microsoft Flight Simulator, it is only what you need to know that they use DLSS 3 for this benchmarks, because this game is CPU bound, so they get more performance compare to 3000 series thx to fake frames, but between 4000 series card there is not much different because they are not faster, because CPU still holding them. So they compare apples to oranges.
it'll be Funny if Crysis 4 released right now and RTX 4000 wouldn't be able to run it đ
Good stuff, subbed đ
Awesome! Thank you!
They hide the pan and a egg. After someone buy it. Jensen come out and say "you can fry your egg now"
After countless videos, youâre the first one to actually touch on these very important topics. Thank you. Waiting on AMDâs, otherwise will be looking for 3000 series reduced price
I bought a 3080 a few months ago I'm glad I didn't wait I'll be skipping this gen
It reminds me of software that boost performance by using the internal graphic chip of the cup in conjunction with the discrete graphic card, it turned out it was only adding blank frames to get higher fps. DLSS3 is not the same but looks like instead of blank frame it's an extrapolated frame. It sound sketchy anyway.... I saw a benchmark of game with dlss 3 having >400 fps but with 9ms latency, it doesn't make sense since it should be 2.5ms
If DLSS 3 is required to achieve good frame rates with RT turned on, then I'll wait for something better than the rtx4090.
I bought a 1600w EVGA Supernova just for the rtx4090.
Now I'm hearing burnt adapter cables on some power supplies... that the 4090 will require a version 3 power supply đł
That DLSS 3 is required to be turned on if you want those great frames...
I'm currently running an rtx3070 on 3 x 28" 4k screens. (I7-6950 extreme o/c 4.2Ghz & 64GB ddr4 3200, 970 EVO plus nvme 500GB & 1TB for games)
I'm expecting rtx4090 to run at least 60fps with all settings maxed...without DLSS 3 and RT turned on....for those games that support RT.
Said all this and more.
And everyone now needs a full tower case again for this new graphics card .. and also a midi for everything else with a long cable. Lol.
I call dlss3 the Psychic Algorithm.
When GN and HU get a hold of these is going to be a bloodbath of epic proportions.
Oh I canât wait for their videos. Itâs gonna be amazing!
@@ErockOnTech Every now and then, done this a long time, you have those moments.
This very much feels like several all rolled into one between now and January.
Just said to someone on another channel, AMD have had their moments and their misses and this is a moment they really don't want to miss.
It stands out like a cherry on a Bakewell tart, or at least will over the coming weeks and should be pretty hard to miss.
All tech reviewers are chattering even about the backlash and hate and should stand out as a golden opportunity.
I for one didn't see this elephant coming.
@@saintallnights7239 The 2022 disappointment tour T-shirts are gonna be great, that's for sure.
@@x8jason8x lol
Let us wait for the test results :)
I like your style of making vids brah. subbed
Thank you very much! I appreciate that!
As someone who is transitioning from from console (series x) to pc and steam deck, I can't help but wonder if I made the right choice when I see moves like this and evga no longer dealing with Nvidia. Just a lot of red flags all around.
You're fine. PC get Gamepass. Then get a PS5 and. Switch. The Xbox is a PC gamewise.
Lisa su will send Nvidia back to the shadow realm
She will eliminate Jensenâs life points with her trap card.
You nailed it bro. Earned my sub today
Thank you very much! I appreciate it!
Hope FSR3 would be improved with optical flow acceleration in addition to FSR2 features, but still without ANY machine learning hardware. Improving shader FP16/INT8 performance maybe more important on raw upscaling and downscaling performance. I would like to see FP16 on shader core being up to 8x FP32 FLOPS performance. Maybe DirectML would help FSR even further
Also, remember ray tracing matters most only on lighting, reflection, shadow, and metallic surfaces. Rasterization still used in most textures.
Improving ray accelerator core should offload much of BVH to it.
I haven't heard much about a FSR 3 so far.
This is a very good analysis. Nvidia is again up to its old tricks. They went from raw power to fake speed; a marketing plot to harvest sales from consumers. AMD and Intel needs to capitalize..
DLSS is GARBAGE. It is a crap ai program that makes your games look blurry. I would NEVER use it. For those of us with good vision DLSS is HORRIBLE. It will make your game blurry and laggy.
word
so true
Just bought a Msi 3080 12gb suprim as seeing the stats of the 4080 12gb Iâm just loosing out on the next gen speed of cuda cores as the cards clock 5/600 mhz slower but has like 1000 more cores
I have a 3080 10GB. Iâm perfectly happy with it!
Good job and great info. Has some good points to consider.
Rtx 4090 gonna be a beast âŠ.!!!! Canât wait to get one
it amazes me that people still don't understand the concept of native resolution, this conversation has been around since the xbox 360/PS3 started upscalling games.
Trust me, I know.
The Amount of Lovelace in 4080 16 and 12gb almost makes me think they are holding back for next gen. Theres enough cores missing between the 4080 16 gb and 4090 for 3 cards; the 16gb card itself having 60% of the cores & resources, then another with 70, 80 and 90% of Lovelace. Its disengenous. The base 3080 had 8704 cores vs 10496 for 3090. We are nowhere near that ratio here.
8:50 8:43Exactly. I wanted to get a new GPU but now Iâll think Iâll wait for real-time specs. I donât care about DLSS on or off. I want raw performance specs.
Yes exactly! Iâm excited for AMD RDNA 3!
You make an excellent point that I hadn't noticed before about the DLSS off/RT on etc - I was shocked. Hell, I'm playing CP77 right now and it looks incredible - I'm getting more FPS at 4k with RT on and DLSS off on my 2070 SUPER. My old 970 could compete!
Yeah, but maybe they had all settings at ultra and ray tracing on psycho or something...which is not an excuse but it could mean that the performance in general just was left at same levels and maybe they improved perf per watt of power compared to 30 series....
Which would make sence considering that they went from 8nm node to 4nm. Power efficiency :)
They did have everything-ultra, and ray tracing 'overdrive' setting in cyberpunk.
Still, from the benchmarks on the 3 games which don't have DLS3, that the 4070 (I'm not gonna call a 104 chip a 4080) was performing about on par with a 3090. And at $900 it's about the same price as a 3090 is right now. That would make the 4080 (the real one, 16gb) a little faster than a 3090ti (at a little higher asking price), and the 3090 should be the fastest nvidia card, at $1600). There's no generational cost:performance gain. Which makes 4000 series quite a dud. If past generations charged for every perf gain, cards would cost $100,000 now.
is AMD releasing or announcing RDNA 3 on Nov 3rd? I guess products are often released after one month of the announcement
To the best of my knowledge, they are releasing on November 3rd.
Correct me if I am wrong but I think the 22fps in cyberpunk was the "rt overdrive" mode.
Youâre correct. But 22 fps looks bad no matter how you look at it in my opinion. Also, why make a new more demanding RT mode? We never mastered the last mode.
After being forced to use DLSS in Cyberpunk, I will never use it again. Yes, it's good at faking the look of the resolution you've designated, but only if you're the sort of person who is blind to the artifacting. I am the opposite of that sort of person. 1: Any motion that has more than a single vector, such as riding an elevator (static) while the outside scenery scrolls up (moving) causes a smear of artifacts like you're wiping oil on your screen. Also when your first-person arms move too fast in front of your face. 2: Billboards and other "pure texture" elements: These are 1440p (in "quality") or 1080p ("performance"), *period,* not 4K. Because they are flat elements instead of 3d meshes, and DLSS has nothing to "upscale".
interesting and solid sounding analysis, gj looking through the smoke and mirrors
I was saving for a 30 series but heard AMD slashing prices on current stock, getting ready for new. Now I'm saving for an AMD.
Assuming by the second chart they released AC Valhalla chart seems to be the only one we should set our expectations at. And from what I see, they just placed more cores, overclocked those and increased the power limit and the 4090 gets about 50% more performance when it's compared to 3090 Ti when DLSS or RTX is out of the equation. And also the extra core count does not scale all too well because 4090 has more than double the amount of cuda cores compared to 3080 12G, It also has double everything else. But the performance gain seems to be about 60%.
That aside, 2600mhz is about 45% faster than 1800mhz and 10752 cuda cores is %40 more than 7680 cuda cores. But across the board, 3090Ti is faster so 4000 series have slower cuda cores than 3000 series. So probably the sweet spot card in terms of core count and power scaling is the 16gb version of the 4080. And 4090 is probably doomed to fail at random because of super high power budget. That card will be notorious solder balls melting and shorting a power stage due to the extreme heat and possibly even creating a hole in the pcb.
Good points. Thanks for sharing. Overall, Iâm sticking with my 3080.
I agree with all your points. But I will always be a 4K purest and donât use any upscaling in my games. I too wonder what the true raw specs will be and from what I have been seeing there isnât much gain over the 30 series in raw data. Canât wait until AMDâs RDNA3 announcement.
Rasterization and current ray tracing are also full of hacks. If you want "pure" 4K you better not buy a GPU for at least the next 10 years, as it won't be purely ray-traced to a sufficient accuracy
In cyberpunk 2077 native 4k rtx on :
20 fps using 3090
22 fps using 4090
this is unacceptable
10% increase of performance at that price???
after 2 years. like what?????
DLSS is being pushed solely to combat the performance hits that come along with Ray tracing. All these 40 series cards as well as the 30 series cards are have rasterization taken care of. Yes the Cyberpunk numbers shown at native 4k are abysmal but that is mainly down to them running the insane setting for all ray tracing effects. Bottom line is that until they create a new memory system to accommodate the massive demand that running RT has on the GPU adding more RT and Tensor cores aren't really accomplishing anything. Maybe instead of adding more Cuda cores and slapping a ridiculous price tag on these "next gen" gpus Nvidia should innovate to solve the performance sucking issue that come along with Ray tracing. Then maybe these Tensor cores could be used for something ground breaking (like enemy AI or world simulation) instead of having the sole purpose of being a band aid for performance gash caused by Ray tracing.
I like this comment. Youâre not just complaining about the GPUs or my video lol. Youâre actually proposing potential solutions to these problems. Youâre right. Nvidia should have focused on solving the RT problem.
i had a Samsung TV that inserted fake in-between frames, it was really horrible, a fake smooth motion that just looked so wrong. that kind of tech should be left to the monitor not the GPU to implement.
The 4080 12GB is 100 percent a 4060/4060Ti class card, since it is the fully unlocked version of the fourth largest die from this generation. This is based on the idea that Nvidia would usually reserve their largest die (possibly the AD100, you bet they'll introduce it later) for their server products, their second largest (AD102) for x080 and above cards, and the third largest (AD103) for their x070 and above. And so, logically speaking, that makes the AD104 the fourth largest die of this generation, which puts it in the x060 class. The fifth largest would go into lower end products like the x050 class and below.
Instead, what they're doing this time is reserving the entire AD102 die for the 4090 class, pushing the 4080 class down to the AD103(renamed from 104) and adding 104(actually 106) to the same class while they're at it.
You could argue these are just names, and Nvidia can call them whatever they want as long as the performance and hardware value is there. Except it isn't.
Smaller die sizes also mean smaller memory buses, so we go from 384-bit on the AD102 to 256-bit on the AD103 to 192-bit on the AD104. For context, Nvidia has been using the 192-bit bus for x060 class cards since Pascal, all the way upto Ampere. That's why we've been stuck at 6GB on this class of cards with the occasional 2060 12GB and 3060. Die size-wise, the AD104 die is only 295mm^2, which makes it the smallest x060 class die since the 1060 (200mm^2), with previous x060 class dies being 394mm^2 on the 3060 and 445m^2 on the 2060. Yes, Silicon prices have gone up, but you need to understand that the actual die doesn't constitute for 100% of the price of the whole card, meaning even a 100% hike in silicon price shouldn't warrant a $100 price increase, let alone $500 assuming the 4060Ti would be priced at $400. While TSMC reported a 20% price increase, not 100%.
In conclusion, AD104 - small die, small bus width, big price. 4060Ti class is now 4080 class. Because Nvidia has a massive RTX3000 inventory to clear.
Their justification is DLSS3.0, thinking we'd forget that rasterization is a thing. With this generation they're literally selling us a software feature, which explains their lock down of DLSS3.0 on RTX4000 only.
Youâre 100% correct! Iâve been saying the 4000 series cards are an expensive software update. People get mad at me about it. But itâs true.