RTX 4000 Is All Smoke and Mirrors

SdĂ­let
VloĆŸit
  • čas pƙidĂĄn 22. 04. 2024
  • In today's video, I discuss the new RTX 4090, RTX 4080 16 GB, and the RTX 4080 12 GB. What is Nvidia hiding? Where is the RTX 4070?
    CONNECT WITH ME
    ---------------------------------------------------------------------------------------------------
    EMAIL ErockOnDeckYT@Gmail.com
    DISCORD / erockontech
    TWITTER / erockontech
    INSTAGRAM / erockontech
    -----------------------------------------------------------------------------------------------------------------------
    The videos you will find on this channel are similar to videos you will find on channels like Alpha Gaming(Harris Heller) and JayzTwoCents. Also, there are a lot of themes and influences from creators like Nutty, BitWit, and Linus Tech Tips.
    -----------------------------------------------------------------------------------------------------------------------
    The music in this video is from Harris Heller Streambeats. You may have heard about this from Alpha Gaming.
    -----------------------------------------------------------------------------------------------------------------------
    🎬Chapters
    🕒 0:00​​​​ - RTX 4090 and 4080 Price and Release Date
    🕒 0:20​​​​ - Where is the RTX 4070
    🕒 0:48​​​​ - RTX 4080 12 GB Specs
    🕒 1:52​​​​ - Nvidia is hiding something
    🕒 3:15​​​​ - RTX 4090 Ray Tracing and DLSS
    🕒 3:25​​​​ - DLSS 3 is exclusive to RTX 4000 Series GPUs
    🕒 3:53​​​​ - DLSS 3 uses fake frames and creates more latency
    🕒 5:12​​​​ - Where is the Raw Rasterization for 4090 and 4080
    🕒 5:25​​​​ - What is DLSS
    🕒 7:00​​​​ - DLSS 3 is the main selling point of the RTX 4090
    🕒 8:28​​​​ - Is the RTX 4090 better than RTX 3000
    🕒 9:35​​​​ - AMD Radeon 7000 Series GPU Release Date
    🕒 10:00​​​​ - How can AMD beat Nvidia
    -----------------------------------------------------------------------------------------------------------------------
    This video addresses all the following topics:
    RTX 4090
    RTX 4080
    RTX 4080 12 GB
    RTX 4080 16 GB
    RTX 4070
    RTX 4090 price
    RTX 4090 release date
    RTX 4080 price
    RTX 4080 release date
    Nvidia
    DLSS
    DLSS 3
    AMD
    AMD FSR
    AMD Radeon 7000 Release Date
    AMD vs Nvidia
    RTX 4000 Series GPUs
    RTX 4000 vs RTX 3000
    RTX 4090 is a rip off
    RTX 4080 is a rip off
    RTX 4090 is overpriced
    RTX 4080 is overpriced
    4080 12 GB is really a 4070
    #Nvidia #RTX4090 #RTX4080

Komentáƙe • 1,1K

  • @ErockOnTech
    @ErockOnTech  Pƙed rokem +22

    Are you getting a RTX 4000 Series GPU? Let me know below.
    GET DISCORD ACCESS FOR ONLY $1: www.patreon.com/ErockOnTech
    Follow me on Twitter: twitter.com/ErockOnTech

    • @TheCudder4life
      @TheCudder4life Pƙed rokem +3

      @No Scope GG You are the most enlightened!

    • @MrArrmageddon
      @MrArrmageddon Pƙed rokem +2

      As long as the 4080 16GB is better than my 3070 TI yes. Otherwise I'm likely getting a 3090 TI. I game and I use Stable Diffusion so I would like to have at least 16GB Video Memory to render AI Images. 24GB is kind of overkill for me. So that is my plan and I already have all the requirements for my PC to support a 4080 that has been my plan since getting this PC. But I admit a lot that either pisses me off or worries me about the 4000 series launch.

    • @brandonmitchell2374
      @brandonmitchell2374 Pƙed rokem +2

      Nah, probly not. Currently have a 3060 but might try to snag a cheap used 3080 in the future

    • @Zoppaz
      @Zoppaz Pƙed rokem +1

      Yes 4090, already sold my 3080 at profit.

    • @yourhandlehere1
      @yourhandlehere1 Pƙed rokem +5

      I'll keep my 3080 til the 13080 comes out.

  • @hemzaonline
    @hemzaonline Pƙed rokem +123

    NVIDIA: also don't miss the GT4030 starting at $499.99

  • @billlawson3467
    @billlawson3467 Pƙed rokem +172

    I have been screaming that in the discords I'm active in. DLSS and FSR is ok for taking a lesser card and giving you a bit more. But when you need to use it to toot your own horn about 4X the performance, you are only kidding yourself.

    • @Thundah_Dome
      @Thundah_Dome Pƙed rokem +13

      FSR is for the budget cards 100%.
      I'm an AMD fan, and when it works, it's magic, but yeah. It isn't supposed to point at quality graphics

    • @ufukpolat3480
      @ufukpolat3480 Pƙed rokem +13

      I'm mostly interested in DLSS/FSR and other tech to fake/imitate full scale resolution/framerate (interpolation etc.) for truly low power systems like handhelds, tablets and ultra portable laptops which stand to benefit a lot from the low power consumption of simulated frames due to lower heat dissipation and longer battery life. If FSR 3.0 and/or DLSS 4.0 can give 1080p 60fps on a tiny handheld while upscaling from 480p, that's truly a great win. In the past a lot of games supported multi gpu setups for those who wanted the ultimate high resolution high fidelity high framerate experience but last gen consoles pretty much rendered that useless and Nvidia has zero incentive to go that route given how power hungry their high end GPUs have been lately. There's no way anyone can host a dual 3090(ti) setup without a significant cooling solution and/or compromising on noise levels. This is the only way they can pretend to be pushing the norm to a future they promised with each generation but failed to deliver like with Ray Tracing support. Even the highest end GPUs can't render 4K real time fully ray traced game as simple as quake at high framerates.

    • @alissonvieira6903
      @alissonvieira6903 Pƙed rokem +3

      if your limitaded by 100 fsp, and already has 80 fps natively, turning dlss3 on would spam your 100 fps limit with fake frames, an unwanted effect. Is indeed safe to say just make sense dlss3 if your card is performing halft or less fps that your monitor can handle

    • @eleventy-seven
      @eleventy-seven Pƙed rokem +4

      DLSS on 1440 is like magic. Set it to quality instead of performance and it looks great.

    • @Thundah_Dome
      @Thundah_Dome Pƙed rokem +1

      @@eleventy-seven and then drops frames and is unstable. Idk, not sure it's worth the price.

  • @KryssN1
    @KryssN1 Pƙed rokem +223

    Perfect take.
    Nvidia wants to charge us based on DLSS 3.0 numbers, while it looks like flufff performance that is not real framerate in practice

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +23

      Thank you! Yes, exactly. It’s a rip off. This is anti consumer.

    • @jeevan1198
      @jeevan1198 Pƙed rokem

      So many people, and I assume they are Nvidia fanboys, have said the 40 series performance is impressive. But it’s really not. Nvidia hardly showed any apple to apple benchmarks because they are trying to hide something. They are getting all of this fake performance from DLSS 3. It’s not real performance and that’s exactly how Nvidia are scamming gamers. They are charging ridiculously high prices for their GPUs but giving them such a minor performance boost. Nvidia likes to minimise performance and maximise the prices on their GPUs. They are trying to hide the fact that 40 series are very minor performance increase the 30 series. I highly recommend that everyone avoids buying a 40 series GPU. They are nothing but a huge scam.

    • @jeevan1198
      @jeevan1198 Pƙed rokem +7

      Plus, Nvidia are making these false 2 to 3x faster performance claims. They shouldn’t be allowed to make these performance claims unless the GPU is actually that much faster than its predecessor apple to apple. DLSS 3 is a cool feature, but they can’t just rely on that.

    • @jeremycovelli
      @jeremycovelli Pƙed rokem +5

      Why is nobody understanding that Raytracing and DLSS is the future.. not rasterization? There's a reason they're pushing dlss..

    • @liesandy291
      @liesandy291 Pƙed rokem +8

      @@jeremycovelli simple give me native.

  • @BartoszBielecki
    @BartoszBielecki Pƙed rokem +101

    As an owner of RX6900XT I am happy to know that FSR 3.0 will probably still support my card, not be slapped in a face like nVidia owners.

    • @absurdgaming00
      @absurdgaming00 Pƙed rokem +4

      Maybe it also supports Nvidia's card too like FSR 1.0/2.0? Who knows.....

    • @beretteres4031
      @beretteres4031 Pƙed rokem +5

      Are you sure about that?? Fsr 3.0 is rumored to function like dlss thus requiring dedicated hardware that the 6900xt simply doesn't have... still these are all rumors

    • @BartoszBielecki
      @BartoszBielecki Pƙed rokem +1

      @@beretteres4031 That's why I said probably. But if they do it, they will look just like nvidia. A plausible solution for them would be to offer fsr 3.0 for rdna2- that wouldn't provide such low latency/high frame rate but still would bring better experience for lower latency titles.

    • @beretteres4031
      @beretteres4031 Pƙed rokem +3

      @@BartoszBielecki not really, because nvidia is locking down a feature that is possible in older generations, amd with fsr 3.0 is just trying to catch up with xess and dsl and in order to do so new hardware is needed, without lies and just trying to advance

    • @marshinz5696
      @marshinz5696 Pƙed rokem +4

      @@beretteres4031 I think AMD will be smart. I think FSR 3.0 will function one way for the 7000 series, but a basic version that's superior to 2.0 will be made available to use on non-7000 series GPUs.That should be the move.

  • @robertobarrios2008
    @robertobarrios2008 Pƙed rokem +6

    The only creator that's not leaking nvidia boot to get a free 4090. Didn't miss anything. Thanks alot. Excellent work

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +3

      Thank you. I appreciate it! 🙏

  • @RobinHood-yk8og
    @RobinHood-yk8og Pƙed rokem +20

    DLSS is set to _Performance_ mode in their benches.

  • @icantlivewithoutwifi
    @icantlivewithoutwifi Pƙed rokem +59

    I absolutely agree with you. A $1600 card that only gives you 22 FPS with RT turned on is a terrible investment. So the major improvement of the 40 series is the software, not the hardware

    • @xkeepersvk
      @xkeepersvk Pƙed rokem +3

      exactly and they could activate that for 3series

    • @chiyolate
      @chiyolate Pƙed rokem +5

      I'm afraid that they can't go much further in terms of raw power so they have to look for improvements from this kind of technique.. even with 450W! I can't imagine if we are close to being stuck on performance? Is this going to be the end of GPU development?

    • @sql64
      @sql64 Pƙed rokem

      The improvent is bigger than ampere was lol wdym

    • @x8jason8x
      @x8jason8x Pƙed rokem +1

      Actually a key part of DLSS 3.0 is built in to the 40 series hardware, it's why only 2 of 3 of its' features will be available to 30 series cards.

    • @x8jason8x
      @x8jason8x Pƙed rokem +5

      @@chiyolate 100% right on the money where nVidia is concerned, it's because they didn't innovate in the same way AMD has with chiplet design. They're demonstrating in real time why silicon is dead. AMD is utilizing the only approach left before we have to move on to graphene chips, or whatever tech is next.

  • @gamerfly3064
    @gamerfly3064 Pƙed rokem +6

    Basically "Jensen Black Jacket" is saying buy out old rotten 30 series, because I need more jacker and I am all out of jacket.

  • @ragingmonk6080
    @ragingmonk6080 Pƙed rokem +12

    I told people that Nvidia stock was priced so high because of crypto card sales. I said when Ethereum crashes the stock price will drop back to normal and it did. Jensen and the investors had a sugar high with the high stock prices and now that stock prices are back down to normal... mad scramble begins.
    Consumers have the power at this point. Follow EVGA and tell Nvidia no and they will have to clean their act up. Buy the junk and Nvidia will not learn anything.

    • @kerbodynamicx472
      @kerbodynamicx472 Pƙed rokem

      Have you been short-selling Nvidia when the ethereum merge date was announced?

    • @kerbodynamicx472
      @kerbodynamicx472 Pƙed rokem +4

      @No Scope GG nah, most gamers can’t afford the ridiculously priced 40 cards

    • @ragingmonk6080
      @ragingmonk6080 Pƙed rokem

      @@kerbodynamicx472 I don't own Nvidia stock the dividend is too low. 98% of my stock have high dividends and I hold long. Only a few stocks for day trading.

    • @NickyNicest
      @NickyNicest Pƙed rokem +3

      Don’t forget they also have an abundance of 30 series to offload as well. So they are also pricing the 40 series high to make the 30 series seem like a good deal so they can keep charging exorbitant prices for last gen cards.

  • @marcgallant5270
    @marcgallant5270 Pƙed rokem +13

    Theres basically 0% IPC lift, it's all in the node shrink and clock speed increase. In other words all the improvements are from TSMC not Nvidia engineering.

    • @x8jason8x
      @x8jason8x Pƙed rokem +2

      Even with the node improvements, they focused so hard on tensor core performance that they forgot to improve anything else, and need 1.21 jigowatts just to power it. 😅

    • @monotoneone
      @monotoneone Pƙed rokem

      Which is fine, if the price is fair. Maxwell and Pascal were also very similar in terms of architecture but there was a large performance increase for only a $50 price increase

    • @PoisoningShadow671
      @PoisoningShadow671 Pƙed rokem +1

      @@x8jason8x If only your PC could bring you back in time.

    • @x8jason8x
      @x8jason8x Pƙed rokem

      @@PoisoningShadow671 Right? lol

  • @AllergiesG
    @AllergiesG Pƙed rokem

    What's up! can I still use my corsair hx1000 or need to upgrade gen 5? or can use adapter?

  • @geoffreystraw5268
    @geoffreystraw5268 Pƙed rokem +62

    Great analysis 22 frames for a 4090 is shocking. I know Cyberpunk is the new Crysis but still. Also a new sub :)

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +10

      Thank you! I really appreciate it! Welcome!

    • @0rrchids
      @0rrchids Pƙed rokem +12

      I just check some youtube video that test 3090ti w/ cyberpunk 4k Native no DLSS and it got 20 FPS! so... its only 2 fps difference!?

    • @leihejun844
      @leihejun844 Pƙed rokem +1

      @@0rrchids9:28 he said he confirmed that they're both 4090. I still can't believe it.

    • @Tom--Ace
      @Tom--Ace Pƙed rokem +7

      ​@@0rrchids this was the new RT overdrive mode, so not comparable at all

    • @geoffreystraw5268
      @geoffreystraw5268 Pƙed rokem

      @@0rrchids Holy shit lol wow

  • @Zockerfreak999
    @Zockerfreak999 Pƙed rokem +5

    22 fps on a RTX 4090 bruh

  • @b34k97
    @b34k97 Pƙed rokem +14

    Pretty sure the 40 series is a software update, not really a hardware update. The only changes to the hardware looks to be to enable their new software magic and massive increases in power utilization.
    The extra power and clock speeds seem to at least help rasterization for the 4080 and 4090, but not by by anything we’d really call a generational increase. Especially if you look at it in terms of pref per watt.
    I think the benchmarks next month will be truly telling.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +7

      Lol yes you are correct! It’s a $1600 software update. Ridiculous.

    • @aoolmay6853
      @aoolmay6853 Pƙed rokem +1

      @@ErockOnTech The "purely software 5nm process node update" will net in a minimum 50TFlops performance, uplift from 40TFlops on top silicon die. It also seems you will be able to buy better perf/$ than last gen too. If you do some napkin math, you'll realize you get rtx20 series perf/$ level, that actually great considering world situation and market condition.
      As for gaymers, you can opt out of real time lightning in any/all games and you'll have plenty extra rasterization compared to previous gen and no need to enable DLSS. Stop bitching.

    • @kingyogesh441
      @kingyogesh441 Pƙed rokem

      @@aoolmay6853 no point arguing with this raster cultist amd fanboys.

    • @aoolmay6853
      @aoolmay6853 Pƙed rokem

      ​@@kingyogesh441 Why call anyone cultist fanboy? I use exclusively AMD GPUs for professional work, the price for AMD compute was significantly lower than Nvidia products. This may or may not change after both companies show new products this year.
      I'm against exaggerating drama around rumors. Just wait for independent reviews and make your decision.

    • @paul1979uk2000
      @paul1979uk2000 Pƙed rokem +1

      It's not a software update as I doubt Nvidia is stupid enough to try and pull that over our eyes lol, but something is off about the numbers Nvidia are showing us.
      After all, 22fps on Cyberpunk with DLSS is really low for a card that's supposed to be a lot faster than this gen, there is no way that card should be getting such a low frame rate unless it's 8k but if it was, Nvidia would make sure we knew about it and even then, I suspect it should be able to do 8k at 30fps if not a little more.
      Like many are starting to think, it's looking more like Nvidia is trying to convince us of the performance leap by using DLSS 3 to bump up the frame rate, which tells us very little about the raw real performance of the card which is what most gamers are interested in.

  • @brianjones6465
    @brianjones6465 Pƙed rokem +7

    They took the 4070 and slapped a 4080 sticker and price on it. Took an actual 4080, put 4 extra gb of ram on it and charged the price of the 3080TI. They will release a 4080 ti priced around $1600 next year.

  • @dralord1307
    @dralord1307 Pƙed rokem +7

    Glad to see someone else noticed the GPU launch was just tacked on and more of a "fund us please" plea

    • @kasimirdenhertog3516
      @kasimirdenhertog3516 Pƙed rokem

      Indeed. My guess is the price of leather jackets has gone up significantly.

  • @Thomaseron2
    @Thomaseron2 Pƙed rokem +16

    This might be their worst launch since the FX5800, but let's wait for reviews.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +2

      I agree 100%!

    • @x8jason8x
      @x8jason8x Pƙed rokem

      By far worse, that was a Quadro. lol

    • @gametime4316
      @gametime4316 Pƙed rokem +8

      @@x8jason8x u are too young :P he is not talking about the Quadro but the FX 5800 Ultra from 2003 that got spanked by 9700 pro from ATI (now part of AMD)

    • @x8jason8x
      @x8jason8x Pƙed rokem

      @@gametime4316 LOL I'm not too young to remember that at all. There's literally 5 years of difference between the two, but yeah I started using nVidia around 2004 exclusively.
      Thanks though, made my morning.

  • @supralex1
    @supralex1 Pƙed rokem +6

    It will be interesting to see the actual performance % uplift in RASTERIZATION at 4k, for the RTX 4090 over the 3090 Ti..... without DLSS or RT

    • @kennethhohenthaner2227
      @kennethhohenthaner2227 Pƙed rokem

      Look up "how fast is rtx 4090 without fake frames" that video has some solid speculation based off of nvdias graphs.

    • @Lightsaglowllc
      @Lightsaglowllc Pƙed rokem

      It’s about 70% per Nvidia’s charts

    • @michaelabbott1203
      @michaelabbott1203 Pƙed rokem

      @@Lightsaglowllc 61 percent when you combine the 3 games . Definitely not 2-4 times faster lol but it's decent. The 4080 is around 20% which isn't good at all for the price they want . The 4070 was 10 percent slower in resident evil and slightly behind in the other games .right where you'd expect a 70s card to be compared to previous gen top tier card. I refuse to collect it anything other than a 4070 lol

  • @Kirainne
    @Kirainne Pƙed rokem +41

    Thank you for pointing out exactly what DLSS is actually doing. Everyone saw the marketing for higher framerates, but IDK how many people understand what it is actually doing.
    I can't wait to see what independent reviewers will say about pure rasterization with the three 40-series cards.

    • @x8jason8x
      @x8jason8x Pƙed rokem +3

      I used to be a fan of DLSS when I misunderstood it. I confused their meaning of native to what it means to me as a tech, EG 4K native is purely reliant on your monitor specs. When I looked further in to it, I finally understood they downscaled, then upscaled back. Using "native resolution" in this context is something of a misnomer, it's truly just AI upscaling, but it isn't true 4K either. It's motion interpolation, which TVs have been doing for years.

    • @cdiff07
      @cdiff07 Pƙed rokem +1

      RTX 4090 is 80-90% faster than a 3090 and 60-70% faster than a 3090ti! Keep this in mind until a bit after launch and tell me you were right! to me this is just fine for a 6% price increase from previous gen! The numbers I just mentioned if it needs to e said is referring to rasterization ONLY!

    • @x8jason8x
      @x8jason8x Pƙed rokem +1

      @@cdiff07 LOL! No. If you believe that I have a bridge to sell you.

    • @cdiff07
      @cdiff07 Pƙed rokem +1

      @@x8jason8x Come back to this comment after launch!

    • @cdiff07
      @cdiff07 Pƙed rokem

      @@x8jason8x czcams.com/video/vt8Z8iM0aO0/video.html

  • @AJ-po6up
    @AJ-po6up Pƙed rokem

    Wow great video! with some very interesting points I haven't see anyone else talk about. You've definitely earned a new sub here, I've got the feeling this channel is going to become one of my main sources of tech news from now on. Keep it up!

  • @Accuracy158
    @Accuracy158 Pƙed rokem +3

    Didn't they kind of say that they want the 30 series and 40 series to co-exist for a little while and it's assumed there is fairly much stock left. I wouldn't expect these cards to render the old ones obsolete. Maybe cost a little more but come with a couple new features like DLSS 3.0 and better ray-tracing performance.

  • @billb0313
    @billb0313 Pƙed rokem +38

    Good take. I'm not interested in the 40 series cards. Looking forward to your video on AMD's presentation of the 7000 series. It'll be interesting to see what AMD offers and how they compare.✌

    • @07wrxtr1
      @07wrxtr1 Pƙed rokem +3

      Same; my overpriced 3080 works just fine.... I did an R7 1700 16gb 1060 1x970pro samsung in 2017; Then in fall of 2021 - 3x2tb samsung 980pro, 5950x, 3080, 128gb ram; Thus - I like doing 4-5 year upgrade increments... I find graphics cards hilarious - it reminds me of the car world where, when every new car model comes out, they use this term: "Structural rigidity improved 15%" lol... So a car built in 2022 vs one in 2010 is a 5000% difference?! The "marketing" morons amuse me... but! Bill - the low IQ morons buy into it because they're lost. They think material goods will make them happy, when - there's other problems that are more meaningful and rewarding in life to solve. Computers should be lower on the list.

    • @buggerlugz6753
      @buggerlugz6753 Pƙed rokem +1

      They'll do the same and increase GPU commodity price. Corporate greed has no boundaries.

  • @xaranthius1415
    @xaranthius1415 Pƙed rokem +2

    Agree with most of this. Only thing that hits me is the 22fps Cyberpunk stuff. It sounded like there is a new higher quality retracing mode being added to the game that they are using. I'd like to know how current cards handle that for comparison.

    • @stevetb7777
      @stevetb7777 Pƙed rokem

      Exactly. There's no apples-to-apples comparison. It doesn't mean anything.

  • @Kingramze
    @Kingramze Pƙed rokem +7

    Ray tracing seemed an interesting gimmick. So few games I play actually have it as an option, though. The few that do, it's barely noticeable in normal gameplay - like a reflection in a puddle is more accurate. But, DLSS was practically invented to offset the lower frame rate caused by enabling ray tracing. Since I don't use one, I don't need the other.
    I see DLSS as a "nice to have" for 4K gaming with the 30 series, but for NVidia to release fps benchmarks for the 40 series based on DLSS 3 being enabled without a proper benchmark for all the bells and whistles off, I just can't compare apples to apples.
    It looks like they're just letting me know not to expect 4K 120 fps with the 40 series without the smoke and mirrors and best luck next gen.

    • @a64738
      @a64738 Pƙed rokem +1

      In Cyberpunk 2077 there is a huge difference in how it looks with the RTX on but it is the only game I see any difference... Just to sad that with my RTX3060 it drops to 30fps with RTX on, but it looks so much better that I run it at that fps anyway.

    • @Kingramze
      @Kingramze Pƙed rokem

      @@a64738 I enjoy watching streamers play Cyberpunk 2077 on Twitch, but it's just not for me.
      I have a 3090, but my favorite games - Subnautica and Subnautica Below Zero don't even have RTX capability, nor are there any plans to include it even though they still update the games, backport tech to the older one, and occasionally update the entire game engine. They claim not enough players even have RTX cards to bother with it. I think the latest Unity engine supports DLSS by default, so when they backport it, I might get that at least, but without RTX, I don't really need it.
      Same goes for a lot of games, so while I'm glad NVidia is innovating, The marketing speak of fps on only certain titles that include their proprietary tech is pretty shady and I look forward to youtubers giving us proper benchmarks once they get cards to test. :)

    • @nihylbaxter7396
      @nihylbaxter7396 Pƙed rokem +1

      Word

  • @Samir72
    @Samir72 Pƙed rokem +4

    Good report Erock! glad you din't drink the Ngreedia cool-aid. And For the 1st time in years, I'm seriously considering to purchase a Radeon 7xxx series when available. (or if becomes, possible a new RTX 3090 for $400-600)

  • @VakmanCA
    @VakmanCA Pƙed rokem +12

    I concur. That release will be a major mess. Regarding DLSS 3.0, I think Nvidia will add it to 3000s series down the road.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +3

      I hope you’re right!!

    • @sephirothlv100
      @sephirothlv100 Pƙed rokem +2

      they cant its a hardware limitation

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      @@sephirothlv100 The VP at Nvidia said it was theoretically possible. However, the performance uplift won’t be the same as RTX 40 series. So they could add it.

  • @rasenganshimada208
    @rasenganshimada208 Pƙed rokem +1

    Should I go for 3080ti or 3090ti got 360hz 1080p and 1440p 144hz montier stuck with RTX 2070 super.

  • @hardcorehardware361
    @hardcorehardware361 Pƙed rokem +3

    Totally agree with everything you said Erock, Nvidia want to sell a service and that presentation was a shareholder event. This launch gives me Fermi vibes (GTX480).

  • @Phenylalanin1979
    @Phenylalanin1979 Pƙed rokem +14

    Right on my man. There is something going on here with the raster numbers. I think they are fine and we will know about everything in 3 weeks, but nVidia wants to focus on the new shiny thing over what 99.9% of games use.
    Here are the facts; The die stack has just been renamed. I'm still to see any tech channel catch up to this fact. It's right there in the numbers.
    102 is still 102 (384-bit)
    104 is now 103 (256-bit)
    106 is now 104 (192-bit)
    The "4080 12 GB" is a 60-class card. Just look at the bus width / number of memory PHYs on each chip. Going down the stack it's 384, 256 and 192 bit or 12, 8 or 6 memory modules. Also look at the die sizes; The "4080 12 GB" chip is under 300 mmÂČ, so clearly a 60-class die.
    900 USB, or a lot more realistically, for a 4060 with a fake name and price. So I just bought a used Asus Strix RTX 2080 ti, because I'm done waiting. I'm not getting anything at these prices.

    • @saintallnights7239
      @saintallnights7239 Pƙed rokem +4

      This has been said a lot and I'm betting it will only get worse over the next month.
      Absolutely everyone calls it anything between a 4060 to a 4070Ti. I for one cannot wait to see benchmarks from Gamers Nexus and Hardware Unboxed as it's going to be a bloodbath. Even Jayz is not impressed.
      I saw a video by Graphically Challenged and the rumour mill is on raw rasterization Radeon HD going to be up there and prices were leaked as lower.
      That being said I think when AMD soak in all this backlash and the even greater one from the benchmarks they may .. drop prices a little knowing they could wipe the floor with nVidia for market share.
      If AMD have any sense.
      Doesn't matter how much cheaper your cards are if they don't sell and increase market share. In this current and unprecedented climate is going to take a fair amount of wisdom.

    • @VGFightSchool
      @VGFightSchool Pƙed rokem +5

      @@saintallnights7239 exactly! I’m glad someone said AMD will need wisdom. They seem to have opportunities and sometimes can’t take it.

    • @saintallnights7239
      @saintallnights7239 Pƙed rokem +4

      @@VGFightSchool Been following this too long.
      They have had their moments and they have had their misses. This is a moment they wont want to miss.

    • @user-ki1wq6pt9u
      @user-ki1wq6pt9u Pƙed rokem +1

      70- class not 60 class

    • @Phenylalanin1979
      @Phenylalanin1979 Pƙed rokem +7

      @@user-ki1wq6pt9u Nope. 60-class. 192-bit bus is a 60-class card.

  • @timotervola2734
    @timotervola2734 Pƙed rokem +16

    The GPU card architecture is hitting obviously a performance hard limit. They cannot achieve better performance anymore by design or material improvements or any other ways, so they just added more CUDAs and more power via upping the clock frequency. Pretty much just afterburning the engine.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +7

      And that’s fine. As time goes on, the materials will improve. So why make this launch? They could just keep the 30 series going. Or if they needed to do the 40 series, then cancel the 30 series and make the 40 series a small refresh. Market it as a small refresh. Don’t market it as 2x-4x the performance.

    • @steveleadbeater8662
      @steveleadbeater8662 Pƙed rokem

      @@ErockOnTech OR actually wait until you have the card before foaming at the mouth about performance. Otherwise you come across as unprofessional and Unprofessionalism leads to the dark side, Fanboyism. No sane person is going to buy the 4090 until AMD have blown their load. History suggests AMD will balls this up, like last gen, but only a fool would buy until the raw numbers are out. But wild speculation only serves to muddy the waters and fan the idiot fanboy flames.

    • @Konkretertyp
      @Konkretertyp Pƙed rokem +2

      @@ErockOnTech a refresh like the "super" brand with the 20 and 16 series i would assume. I can imagine, that nvidia would do that kind of thing.

    • @PawelLeszczynskipav
      @PawelLeszczynskipav Pƙed rokem

      @@ErockOnTech I think they have AMD on their back, making a claim they'll take the raster crown this generation. They had to respond with something...

    • @animalyze7120
      @animalyze7120 Pƙed rokem +1

      @@ErockOnTech Exactly nVidia really didn't a 40 series right now, the 20 series haven't been fully maxed except when done on purpose not through normal usage. The avergae gamer still plays with 900 and 10 series cards and has a great time. Competitive gamers are small 1% of the total gamers out there. They should be selling the 2k and 3k series raking in the dough and working on a proper fully ground up new gen 4k series for 2025 maybe a tad sooner.

  • @chiyolate
    @chiyolate Pƙed rokem +3

    The bigger your monitor, the blurier the DLSS.
    I use a 42 inch TV, so anything not native be it DLSS or any image scaling technique, will result in softer or blurier image quality.
    Besides, the fake frames aka frame interpolation will introduce artifacts and at best it just ignore the hard to process fast moving scenes, so in a fast moving games such as the FPS or action games, you'll see a lot of unprocessed parts of the image due to the lack of data between frame A and B. The less raw FPS you have, the worse the performance of the fake frame generation.

  • @anilarose8949
    @anilarose8949 Pƙed rokem +1

    So may I ask, currently I have rtx 2060 oc 6gb, so I need wait to take rtx 4080 (12gb actually rtx 4070) or just get rtx 3070ti for 1080p ? Because nvidia has strategy not like human on the world :) I need to play 1080p for 5-6 year

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      If you’re doing 1080p, then get the 3070ti or even a 3060ti and save the money. Those cards destroy at 1080p. :-)

  • @kasimirdenhertog3516
    @kasimirdenhertog3516 Pƙed rokem +17

    The 22 fps is because they used a special new ray-tracing setting in Cyberpunk called ‘overdrive’. This increases the ray-tracing fidelity to the max.
    The point of showing this, and the reason why DLSS 3.0 was showcased to such an extent, was explained in the presentation, where they stated that NVIDIA can’t battle the ever increasing demands of gaming graphics with extra cores only, smarter solutions are necessary. Therefore: DLSS 3.0.
    The problem with showing off a new generation of video cards is you want to show they can tackle future games. But future games don’t exist yet. So guesses are made where future games are going. NVIDIA thinks this will be ray-tracing galore. And it comes up with a ‘possible future game’ scenario that fits this.
    That’s all very nice, but you’ve got to believe what they’re telling you. First, that the future is indeed going to be ray-tracing galore, and second that DLSS 3.0 is brilliant and only when using these new cards. And to be honest: I’m not convinced by any of that. What bothers me the most, is they won’t enable DLSS 3.0 on older cards, so they essentially deprive us of the opportunity to verify their claims. That in itself makes you doubt everything that’s been said. I hope AMD’s presentation will be less disappointing.

    • @Haganu
      @Haganu Pƙed rokem

      DLSS3.0 is NOT a smarter solution though. This tells me that NVIDIA completely missed the mark, because the reason why people want higher framerates is because input feels more responsive. That's the whole reason games push for framerates beyond what their monitor refresh rate supports. DLSS3.0 increases the framerate while making the input feel *less* responsive. That's a big mistake.
      Besides, this started *because* NVIDIA is pushing a very inefficient rendering method onto gamers. Rendering based on rasterization is fine. Saves everyone a lot of performance but still gives a sufficient level of graphic fidelity. Ray tracing is a parody of its time, in the sense of that NVIDIA has to make giant leaps in terms of power consumption, for only marginal improvements on the graphics side. And now they have to compensate by giving us *fake frames in which we have no control over what happens in the game*.
      The chief technical engineer at NVIDIA that created the principles that DLSS3.0 uses stated that older cards simply cannot cope with DLSS3.0, because of the fake frames that are being created in between. The input lag would simply be too much. That is if the cards aren't struggling at creating these fake frames to begin with.
      DLSS3.0 also isn't really benched yet from what I have seen. I've seen plenty of cases where using it crashed your computer.

  • @ddnguyen278
    @ddnguyen278 Pƙed rokem +3

    Suspicious that Nvidia who makes DLSS, doesn't specify what quality mode they are using in their benchmarks..

  • @YuniorGamboa
    @YuniorGamboa Pƙed rokem +13

    I can't wait to see what AMD can bring to the table. This is their chance to dominate if they play it smart

    • @chiyolate
      @chiyolate Pƙed rokem

      It would be funny though if they have something similar..

    • @toxicavenger6172
      @toxicavenger6172 Pƙed rokem +1

      I don't think AMD will get more than 20% market share even if their cards are phenomenal price to performance. There are way more Nvidia fans that will buy this regardless if the raster performance sucks.

  • @michaelfalabella6296
    @michaelfalabella6296 Pƙed rokem +1

    thank you!!!!!!!!!!!!!!!!! ive been asking this since i saw the 22fps when it was live!

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +1

      Yeah more people should be focused on this. This is very telling. It’s all the proof we need that these cards aren’t as much of an improvement as we thought.

  • @02stampede
    @02stampede Pƙed rokem +3

    People who recently bought 30 series cards at, or below, MSRP are feeling vindicated right about now.

  • @johnwasson8604
    @johnwasson8604 Pƙed rokem +14

    i have a feeling the 40 series is really just the 30 series that are pushed as far as possible with a software upgrade

    • @rijjhb9467
      @rijjhb9467 Pƙed rokem +3

      They're basically overclocked 30XX. They pushed frequency and power limits to the Max. That's why the 4090 is so thick, it needs a humongous heatsink in order not to melt.

    • @Nerdywr
      @Nerdywr Pƙed rokem +2

      It’s just a rebranded 30series with a better process node. TSMC 4N is definitely much more efficient than the previous Samsung 8nm and I can bet nvidia didn’t do shit and just ported the new node into the existing ampere arch.

    • @animalyze7120
      @animalyze7120 Pƙed rokem

      Easy to check by removing the cooler and looking at the Chip and Chipset of the card, though they may have renamed everything to cover their butts but who knows. A good look at the layout would be a dead give away as no 2 genrations look identical as they layout and traces have to change as well.

    • @kathrynck
      @kathrynck Pƙed rokem

      nvidia has a habit of only pushing the performance envelope every other generation. And in the weak generations, just tacking on gimicks (which sometimes are deemed useful).

  • @ItzContrast
    @ItzContrast Pƙed rokem +1

    yeah I think when I finally upgrade my pc I'm going to stick to the 30 series, I just can't really get behind those prices especially since the current hardware we got i still feel like is being underutilized

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      Thanks bro, for checking out the video! I agree, it does seem like the 30 series cards aren’t even that old yet. I am excited for the new AMD cards being announced later this year, though.

  • @TheRealFloc
    @TheRealFloc Pƙed rokem +12

    Can't wait to hear Erock drop some heat!

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +4

      Haha thanks man! It’s coming!

    • @x8jason8x
      @x8jason8x Pƙed rokem +3

      @@ErockOnTech Heat you say? Like enough heat to melt ATX 12v pins? 😂

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +2

      @@x8jason8x lol 😂 that’s funny!

  • @ufsteropolstero6014
    @ufsteropolstero6014 Pƙed rokem +10

    The thing about DLSS is that it's Nvidia's prerogative to implement it whereas rasterization simply requires a compatible driver for the API so if I buy a GPU that is more powerful with rasterization, it'll be theoretically faster for all games past and future.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +2

      Exactly! Just another reason why we should focus on raw rasterization more and DLSS less.

  • @x8jason8x
    @x8jason8x Pƙed rokem +8

    Eyy finally get a notification on time. You know what the scummiest thing of all in this lineup is? They're counting on people to not know which 4080 is which. They use different dies, and the 4080 12gb uses a smaller bus size than any version of a 3080 or 3080ti (3080 carries a 320bit bus, 4080 12gb carries a 256bit bus). I'd say this is at least a purposeful attempt to bamboozle consumers, if not outright bait and switch.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +5

      Yeah this is definitely unfair. The 4080 12 GB is 100% a 4070. But people would freak out if a 70 series card cost $900. Nvidia is being greedy.

    • @HeavyGun1450
      @HeavyGun1450 Pƙed rokem +5

      Actually it's a 192 bit bus which is even worse

    • @x8jason8x
      @x8jason8x Pƙed rokem +3

      @@HeavyGun1450 Ooof hell, yeah you're right. BOTH "4080" have a smaller bus than the 3080 10gb.
      I just don't have the words to describe how scumbag this is.

    • @renaudnoyelle6309
      @renaudnoyelle6309 Pƙed rokem +2

      @@HeavyGun1450 Indeed! This 4080 12 GO is more a 4060TI than a 4070. NVIDIA are Masters of Illusion because they will outsell the fake 4K gaming card (4080 12 GB) instead of the true 4080 (16GB).

  • @AI23973
    @AI23973 Pƙed rokem

    May i ask what is the performance of the 6950 xt with RT On???? It needs FSR to bring 51 and NVIDIA 3090Ti with DLSS 2 brings 59. So 22 frames with RTX ultra quality without any DLSS is not so bad.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      I see the point you’re trying to make here. However, I love AMD cards and I think they are underrated these days. With that being said, they are way behind Nvidia in terms of RT. So it’s not really an apple to apples comparison. My ultimate point here is this is supposed to be the next generation. This is supposed to be the latest and greatest. Also, the 4090 is $1600 USD. So if a 3090 ti isn’t so far off that makes the 4090 look even worse. I’m not paying $1600 USD for a software upgrade.

    • @AI23973
      @AI23973 Pƙed rokem

      All I am saying is you are talking about rasterized performance while you bring RT on numbers to justify your point of view.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      @@AI23973 I’m saying the only thing that Nvidia wants to talk about is RT and DLSS. Where are the raw rasterization numbers? When they did show RT with no DLSS, the 4090 looked pathetic.

  • @SJkopa
    @SJkopa Pƙed rokem +2

    The 4000 card must be passed unconditionally.

  • @tsparc2
    @tsparc2 Pƙed rokem +4

    If you buy a 4090 and have to use DLSS something seems wrong...

    • @SSoul0
      @SSoul0 Pƙed rokem

      I dont personally see anything wrong. I drive a 5120x1440p monitor. When I want pure immersion in a game, I go widescreen with DLSS. Its fine to want pure rasterization but you're goign to get to a point where if you want more performance, you either crank up the power for it or you start playing smart. IMO DLSS is playing smart instead of brute forcing.

    • @SSoul0
      @SSoul0 Pƙed rokem

      @n n but the 3090 CAN actually do 8K even if its 30 FPS is a testament to advancement of GPU's. 4K was the pinnacle and now 8K is actually achievable. Stop pretending that its impossible.

    • @SSoul0
      @SSoul0 Pƙed rokem

      @n n your comment was 8K isnt doable which is obviously not true as its doable even at 30fps and even at 300 watts. when you find a gpu that can push 8k and NOT be a power draw please let me know.

    • @SSoul0
      @SSoul0 Pƙed rokem

      @n n well when were there and youre pulling 900 watts to pull 8k 60fps, maybe youll have another reason to complain then too.

  • @Fullmetal1291985
    @Fullmetal1291985 Pƙed rokem +3

    Nvidia: we got 100 fps on cyberpunk in a straight away (although 20 or 30 of those frames are fluff) đŸ€Ł

  • @toxicavenger6172
    @toxicavenger6172 Pƙed rokem +1

    I'd like to know if these fake frames change hitboxes in games with DLSS 3. For example, you shoot at someone using DLSS 3 tech but what you're seeing isn't their actual position due to DLSS sending incorrect data back to the server that refers to their position.

  • @davehill9951
    @davehill9951 Pƙed rokem

    If you want one of theses cards, you will definitley have to by a new pwer supply to accomodate the power useage and the wiring harness's supplied.

  • @heliumfreak5364
    @heliumfreak5364 Pƙed rokem +4

    Just waiting to see independent reviews of both Amd and Nvidia before I make any decisions 👍

    • @ghost500e
      @ghost500e Pƙed rokem

      Bought the evga 3090 ftw3 ultra at 990$ that i think was fairly reasonable... And surely will last for a while.

  • @awsm253
    @awsm253 Pƙed rokem +3

    These are a lot of amazingly good points. And I hadn't realized most of this before... Their entire graph setup was just "These games ran with way higher frames when run under cherry picked settings for the new cards."
    There was no "This GPU will run Ultra in these games 1080p at 144fps while barely making a sound" Or "Enjoy a smooth 90fps in 4k Ultra with no fancy cooling needed' Just a lot of "They did a lot better with upscaled resolutions and false frames"
    AMD has the floor at this point... And I am wondering if they're waiting for a full reveal just so they can watch Nvidia stumble first.

    • @ctsd623
      @ctsd623 Pƙed rokem

      It's poor strategy to show your competition the cards youre holding well in advance of you having product on the shelves.

    • @awsm253
      @awsm253 Pƙed rokem +2

      @@ctsd623 it's good strategy to show consumers a vetter/cheaper product so they hold off

  • @Andreadeluxe
    @Andreadeluxe Pƙed rokem +1

    When you said: where are?
    I have immediatly subscribed to your channel

  • @nicosuave3249
    @nicosuave3249 Pƙed rokem

    Fire video!!!!!!! đŸ”„đŸ”„đŸ”„ Won't be buying one of these bad boys soon, however maybe in the near future on my ultimate build! Thanks Erock! A++++!

  • @Kirito14682
    @Kirito14682 Pƙed rokem +5

    DLSS will only work on 4000 series just like how LHR was unlockable

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +3

      Lol 😂 I’m open to a mod/hack for
      It. I hope the 30 series gets it. But that’s detrimental to Nvidia for sure.

  • @fpshooterful
    @fpshooterful Pƙed rokem +5

    I hope Americans realize that in Countries like Canada to UK, to all of Europe, and of course worse for places like South Africa to Australia. Where a GPU like the 4090 will be about $2300 the cheapest.

    • @DjangoMMA
      @DjangoMMA Pƙed rokem

      At that point just fly to America to buy one

    • @SSoul0
      @SSoul0 Pƙed rokem

      Why should Americans care about the pricing of luxury items OUTSIDE of their home? Its not exactly my or anyone elses fault that the prices of EXPENSIVE LUXURY ITEMS during an inflation are high. If the prices are an issues, wouldnt it be best to not spend your money if you dont have it? No point in going into debt for a LUXURY ITEM to play video games.

    • @fpshooterful
      @fpshooterful Pƙed rokem +2

      @@SSoul0 I am not sure what your point is? And I am not sure where I indicated I was interested in getting the 40 series? I was just giving a perspective to what the rest of the world is facing when it comes to 40 series pricing compared to the United States? Yes, I am very clear on how debt works. That's why I am mentioning the high prices?

  • @leihejun844
    @leihejun844 Pƙed rokem

    9:28 how did you confirm? Some say the 22fps is a 3090/Ti

  • @IncessantWake
    @IncessantWake Pƙed rokem +1

    Dude, your channel is the best. Thank you for the information. If for whatever reason I choose to upgrade my 3070, it'll be with AMD.

  • @icantlivewithoutwifi
    @icantlivewithoutwifi Pƙed rokem +4

    As soon as Jensen mentioned AI can "Predict" the frames, I thought that this is wrong. No one can predict the First Person shooter movement

    • @x8jason8x
      @x8jason8x Pƙed rokem

      They can't predict player movement at all. FPS will suffer worst though, everything is gonna look like you're playing on a VA panel. Smearing everywhere.

    • @leihejun844
      @leihejun844 Pƙed rokem +1

      If you punch me in the face, I can't predict it. How can AI?

    • @kingyogesh441
      @kingyogesh441 Pƙed rokem

      It’s not actually “predicting” a future frame it’s inserting a new AI generated frame taking information from a past and latest rendered frame at worst it’s adding slight latency.

    • @x8jason8x
      @x8jason8x Pƙed rokem

      @@kingyogesh441 Jesus, the nVidia withdrawals are strong in you. In this kool-aid lies disappointment, but you do you.

    • @toxicavenger6172
      @toxicavenger6172 Pƙed rokem

      @@kingyogesh441 yes it is taking frame data from the previous frame and predicting what the next frame will be using AI. It then shoehorns that predicted frame between the previous frame and the real newest frame to make it appear like a new frame was created. Ghosting/artifacting is going to be a major issue with this tech.

  • @diegosilang4823
    @diegosilang4823 Pƙed rokem +3

    Wondering why DLSS 3.0 is only exclusive for 40 series? Maybe to artificially increase 40 series improvement over the 30 series, the real-life FPS gain is probably about 10-15 FPS for $1000 msrp. You need a new PC build to accommodate the 40 series.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +3

      They say it’s because DLSS 3.0 uses the new type of Cores inside the 40 series. The vice president also said that in theory they could get it to work with 20 and 30 series cards but the performance uplift would not be the same. But I’m also inclined to agree with you and say they are doing this to give some type of value to the 40 series cards. Because if the 30 series cards have DLSS 3.0 then there’s virtually no reason to buy these cards.

    • @gamingreflections6476
      @gamingreflections6476 Pƙed rokem +1

      @@ErockOnTech DLSS 3.0 on the 30 series would help the AIB's clear their stocks đŸ€”

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +1

      @@gamingreflections6476 Good point. đŸ€”

  • @Enakaji
    @Enakaji Pƙed rokem

    Regarding your comment at 5:47, they did specifiy the preset. If you check the fine print again, Nvidia used DLSS Performance, so they upscaled from 1080 to 4K.

  • @awsm253
    @awsm253 Pƙed rokem

    To be fair. The Xbox 1 S had Native 4k. And it was beautiful. I took advantage of it in a few titles like RDR2 and Resident Evil: 2 Remastered.
    The PS4/PS4 pro on the other hand at most got an upscaled resolution from. 2040p or something.

  • @angxlzj
    @angxlzj Pƙed rokem +7

    Happy the community is angry and telling Nvidia ‘This isn’t acceptable’ so they’ll be forced to make a great step forward on the 50 series so we can justify the prices or having them turned down. Ngl i still believe the future of gaming will be streaming video games on a Virtual Machine. Cuz at some point, evolve the technology its gonna cost so much money that having your hardware with you will be something luxurious. Remember first laptops, computers, tvs.. At some point we won’t be able to evolve easily. Down on hearing your thoughts..

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +3

      I agree about the community. I’m happy to see the community isn’t just accepting this as mindless zombies. This is truly great!
      As for the streaming, I see the potential. The issue is the tech is still a ways off. Anytime you want someone to adopt a new way of life, you have to convince them that the new way is superior. Right now if you stream a game, you can easily see the issues with it. So until they can make it feel native, physical equipment will still be around. But what you propose is totally possible and likely probable at some point in the future.

    • @x8jason8x
      @x8jason8x Pƙed rokem +1

      If the hardware evolution trended like TVs, we'd all be playing on graphene based chips and rocking 8k144hz. The key point being here that I can get a 55" LG OLED Evo for less than the 4090, having change enough left over to buy a couple new games, and dinner.

    • @thefilmdirector1
      @thefilmdirector1 Pƙed rokem

      PC gamers DO NOT WANT streaming, we like the hardware, the building, the tinkering. Console being replaced by streaming? of course. PC? no way in fucking hell.

  • @forgot46
    @forgot46 Pƙed rokem +3

    Exactly what i was thinking about raw performance. DLSS should be a feature to improve performance on old nvidia cards lets say on ampere cards but it should not be sole selling point. May be ampere card owners sacrifice some visual performance for the sake of higher framerate but someone who is buying 4090 for $1600 should get real 4k with framerate uplift compared to previous gen with RT on. This is bad news and i see no significant value in upgrading from 3090 to 4090 for gamers atleast for this ridiculous price.

    • @jinx20001
      @jinx20001 Pƙed rokem

      The thing is when has raw performance ever been an issue for the last 5 years, all everyone gets fixated on is frame rate as if we have not had more than good enough frame rates at any resolution now for years, you still play at 1080p well there is a stack of cards that can do max settings at 1080p at stupidly high frame rate, same for 1440p and 4k was basically job done with the 30 series, heck a 1080ti will do the job at 4k in most games if you don’t only want ultra settings and are happy with high or even medium.
      I for one am glad that nvidia is at least focusing on advancing the visuals, games have looked great using rasterisation for the last 10 years now so at what point do we want them to stop focusing on making rasterisation faster and faster and start actually pushing the visuals on. RT as we should all know by now is that next step and DLSS makes it playable. Nvidia knows it, amd knows it and intel knows it and all have created their own technique.
      So many people want games to look like Toy Story but run like it’s csgo and yet here we are trying still to get close to what movies could do 25 years ago in real time it’s that difficult to do and we still cry about framerates

    • @cdiff07
      @cdiff07 Pƙed rokem

      RTX 4090 is an 80%-100% uplift compared to the 3090 in pure rasterization! that is Huge. Also, it's a 60%-70% uplift compared to a 3090ti. The price increase is only 6% price increase from previous 3090 MSRP! That is really good! Whatever you're watching, listening to or SMOKING, please my friend, stop now good sir!

    • @forgot46
      @forgot46 Pƙed rokem

      @@jinx20001 i never said raytracing is not worth it, its actually awesome but why on earth do you have to rely on DLSS upscaling tech to achieve 60+ fps in 4k. And if RT is not possible at 4k 120 fps why would anyone want to pay $1600 for a card that can't do 4k with your new RT tech or else just do fair pricing. Locking DLSS 3 on only 4000 card to make it look good compared to ampere is also suspicious.

    • @forgot46
      @forgot46 Pƙed rokem +2

      @@cdiff07 Those uplifts are without Raytracing my frnd. Once more i am saying which you didn't take note of earlier "can an rtx 4090 do NATIVE 4K with RT on at 60+ fps. Without raytracing even amd can do native 4k with those uplifts you are talking about. 😂😀

    • @cdiff07
      @cdiff07 Pƙed rokem

      @@forgot46 What I read and replied to from what you said was referring to real 4k with framerate increase. Not to mention other people saying that the 4090 is going to barely be higher than a 3090ti in pure rasterization. All the outrage is OVER rasterization, which is not RT ON. Like I said, RTX 4090 is the Only card to buy, the 4080 16gb and 12gb model are way overpriced for what you get!

  • @AzSureno
    @AzSureno Pƙed rokem

    I wanted to be excited for the new cards but seeing this info about nvidias new pricing along with the 4080s kinda discourages me , I want to see what AMD is going to release, hopefully it’s something good , I am a AMD and nvidia owner of their cards

  • @Akab
    @Akab Pƙed rokem +1

    I remember back in 2014 the titan x (basically the most expensive gpu you could buy as a consumer, and not many bought it anyways because of it's ridiculous price back then) was 1k and 600$ gpus were considered high end...
    Now we have this bs...

  • @m_sedziwoj
    @m_sedziwoj Pƙed rokem +4

    7:40 look at Microsoft Flight Simulator, it is only what you need to know that they use DLSS 3 for this benchmarks, because this game is CPU bound, so they get more performance compare to 3000 series thx to fake frames, but between 4000 series card there is not much different because they are not faster, because CPU still holding them. So they compare apples to oranges.

  • @blakedmc1989RaveHD
    @blakedmc1989RaveHD Pƙed rokem +3

    it'll be Funny if Crysis 4 released right now and RTX 4000 wouldn't be able to run it 😂

  • @shattia
    @shattia Pƙed rokem +1

    Good stuff, subbed 👏

  • @VinhNguyen-pf5rs
    @VinhNguyen-pf5rs Pƙed rokem +1

    They hide the pan and a egg. After someone buy it. Jensen come out and say "you can fry your egg now"

  • @unstanic
    @unstanic Pƙed rokem +4

    After countless videos, you’re the first one to actually touch on these very important topics. Thank you. Waiting on AMD’s, otherwise will be looking for 3000 series reduced price

  • @ShadowFox10587O
    @ShadowFox10587O Pƙed rokem +3

    I bought a 3080 a few months ago I'm glad I didn't wait I'll be skipping this gen

  • @ViperVenoM13
    @ViperVenoM13 Pƙed rokem

    It reminds me of software that boost performance by using the internal graphic chip of the cup in conjunction with the discrete graphic card, it turned out it was only adding blank frames to get higher fps. DLSS3 is not the same but looks like instead of blank frame it's an extrapolated frame. It sound sketchy anyway.... I saw a benchmark of game with dlss 3 having >400 fps but with 9ms latency, it doesn't make sense since it should be 2.5ms

  • @VonSpud
    @VonSpud Pƙed rokem +2

    If DLSS 3 is required to achieve good frame rates with RT turned on, then I'll wait for something better than the rtx4090.
    I bought a 1600w EVGA Supernova just for the rtx4090.
    Now I'm hearing burnt adapter cables on some power supplies... that the 4090 will require a version 3 power supply 😳
    That DLSS 3 is required to be turned on if you want those great frames...
    I'm currently running an rtx3070 on 3 x 28" 4k screens. (I7-6950 extreme o/c 4.2Ghz & 64GB ddr4 3200, 970 EVO plus nvme 500GB & 1TB for games)
    I'm expecting rtx4090 to run at least 60fps with all settings maxed...without DLSS 3 and RT turned on....for those games that support RT.

  • @saintallnights7239
    @saintallnights7239 Pƙed rokem +3

    Said all this and more.
    And everyone now needs a full tower case again for this new graphics card .. and also a midi for everything else with a long cable. Lol.
    I call dlss3 the Psychic Algorithm.
    When GN and HU get a hold of these is going to be a bloodbath of epic proportions.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +4

      Oh I can’t wait for their videos. It’s gonna be amazing!

    • @saintallnights7239
      @saintallnights7239 Pƙed rokem +2

      @@ErockOnTech Every now and then, done this a long time, you have those moments.
      This very much feels like several all rolled into one between now and January.
      Just said to someone on another channel, AMD have had their moments and their misses and this is a moment they really don't want to miss.
      It stands out like a cherry on a Bakewell tart, or at least will over the coming weeks and should be pretty hard to miss.
      All tech reviewers are chattering even about the backlash and hate and should stand out as a golden opportunity.
      I for one didn't see this elephant coming.

    • @x8jason8x
      @x8jason8x Pƙed rokem +2

      @@saintallnights7239 The 2022 disappointment tour T-shirts are gonna be great, that's for sure.

    • @saintallnights7239
      @saintallnights7239 Pƙed rokem +2

      @@x8jason8x lol

  • @21Komtur
    @21Komtur Pƙed rokem

    Let us wait for the test results :)

  • @gambaownsus
    @gambaownsus Pƙed rokem +1

    I like your style of making vids brah. subbed

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      Thank you very much! I appreciate that!

  • @dr.ticklebum2385
    @dr.ticklebum2385 Pƙed rokem +3

    As someone who is transitioning from from console (series x) to pc and steam deck, I can't help but wonder if I made the right choice when I see moves like this and evga no longer dealing with Nvidia. Just a lot of red flags all around.

    • @Ariautoace
      @Ariautoace Pƙed rokem

      You're fine. PC get Gamepass. Then get a PS5 and. Switch. The Xbox is a PC gamewise.

  • @Th3Fly1ngCow
    @Th3Fly1ngCow Pƙed rokem +6

    Lisa su will send Nvidia back to the shadow realm

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +3

      She will eliminate Jensen’s life points with her trap card.

  • @wmnsriteslol
    @wmnsriteslol Pƙed rokem +1

    You nailed it bro. Earned my sub today

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      Thank you very much! I appreciate it!

  • @jimmyjiang3413
    @jimmyjiang3413 Pƙed rokem

    Hope FSR3 would be improved with optical flow acceleration in addition to FSR2 features, but still without ANY machine learning hardware. Improving shader FP16/INT8 performance maybe more important on raw upscaling and downscaling performance. I would like to see FP16 on shader core being up to 8x FP32 FLOPS performance. Maybe DirectML would help FSR even further
    Also, remember ray tracing matters most only on lighting, reflection, shadow, and metallic surfaces. Rasterization still used in most textures.
    Improving ray accelerator core should offload much of BVH to it.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      I haven't heard much about a FSR 3 so far.

  • @rmoore6471
    @rmoore6471 Pƙed rokem +5

    This is a very good analysis. Nvidia is again up to its old tricks. They went from raw power to fake speed; a marketing plot to harvest sales from consumers. AMD and Intel needs to capitalize..

  • @kenshinhimura9387
    @kenshinhimura9387 Pƙed rokem +3

    DLSS is GARBAGE. It is a crap ai program that makes your games look blurry. I would NEVER use it. For those of us with good vision DLSS is HORRIBLE. It will make your game blurry and laggy.

  • @Lyndonberg_Gaming
    @Lyndonberg_Gaming Pƙed rokem +1

    Just bought a Msi 3080 12gb suprim as seeing the stats of the 4080 12gb I’m just loosing out on the next gen speed of cuda cores as the cards clock 5/600 mhz slower but has like 1000 more cores

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      I have a 3080 10GB. I’m perfectly happy with it!

  • @blakkheartt12
    @blakkheartt12 Pƙed rokem

    Good job and great info. Has some good points to consider.

  • @JoeWayne84
    @JoeWayne84 Pƙed rokem +3

    Rtx 4090 gonna be a beast 
.!!!! Can’t wait to get one

  • @redking8585
    @redking8585 Pƙed rokem +1

    it amazes me that people still don't understand the concept of native resolution, this conversation has been around since the xbox 360/PS3 started upscalling games.

  • @redinthesky1
    @redinthesky1 Pƙed rokem +1

    The Amount of Lovelace in 4080 16 and 12gb almost makes me think they are holding back for next gen. Theres enough cores missing between the 4080 16 gb and 4090 for 3 cards; the 16gb card itself having 60% of the cores & resources, then another with 70, 80 and 90% of Lovelace. Its disengenous. The base 3080 had 8704 cores vs 10496 for 3090. We are nowhere near that ratio here.

  • @frankmjr6571
    @frankmjr6571 Pƙed rokem +1

    8:50 8:43Exactly. I wanted to get a new GPU but now I’ll think I’ll wait for real-time specs. I don’t care about DLSS on or off. I want raw performance specs.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      Yes exactly! I’m excited for AMD RDNA 3!

  • @julesroberts962
    @julesroberts962 Pƙed rokem

    You make an excellent point that I hadn't noticed before about the DLSS off/RT on etc - I was shocked. Hell, I'm playing CP77 right now and it looks incredible - I'm getting more FPS at 4k with RT on and DLSS off on my 2070 SUPER. My old 970 could compete!

    • @Alexandru1996_
      @Alexandru1996_ Pƙed rokem

      Yeah, but maybe they had all settings at ultra and ray tracing on psycho or something...which is not an excuse but it could mean that the performance in general just was left at same levels and maybe they improved perf per watt of power compared to 30 series....
      Which would make sence considering that they went from 8nm node to 4nm. Power efficiency :)

    • @kathrynck
      @kathrynck Pƙed rokem

      They did have everything-ultra, and ray tracing 'overdrive' setting in cyberpunk.
      Still, from the benchmarks on the 3 games which don't have DLS3, that the 4070 (I'm not gonna call a 104 chip a 4080) was performing about on par with a 3090. And at $900 it's about the same price as a 3090 is right now. That would make the 4080 (the real one, 16gb) a little faster than a 3090ti (at a little higher asking price), and the 3090 should be the fastest nvidia card, at $1600). There's no generational cost:performance gain. Which makes 4000 series quite a dud. If past generations charged for every perf gain, cards would cost $100,000 now.

  • @DarkionAvey
    @DarkionAvey Pƙed rokem +1

    is AMD releasing or announcing RDNA 3 on Nov 3rd? I guess products are often released after one month of the announcement

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      To the best of my knowledge, they are releasing on November 3rd.

  • @i3l4ckskillzz79
    @i3l4ckskillzz79 Pƙed rokem +1

    Correct me if I am wrong but I think the 22fps in cyberpunk was the "rt overdrive" mode.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +1

      You’re correct. But 22 fps looks bad no matter how you look at it in my opinion. Also, why make a new more demanding RT mode? We never mastered the last mode.

  • @Asterra2
    @Asterra2 Pƙed rokem +1

    After being forced to use DLSS in Cyberpunk, I will never use it again. Yes, it's good at faking the look of the resolution you've designated, but only if you're the sort of person who is blind to the artifacting. I am the opposite of that sort of person. 1: Any motion that has more than a single vector, such as riding an elevator (static) while the outside scenery scrolls up (moving) causes a smear of artifacts like you're wiping oil on your screen. Also when your first-person arms move too fast in front of your face. 2: Billboards and other "pure texture" elements: These are 1440p (in "quality") or 1080p ("performance"), *period,* not 4K. Because they are flat elements instead of 3d meshes, and DLSS has nothing to "upscale".

  • @andyvanderbeken3729
    @andyvanderbeken3729 Pƙed rokem

    interesting and solid sounding analysis, gj looking through the smoke and mirrors

  • @sevantofra
    @sevantofra Pƙed rokem

    I was saving for a 30 series but heard AMD slashing prices on current stock, getting ready for new. Now I'm saving for an AMD.

  • @fatihyener7589
    @fatihyener7589 Pƙed rokem +2

    Assuming by the second chart they released AC Valhalla chart seems to be the only one we should set our expectations at. And from what I see, they just placed more cores, overclocked those and increased the power limit and the 4090 gets about 50% more performance when it's compared to 3090 Ti when DLSS or RTX is out of the equation. And also the extra core count does not scale all too well because 4090 has more than double the amount of cuda cores compared to 3080 12G, It also has double everything else. But the performance gain seems to be about 60%.
    That aside, 2600mhz is about 45% faster than 1800mhz and 10752 cuda cores is %40 more than 7680 cuda cores. But across the board, 3090Ti is faster so 4000 series have slower cuda cores than 3000 series. So probably the sweet spot card in terms of core count and power scaling is the 16gb version of the 4080. And 4090 is probably doomed to fail at random because of super high power budget. That card will be notorious solder balls melting and shorting a power stage due to the extreme heat and possibly even creating a hole in the pcb.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +2

      Good points. Thanks for sharing. Overall, I’m sticking with my 3080.

  • @pilotstiles
    @pilotstiles Pƙed rokem

    I agree with all your points. But I will always be a 4K purest and don’t use any upscaling in my games. I too wonder what the true raw specs will be and from what I have been seeing there isn’t much gain over the 30 series in raw data. Can’t wait until AMD’s RDNA3 announcement.

    • @grigorioschristodoulou5229
      @grigorioschristodoulou5229 Pƙed rokem

      Rasterization and current ray tracing are also full of hacks. If you want "pure" 4K you better not buy a GPU for at least the next 10 years, as it won't be purely ray-traced to a sufficient accuracy

  • @life3600
    @life3600 Pƙed rokem +1

    In cyberpunk 2077 native 4k rtx on :
    20 fps using 3090
    22 fps using 4090
    this is unacceptable
    10% increase of performance at that price???
    after 2 years. like what?????

  • @shredenvain7
    @shredenvain7 Pƙed rokem +2

    DLSS is being pushed solely to combat the performance hits that come along with Ray tracing. All these 40 series cards as well as the 30 series cards are have rasterization taken care of. Yes the Cyberpunk numbers shown at native 4k are abysmal but that is mainly down to them running the insane setting for all ray tracing effects. Bottom line is that until they create a new memory system to accommodate the massive demand that running RT has on the GPU adding more RT and Tensor cores aren't really accomplishing anything. Maybe instead of adding more Cuda cores and slapping a ridiculous price tag on these "next gen" gpus Nvidia should innovate to solve the performance sucking issue that come along with Ray tracing. Then maybe these Tensor cores could be used for something ground breaking (like enemy AI or world simulation) instead of having the sole purpose of being a band aid for performance gash caused by Ray tracing.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem +2

      I like this comment. You’re not just complaining about the GPUs or my video lol. You’re actually proposing potential solutions to these problems. You’re right. Nvidia should have focused on solving the RT problem.

  • @Giljrg
    @Giljrg Pƙed rokem +1

    i had a Samsung TV that inserted fake in-between frames, it was really horrible, a fake smooth motion that just looked so wrong. that kind of tech should be left to the monitor not the GPU to implement.

  • @hasnihossainsami8375
    @hasnihossainsami8375 Pƙed rokem +2

    The 4080 12GB is 100 percent a 4060/4060Ti class card, since it is the fully unlocked version of the fourth largest die from this generation. This is based on the idea that Nvidia would usually reserve their largest die (possibly the AD100, you bet they'll introduce it later) for their server products, their second largest (AD102) for x080 and above cards, and the third largest (AD103) for their x070 and above. And so, logically speaking, that makes the AD104 the fourth largest die of this generation, which puts it in the x060 class. The fifth largest would go into lower end products like the x050 class and below.
    Instead, what they're doing this time is reserving the entire AD102 die for the 4090 class, pushing the 4080 class down to the AD103(renamed from 104) and adding 104(actually 106) to the same class while they're at it.
    You could argue these are just names, and Nvidia can call them whatever they want as long as the performance and hardware value is there. Except it isn't.
    Smaller die sizes also mean smaller memory buses, so we go from 384-bit on the AD102 to 256-bit on the AD103 to 192-bit on the AD104. For context, Nvidia has been using the 192-bit bus for x060 class cards since Pascal, all the way upto Ampere. That's why we've been stuck at 6GB on this class of cards with the occasional 2060 12GB and 3060. Die size-wise, the AD104 die is only 295mm^2, which makes it the smallest x060 class die since the 1060 (200mm^2), with previous x060 class dies being 394mm^2 on the 3060 and 445m^2 on the 2060. Yes, Silicon prices have gone up, but you need to understand that the actual die doesn't constitute for 100% of the price of the whole card, meaning even a 100% hike in silicon price shouldn't warrant a $100 price increase, let alone $500 assuming the 4060Ti would be priced at $400. While TSMC reported a 20% price increase, not 100%.
    In conclusion, AD104 - small die, small bus width, big price. 4060Ti class is now 4080 class. Because Nvidia has a massive RTX3000 inventory to clear.
    Their justification is DLSS3.0, thinking we'd forget that rasterization is a thing. With this generation they're literally selling us a software feature, which explains their lock down of DLSS3.0 on RTX4000 only.

    • @ErockOnTech
      @ErockOnTech  Pƙed rokem

      You’re 100% correct! I’ve been saying the 4000 series cards are an expensive software update. People get mad at me about it. But it’s true.