RTX 4090 1440p Benchmark With DLSS 3 on/off vs RTX 3090 Ti | AMD Competes on Price | CPU Records |

Sdílet
Vložit

Komentáře • 633

  • @haodongsun4159
    @haodongsun4159 Před rokem +34

    I don't really care if AMD will put a better GPU better than nvidia, just a good perfomance and price will be good enough

    • @slylockfox85
      @slylockfox85 Před rokem +17

      Most people just wait for AMD to launch a product that competes with Nvidia, wait for the inevitable price drop response from Nvidia, and they buy Nvidia's product anyway. This cycle has been repeating for years. Unless people are willing to support AMD with their pocketbooks, nothing will change. Nvidia keeps getting bolder because people keep buying their stuff. Even the people complaining about Nvidia.

    • @ogaimon3380
      @ogaimon3380 Před rokem

      well most people don't buy A 3080-3090 ti anyway even the 3070 is rare on the large scale

    • @yan3066
      @yan3066 Před rokem +1

      @@slylockfox85 well... I was a Nvidia fan. 6 out of the 7 pc I owned are running on Nvidia card. But seriously, I'm so deceived by Nvidia about their latest actions that I'm really excited about what AMD will bring in November. For the first time ever, I'm cheering for the underdog here. I really hope AMD will win more marketshares, just to give Nvidia a lesson.

    • @-Ice_Cold-
      @-Ice_Cold- Před rokem

      @@slylockfox85 You're right.

  • @Braindawgs00
    @Braindawgs00 Před rokem +57

    I want to see the difference between dlss 2.0 vs 3.0 without frame generation, since according to nvidia on reddit, dlss 3.0 without frame generation is what 3000 and 2000 series cards are getting.

    • @iansrven3023
      @iansrven3023 Před rokem +2

      yeah also does DLSS 3.0 actually feel smoother or just create more frames

    • @Arxgxmi
      @Arxgxmi Před rokem +2

      @@iansrven3023 that question is not really smart there is physically no way it just copies prior frames that has to be fraud on their part if it does since the feature doesnt say nor imply it does that, so it's probably some more AI generated stuff based on the systems in place for the upscaling making use of what DLSS has to work with anyways, i feel like especially the temporal element of dlss is going to play a big role into inflating the fps counter without destroying the picture quality. now if all of what i said is true that means there would be no reason at least 30 series cards wouldnt be able to use this. which is REALLY depressing and shitty on their part if the feature is actually any good.

    • @dralord1307
      @dralord1307 Před rokem +3

      @@iansrven3023 If you look at the latency numbers on their Cyberpunk video the latency for the DLSS2 65ish fps was the same as the DLSS3 100fps. So unless that improves it will look smoother but not play a lot smoother than DLSS2

    • @dralord1307
      @dralord1307 Před rokem

      @@Arxgxmi Uhm he said "create more frames" which is what it does. It uses the past frame and the next expected frame and the movement between the frames to create and insert a middle frame. It isnt copying it is inferring.
      As for why the 30 series wont get frame insertion: Nvidia stated its because they made a 2x improvement in their Optical Flow Accelerator. 20 and 30 series both have these on the chips. IMO its just a copout for why they wont bring it to older cards.
      The real question is will the new frame insertion tech decrease latency and by how much. From their Cyberpunk video it didnt seem to decrease the latency to where it should be compared to if the engine drew the frame. they showed DLSS2.0 getting 65ish fps at 60ish latency, then they switch to DLSS3.0 getting 100ish fps and the same latency. It then slowly goes down as the scene changes. To me that says if your used to current high fps smoothness in your controls your not gonna have the same experience with DLSS3.0 frame insertion. It will play more like the fps it should be with DLSS2

    • @AVerySillySausage
      @AVerySillySausage Před rokem

      There is 0 difference. There are going to be no improvement to super resolution other than the gradual updates already being released, they would have mentioned it. All they talked about was performance and the only reason the peformance is better is because of this framerate generation. There is no actual improvement to the reconstruction or the performance impact of the reconstruction.

  • @shortshins
    @shortshins Před rokem +242

    Great video mate, I really hope amd kill it this gen , nvidia don't deserve the success . They've proven time and time again they couldn't care less about the general consumer . The 40 series pricing the fact dlss 3 is exclusive to the 40 series whereas amds version of upscaling fsr is agnostic and benefits cards across brands and generations, I have a 3080 and that will be my last nvidia card .

    • @user-yg5vj9dd8y
      @user-yg5vj9dd8y Před rokem +3

      It's because they re using more tensor cores

    • @4gbmeans4gb61
      @4gbmeans4gb61 Před rokem +18

      You need to wake up mate, you think these company's are not working together to establish price points? AMDs new GPUs are also 450 tdp with massive coolers. Amds new GPUs will be PRICED per its performance to other cards including Nvidia cards.

    • @bombayroll4631
      @bombayroll4631 Před rokem +7

      @@4gbmeans4gb61 yeah unfortunately while we want a hero to come in this is not going to happen. Amd will be priced in line with nvidia less maybe 5% to try and win a bit more market share.
      I mean if Amd match the 4080 at the 700 bracket then everyone would buy it but for some reason they won't do it.

    • @apostleoffate2028
      @apostleoffate2028 Před rokem +1

      Leaks might suggest fsr 3.0 will be exclusive to rdna 3 via hardware just like nvidia did with dlss 3.0 but we will have to wait and see...

    • @4gbmeans4gb61
      @4gbmeans4gb61 Před rokem +2

      @@bombayroll4631 Exactly, by how the economy is looking i doubt they are even making that much money on these cards at first anyways.

  • @DJ.1001
    @DJ.1001 Před rokem +122

    Who would've guessed. Using software to inject fake frames instead of actually rendering frames is easier on the GPU. This is going to look bad and feel even worse.

    • @garyb7193
      @garyb7193 Před rokem +4

      random stuttering?

    • @Relrion
      @Relrion Před rokem +27

      We won't know until gamers nexus does its review, better to wait before forming an opinion yet.

    • @garyb7193
      @garyb7193 Před rokem

      @kaizer5lock @kaizer5lock Funny, I get the occasional frame stutter in Elden Ring. I don't think it is as restricted to FPS titles as you implied. Time will tell.

    • @JanVerny
      @JanVerny Před rokem +7

      @@garyb7193 Framestutter can obviously happen for multitude of reasons. Elden Ring is just poorly coded, don't get me wrong, great game,... but they don't spend enough on technical improvements to the engine.

    • @Warsheep2k6
      @Warsheep2k6 Před rokem +3

      @@garyb7193 elden ring has that since release normal for that game

  • @vaggeliskosiatzis5487
    @vaggeliskosiatzis5487 Před rokem +118

    if you compare the numbers of these two GPUS, the RTX 3090 TI and the RTX 4090, the RTX 4090 has 52% more shader units and 35% higher clockspeed than the RTX 3090 TI... with all these improvements, you would expect at least 2x the raster performance compare to the previous flagship, but that's not the case at all... all you get is a really underwhelming 62% more performance on Cyberpunk on 1440p resolution... and still the Nvidia has the audacity to ask even more for their GPU architecture which apparently shows that the IPC went really backwards in order to have marketable clockspeeds... Hail Jensen, the master of deception....

    • @GamerSg84
      @GamerSg84 Před rokem +10

      NVidia IPC has been going backwards since Pascal. If you compared transistors and clockspeeds in Pascal-Turing and Turing-Ampere, you will see there has been a regression every generation.
      1080ti had 11.8 billion transistors 1.58ghz boost
      3090ti had 28.3 billion transistors 1.86ghz boost ( 2.39x X 1.17 = 2.81x expected performance but actual is 2x only for almost 3x the price)
      This is only if they merely reused Pascal architecture, when the norm is to improve your architecture but NV is actually regressing.
      Obviously this is because transistors are being wasted on RT/Tensor cores and god knows what other bloat NV is including.

    • @marsovac
      @marsovac Před rokem +6

      push power and get clocks, but it costs nothing to them since the GPU die size remains the same, the cost is shifted to the user and his electricity bill, and to the world

    • @DonC876
      @DonC876 Před rokem +17

      You're comparing shader units in a workload that relies heavily on their ray tracing hardware, so that comparison doesnt really make sense - you're kinda comparing apples to bananas. Now don't get me wrong i don't think nvidias pricing is consumer friendly, but i really admire the amount of research and innovation they put into the advancement of realtime rendering in general. As a game dev myself that is enough to warrant the prices they've been asking, but i can totally see the point that regular/enthusiast gamers are making. But just ask yourself, if nvidia wouldn't be pushing the technology forward there would be no notable advancement at all. Take FSR from amd, it's just a glorified, traditional upscaler. Nothing new, nothing innovative about it. And if you've seen the interview from digital foundry with the lead developer of FSR: when df asked wether they would support an open api that would unify XESS, DLSS and FSR his answer (no we won't support that) was all to telling that they rather hinder progress than give up their unique selling point that FSR can work on any card because it's not doing anything really new.

    • @manuelmunguia616
      @manuelmunguia616 Před rokem +3

      I think it is fair to compare it to the 3090 as there will surely be a 4090ti next year, also those improvements are actually good, even if it is not 2x the perf of the top tier last gen gpu

    • @CriminalGameplay
      @CriminalGameplay Před rokem

      Memory speed is the same is why.

  • @FATBOY_.
    @FATBOY_. Před rokem +22

    Prices here in Australia
    24GB RTX 4090 $2390
    16GB RTX 4080 $1792
    12GB RTX 4080 $1344
    🤣

    • @altaafsheik5725
      @altaafsheik5725 Před rokem +2

      I feel you I think my country will be similar to that

    • @yellowflash511
      @yellowflash511 Před rokem +1

      Damn it's cheaper in India at 1915 for 4090 I think

    • @JustAnotherAccount8
      @JustAnotherAccount8 Před rokem +2

      Yeah it really sucks living here doesnt it

    • @hollyh888
      @hollyh888 Před rokem +3

      in the netherlands:
      24GB RTX 4090 1959 euro
      16GB RTX 4080 1469 euro
      12GB RTX 4080 1099 euro

    • @legendp2011
      @legendp2011 Před rokem +2

      honestly not that bad. I had a friend who paid $1800 aud for a 3070ti 12 months ago. so a rtx 4090 for $2400 is cheap in comparison.....however after 10% gst I expect the rtx 4090 to be $3000

  • @Shadowed007
    @Shadowed007 Před rokem +3

    NVIDIA Marketing 2020
    3090: The 8K gaming monster
    NVIDIA Marketing 2022:
    4090: 1440p running low res with fake frames

  • @JorgeAlarconL
    @JorgeAlarconL Před rokem +47

    I think a lot of people, including reviewers and tech youtubers, missed the fact in measuring wattage with DLSS in the past. My 3090 and for sure every GPU down to the 2060 cuts power draw because the internal resolution is lower, which means the GPU, memory controller, power delivery and memory bandwidth are less stressed than going 4K no DLSS when you can clearly see the GPU reducing clocks in favor to power other components.
    I have tested this because I thought DLSS would activate the tensor cores so a little more power could be used, but is not.

    • @gameguy301
      @gameguy301 Před rokem +1

      Nvidia even highlighted DLSS as a way to save battery on gaming laptops it’s no big secret.

    • @jobslolo7387
      @jobslolo7387 Před rokem +3

      imagine ... now your gpu can consume "only" 500w instead of 600w, great indeed.

    • @najeebshah.
      @najeebshah. Před rokem

      @@jobslolo7387 its 450 for 2x the performance.same as last gen so you got no grounds here lol

    • @jobslolo7387
      @jobslolo7387 Před rokem +9

      @@najeebshah. you're pretty delusional if you think the actual performance is 2x in games.
      Its 2x when the game can have dlss3 AND rtx. But you can't really compare apples to apples because the 30gen cannot have dlss3 only dlss2, that's purposeful gimping by nvidia of previous gen to make newer gen look better.
      So how many game will actually have dlss + rtx is? Not all that many. Less than 10 in a sea of games that have neither techs.
      Nvidia own slides show 3 game that have no dlss or rtx... And guess what. 4080 was slower or on par with a 3090.
      You will see x2 performance in like... 3-4 games... Maybe? Best case scenario is you're likely to have more something in the real of 20% more performance.
      I guess you're the kind of person their marketing actually work for. 2 digits IQ

    • @divanshu5039
      @divanshu5039 Před rokem +3

      @@najeebshah. lol didn't you watched the video, its actually 62% faster than RTX 3090 Ti.
      At least watch the video before smashing your keyboard.

  • @joescalon541
    @joescalon541 Před rokem +4

    So this generation fps isn’t going to mean much. The entire point of going above 60 fps is to decrease latency and make the game feel smoother. Our eyes can’t really tell the difference above 60 fps but the game feels way smoother, but if the latency doesn’t scale then it will feel like playing a console game on an old tv at 30 fps.

    • @ChrisHillASMR
      @ChrisHillASMR Před rokem

      stfu old pos. peoples reactions vary and at ur age ur lucky to precieve at all

  • @nostrum6410
    @nostrum6410 Před rokem +5

    power draw from the wall could be from reduced cpu utilization, or other components.

  • @Playboyer670
    @Playboyer670 Před rokem +12

    I prefer to see the 3090 Ti vs 4090 with DLSS Off. I'd like to see the raw performance comparison.

    • @doppeltegenkidama
      @doppeltegenkidama Před rokem +1

      me too
      and this nvidia benchmarks and tables are bullshit at all
      every new gen they talk about 3x more power
      at the end its 25-40%
      im sure its the same 25-40% increase on the 40xx gen and all other performance boosts from dlss3
      thats the point why nvidia dont want to make dlss3 compatible for old gen

    • @Real_MisterSir
      @Real_MisterSir Před rokem +1

      Yep and I'd also like to see it without Raytracing, to see the actual raster hard-pull performance. Because that will be the baseline indicator for all games, since raytracing implementation varies greatly from game to game. Cyberpunk leans very heavily into Raytracing, while other games dont, which may cause massive performance differences in comparisons where Raytracing is enabled.
      I would also love to see DLSS 2.0 vs DLSS 3.0 without frame generation enabled (you can actively turn off this setting in DLSS 3), so we get the true performance of DLSS 3.0 without latency-incurring fake frame generation skewing the results.

  • @CrayGriffin
    @CrayGriffin Před rokem +4

    great video, something i think you missed is that the ai generated image feature can be turned off independatly from DLSS 3.0. At least that's how i understand it from the way it's worded

  • @GeoBot_Gaming
    @GeoBot_Gaming Před rokem +18

    Native resolution 100% for me, on 4k can see the difference very easily, so… can’t wait to see what amd comes up with, had enough with nvidia…

    • @wasd-moves-me
      @wasd-moves-me Před rokem +5

      Same I game at 4k I don't use ray-tracing or DLSS

    • @crazybeatrice4555
      @crazybeatrice4555 Před rokem

      @@wasd-moves-me ray tracing is looking a lot more reasonable this generation. There's a lot more Raytracing horse power in the upcoming generation of cards.

    • @wolvenheart4309
      @wolvenheart4309 Před rokem

      Yeah I bought a 3060ti and it looks trash on 1080p using dlss 2.0 quality mode. After that I never used it again

    • @Sims64340
      @Sims64340 Před rokem

      Yes 1080p looks trash I agree. (2013 Xbox One 100$ resolution)

    • @arthurmorgan6353
      @arthurmorgan6353 Před rokem

      Yep 4k is becoming more and more important with the abysmal blurry TAA implementations

  • @technologicalelite8076
    @technologicalelite8076 Před rokem +9

    You definitely have a right to be concerned about latency and it's gppd to get this information out to help users and potential buyers be informed. But frametimes also have latency. Yes it would increase latency cpmpared to non-dlss three, but it shouldnt be much. Latency is important on esports and what not, so better just ti have lower settings, but for those who don't care much, but want a better fidelity, this is good.

  • @mohsen0664
    @mohsen0664 Před rokem

    hi thanks fro the info , you are using LG c1 as monitor
    can u make a video or say something about your experience with LG c1
    how you use it or did u experience any pixel burning? do u suggest using LG oled tvs as monitor for gaming and static work for our pc build?

  • @x32i77
    @x32i77 Před rokem +6

    Damn I was hoping for 60 fps at native 4k without blurry dlss

    • @anuzahyder7185
      @anuzahyder7185 Před rokem

      It can but not in cyberpunk. Thats 59fps at 1440p . It wud be 30fps at 4k native

  • @Robonoodles
    @Robonoodles Před rokem +3

    Thanks for the video Daniel, quality video as always man

    • @Rivexd
      @Rivexd Před rokem

      You can't know that, he's just uploaded it.

    • @Robonoodles
      @Robonoodles Před rokem

      @@Rivexd Nah I didn't watch all the way through before commenting but a few mins in and i already liked it.

  • @henrikeng1585
    @henrikeng1585 Před rokem

    i love the little screams you make when you move your camera haha

  • @Camilo316
    @Camilo316 Před rokem

    Question maybe you can Answer. 3090 w/ 12900k when HDR is on and I go to use the Xbox Game Bar or its FPS counter. I get stutter on my pc and games are unplayable. Do you know whats going on with that ?

  • @stevin47
    @stevin47 Před rokem +2

    wow amazing how when new cards come out the performance they can release thru drivers upon a new series release . and how much they can lower performance later when they want you to upgrade on the same series cards

  • @MrTHORN74
    @MrTHORN74 Před rokem

    I'd like to know what card is outputting the 22fps on the left , is it the same card putting out the 98fps on the right? If so the raw raster isn't looking good.

  • @Dionyzos
    @Dionyzos Před rokem +6

    Btw Daniel, you should try DLDSR at 1,25x (5K downscaled to 4K with Deep Learning using the Tensor Cores) with DLSS Performance. It looks significantly better than 4K DLSS Quality. I'm using it in RDR2 on my CX OLED right now and it looks stunning.

    • @Keivz
      @Keivz Před rokem +1

      Dlss performance of 5K also runs considerably worse than dlss quality fwiw despite being the same internal 1440p resolution. But yeah it does look better and you get to control the sharpening more with dldsr.

    • @Dionyzos
      @Dionyzos Před rokem

      @@Keivz I think the cost is worth it if you have a little performance to spare or can reduce some settings

  • @x9x9x9x9
    @x9x9x9x9 Před rokem +3

    This DLSS 3 tech will def have drawbacks. Nvidia couldnt up raster performance more so they relied heavily on DLSS to claim the “2-4x” bs

  • @pastafazool5951
    @pastafazool5951 Před rokem +7

    I have a 12900k(5.2ghz) and a msi suprim 3090ti suprim x ddr5 5600dom plat, when I ran those settings I ran at 45-55fps(not 39fps like wcct)..with dlss on ect I was able to hit 95-110 fps(performance mode). My monitor is a Samsung odyssey g9, 5120x1440 240hz, card is currently on air, boost clock over 2100mhz whole time, not really sold on 40 series yet.
    Ps. Still think these prices will crash hard. Also all my tests were performed on a test bench. (Open air gpu, open loop cpu)

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Před rokem +1

      they will but probably, just like 3k series, will have to wait for over a year

    • @dhaumya23gango75
      @dhaumya23gango75 Před rokem +1

      Wccftech got 37 fps (not 39) on the oc 3090 ti in an incredibly demanding scene. Go and benchmark there in the exact same location and you would get similar fps. Bang4buckgamer's 3090 ti oc was struggling to maintain 60fps with dlss set to quality at 1440p psycho rt in that area, that scene is more demanding than driving around in the city.

    • @pastafazool5951
      @pastafazool5951 Před rokem +1

      @@dhaumya23gango75 I never said I was driving around heh. I did everything that was done, 15x over. If I block and oc this card I'm sure it'll do even better. All my settings are at Max/Psyco just to be clear here. At quality I was getting 45-55. Sorry to disappoint, 3090ti still very relevant. I'm sure I'll be picking up a 4090/ti anyway when the timing is right. (Currently waiting for ek to pack and ship my block and waiting for my cable from evga so I can burn this card)

    • @dhaumya23gango75
      @dhaumya23gango75 Před rokem +1

      @@pastafazool5951 Benchmark in the exact same location, that was my point. I know you were using the same settings and neither did I make any comment about the 3090 ti being irrelevant.

    • @pastafazool5951
      @pastafazool5951 Před rokem +1

      @@dhaumya23gango75 I did bench the exact location haha xD, and then some! No hate here brother! Ps. The news about the fragility of the 12pin adapters only having 30 cycles is scary as shit btw.

  • @Mikediszel
    @Mikediszel Před rokem +1

    Great and informative vid. I appreciate your time and effort.

  • @rickysoora7930
    @rickysoora7930 Před rokem

    which subject do you teach?

  • @optimalsettings
    @optimalsettings Před rokem +1

    When enabling dlss 2.0 on my 3090 the powerconsumtion is also lower than native. So why should the 4090 not do the same? There is less to render.

  • @user-np1he4mu2u
    @user-np1he4mu2u Před rokem

    Great analysis. Hope you keep it up!

  • @REKTNA
    @REKTNA Před rokem

    What's up with the PCIe connectors? Can't these just slot into a normal mobo? I ask because I think I saw it was said that the recommended CPU was 5900x which is 570/PCI gen 4.

  • @AlexC259
    @AlexC259 Před rokem +5

    What i am interested in is how the 4000 series will perform using PCVR units, where just getting the unit to run at it’s native resolution is important.
    There a lot of next generation headsets coming out and the 4000 series cards will be important and needed to drive these new high resolution sets.

    • @Kai_Mercer
      @Kai_Mercer Před rokem

      Same I can't wait to see how high I can take the res on Vr games with a 4080 16gig

    • @BillCipher1337
      @BillCipher1337 Před rokem

      Not great, except your VR-Games uses RT and DLSS 3

    • @Kai_Mercer
      @Kai_Mercer Před rokem +1

      @@BillCipher1337 only one vr game I think use any type of dlss

    • @eduardosantiago6948
      @eduardosantiago6948 Před rokem

      @@BillCipher1337 latency on vr is not great, dlss3 may be a bad idea there.

    • @BillCipher1337
      @BillCipher1337 Před rokem

      @@eduardosantiago6948 that's true and as far as I know most VR headset use already some kind of interpolation

  • @johnbeeck2540
    @johnbeeck2540 Před rokem

    Almost 60K Subs! The power reduction at the wall could be the net effect of DLSS unloading the CPU?

  • @turtlefeet7722
    @turtlefeet7722 Před rokem +9

    I have no idea why they test their gpu with the most broken game at all time. Even a few month ago I had issues with bugs and glitches.

    • @cybervoid8442
      @cybervoid8442 Před rokem +1

      Because a lot of people like cyberpunk. Pretty simple really

    • @dam78
      @dam78 Před rokem

      The game has become popular again since Edgerunners release

    • @TekGriffon
      @TekGriffon Před rokem +1

      Because it's the Crysis of its generation. A game that looks so good (on high details, obv), you have to check twice to make sure it's not a tech demo.

    • @wasd-moves-me
      @wasd-moves-me Před rokem +1

      Thank you someone else gets it

    • @wasd-moves-me
      @wasd-moves-me Před rokem +1

      @@TekGriffon I'm sorry it can look ok at times but poor just as much

  • @Durkhead
    @Durkhead Před rokem

    Can you run your 3080 on cyberpunk 4k with raytracing no dlss to compare the framerate to nivida's 4090 presentation?

    • @optimalsettings
      @optimalsettings Před rokem +1

      I have done with my 3090. Its 20 fps.

    • @optimalsettings
      @optimalsettings Před rokem

      But nvidia has done in 1440p not in 4k. In 1440p native the 3090 gets 40 fps.

  • @godzilla2k26
    @godzilla2k26 Před rokem +7

    Only appears to reduce latency because it massively boosts frame rate with fake frames. These cards are going to be torn apart when review samples release!

    • @daemonreaderx
      @daemonreaderx Před rokem

      No. It reduces latency because this Frame Generator runs along with Reflex. Something like that. Reflex>>Reflex+FrameG>>Off

    • @godzilla2k26
      @godzilla2k26 Před rokem

      @@daemonreaderx Latency reduction is based on interactions with fake frames.

  • @Camilo316
    @Camilo316 Před rokem

    All of this is done in performace mode right ? I know that I usually either play at Quality or Balanced mode. I prefer they show those numebrs at the mode instead of performance.

  • @tektekken176
    @tektekken176 Před rokem +1

    Cyberpunk has its own benchmark generator. So I imagine they used the games specific scene. Which is the bar/club scene and then the open city cam travel

  • @_ICE_VR
    @_ICE_VR Před rokem +6

    Glad we both run the 3080 12 gb as daily driver lol
    Also, I remember you did some VR vids before, I'd be interested in seeing some differences between Nvidia and AMD for VR frametimez

    • @_ICE_VR
      @_ICE_VR Před rokem +2

      I would for sure like to talk out with you my current findings if you want sometime, I love VR and have accumulated alot of opinions lol

    • @rickysoora7930
      @rickysoora7930 Před rokem +1

      I too would love to see some benchmarks for vr gaming.

    • @rickysoora7930
      @rickysoora7930 Před rokem

      @@SkeleTonHammer sounds good. Im just gonna hold until I see the new AMD cards. been needing to upgrade my 1060 3gb for a while.

    • @ruiaz701
      @ruiaz701 Před rokem

      @@SkeleTonHammer I have a 3090 and in No Man's Sky the dlss helps to have a fluid game in VR.

  • @willie_style26simmons98
    @willie_style26simmons98 Před rokem +2

    I thought they told us it was suppose to be 4xs faster? Maybe I was mistaken

  • @PssupplementreviewsbyPete

    3090ti -> 4090, not even double the performance with raytracing enabled in Cyberpunk, 1440p maxed? WTH nVidia!
    Please class action sue them for false advertisements.

    • @EldenLord.
      @EldenLord. Před rokem

      62% native performance increase is still the biggest jump we ever had. Maybe it will actually be double performance in other games that are better optimized than Cybertrash 2077. Also keep in mind that performance will only get better with future drivers. It has always been this way.

    • @PssupplementreviewsbyPete
      @PssupplementreviewsbyPete Před rokem

      @@EldenLord. it's just the issue that they are not allowed to mislead their customers. Well, we'll see when the reviews come out. But now it's not looking good.

    • @Chrontard
      @Chrontard Před rokem

      @@EldenLord. if there is 62%... 62% is probably only your hopium at this point

    • @psylina
      @psylina Před rokem

      @@EldenLord. Cyberpunk runs pretty well on a 2060 which is very low pricey. 4090 probably isn't much of an improvement from a raw performance perspective

  • @MonkeyDSlayer13
    @MonkeyDSlayer13 Před rokem

    classic teacher. loses times in the introduction and then rushes until the end haha

  • @greendude0420
    @greendude0420 Před rokem

    It’s so cool seeing old games remixed with rtx hopefully it becomes mainstream tech someday

  • @getermoura
    @getermoura Před rokem +1

    HUmmm 25% less power? Make me think some internal components of the dlss pipeline are bottlenecking and making other components idle while it. OR maybe even the CPU cant keep up, but not likely as half frames are not computed.

  • @allergictobs9751
    @allergictobs9751 Před rokem +7

    When 3090 came out it was very cool too, then later the temps of the memory were added and ppl realized what kinda dumpster fire they had running in their pc.

    • @Real_MisterSir
      @Real_MisterSir Před rokem

      the new 40xx coolers are massively improved though, I think that's the last part we should be worried about. For now, the only true drawback is the damn price which is more heated than the vrms on the 3090 cards

  • @mage0killer
    @mage0killer Před rokem +20

    My take on the article. Turning on DLSS 3 Quality mode takes the rendered frame down to a lower res (no longer native) that is then DLSS upscaled. This increases the frame rate from 59 to 85. Then the Frame generation part will double the frame rate to 170 by inserting a generated frame in between every rendered frame. That's the only way this makes sense to me.

    • @googleslocik
      @googleslocik Před rokem +5

      Probably, that means you always get 1-2 frame delay with DLSS 3, which is 16-33 ms of extra delay.

    • @mage0killer
      @mage0killer Před rokem +2

      @@googleslocik at 85 FPS that is 11.76 ms additional delay. The thing is they are comparing reflex off to reflex on which by itself lowers latency by like 30 or 40 ms. It's not an apples to apples comparison but still the increased latency from the generated frames shouldn't be that noticeable.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Před rokem +1

      theres also possibility that they intentionally make native res inefficient on dlss/fsr titles (overly high graphic quality)

    • @timdal78
      @timdal78 Před rokem

      @@googleslocik Yep but when the hardware latency chain is good and the engine is not laggy that latency is very tollerable.
      I stream PC VR games to my Quest2 via Wifi and even with that latency beleave it or not its fine if the total motion to photon lag is low enough. And if VR doesnt feel wierd with that latency unless your playing conterstrike you will totally be good. Time will tell

    • @TheDashACorner
      @TheDashACorner Před rokem

      @@googleslocik I wish people would watch the nVidia keynote first before posting BS like this.

  • @Dlo_BestLife
    @Dlo_BestLife Před rokem +4

    Really would encourage people to just take it all in & relax until Nov 3rd. I want a little more performace but the BEST choice is to WAIT for all the REAL data & what AMD has to offer before making a decision. I'll probably go Nvidia again BUT I would love to try AMD if they make a worthy enough product. ✌️

    • @th3orist
      @th3orist Před rokem

      i never waited tbh, i bought the 1080ti, the 2080ti, the 3090 right at release and AMD never had a card where i said afterwards "oh man why didn't i wait, this card would've been better".
      so i have no reason to believe that AMDs flagship will make me regret getting the 4090 at launch.

  • @kaseyboles30
    @kaseyboles30 Před rokem +3

    I suspect at least some of the 4090 coolers were designed when the 600watt rumors will still strong.

    • @goldnoob6191
      @goldnoob6191 Před rokem +2

      🙄 Or maybe because it actually outputs 600W..

    • @kaseyboles30
      @kaseyboles30 Před rokem +1

      @@goldnoob6191 They measured less power usage than that.

    • @goldnoob6191
      @goldnoob6191 Před rokem

      @@kaseyboles30 😆😆 You can throttle down the power to 250W also if you crank down the clocks.
      Since there are rumors about the GPU hitting 3GHz...
      Also looking at transients figure I can already say the average to be half that value.

  • @StingyGeek
    @StingyGeek Před rokem

    Really interested to heard the 3rd part reviews of DLSS 3.0. It is certainly a great way to manipulate FPS benhmarks, it will also be interesting to know if it is great for gameplay.

    • @alexkarp130
      @alexkarp130 Před rokem

      DLSS 3 is just that shitty “motion interpolation” like “Smooth mode” on TVs, it adds fake frames making motion look odd(soap opera effect) also adding a lot of input lag while putting frames which they pathetically trying to cover with already existing Nvidia Reflex. Best looking and most responsive experience is and will be DLSS 2 + Nvidia Reflex. Also new card are not much faster in real frames so they they use graphs with DLSS 3 fake frames, misleading like other big hypocrite companies

  • @Hyperion1722
    @Hyperion1722 Před rokem +1

    Hmmm, the "within 50C temps" could possibly be the room temperature...

  • @lancedunn310
    @lancedunn310 Před rokem +3

    You’re telling me that an RTX 4090 barely gets 60fps running Cyberpunk at 2k? I get that it was using the best settings, but that means that 4k will be out of the question for true next-gen games. These cards are not near good enough for the prices they are asking

    • @anuzahyder7185
      @anuzahyder7185 Před rokem

      Yes coz cyberpunks rt is crazy. Rt overdrive brings even 4090 to its knees didnt u see. 20fps. U wud have to use dlss or turn down rt options.

  • @CharlesVanNoland
    @CharlesVanNoland Před rokem +1

    I would think that a digital TV's motion smoothing is actually just using the MPEG macroblock interpolation vectors to generate extra frames in there, but yeah a full display per-pixel motion vector map will allow for better interpolation. The problem is when there are transparent things, or particles (like smoke). The vector map can only represent the motion of a single surface pixel - not a layer of multiple overlapping things like particles, glass, etc.. That means either the motion of the transparent object will fudge around what's behind it, or the motion of what's behind it will fudge the transparent thing's position. This was visible in DOOM which used a velocity map for global motion blur, if something was behind a window moving around - the window's surface would be smudged around in certain situations.
    CP2077 running at 170FPS w/ DLSS3 means that it's rendering frames at 85FPS, which is a frametime of 11 milliseconds. The interpolation requires the next frame in time to render the interframe, so 11x1.5 = 17ms frametime latency, which is equivalent to the latency of 56FPS. I think I would be more inclined to play CP2077 at 85FPS with 11ms frametime latency instead of 170FPS with 17ms frametime latency.

    • @Real_MisterSir
      @Real_MisterSir Před rokem

      The thing with TVs is there is no input, thus no input lag to worry about. They can afford to delay frames to calculate an intermediate frame of decent quality without the viewer even realizing things are delayed, because once they're put on screen the framerate is smooth, even if every single frame is delayed, they're all delayed equally so it doesn't matter.
      They can get away with very old ai tech and still get incredible results easily, but the moment latency becomes a point of care - like in pc gaming where you have mouse and keyboard input - that's when the response of every generated frame matters, and you can't just withhold frames for that long while supplementary frames are being generated.
      I could do it easily with some rudimentary FLASH morph features nearly 10 years ago with only a few weeks of practice in IT class. It's really not difficult to do.

  • @the1observer
    @the1observer Před rokem +2

    Imagine drooling over a 4090 when you had to sell your 3090 at a loss to get a 3090 ti. People doing this are a bigger problem than scalpers.

  • @mikee3437
    @mikee3437 Před rokem +1

    The tensor cores are made to do the calculations used for AI vs the RT cores so by offloading the math needed on the RT cores which are running at max, it shifts the load to the tensor cores which are much better at AI calculations so it utilizes less power because the GPU does not have to think as hard.
    I really wish Nvidia would do tech architecture videos like Intel does so more people can understand how things are working under the hood.

  • @florisbackx1744
    @florisbackx1744 Před rokem

    Ah, now I understand why you are so good in explaining these complex concepts to viewers, you are a teacher!!

  • @yanmew
    @yanmew Před rokem +4

    The problem is the price... The performance and new tech sounds incredible and I would love to experience it! But how can I make a 1.6k purchase more justifiable to myself? Maybe I should learn blender haha.

    • @nomadicsouls3290
      @nomadicsouls3290 Před rokem +1

      I’m going to start an only fans to help pay for it. Everybody seems to be at it these days😅

    • @divanshu5039
      @divanshu5039 Před rokem +1

      @@nomadicsouls3290 lol

  • @yellowflash511
    @yellowflash511 Před rokem +2

    Jensen will have to personally blow me off for me to pay those prices. Waiting for Nov 3.

  • @nyftn
    @nyftn Před rokem

    i like the detailed videos . i do indeed only compare with what i need and is usefull for simracing. i'm stil on single screen and gpu runs at 80% load , i do that on purpose to have less frame drops when racing online. i see 40 series as a good step forward and 50 series will be worth upgrading in my case. settings are mostly maxed out. rennsport (ue5) is going to change that i guess lol

    • @nyftn
      @nyftn Před rokem

      when the fps is already 100 for example latency is ok for me and in that case i could use dlss3 to get it up to the refreshrate from the monitor without using gsync . i just think i'll see a lot of people on facebook with the question to use a 40 series to bump their 30-40fps with dlss3

    • @N4CR5
      @N4CR5 Před rokem

      AMD is faster in raw raster and some racing games check them out too. Nvidia are being extra asshattery.

  • @popthiccle1158
    @popthiccle1158 Před rokem +1

    damn ur a high school teacher??? I would have loved to have u as a history teacher in highschool😂

  • @pilotbsinthesky3443
    @pilotbsinthesky3443 Před rokem +1

    Great to hear about less power consumption on 40 cards.

  • @ReidBoS
    @ReidBoS Před rokem

    Love your videos, but you need an ad-blocker on your browser 🤣

  • @giedmich
    @giedmich Před rokem +1

    50C on full load? So why on earth did they make the cards so big? They could have made the cooling twice as small and hover around usual 70C. Something isn't adding up here.

  • @MrLeovdmeer
    @MrLeovdmeer Před rokem

    For that crazy prize it better be extremely good.

  • @pete2097
    @pete2097 Před rokem +6

    I think DLSS 3, shouldn't be called DLSS 3
    How about Deep Learning Frame Insertion DLRI or frame Sampling injection FSI
    Also think there should be a method of controlling what frame rate needs to be for the insertion, so say if it's below a target, then difficult scene can remain smooth but the feature doesn't need to be active all the time.

    • @zqzj
      @zqzj Před rokem

      That would just add more latency.

    • @AVerySillySausage
      @AVerySillySausage Před rokem

      There should hopefully be a way to turn off the frame generation and essentially just run DLSS 2. Because not everyone is going to like it and not on every game. It shouldn't be up to the developer either because you know some won't bother.

  • @yukisnoww
    @yukisnoww Před rokem

    my local gigabyte distributor has those newegg prices and...6950xt at US$670, plenty of stock too. The rest hasnt follow though...dont know what's up

  • @joshuaguzman9762
    @joshuaguzman9762 Před rokem

    What do you teach in high school boss?

  • @Nite-Lite-Gamers
    @Nite-Lite-Gamers Před rokem +9

    Would you know, Nvidia Clock = 2850, Amd = near 3ghz, Nivida cuts power draw by 25%, Amd used less power draw, Strange how this comes out just before the amd cards launch. I feel panic from Nvidia

    • @Angel7black
      @Angel7black Před rokem +5

      If there was panic they wouldnt be selling a 70 class for $900. Its crystal clear that Nvidia doesnt care at all about AMD at all. If they did then this would be the stupidest way of handling. This is just S tier hubris on Nvidias part and they genuinely think they dont have to worry at all about not just AMD but about people getting sick of their anti consumer crap. I think Jensen is gonna get a rude awakening this time around but i dont think he cares at all what AMD is doing

    • @Nite-Lite-Gamers
      @Nite-Lite-Gamers Před rokem +1

      @@Angel7black Yea ya could be right, but I do think he will get a shock and that will be on the 4080"2 versions" but Im wondering, are they gonna now put the 4070 on a different dye or still use the 4080 12gb`s dye?

    • @impc8265
      @impc8265 Před rokem +3

      @Salt Maker lol that bus width, I'd say thats a 4060, and the 16G 4080 is the 4070, and the 4090 is the 4080

    • @christiannozza5469
      @christiannozza5469 Před rokem

      keep in mind that AMD and Nvidia use different ways to measure GHZ so you can make a comparison between an Nvidia card to another Nvidia, but not an Nvidia to an AMD

    • @christiannozza5469
      @christiannozza5469 Před rokem

      @Salt Maker ops

  • @AdiCherryson
    @AdiCherryson Před rokem

    As it comes to inserting frames the previous video sounded more reasonably (even if it is not what is going at all in that algorithm). This time with some of the future frame buffer and motion vectors I don't buy it. Although I'm prepared to be corrected. Just it doesn't make much sense. Is the motion even known to the graphics card? And even if it is then it still would be predicting future. The same way you can take last two (three...) frames and extrapolate what will be next. It won't solve the problem of the responsiveness (since we don't know if player will generate input or not).

  • @laynemccormic9102
    @laynemccormic9102 Před rokem

    does anyone know what classes Daniel teaches? Algebra, trig, calculus?
    Edit: I found it on the internet he teaches geometry.

  • @SeedMesh
    @SeedMesh Před rokem

    I hope they will implement DLSS 3.0 on 30 series, they said the same about RTX and then after all hype I was using RTX with my 1070

  • @garyb7193
    @garyb7193 Před rokem +4

    It sure looks like DLSS 3.0 Fake Frame Generation latency issue could introduce stutter at the very least. Personally, I find random freezing and stutter in games so annoying, almost as bad as lower fps itself.

    • @zqzj
      @zqzj Před rokem +3

      Not only will it increase latency, but it will have significant negative impact on accuracy.

    • @garyb7193
      @garyb7193 Před rokem +4

      @@zqzj Yeah, that and the fact DLSS 3.0 leaves behind those who purchased 20 and 30 series cards is a hard pill to swallow. Especially when AMD find ways to include several generations of their cards and the competition cards as well.
      As far as the quality difference between FSR and DLSS, I believe we will reach a point of diminishing returns, at which point FSR 2.x or FSR 3.x would be generally excepted as 'good enough'

  • @fullspeedfordbronco
    @fullspeedfordbronco Před rokem

    I like this dude…keep up the great work!

  • @Dg-zj6jo
    @Dg-zj6jo Před rokem

    can my 3090ti strix do dlss3

  • @Kizzster
    @Kizzster Před rokem

    Does anyone know the 1440p and 4k benchmarks vs old gen without crappy dlss?

    • @Chrontard
      @Chrontard Před rokem

      4080 12gb below 3090ti, 4090 in cherry picked games +40% over 3090ti

    • @dhaumya23gango75
      @dhaumya23gango75 Před rokem

      @@ChrontardOh yeah no. The 4090 gets around 70 percent more performance over the 3090 ti in division 2 and re village, which are games with no rt/dlss implementation from the graphs on the website. 50 percent more ac valhalla , a game which runs bad on nvidia cards. So definitely well over 60 percent on average over the 3090 ti. 4090 seems to be an impressive card unlike both the 4080s.

  • @la7era1u54
    @la7era1u54 Před rokem

    His heater kicked on? It's 97f here right now. Are you in the arctic???

  • @gonkbous
    @gonkbous Před rokem +2

    not being able to hit 4k 60fps natively is really disappointing since they said 2x rastor perf. still probably worth a buy though

    • @jeevan1198
      @jeevan1198 Před rokem +2

      It’s not worth buying at all. It is a full scam. For two major reasons. The first reason is the prices are ridiculously high for what the GPUs are actually capable of. The second reason is they are making false performance claims. Nvidia said the 4090 is 2-4x faster than the 3090 Ti and we all know that’s total bullshit. They shouldn’t be allowed to say it's 2-4x faster unless the GPU is actually that much faster than its predecessor apple to apple.

    • @gonkbous
      @gonkbous Před rokem +1

      @@jeevan1198 i am saying its worth a buy as someone who currently doesn't have a GPU and is in the market for one. I agree that its a poor decision to upgrade from something like a 3090

    • @jeevan1198
      @jeevan1198 Před rokem

      @@gonkbous If Nvidia really tried to push the 40 series to the limits and tried to get the most performance out of them as possible, they could have made the 4090 around 80% faster than the 3090 Ti in rasterised games and up 3x faster in ray traced games without DLSS if they really wanted to. They choose not to because Nvidia likes to minimise performance and maximise the prices. That’s what makes them so fucking selfish. It proves they don’t give a shit about their customers. Nvidia has heavily relied on DLSS to give the 40 series a massive performance. They can’t just rely on DLSS to boost performance. I’m not saying DLSS is bad. It’s a cool feature and it can be useful in some cases. I don’t have a problem with Nvidia developing their DLSS technology. One problem I do have with DLSS though is when it becomes the main selling point of the GPUs. Even if someone had a 1080 Ti, I would recommend them either getting a 30 series or RX 6000 GPU. The other option is waiting for RDNA 3. If it was me, I would skip the 4000 series.

  • @sgonzo5572
    @sgonzo5572 Před rokem +1

    Theres no way the gpu was at full utilization with 25 percent less wattage

  • @remisouth5695
    @remisouth5695 Před rokem

    one thing people are overlooking is the lack of input information in the fake frame also if you get the frame rate high enough to overcome the input lag you are not going to notice the increase in smoothness

  • @eslamarida
    @eslamarida Před rokem

    Dlss 3 will make any game playable at 8k with quality mode that's exciting I'm already can't tell the difference at 4k with performance mode maybe there a little bit aliasing to the objects but it's not noticeable

  • @samsak1239
    @samsak1239 Před rokem

    Hey Daniel not to be rude but will you update your channel picture. It's kind of dark and not so nice. I love your videos though.

  • @MikeDoesRandomThings
    @MikeDoesRandomThings Před rokem +2

    I knew it was marketing BS. If your product relies on a feature that not everyone may like for it get the performance numbers you advertise, which ALSO works for the previous gen products too, then you didnt really make that good a product. AMD, come out swinging. Swing hard and let your presence be known.

  • @sacb0y
    @sacb0y Před rokem

    "i'm a highschool teacher" yeah i'm not surprised lmao. You sure sound like it.

  • @PancakeLizard710
    @PancakeLizard710 Před rokem

    It's not good that the 4000 series don't have Display Port 2.0

  • @UncannySense
    @UncannySense Před rokem +1

    yeah because nothing says efficiency like a GPU cooler that's over 3 slots thick weighing over 4kg.....

  • @Lackesis
    @Lackesis Před rokem

    5950x+3090 here, native 1440p + maximum ray tracing psycho settings = 30~33 fps, at outside of V's apartment.
    can't wait for review and test.

  • @gwenray17
    @gwenray17 Před rokem +1

    Can you go one episode without saying Apples to Apples 😂

  • @Alfactors
    @Alfactors Před rokem +1

    I can't wait until it tanks later and I can buy it for under a 1000 like the 3090TI

  • @PapaStel25
    @PapaStel25 Před rokem

    Don’t be too worried about buying more gpus. Nice thing about your channel is it could be considered a business, and those as business expenses. Hopefully you get what I’m hinting at 😉

  • @ExperiencedPanda
    @ExperiencedPanda Před rokem

    It would be incredible to use a RTX 4090 with a high end VR headset.

  • @Playboyer670
    @Playboyer670 Před rokem

    Wait its 59 fps with ultra ray tracing, Psycho settings DLSS off. At quality preset I believe fps should be higher and I wonder how high.

  • @reviewforthetube6485
    @reviewforthetube6485 Před rokem +7

    62% native performance increase is massive bigger then any other generation. The craziest part is this is again the 3090ti so imagine a 4090ti! That means that it's over 70% faster then the 3090 in native performance without dlss.

    • @Rivexd
      @Rivexd Před rokem +5

      4090 is impressive yes, 4080/4070 not so much given the price. They don't make much sense to buy where 4090 is so far ahead, glad I can skip this generation.

    • @flemminglarsen3746
      @flemminglarsen3746 Před rokem +2

      Still a looong way from the x2-4 they where boasting before these benchmarks came out.

    • @PssupplementreviewsbyPete
      @PssupplementreviewsbyPete Před rokem +5

      This is abysmal if their own advertising says 2-4X perf. They should be sued.
      But as the pricing goes, nVidia has no other option because they can't crash unsold 30-series price. Can't blame them for that, but for all the rest bullshit, oh god.
      At first, when watching the launch live I was ok with everything, but after when found out that 4080 12GB was really a 4070, I was pissed. That's misleading the customer, big time.

    • @TRX25EX
      @TRX25EX Před rokem

      @@Rivexd 4080 still makes sense the 12gb and I will explain why, so far 12GB is enough for 99% of games at 4K, you getting the 4080 16gb performance but less Vram that results at 2-3%, performance loss (like previous gen), 3090 was only 10% faster than 3080, this gen Nvidia said both 4080 and 4090 2X or higher than their previous gen counterparts......
      Considering both got same performance bump at 899$ 4080 12GB is alot better value than other offerings...... Who need others? 4080 16GB for VR users and 4090 for content creators/VR users

    • @Richard-tj1yh
      @Richard-tj1yh Před rokem +1

      @@TRX25EX have you been asleep?

  • @onedoompunchman7549
    @onedoompunchman7549 Před rokem

    14:35 i think too that some frames are made more equal than others

  • @iceburglettuce6940
    @iceburglettuce6940 Před rokem +1

    It was quite obvious that DLSS 3 would reduce energy use. If “fake” frames are added in (as they are) they will not use much computations.
    I think Nvidia had to come up with something. It may well work. We will have to see.

  • @moriwenne
    @moriwenne Před rokem

    You know what else is really fast? Ferraris. Can't afford them either.

  • @christopherarocha92
    @christopherarocha92 Před rokem

    Can’t wait to get a rtx 4090 this tech will be excellent!

  • @nick-dogg
    @nick-dogg Před rokem

    I’m excited. Lower power and cooler temps would definitely be a bonus for me. I’m not ultra competitive so the latency won’t bother me and i probably wouldn’t even notice it.

    • @Kizzster
      @Kizzster Před rokem +1

      50+ ms of latency is defo noticeable

    • @impc8265
      @impc8265 Před rokem

      wake up bro your games have to support dlss in the first place for you to experience that high ass latency. Competitive games are generally not going to support dlss.

    • @r3act250
      @r3act250 Před rokem

      Ignoring the majority of the player base just because it doesn't affect you is simply ignorant at best. Look around you nobody shares your opinion.

  • @adi_ru21
    @adi_ru21 Před rokem

    will it benefit my RTX 2060?

  • @user-mm7ot5zq2c
    @user-mm7ot5zq2c Před rokem

    To be more correct, in addition to zooming in, DLSS3 also uses Optical Multi Frame Generation to generate entire individual images based on the frame before zooming in. Optical Multi Frame Generation still operates at reduced resolution. Without first reducing the amount of data that needs to be processed, which reduces the resolution, it is impossible to achieve acceleration. In other words, the Zoom technology of DLSS3 is better, but the image quality defects brought by DLSS2 in the past will still exist, but the degree of defects in DLSS3 will decrease.

    • @user-mm7ot5zq2c
      @user-mm7ot5zq2c Před rokem

      The AI technology of DLSS3 is mainly based on the analysis and prediction of the motion of all objects in the game, but as long as there is an unexpected operation behavior, the image defects caused by DLSS3 will appear. That's why the DLSS3 images shown by nVidia are all straight forward, because the inertial operation of straight forward is unlikely to predict wrong. However, after maintaining a straight line for a period of time and making a sudden turn, the AI will make mistakes, but it is normal for the operator to change direction.

  • @ranixcz4087
    @ranixcz4087 Před rokem +1

    i think strategy games and simulator games will benefit a lot from dlss 3

  • @RadialSeeker113
    @RadialSeeker113 Před rokem

    Need to see dlss 2 4080 vs dlss 2 3080

  • @Kallan007
    @Kallan007 Před rokem

    A function that works in like 1% of games. Woot super excited! LOL

  • @PazuzuHanbi
    @PazuzuHanbi Před rokem

    I play CP77 in 2K on 3090 SuprimX and I'm getting 69-100 FPS in the game, all settings maxed out, my CPU is 9900K OC 5.1Ghz