NVIDIA’s DLSS 3.5 Is Magic

Sdílet
Vložit
  • čas přidán 20. 09. 2023
  • NVIDIA DLSS 3.5 should be impossible but it's here.
    by ‪@TwoMinutePapers‬ • NVIDIA’s DLSS 3.5: Thi...
    ► Asmongold's Twitch: / zackrawrr
    ► Asmongold's Twitter: / asmongold
    ► Asmongold's 2nd YT Channel: / zackrawrr
    ► Asmongold's Sub-Reddit: / asmongold
    Channel Editors: CatDany & Daily Dose of Asmongold
    If you own the copyright of content showed in this video and would like it to be removed:
    / catdanyru
  • Hry

Komentáře • 2,1K

  • @PlayerOne.StartGame
    @PlayerOne.StartGame Před 8 měsíci +3477

    DLSS is basically what those old CSI tech guys used when their boss said "ENHANCE! ENHANCE!" to them. Finally Hollywood CSI tech at out fingertips!

    • @carlos84708
      @carlos84708 Před 8 měsíci +91

      High resolution x100 zoom 😎

    • @BoatMurderedDF
      @BoatMurderedDF Před 8 měsíci +179

      Except that DLSS is "guessing" how stuff should appear and generating it based on the guess. It's all just AI giving you what it thinks you want, not what the true render would have looked like.

    • @Furious321
      @Furious321 Před 8 měsíci +38

      I mean, a blurry license plate where you can't make out the characters is still going to be sloppy guesswork by DLSS. It would have to know exactly how letters/numbers look in various colors, fonts, and against various backgrounds to possibly decipher the limited information.

    • @DruidEnjoyer
      @DruidEnjoyer Před 8 měsíci

      I wonder if AI generated evidence would fly in the court room?
      I doubt it, as the AI is basically making shit up. It might be very plausible shit it is making up, but it still is only something the AI's training data assumed might be there, not an actual objective observation of any kind.

    • @n0de_punk735
      @n0de_punk735 Před 8 měsíci +1

      HUHUHUHUHUHUHUH EPIC REFERENCE BRO

  • @PhynixAbyss
    @PhynixAbyss Před 8 měsíci +1164

    Can't wait for Devs making Games only playable at 60fps if DLSS 3.5 is enabled.

    • @bretttheillustrator
      @bretttheillustrator Před 8 měsíci +294

      we already there

    • @Kongren94
      @Kongren94 Před 8 měsíci +147

      They are already making this. As an example, Remnant 2 devs openly admitted that they made the game with upscaling in mind and that it's not really meant to be played natively.

    • @rewardilicious
      @rewardilicious Před 8 měsíci +27

      I mean, what's the issue? Can you actually try to have a higher level of thinking in that brain of yours? You do realize graphics cards have basically been using tricks this entire time, and then games become better looking and more intensive, something that was otherwise not possible.

    • @rewardilicious
      @rewardilicious Před 8 měsíci +15

      Like what you want them to not take advantage of the boost and then continue making games the same as it improves so you get like 2,000 FPS?

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +4

      Ray reconstruction doesn't actually help with performance much at all. So this tired joke you keep repeating over and over doesn't actually apply. Ray Reconstruction is for image quality, not framerate.

  • @bryanbenderdillenburg6935
    @bryanbenderdillenburg6935 Před 8 měsíci +669

    I like when Asmon says he's going to play a game but never plays it and it's cool because the editors make some really good jokes with these lines

  • @backfischritter
    @backfischritter Před 8 měsíci +870

    Asmon reacting to two minute papers. Never have seen this combo coming. What a time to be alive! 😂

    • @diegosolis9681
      @diegosolis9681 Před 8 měsíci +62

      I wasn't ready, I didn't hold on to my papers!

    • @aquinox3526
      @aquinox3526 Před 8 měsíci +24

      Next step is him reacting to AI explained

    • @Eric-uy7ee
      @Eric-uy7ee Před 8 měsíci +21

      @@aquinox3526 AI explained will break his mind lol.

    • @archdemonplay6904
      @archdemonplay6904 Před 8 měsíci

      Or David Shapiro@@aquinox3526

    • @bazoogaming3044
      @bazoogaming3044 Před 8 měsíci +30

      I hate that he says every word as if its a sentence. It's impossible to listen to.

  • @hubris6567
    @hubris6567 Před 8 měsíci +596

    cool technology, surely devs will not use it as an excuse to release unfinished unoptimized games.

    • @Irrelavant__
      @Irrelavant__ Před 8 měsíci +11

      Clueless Oh for sure it's optional, devs know not everyone has a Nvidia card

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +39

      You could make that argument about literally any technology, including 3d acceleration itself.

    • @winxwin4790
      @winxwin4790 Před 8 měsíci +6

      The game is optimized, you just have to upgrade your pc

    • @siliconhawk9293
      @siliconhawk9293 Před 8 měsíci +20

      @@Irrelavant__ sounds like copium bro. we have seen time and time again even when hardware specs go up, AAA studios just dont optimize the game. they are like well now everyone has better hardware sp why optimize, at least at launch every game is a buggy unoptimized hell, most games do not need ridiculous high specs but it becomes a requirement because of zero care for optimization

    • @Timeyy
      @Timeyy Před 8 měsíci +4

      @@Wobbothe3rd If you don't program in assembler you're just too lazy to truly optimize your game Kappa

  • @eriuz
    @eriuz Před 8 měsíci +1030

    rip game optimization

    • @wing0zero
      @wing0zero Před 8 měsíci +114

      Not really, you're just another throwing the word optimization around when you have no real idea what it actually means.

    • @Deezsirrer
      @Deezsirrer Před 8 měsíci +17

      Us genz will get 500fps later on for these games anyway who cares Turn off ultra setting and unreal 5 game still possible

    • @Donfeur
      @Donfeur Před 8 měsíci +149

      ​@@wing0zerookay then, tell me why game companies are putting out shit games that perform shit on great computers then, go on, I'll wait

    • @lycanthoss
      @lycanthoss Před 8 měsíci +7

      ​@Donfeur not the OP, but what has shitty optimization to do with DLSS?

    • @M_CFV
      @M_CFV Před 8 měsíci +80

      ​​@@wing0zerolol is this really the best counter? "You don't know what that means"? It's blatantly obvious that companies are going to crutch on this for FPS and push out unoptimized messes, where they lazily put processes on the CPU that take time to code properly for the GPU. I don't need to speculate that will happen when it already is happening. Vex has 2 videos going into explicit detail on the CPU issue.

  • @shoneyt4316
    @shoneyt4316 Před 8 měsíci +482

    DLSS is a double-edged sword. You see it as a cool technology and frankly, it is but we have a fairly high chance of future games relying heavily on DLSS to have good framerate.

    • @wholetyouinhere
      @wholetyouinhere Před 8 měsíci +107

      Already happening. You can't run Remnant 2 at playable framerates on modern hardware without using upscaling tech, and the game doesn't even look that great. And Starfield is just a straight up embarrassment, but no shortage of people defending that optimization dumpster fire.

    • @NemoN0body
      @NemoN0body Před 8 měsíci +4

      exactly what i was thinking.. but then again hopefully competition and the free market will force quality to triumph, also i was thinking this might actually be the way to "super realistic" graphic in games, you train the data on real life images and use it to replace the in game graphics and tada: real life graphics quality in game.

    • @NihongoWakannai
      @NihongoWakannai Před 8 měsíci +8

      This is 100% going to be the case, because even if the game devs know it's the wrong choice, corporate greed will make it inevitable

    • @mick7727
      @mick7727 Před 8 měsíci +2

      a lot of games still have trouble with ghosting in dark areas, and it's so easy to see. There's other odd common dlss glitches but that's the most prevalent.

    • @I_enjoy_some_things
      @I_enjoy_some_things Před 8 měsíci +5

      Starfield was my first real run-in with this issue. Running it without FSR or DLSS is a fool’s errand and I kind of hate it.
      I initially typed this next part elsewhere, but I’ll share it here as well in the hopes that someone who knows more than me can see it and correct me:
      >This is really cool stuff, I’ve been using the older versions of DLSS for awhile now, but is it a stopgap (for gaming)? Is traditional rendering dead? Will games soon become too ‘high fidelity’ for cards to keep up? Idk, it’ll be interesting to watch this space over the next 5-7 years, but I would choose traditional rendering every time if my GPU could handle it.
      >Specifically, DLSS-G (Frame Generation) just kind of seems like a cop-out to me. The cynic in me says it’s cheaper to make a card designed around Frame Gen than it is to make an actual ‘generational leap in power’ GPU, so that’s why Nvidia is doing this.

  • @rucka704
    @rucka704 Před 8 měsíci +123

    The only issue with DLSS being mainstream is developers won't push to make their games optimized they'll just tell you to use DLSS. Games should be optimized and have great DLSS support.

    • @Near2Future
      @Near2Future Před 8 měsíci +2

      Imagine using an AI to optimize the games for you. I'm sure it could be possible!

    • @joonashannila8751
      @joonashannila8751 Před 8 měsíci +3

      Imagine an AI to be you. I'm sure it could be possible!

    • @Near2Future
      @Near2Future Před 8 měsíci

      @@joonashannila8751 Just look at kwebbelkop...

    • @daniel-un4wk
      @daniel-un4wk Před 6 měsíci

      ​@joonashannila8751 imagine an image iam sure it could be possible

    • @hlli999
      @hlli999 Před 6 měsíci

      This! Just look at Starfield and Alan Wake 2 for example.

  • @natalwho2786
    @natalwho2786 Před 8 měsíci +36

    the tech to upscale things like this could also be used for ingame textures, that way no matter how close you get, the textures will be actively upscaled or downscaled depending on the relative camera position, similar to UE5's Nanite.

    • @uncle1sstvan
      @uncle1sstvan Před 8 měsíci

      Could really lower hardware costs for planetary approach on games like NMS, Elite dangerous, or Star Citizen.

    • @_GreenShadow_
      @_GreenShadow_ Před 8 měsíci +6

      holy shit i think you just described reality

    • @m.poppins4843
      @m.poppins4843 Před 8 měsíci

      you mean like dynamic resolution scaling? Yeah that's already a thing. Been for a while.

    • @gamegame6089
      @gamegame6089 Před 6 měsíci

      Its already exist for more than 10 years, its called mip maping.

  • @thomasburchsted3287
    @thomasburchsted3287 Před 8 měsíci +17

    When they say DLSS 3.5 for all cards it’s confusing.
    Only 40 series + cards will have frame generation thus have vastly different fps if that makes sense.

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +7

      Ray reconstruction is entirely separate from frame generation, it will work on any rtx card.

    • @GryffinTV
      @GryffinTV Před 8 měsíci

      @@Wobbothe3rd Working is one thing, working as intended is another. They're trickling down features to older series just enough to let you crave the real deal (which is more feasibly achievable on 4070 and above), to prompt you to give in and upgrade. Genius marketing with no backbone, typical of late-stage capitalism. =)

  • @fosteredlol
    @fosteredlol Před 8 měsíci +298

    Some of the unmentioned negatives for anyone curious: Input Latency and Motion Clarity. Framegen can't make the input latency better than it's native pre-gen latency with reflex enabled. All temporal techniques (TAA,TSR,FSR,DLSS,DLAA) introduce some amount of temporal 'blur' to an image compared to a non-temporal solution (MSAA, SMAA, SSAA), but the gap is slowly closing. Not quite ready for competitive shooters like CS and Valorant where clarity and latency is king, but it's getting better. RTXDI is another interesting tech if you're curious about "What's next". Basically Nvidia's Path Tracing tech. Some real "Oh, they're doing magic now" tech.

    • @TimHawk.
      @TimHawk. Před 8 měsíci +29

      You don't need DLSS for competitive shooters anyway

    • @Charles_Bro-son
      @Charles_Bro-son Před 8 měsíci +14

      20 FPS boosted by DLSS to 60+ FPS will still feel like 20 FPS, right?! For many genres that's borderline useless then.

    • @jaleshere
      @jaleshere Před 8 měsíci +14

      @@Charles_Bro-son20fps feels nothing like 60 unless you’re brain dead.

    • @Charles_Bro-son
      @Charles_Bro-son Před 8 měsíci +15

      ​@@jaleshere Could work pretty well for turn based gameplay. However realtime gameplay dependent on reflexes still feels sluggish, despite the fluent graphics. I don't like where this is going - looks good in gameplay trailers, feels shitty at home when actually playing...

    • @isakh8565
      @isakh8565 Před 8 měsíci +24

      @@Charles_Bro-son Correct, it's a fake 60 FPS. So it will still feel like 20FPS input-latency wise. Useful for flight simulators and turn based stuff, horrible for first person shooters.

  • @stve01
    @stve01 Před 8 měsíci +25

    I hope the rumours that Switch 2 will use this tech are true. For me it's the perfect use case, because you can't make hardware super powerful if you also want it to fit in a handheld. And this would compensate for that. Yes, it has some drawbacks, but it would negate most of the problems current Switch faces.

    • @Ay-xq7mj
      @Ay-xq7mj Před 8 měsíci +5

      Doesnt make any sense at all. Nintendo uses no cutting edge hardware or software and the only hardware using frame gen and advanced hardware accelerated upscalers are Nvidia gpu's which will never be in a small form. Mobile = APU/integrated with no dedicated gpu. Also is expensive to produce wihich Nintendo never deals with.

    • @joshdingle930
      @joshdingle930 Před 8 měsíci

      it's most likely going to use an Nvidia APU which yes will probably not have frame gen but dlss would still be available @@Ay-xq7mj

    • @senmage
      @senmage Před 8 měsíci +6

      @@Ay-xq7mj iirc the rumored chip Nintendo will be using isn't exactly the latest or cutting edge of Nvidia in terms of hardware. it will be on ampere which at this point is a couple years old and will be replaced by Nvidias chiplet design in the 50 series ~2025. So Nintendo is quite literally gonna release the Switch 2 on hardware that Nvidia has matured at this point. Switch 2 WILL have DLSS and Frame gen available to it.

    • @YammoYammamoto
      @YammoYammamoto Před 8 měsíci

      astroturfers have gone mad... lol.

    • @SpongeBobfromthehoodnig
      @SpongeBobfromthehoodnig Před 3 měsíci

      Sorry Buddy but that ain’t happening😂

  • @simonazzz
    @simonazzz Před 8 měsíci +233

    I remember learning about this in game dev school few years ago and thinking to myself: "Oh I can't wait for it to be used in games!" Good times coming!

    • @deyash4602
      @deyash4602 Před 8 měsíci

      ​@@ramonandrajo6348how's it a gimmick? Its a standard for how good it is it improves your fps dramatically without affecting the look of it noticably

    • @ashleybarnes9343
      @ashleybarnes9343 Před 8 měsíci +30

      ​@@ramonandrajo6348dlss isn't a gimmick which this video is on about....and if your on about ratyracing it also isn't a gimmick lmfaoo, it's hard to run yeah but it's a major game changer to graphics

    • @Blurredborderlines
      @Blurredborderlines Před 8 měsíci

      @@ashleybarnes9343 It is a gimmick, as well as being a crutch for extremely Unoptimized resource utilization that is non-functional without it. If a product needs to be de-scaled and then checkerboard the screen up to “full res” to run in the “recommended” specs, your game sucks regardless to FX used.

    • @big.dirty.buckets
      @big.dirty.buckets Před 8 měsíci

      @@ramonandrajo6348it literally triples your frames and reduces input delay lol wym

    • @ximplex1
      @ximplex1 Před 8 měsíci +26

      It still feels like it's being used as a crutch for poor optimization.

  • @lowbudgetfilms4535
    @lowbudgetfilms4535 Před 8 měsíci +353

    Although I like the idea of upscalers used as a way for older cards to get more performance, since nvidia limits it to new cards it's only a matter of time before they start using this "performance" as a selling point.

    • @caab0516
      @caab0516 Před 8 měsíci +152

      They already are lol

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +15

      Ray reconstruction isn't about performance at all, it's for image quality.

    • @RaeneYT
      @RaeneYT Před 8 měsíci +73

      God Bless FSR 3. The fact that it works on all graphics cards, no matter the age or brand is insane. AMD truly playing the side of gamers since NVIDIA got so complacent and uncaring after making AI money.
      Addition: For the NVIDIA dickriders, this is not hate for Nvidia. This is competition loving. I don't hate Nvidia, I have used their graphics cards all my life and have only very recently switched to AMD. They are both very good but the only loss I suffered in the switch was Nvidia Broadcast. Other than that, both of them are functionally the same. However, being able to use FSR tech through the AMD Control Panel is incredible. You can have FSR in every single game even if it's not baked into the game. That's incredible. AMD is surging through accessibility for all gamers and I seriously hope because of this, Nvidia figures out they need to do the same otherwise they'll continue to lose good will. Their GPUs have already caught up, their CPUs have caught up to Intel, and their prices are normally cheaper across the board. They're doing super well and all the hate for Nvidia because they're simply not Nvidia or Intel is depressing. These are companies that don't care if you shill out for them, what they do care about is sales and numbers, and currently, AMD is sacrificing profit for goodwill to get ahead of the competition and it's working. Later on they'll maybe pivot, but right now, take advantage.

    • @M_CFV
      @M_CFV Před 8 měsíci +9

      ​@@Wobbothe3rdthat isn't an upscaler. He is talking about DLSS not ray tracing

    • @vincentvega3093
      @vincentvega3093 Před 8 měsíci +1

      Thats why you either suck it up for nvidia or buy the competition

  • @Rubiktron
    @Rubiktron Před 8 měsíci +4

    I have been following "2 Minute Papers" (I just can't remember this doctor's actual name) for years now, he's a genius.
    This "illusion" of creating more frames by having the computer predict where and how pixels are going to move in the next frame in order to place a blend in between has been used for years now in video production with the purpose of creating fake super slow motion videos, as in GoPro Hero promo videos, which feature scenes that were just not possible to film with their cameras back in the day. The name of this feature was known by the commercial name "Twixtor", and it was very popular back then.
    However, it took hours to render just a couple of minutes of video with this effect, what is actually new here is that it's working at real time now, which is absolutely mind blowing.
    "What a time to be alive!"

  • @kennethkk
    @kennethkk Před 8 měsíci +54

    Two min papers guy is so cool. Long time subscriber. He brings great insight into the world of AI GFX stuff first hand. I love it.

    • @astronotics531
      @astronotics531 Před 8 měsíci +27

      the way he talks makes me want to jump out my window, why doesn't he talk normally?

    • @EliteUwUPrime
      @EliteUwUPrime Před 8 měsíci

      Do it. Jump.@@astronotics531

    • @joepopplewell680
      @joepopplewell680 Před 8 měsíci +24

      @@astronotics531 Because english isn't his first language?

    • @jxmpking4384
      @jxmpking4384 Před 8 měsíci +14

      @@astronotics531 did you know there's 7000 spoken languages in the world?

    • @igrschmidt
      @igrschmidt Před 8 měsíci +10

      I couldn’t finish the video. There are millions of non-native English speakers out there but I have never seen someone use so many commas while speaking. He has the vocabulary and the pronunciation but wtf

  • @arashnonexistent9033
    @arashnonexistent9033 Před 8 měsíci +52

    The de-noising is being done simultaneously by the upscaler itself in one go now.
    witch results in higher quality and more stability.

    • @igorthelight
      @igorthelight Před 8 měsíci

      Yep!

    • @MGrey-qb5xz
      @MGrey-qb5xz Před 8 měsíci

      Don't know about quality considering how much more blurry the games look

    • @igorthelight
      @igorthelight Před 8 měsíci

      @@MGrey-qb5xz Watch the latest Digital Foundary video about Cyberpunk 2077 v 2.0 ;-)

  • @rickybrooks2971
    @rickybrooks2971 Před 8 měsíci +17

    This sounds like how our brains generate images- take very low fidelity and make up the rest

    • @zippac
      @zippac Před 8 měsíci

      What does that make porn?

  • @fish3010
    @fish3010 Před 8 měsíci +4

    The plot twist is that the technology for 3.5 is advertised strongly on the comparison videos with frame generation on while also pushing strongly that DLSS 3.5 is available without frame generation for all RTX cards. A tad of manipulation there. The real difference will be 5-15% at best depending on the game. At least it will look better with more or less the same performance.

  • @Daobeezy
    @Daobeezy Před 8 měsíci +4

    I agree so much about the 6k bitrate. They need to up it on twitch.

  • @deadturret4049
    @deadturret4049 Před 8 měsíci +12

    Man could you imagine if this kinda tech dropped on the next nintendo handheld or really any other portable system. The ability to boost performance using optical flow techniques could be the key to keep those types of devices small and lightweight while competing with home consoles and PC.

    • @arfanik9827
      @arfanik9827 Před 8 měsíci +3

      Some potential leaks about the next Nintendo hardware (Switch 2, or Switch Next-Generation, or wtv it's gonna be called) mentioned the fact it is probably gonna use DLSS/FSR3 along with better hardware to make games such as Final Fantasy 7 Remake or the Matrix Demo be able to run on the handheld console as if it was run on a like PS4 pro

    • @robertcarhiboux3164
      @robertcarhiboux3164 Před 8 měsíci

      when you start using those tech, it is a simple proof that the hardware sucks @@arfanik9827

    • @turbochoochoo
      @turbochoochoo Před 8 měsíci +3

      Supposedly this is already confirmed for the Switch 2

    • @hogman3543
      @hogman3543 Před 6 měsíci

      ​@turbochoochoo Source and Evidence please Mr.Nitentoe

  • @Robulite
    @Robulite Před 8 měsíci +4

    It adds latency if you use frame generation. But normal DLSS with reflex is pretty good. Some weird smoothing occurs with movement but it's getting better just not quite there.

    • @stysner4580
      @stysner4580 Před 7 měsíci +4

      Not just that, the ghosting is really annoying to me.

    • @Robulite
      @Robulite Před 7 měsíci

      Yeah you have to run it on quality or it looks terrible. @@stysner4580

  • @puregarbage2329
    @puregarbage2329 Před 8 měsíci +2

    Dlss 3.5 is a de-noiser that’s supported on all rtx cards for better looking Path Tracing(not ray tracing and doesn’t affect performance). Dlss 3 is frame generation only on 40xx cards. Makes sense right.

  • @thewhyzer
    @thewhyzer Před 8 měsíci +2

    The issue is, if it's using two frames to generate an in-between frame, there'll be input lag because it has to wait for the third frame to be generated naturally before it can create the 2nd frame to display. But if it's using AI to "predict" the next frame based on the already displayed frames, then there will be errors, especially when things on the screen are changing a lot. The AI can't predict that you'll decide to turn around or fire a big gun or whatever at that moment. Hell, it almost certainly won't be able to dig into the game mechanics to accurately predict even what the enemies and NPCs will be doing next.

    • @Christobanistan
      @Christobanistan Před 6 měsíci +1

      This is the comment everyone needs to read. DLSS is not gonna be great for less casual gamers.

  • @Real_MisterSir
    @Real_MisterSir Před 8 měsíci +5

    I think a good way of looking at it is the comparison between brute forcing computation, vs becoming increasingly better at predictive visual generation. Imagine if we used our brain as an example, vs the brain of a whale. It can weigh over 10 kgs, vastly outscaling ours in terms of sheer size and also in neuron connections. But the reason our brain is different, is due to how different parts interact, how some neural links are formed, how our process of learning over time to rewire our brains to constantly become better at things we set out to improve. That is what truly sets us apart and allows us to go from the intellectual level of a wallnut as infants, to gigabrain 5heads (on occasion) as adults. Neural deep learning AI does the same for computational graphics, it entirely reworks where the effort is put, it forms 10 99% positive connections before a brute force method has even calculated a single 100% step. And with time, AI will only get closer and closer to 100% for each calculation, to the point where we can't actively tell apart the differences - which is pretty much where we're at right now. So from this point, the scale is only going to further extend in favor of AI accelerated predictive generation, rather than brute force calculation.
    Just like how we read words in a book, we don't need to read every letter one-by-one to figure out what the entire word means. We can use predictive deduction to skim the possibilities so fast that we are able to make out even complex words practically in an instant, which allows us to read books in real time. It sounds strange, but when you think about how many possibilities we are filtering in milliseconds to arrive at the right conclusions in real time, it's crazy. AI is literally that. And yes once in a while (just like us) it gets a prediction wrong, and there can be brute-force reliant methods to fact check these more high-risk generative predictions to ensure that false positives get reworked, and this process can be done in a millisecond too because the actual hardware isn't working overtime on brute forcing every frame on its own.
    Frame Gen may seem like a gimmick now, but in cohesion with DLSS and ray reconstruction, I think within 2 years a most this will be the standard that takes over from brute force hardware calculation when we look at general performance levels.

    • @GryffinTV
      @GryffinTV Před 8 měsíci +1

      You raise some valid points.
      It's true, the tech we've been given here is next-gen. The fact that Nvidia did it first is a logical consequence of their many advantages over AMD. They have the brand recognition, the market share, and, most importantly, the resources. This is the turbo-capitalist reality (all else be damned), and Nvidia hasn't shied away from banking big on the recent AI hype either.
      AMD has only ever retaliated retroactively; playing catch-up while setting competitive pricing has always been their signature (case in point, they failed to rival Nvidia's flagship; RX-7900XTX is as far as they could go).
      FSR is also an attempt to mimic DLSS, and AMD lagged behind nearly 2 years with it. FSR 3 got released yesterday and it is hardly on par with DLSS 2, let alone 3 (to say nothing of 3.5). In short, a disaster, but a promising one if we're to have some faith in their Fine-Wine technology.
      I appreciate the idea of DLSS itself in a limited set of scenarios, but I will not give credits to Nvidia for implementing it first. This is simply the way the GPU market conditions have aligned for them to end up as the DLSS pioneer. Historical circumstances and the perks of monopoly. These are just farcical levels of RNG at play.
      If AMD released FSR before DLSS (and made it just as good), things would've played out differently. But that's one too many ifs and what's done is done.
      Right now I'm more concerned about a future shortage of GPU manufacturers for the masses, because recent trends show that AI and DataCenter are far more profitable moving forward. I fear the day AMD stops developing GPUs because it will simply no longer be profitable, which would give Nvidia complete chokehold over the GPU market (ceteris paribus).
      And I also dread the idea of using DLSS as a way to cut corners on proper game resource optimization because Nvidia can get away with an upscaling gimmick that tricks our monkey brains. (You can find arguments in favor of it, but I remain forever convinced that DLSS could never hold a candle to true frames rendered at native res with real computing power.) This creates a precedence that could soon become the new standard for gamedevs. Just as Apple did with the notch; it was just a feature for the iPhone, nothing groundbreaking in particular, but it became the new norm because of Apple's power to set trends and the masses' willingness (or obliviousness) to take it for granted. And now here we are: most phones have notches. I hope the same rhetoric won't apply to frame rendering in the future.
      So, as it stands, this endless arms race between red and green only affects the average PC building enthusiast due to practices employed by both sides, with AMD over-promising and under-delivering, and Nvidia adopting bandaid, Apple iPhone tier tactics and planned obsolescence (crippling the performance of older cards with newer drivers to boost sales of latest gen cards).
      I'm a GPU centrist, I've owned both Nvidia and AMD GPUs on and off, but I think neither team deserves to be spared from due criticism during this hell of a period for R&D. My heart and wallet say AMD, but my brain and foresight say Nvidia. And that's a really disheartening prospect.
      I am trying to root for the underdog, but right now it feels like an inadequate time for it. Advancements in AI computing are rattling the world as a whole (let alone the gaming world), so I find it hard to imagine AMD even catching up with Nvidia in the AI division of their R&D for upcoming GPUs.
      At the same time, I cannot endorse Nvidia's greedy marketing practices.
      Part of me wishes that the AI hype would just wane already, but that's no longer an option. We've already unlocked Pandora's box. So, it makes complete sense that whichever GPU manufacturer outshines the other in the AI segment is sure to have a significant headstart going into this territory. And right now that manufacturer is undoubtedly Nvidia. I pray for AMD to get their shit together, and fast.
      Alternatively, we could also hold on for a divine intervention from a third competitor under the name of Intel, but I have no illusions. Their focus seems to be on iGPUs now and for the foreseeable future.

    • @Real_MisterSir
      @Real_MisterSir Před 8 měsíci +1

      @@GryffinTV I pretty much agree on most if not all the points you raise, and have thought similarly about AMD and their path. There has however been recent aspects of their business practices that have helped shed light on some potentially conflicting elements - for example the idea of AMD being the runner-up and therefore not being the turbo capitalist entity to push ahead compared to Nvidia.
      From observing AMD's behavior over the past 3-4 years, it has become evident that AMD in fact actively choses to not bear the torch, even when they have the opportunity to do so. They could take the lead, but don't. In fact, it has led me to believe AMD (at least with their current plans of operation and business) are not actively interested in stepping out of Nvidia's shadow. I think for the time being, they're more comfortable staying as the perceived underdog, the competitor that plays catchup, the one fighting "against the odds" so to speak. The reason I believe this, is because AMD has shown a very clear bias towards mimicry, even when they could easily take the lead in sales or general market share, and I think one of the key reasons why they're sticking to the shadow, is because the risk of being the forerunner is also that you end up tanking the hardest hits if things do not go to plan. AMD has consistently struggled with internal planning, which does make sense because they're a smaller scale than Nvidia and they are more limited in resources, thus limited in options and leeway. If things go wrong for them, they fall harder, with fewer safety nets. So there is lower inherent risk in actively choosing to play the role that catches up and mirrors what Nvidia does, because it lets Nvidia pave the path and pay for that pavement upfront, which in turn lets AMD coast more easily and even if things go wrong, they will be safe in the fact that Nvidia by nature has gone down the same path so at worst they're both wrong, and in a duopoly market with no real alternatives, that is a very safe place to be. But if AMD tried to lead the path and Nvidia decided not to follow, in that case if AMD's path goes wrong then they're fucked.
      Now Intel is starting to throw cogs in the wheel, but it's a slow process and it gives time for AMD to figure out how to position themselves in the next 10 years - and I think that may also be why we're currently seeing small shifts in market segments and what type of GPUs they focus on, as well as how they are redistributing their balance between consumer and enterprise markets. More than likely AMD is actively choosing to stay in Nvidia's shadow for now, while they test different waters to see potentially viable future paths to branch off to post 2030, and see what type of groundwork would have to be laid out to enable those paths. And while doing so, staying in Nvidia's shadow is simply more consistent and less risk. Two things a company in process of assessing change loves to be in.

  • @wlaaaaaaaaaa123
    @wlaaaaaaaaaa123 Před 8 měsíci +3

    Not sure if it’s the same as this but I’ve already downloaded a few apps that use AI to un-blur some not so good photos I’ve taken on a trip. Couldn’t beleive how well it works!

  • @nightshan1647
    @nightshan1647 Před 8 měsíci +2

    I first started gaming on an Atari 2600. This is like a whole new universe. It's almost scary to think what it will be like in 40 year from now.

  • @hsujonathan5221
    @hsujonathan5221 Před 8 měsíci +2

    I believe the Frame Generation feature is only avaiable for 40X0 series cards.

  • @Jamgcy
    @Jamgcy Před 8 měsíci +11

    What a time to be alive!

  • @waterenjoyer31
    @waterenjoyer31 Před 8 měsíci +4

    Frame generation is a nice to have thing, but its seems to only be worth using if you're getting less than 60 fps..it does increase the visual smoothness etc. but from my understanding it has no effect on input latency so the input would still feel like 25 fps if that's what you're able to run without frame gen. It certainly depends on the type of game you're playing and the input method used.. I can't imagine it would feel very nice to use with keyboard and mouse as that is more sensitive to input delay vs controller. Also, at this time frame generation requires RTX 4000 series cards meaning at least an RTX 4060. I think it will be way more interesting if AMD FSR 3.0 allows frame generation on all generation of cards as it seems more of a low end GPU use case thing.

    • @Qrzychu92
      @Qrzychu92 Před 8 měsíci

      the thing is that you can tweak the resolution and setting so that you start generating from 40-50 fps, and get 80-100. That's way better than playing on low settings in 60 or high in 30

    • @waterenjoyer31
      @waterenjoyer31 Před 8 měsíci

      @@Qrzychu92 this proves my point even better, who is more likely to turn down resolution? High end GPU owners or low end? And even DLSS upscaling from anything below 1080p looks pretty bad..it does retain more quality than FSR but ideally both tech should be used at least 1440p resolution. The higher the base resolution upscaling from the closer FSR matches DLSS in image quality, I think most people put too much stock in DLSS quality vs FSR because they both suck at low resolution lol.

    • @Qrzychu92
      @Qrzychu92 Před 8 měsíci

      ​@@waterenjoyer31 "And even DLSS upscaling from anything below 1080p looks pretty bad" TODAY. Don't forget they are working on it.
      I can easily imagine that in next year or two, upscaling from 720p on high will look better than native 1440p.
      For today, you are right, just don't forget that there is tomorrow. If you told somebody during the RTX20 series premiere that there will be AI powered frame generation RECALCULATING THE RAY TRACING for the new frames, nobody would believe you, or say that it's going to be viable at all. But here we are now :)

    • @waterenjoyer31
      @waterenjoyer31 Před 8 měsíci

      @@Qrzychu92 that's a fair point, we will have to see how it progresses. However it will be a challenge surely because lower resolution is just less data to work with. I certainly wouldn't recommend a product to anyone based on future improvements.

    • @Qrzychu92
      @Qrzychu92 Před 8 měsíci

      @@waterenjoyer31 DLSS can already generate 80% of the frames according to this very video

  • @clairearan505
    @clairearan505 Před 8 měsíci

    To your question about streaming bitrates, since GPUs are gaining the ability to synthesize near ground truth images from sparse samples, it's conceivable that you could even reduce the bitrate at which you stream and have your viewers' GPU's synthesize the additional info needed for clean and crisp playback. It's definitely on the way.

  • @Iamksha
    @Iamksha Před měsícem

    It does in fact effect input delay but you can get rid of that by using reflex

  • @CoLPluto
    @CoLPluto Před 8 měsíci +3

    Seems like similar technology that fighting games are using to help combat internet delay inputs.

    • @NameIsDoc
      @NameIsDoc Před 8 měsíci +1

      Its a sister tech

    • @ls200076
      @ls200076 Před 8 měsíci

      ​:0

    • @MuttleyGames
      @MuttleyGames Před 8 měsíci +1

      It's called rollback netcode and is a bit different. It computes multiple outcomes at the same time and when the game gets confirmation what the correct one is, displays that sequence of events. It's more computationally intensive but better for latency.

  • @heavymetalboi
    @heavymetalboi Před 8 měsíci +3

    Its like listening to Hercule Poirot explain who the murderer is in the end of an episode. I love it😀

  • @bryank427
    @bryank427 Před 8 měsíci +2

    This really is cool tech. I can't wait for Nvidia to bring the RR to regular RT, hopefully that is sooner than later. Combine that with the HD Reworked 2.0 mod coming out next month and the game will be in the perfect state for me to jump back in. I'm sure they will also continue to patch it for a while as the big update likely broke a lot of things.

  • @user-xk1cp5jd2g
    @user-xk1cp5jd2g Před 6 měsíci

    I am happy to see a replacement solutionto anti aliasing upscaling denoising i always turned them offvsince the issue was the color profile friendlyness with the rest of things like wrgb on lg

  • @wolfandspice
    @wolfandspice Před 8 měsíci +12

    The problem is that barrier of entry won't last long. For now this technology makes new games accessible to older hardware, but games going forward are going to be developed with this technology in mind, so the requirements will go up accordingly.

    • @wholetyouinhere
      @wholetyouinhere Před 8 měsíci +4

      Correct. It's sort of like giving everyone money to put towards their rent. Eventually the rent just goes up.

    • @Christobanistan
      @Christobanistan Před 6 měsíci

      Older or very cheap hardware can't do DLSS.

  • @Blackrain707
    @Blackrain707 Před 8 měsíci +20

    Keep in mind, though there is support for DLSS 3.5 on Nvidia cards since 2018 (Rtx2080ti), they will not have the full capabilities unless you have a 40 series.

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +13

      No, only frame generation is exclusive to 40 series, everything else is on any RTX card.

    • @markpaulpangan7618
      @markpaulpangan7618 Před 8 měsíci +5

      to be clear
      DLSS 2 (SUPER SAMPLING/UPSCALER) is used on DLSS version 3.5 (nvngx_dlss.dll) which is available on all RTX cards
      DLSS FG (FRAME GENERATION/DLSS 3) is also used DLSS version 3.5 (nvngx_dlssg.dll) only available on RTX 40 cards
      DLSS RR (RAY RECONSTRUCTION) which is DLSS 3.5 (not sure if it will have a separate file or not) that's available on all RTX cards
      yep only make you more confused
      so basically
      RTX 40 Series: supports all things offered by DLSS
      RTX 20/30 Series: the same except Frame Generation

    • @darvamehleran786
      @darvamehleran786 Před 8 měsíci +6

      @@Wobbothe3rd I like how nvidia did not add FG for old gpus, just to be sure not to extend gpus lifespan

    • @glenmcl
      @glenmcl Před 8 měsíci

      got one 😉 4090

    • @YammoYammamoto
      @YammoYammamoto Před 8 měsíci +1

      @@darvamehleran786
      Oh, and don't forget how NVidia doubled pricing for basically the same performance.

  • @joshuaprivett3552
    @joshuaprivett3552 Před 8 měsíci

    When I google the bitrate cap it looks like that’s only on by default and it can be changed- as of 3 years ago at least.

  • @xXQuickscopzXx1
    @xXQuickscopzXx1 Před 8 měsíci +4

    Unfortunately we live in a reality where this will be a double edged sword for consumers. Devs will begin using this tech as a crutch for optimizing their games and we'll be right at square one. The games will be so poorly done that enabling DLSS will feel like just playing the game normally and then god forbid you don't use the feature or else you'll be running the game as a slide show. Now if the game studio has some integrity and puts in 100% effort into polishing their game regardless of DLSS then we'll get god tier games to play but with how big corporate greed is in terms of cutting costs, I have my doubts unfortunately.

  • @xroman_
    @xroman_ Před 8 měsíci +4

    Excited for ray reconstruction. Frame generation, on the other hand, adds input latency and is only available on 4000 series which should be powerful enough to run everything at 80+ fps without it anyway

    • @XOmniverse
      @XOmniverse Před 8 měsíci

      I just ran the benchmark for CP2077 2.0 patch with high settings and path tracing, and without frame gen, it would be ~50 fps.
      You're right for most games though.

    • @wholetyouinhere
      @wholetyouinhere Před 8 měsíci

      Frame gen doesn't add input latency. It just doesn't fix input latency.

    • @NihongoWakannai
      @NihongoWakannai Před 8 měsíci

      ​@@wholetyouinhere when games start being made with DLSS in mind and don't optimize because "lol we have DLSS" then yeah it basically becomes "adding input latency"

  • @Timeyy
    @Timeyy Před 8 měsíci

    Man I remember when I first got a Gamecube in 2001 and played the Star Wars game they had at launch (idk what it was called). The graphics blew my fucking mind, that shit looked like a movie to me back then lmao. There was a period during the PS3/PS4 era where graphics were stagnating for a bit but now that we have raytracing and Unreal 5 there's finally another notable jump in quality coming.

  • @nickomeinegh
    @nickomeinegh Před 8 měsíci

    I just wish fast moving small objects didn't look like they were smearing your screen. Rain usually looks pretty bad with DLSS but I still love the frames so much

  • @TheSlickmicks
    @TheSlickmicks Před 8 měsíci +4

    I love Two Minute Papers videos.

  • @VVc0mpu73r
    @VVc0mpu73r Před 8 měsíci +2

    image looks smoother input lag stays the same based on the original frame rate = action games unplayable

  • @nyusic6699
    @nyusic6699 Před 5 měsíci

    one thing people miss in it is the latency. ofc its not so crucial in puzzle/ turn based games but it basically makes the game feel like your inputs are on soap. Imagine you turn your car left and it starts on turning when you have turned the wheel half way and oh, youre in an accident

  • @fijau
    @fijau Před 6 měsíci

    2:10 "guys, I think that's pretty fast" sums it up perfectly 😊

  • @panderz_gg
    @panderz_gg Před 8 měsíci +3

    2 Minute Papers is on of the most underrated Channels on youtube!

  • @gecgoodpasi1654
    @gecgoodpasi1654 Před 8 měsíci +3

    alot of people doubt DLSS and think its bullshit but its as real as it gets u basicly sacrifice a small bit of ur input latency and it introduces a little bit of blur but the technique improves so insanely fast that soon its gonna be almost impossible to notice these flaws i remember when DLSS 1 came out and u could rly feel it yet the improvement to performance was already astonishing well now u basicly only got the upside cause they minimized the downsides

    • @I_enjoy_some_things
      @I_enjoy_some_things Před 8 měsíci

      There’s DLSS and DLSS-G. DLSS-G is Frame Gen. Right now, only the 4000-series RTX cards can make use of it.
      DLSS lets you run your games at a lower resolution scale and basically maintain the same visual fidelity. DLSS-G is the one that actually generates the frames.
      I am still skeptical of DLSS-G bc it feels like a cop out and a reason for Nvidia not to keep improving the ‘raw horsepower’ of GPUs.

  • @affrokilla
    @affrokilla Před 8 měsíci

    Can't wait for this technology to work on any stream/video, download video in 720p and show it in 4k

  • @tensa-
    @tensa- Před 8 měsíci +1

    wait so im just seeing frame rate going up but then a delay or input lag becomes some sort of problem....theres always a catch

  • @SpykeHellboy
    @SpykeHellboy Před 8 měsíci +41

    this guy's channel is amazing if you are pasionate about these kinda of things. I've been following him for a while and his videos are top notch, great explanations
    EDIT: they just dropped update 2.0 for Cyberpunk, 48GB so it's a great time to start a fresh new save (as they recommend)

    • @nicknevco215
      @nicknevco215 Před 8 měsíci

      good to know was waiting for the dlc release to see if a new mega patch happens after

    • @asafheller5720
      @asafheller5720 Před 8 měsíci +33

      I love this type of content, but my ears bleed from hearing this ANNOYING narrate, I prefer hearing the TikTok AI voice generator over this tbh.

    • @512TheWolf512
      @512TheWolf512 Před 8 měsíci

      Huh? It's 22 gigs for me, not 48

    • @Sqied
      @Sqied Před 8 měsíci +14

      Wow very cool. But my ears are bleeding from him stopping every half a word.

    • @hurrdurrmurrgurr
      @hurrdurrmurrgurr Před 8 měsíci +15

      It'd be. Amazing. If he could stop. Adding full stops. Mid sentence.

  • @synitarthrax5618
    @synitarthrax5618 Před 8 měsíci +3

    Imagine what kind of world we could live in if we spent as much time and effort on all tech as we do in video game tech.

    • @Ay-xq7mj
      @Ay-xq7mj Před 8 měsíci +2

      I mean the chinese produced autonomous killer drones that can pathfind through forests and use facial recognition.

    • @ahhshytson
      @ahhshytson Před 8 měsíci

      umm AI, robots doing backflips, personal drone cars, jetpacks etc...

    • @IwinMahWay
      @IwinMahWay Před 8 měsíci

      We do

    • @Oedipus.
      @Oedipus. Před 8 měsíci

      Alot of tech isn't developed primarily for videogames, it working for games is merely bi product.

  • @Tulkeleth
    @Tulkeleth Před 8 měsíci +2

    The only bad thing about this is that devs have been relying on this for games to run instead of optimizing them properly. Either that or graphics card manufacturers actually tell them not to optimize for older cards so we have to make the upgrade.
    My 1070 looked pretty decent when it came out running most games I wanted to play at max settings 1080p and keeping stable 60fps if not higher (I have a 170hz refresh monitor).
    Newer games don't look like they're that much better in terms of graphics but my 1070 just struggles to get 30fps on low setting at 1080p and that shouldn't be happening. That's just bad optimization from the devs that are using DLSS or Fidelity FX as a crutch, as games have not made the graphical fidelity leap these hardware requierements are suggesting.

  • @EMI-ev2rp
    @EMI-ev2rp Před 8 měsíci +9

    i remember the first release of raytracing was a huge milestone and now they advanced so much incredible

  • @rexhepizhitia
    @rexhepizhitia Před 8 měsíci +3

    Omg i can‘t the voice is triggering me too hart….

  • @trololololzq
    @trololololzq Před 8 měsíci

    What is the RC car game that is shown in this video? looks like a new version of re-volt

  • @EldenCoont
    @EldenCoont Před 6 dny

    i remember being blown away by the graphics in Donkey Kong and Earthworm Jim

  • @Havoc10K
    @Havoc10K Před 8 měsíci +3

    DLSS 3 is going to be available only to the latest most expensive nvidia cards.
    AMDs FSR3 will be available to ANY video card using DX11/12, with some functionality limitations depending on the card capability, and it will be doing the exact same shit and more.

  • @JoaoGabriel-xm1cq
    @JoaoGabriel-xm1cq Před 8 měsíci +9

    The only problem with DLSS is that Nvidia make so you need to buy/re-buy their hardware to use these technologies, in comparison AMD is launching FSR 3 for not only their GPU's(older and newer ones) but for the whole GPU market.

    • @yumri4
      @yumri4 Před 8 měsíci +1

      Sadly nVidia seems to only put a little more Tensor cores into consumer cards than needed. The reason why a little more than needed is to get in front of AMD in performance. DLSS DLAA and etc. with their deep learning line of consumer technologies uses tensor cores not CUDA. So nVidia can and probably will do what you are saying and make more tensor compute intensive models that require more tensor cores to use in future cards to get consumers who want to use that feature to get a new card.

    • @ggoddkkiller1342
      @ggoddkkiller1342 Před 8 měsíci +1

      Exactly, im buying AMD card next time unless scammer nvidia supports my 3060 as well for dlss 3.0, not believing ''it wouldn't work well enough'' BS! If they can not support even last generation for new technologies ridiculously, there is no point buying 50 series etc as it might happen again...

    • @yumri4
      @yumri4 Před 8 měsíci

      @@ggoddkkiller1342it "works" yes but running without it runs better which is the sad thing. The value i see in the 50 series will be in the 5090 to upgrade from the 980 in 2025. The reason why the 50 series of nVidia instead of the 8000 series of AMD or the 800 series of Intel is 3D modeling program support. CUDA is supported by games too so that is why i use the same card for both. Gets in the way when the full GPU is needed for GPU compute in some applications but having a integrated GPU fixes that which i learned from not having one this time.

    • @ggoddkkiller1342
      @ggoddkkiller1342 Před 8 měsíci +1

      @@yumri4 Nah still not buying it, 4050 can handle dlss 3 but 3090 ti can not with many times more tensor etc cores?! Even my 3060 outperforms 4050 in every way possible...

    • @JoaoGabriel-xm1cq
      @JoaoGabriel-xm1cq Před 8 měsíci

      @@ggoddkkiller1342 with a quick google i found that you are safe, the DLSS will support the GPUs from the 20 generations, but in my case, my old 1050 will only have the FSR 3

  • @adamus9277
    @adamus9277 Před 6 měsíci

    ok but fsr 3 with frame generation will be avialable from gtx 1000 series too. sadly fsr blurry as fck for me when i moving i use inel Xess

  • @osyraleth
    @osyraleth Před 8 měsíci +1

    It almost feels like 3dfx back in the old days!

  • @michealcthompsonjr
    @michealcthompsonjr Před 8 měsíci +4

    Heavily increases input lag.
    DLSS (Deep Learning Super Sampling) heavily increases input lag due to the waiting time of holding back both frames to generate the AI frame.

    • @Born_Stellar
      @Born_Stellar Před 8 měsíci

      well at least we all know who the noobs are now, wanting fames instead of reduced input lag.

  • @NoName-jz8fe
    @NoName-jz8fe Před 8 měsíci +15

    Problem is this is a way for Nvidia to sell software instead of hardware. Which will mean you will always have to buy the next generation of graphic cards every time to get access to this software, even though it could probably work on any GPU.

    • @kgldjkgur2046
      @kgldjkgur2046 Před 8 měsíci +2

      @@webtiger1974 matter of time lol doesnt matter if its nvidia amd or intel if this shit gonna be in all games every company will implement something like this 100%

    • @Hiiiro
      @Hiiiro Před 8 měsíci

      Nvidia is now primarily a software company.

    • @cirescythe
      @cirescythe Před 8 měsíci +1

      Wrong. This technology does not work on any hardware. Hence the tensor cores. AMD does not use em, slaps cheap extra ram on the card and people lap it up.

    • @qilinger4685
      @qilinger4685 Před 8 měsíci

      @@cirescythe AMD is normally accepted as the best general hardware company (so stuff that's for like work and not gaming). Their stuff doesn't have tensor cores because they don't need it, it's not the audience they cater to.

    • @miriamkapeller6754
      @miriamkapeller6754 Před 8 měsíci

      @@Hiiiro They're as much of a software company as they're a candle company: not at all.

  • @JustARTificial
    @JustARTificial Před 8 měsíci +1

    My only question is how would this work in online games? Couldn't 2 noisy images from 2 different people / angles result in something different, so a bullet travelling across the screen could look slightly different and therefore lead to scenarios where we think we aren't getting shot at but it's the AI not properly detecting the image before it?
    Super impressed though.

    • @Born_Stellar
      @Born_Stellar Před 8 měsíci

      yeah but you're only seeing those frames, the game does not take them into account. they are just for your eyes.

  • @WXSTANG
    @WXSTANG Před 6 měsíci

    Frame generation, and upscaling has been around forever in modern TV's. DLSS is a variant of that. Infact, they call it the "soap opera effect", when you turn frame generation on some TV's because it makes the video look like it was taken on a home camera.

  • @truedox
    @truedox Před 8 měsíci +3

    In the future developers will be able to make a game out of a literal PowerPoint presentation and technology like this will turn it into a living vibrant game.

  • @husleman
    @husleman Před 8 měsíci +5

    Its true, I downloaded the DLSS 3.5 mod for starfield, and that shit literally doubled my FPS. I went from 50-60 frames in cities to like 110 frames on my 4080. That shit is fucking god tier

    • @MGrey-qb5xz
      @MGrey-qb5xz Před 8 měsíci

      It basically blurs the game just the right amount to keep fps consistent, if you Don't mind taa even bad taa then go for it

    • @husleman
      @husleman Před 8 měsíci

      @@MGrey-qb5xz not sure what youre talking about by that because with the DLSS 3.5 mod, the textures and objects are sharper than the vanilla. It does the opposite of blurring.

    • @MGrey-qb5xz
      @MGrey-qb5xz Před 8 měsíci

      @@husleman probably using a sharpening filter alongside it to compensate

    • @husleman
      @husleman Před 8 měsíci

      @@MGrey-qb5xz I am not. All I have is what ever the DLSS mod files included.

    • @l4kr
      @l4kr Před 8 měsíci

      50-60 in Starfied on a 4080 are you mad?
      Todd Howard is a clown

  • @Unknowava
    @Unknowava Před 8 měsíci +1

    3:50 XDDDDDDDDDDD. Genius

  • @phoenixfire8226
    @phoenixfire8226 Před 8 měsíci +6

    i have a 3070 and there was a lot of hype around DLSS a few years ago so i expected greatness but honestly it doesn't do shit for me except lower visual fidelity and maybe MAYBE give 1-3 fps if i'm lucky. so color me skeptical. maybe 3.5 is good but, doubt

    • @joni7fi
      @joni7fi Před 8 měsíci +1

      Weird.. In diablo 4 with all graphic settings maxed, with dlss off I get around 80 fps and with dlss on I get like 130 fps

    • @wing0zero
      @wing0zero Před 8 měsíci +2

      Yeah either something is wrong with your PC or you decided to match your 3070 with a really old CPU

    • @phoenixfire8226
      @phoenixfire8226 Před 8 měsíci

      @@wing0zero i did skimp a bit on the cpu and went with a 3600 but it doesn't bottleneck a 3070 at all

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci

      Ray reconstruction is totally independent from Super resolution, you can turn it on without turning on anything else.

    • @gankdalfthedank9855
      @gankdalfthedank9855 Před 8 měsíci +1

      ​@phoenixfire8226 no other definitely xcan, depends on the title, settings, resolution etc but see testing where that card is clearly limited by that cpu under certain conditions

  • @Kazaadoom86
    @Kazaadoom86 Před 8 měsíci +19

    I love dlss 3, it's working quite well for some games, most of them single player but I even use it for darktide 40k and enjoy 122fps on average and I can play without tearing.
    Game looks amazing and the technology was the deal breaker to go for an overpriced 4070ti.
    I had a 6800xt and problems with VR like crazy. The Nvidia works well with my quest 2 and link cable (much better than the amd).
    What Nvidia is lacking is their quality and GUI and function of their drivers. AMDs technology is good, but amd lacks some niche things (Like VR or at least the few games I am playing weren't working well😂)
    So far so good. Hope everybody's card is working well, regardless of AMD or Nvidia or Intel. The best should be that we all can enjoy those marvelous technologys

    • @somerandomusername2590
      @somerandomusername2590 Před 8 měsíci +4

      Personally i dont like DLsS 3 in games with ton of movement because it produces ghosting which means several semi-translucent backends of object like car following the actual object or car when in motion

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +1

      Ray reconstruction and frame generation are Two different things. You can have one without the other

    • @Thezuule1
      @Thezuule1 Před 8 měsíci +2

      AMD cards have never worked right with VR headsets like the Quest2 that utilize transcoding. They just have bad video encoders and worse driver support for them.

    • @brokenhalo2001
      @brokenhalo2001 Před 8 měsíci

      I ended up getting a 3070FE for Darktide and it still ran like total trash ran better months later but no by much. Its a gimmick same with ray tracying.

    • @Thezuule1
      @Thezuule1 Před 8 měsíci +7

      @@ramonandrajo6348 AI based upscaling is a gimmick? That why Intel and AMD are rushing their own version out the door? It's the future of computer graphics whether you're onboard or not.

  • @DruuzilTechGames
    @DruuzilTechGames Před 8 měsíci

    3:38 - Introduce yourself to ESRGAN

  • @EMI-ev2rp
    @EMI-ev2rp Před 8 měsíci +1

    inputdelay from 50 fps on 100fps framegenerated right?

  • @Desalitor2
    @Desalitor2 Před 8 měsíci +3

    All I know is that I feel tired after listening to the AI voice for 15 minutes.

  • @Vector8kGaming
    @Vector8kGaming Před 8 měsíci +3

    Ai powered pcs will eventually auto censor shit on your screen when your browsing the web lmao

    • @igorthelight
      @igorthelight Před 8 měsíci +1

      There would be "jailbreaked AI". As always ;-)

    • @Vector8kGaming
      @Vector8kGaming Před 8 měsíci

      @@igorthelight Aw fuck yeah no doubt!

  • @luckygaming6687
    @luckygaming6687 Před 8 měsíci

    how would be the response time ? like every new thing that comes out must sacrifice something

  • @slyace1301
    @slyace1301 Před 8 měsíci

    Frame generation will be exclusive to rtx40 cards because of the new tech they've added in them but the rest seems interesting to try

  • @Emojavel
    @Emojavel Před 8 měsíci +2

    The feeling of playing on 30 fps, now on 60 fps

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci

      Plenty of people have enjoyed games at 30fps, and not all rendering is interactive.

    • @bethanybellwarts
      @bethanybellwarts Před 8 měsíci

      and with the input latency of 30 fps, and an image that doesn't quite look right with blurry edges and random artifacts, welcome to the future kids!

  • @kevinsongxin2551
    @kevinsongxin2551 Před 8 měsíci +3

    Funny how they relaunched DLSS 3 with a patch just before FSR3 comes out

  • @shieldphaser
    @shieldphaser Před 2 měsíci

    The problem with the barrier to entry reasoning is that high-spec games tend to scale their requirements to whatever the high-end hardware can do at the time.

  • @YoDaPro
    @YoDaPro Před 8 měsíci

    Those 15 to 85 comparions are just in ideal cases and if you only use RT.
    i.E. in Cyberpunkt you need to activate Path Tracing to use RR. So with my 3080ti I get like 45fps insted of sub 30 average @1440p. But nowhere near like 80...

  • @justwhitely2117
    @justwhitely2117 Před 8 měsíci

    I'm just hoping that DLSS 4 features won't be hardware limited to, like, rtx 50 series. I have an rtx 4070 ti and thankfully I can use frame generation, but I think the idea of limiting AI features to specific hardware sucks

  • @DurvalLacerda
    @DurvalLacerda Před 8 měsíci +8

    The problem is that this threatens the top GPU market because most people won't need to buy them anymore in order to play a game on Ultra with high FPS. I bet Nvidia already has amazing DLSS technology to make all this even better, but they won't release it because of their top GPU sales.

    • @wholetyouinhere
      @wholetyouinhere Před 8 měsíci +2

      Nvidia are already locking their tech behind new GPU releases. They are not enabling these AI solutions on previous generation cards, so you don't have to worry about poor Nvidia losing sales. And developers have already started skipping optimization in favor of using upscaling tech, so you will need high end GPUs in order to achieve playable frame rates in the latest AAA releases.

    • @Real_MisterSir
      @Real_MisterSir Před 8 měsíci +1

      Hardly. What we will instead see, is dev studios taking advantage of the higher ceiling in order to simply create better visuals for "high and ultra" settings. Like Cyberpunk's "Overdrive" mode for example which you literally need a 4080 or 4090 to run at playable rates even with dlss on 4k. And even then, "Overdrive" is still a compromise in many regards compared to full global illumination and caustics. It's not because we are in lack of dials to turn up. We have only turned them up by 5% at the current point, and 90% of dev work goes into figuring out how to make graphic compromises so the dials can be turned down without the visuals going to shit.
      High end gpu market will always have something catered to it. Trust me, that's gonna be the least of our problems (3D designer by profession, if game devs could ad 5% of what this industry utilizes, you wouldn't believe your eyes).

    • @I_enjoy_some_things
      @I_enjoy_some_things Před 8 měsíci +2

      The cynic in me says Nvidia love this - they will design their future cards around Frame Gen.
      - in games where Frame Gen is not used, the FPS increase over the older cards will be less impressive.
      - where Frame Gen is used [needed?], the older cards won’t be able to keep up whatsoever. Less savvy users will just assume their “old GPU” is trash and will buy something in the latest generation’s line-up. Frame Gen will give the illusion of increased power/performance.

    • @l4kr
      @l4kr Před 8 měsíci

      You really think Nvidia is sabotaging their market like this? No, they're gonna lock this tech to the new GPUs. And when the marketing drives off, they're figure out a different technology to sell. They're not that dumb.
      Besides, there's still people who want REAL frames like me.

    • @Christobanistan
      @Christobanistan Před 6 měsíci

      They'd rather sell their AI GPUs anyway.

  • @Faust_YT
    @Faust_YT Před 8 měsíci +10

    Can't help but think this gives devs especially those at AAA sphere a reason to disregard performance optimization even more.

    • @RlVlN
      @RlVlN Před 8 měsíci

      I think they’re waiting for the technology that makes it so AI does all the optimization for them which DLSS basically does in a way already

    • @wholetyouinhere
      @wholetyouinhere Před 8 měsíci

      No it doesn't. DLSS trades various performance metrics - sharpness, clarity, motion stability, resolution - for other metrics. The grift works because we typically don't measure the metrics that are sacrificed. None of this is to say that DLSS is inherently bad, but it's definitely not "free" performance, and it's definitely not legitimate optimization.@@RlVlN

  • @pofke1
    @pofke1 Před 8 měsíci

    the most interesting thing is that dlss 3.5 frame generation adds input lag, and you can't turn it off :D

  • @HollandOates
    @HollandOates Před 8 měsíci

    This is how the photo scanner in Blade Runner works. We're almost there. Dibs on 80s Darryl Hannah skinjob!

  • @BeTeK11
    @BeTeK11 Před 8 měsíci +9

    Two minute papers is GREAT channel. Highly recommended.

    • @blushingralseiuwu2222
      @blushingralseiuwu2222 Před 8 měsíci +6

      He speaks like middle schooler where English is not spoken everyday trying to read English.

    • @kubuzeer6510
      @kubuzeer6510 Před 8 měsíci +4

      @@blushingralseiuwu2222 I so no problem as long as he is understandable

    • @panderz_gg
      @panderz_gg Před 8 měsíci

      @@blushingralseiuwu2222 What is the problem? He is not a native english speaker. Didn't you understand anything he said? The guy is incredible at Light transport research and always shows the newest papers in tech, you get that anywhere else on this platform.

    • @merlinwarage
      @merlinwarage Před 8 měsíci +1

      @@blushingralseiuwu2222 He is Hungarian. How is your Hungarian?

    • @Xandros999
      @Xandros999 Před 8 měsíci

      Convince me he's not paid by Nvidia. I'm not convinced.

  • @TimHoekstra
    @TimHoekstra Před 8 měsíci +2

    Must say love the fight between DLSS and FSR. Let's see which technique will win.

    • @c-tothefourth4879
      @c-tothefourth4879 Před 8 měsíci +1

      FSR isn't close to DLSS lmao

    • @meowmeow2759
      @meowmeow2759 Před 8 měsíci +1

      Um XESS is already beating FSR and DLSS is ahead by an absolutr landslide lmaooo

    • @c-tothefourth4879
      @c-tothefourth4879 Před 8 měsíci +2

      @@meowmeow2759 no it isn’t 😂

    • @meowmeow2759
      @meowmeow2759 Před 8 měsíci

      @@c-tothefourth4879 according to the games that I have played with that have both XESS looks far superior to fsr

  • @bronis_lav
    @bronis_lav Před 7 měsíci

    Frame Generation must simultaneously generate smaller frame and than be affected by DLSS super resolution to upscale it. So if we look at the picture 2:59 the DLSS FRAME GENERATION must be as small as TRADITIONAL RENDER and start generated simultaneously while we upscaling TRADITIONAL RENDER fame, and then FRAME GENERATION have to be upscaled with DLSS SUPER RESOLUTION while next TRADITIONAL frame rendering. I believe it will make the process even faster than now.

  • @user-ky5hv8je6u
    @user-ky5hv8je6u Před 8 měsíci +4

    I actually couldn't watch this video because of the guys narration voice. Jesus holy christ

  • @dess1619
    @dess1619 Před 8 měsíci +3

    The AI frame generation on Cyberpunk 2077 is absolutely amazing. Need it in more games.

    • @InsectBewty
      @InsectBewty Před 8 měsíci

      I agree. I've been playing around with it all morning, and it actually encourages me to use frame generation now. There's so much more clarity and it gives me a reason to want to use path tracing combined with fg now.

  • @Sn4p_Fps
    @Sn4p_Fps Před 4 měsíci

    Is helpful only in single player games to push setting to ultra , it will add input delay.

  • @StelznerGaming
    @StelznerGaming Před 8 měsíci +1

    Maybe I haven't tried the newest version but every version I've seen so far of DLSS just makes the game look blurry in places.

  • @ChanceJacksonCJ
    @ChanceJacksonCJ Před 8 měsíci +14

    "Don't worry guys, DLSS will make the game optimized"

  • @teslainvestah5003
    @teslainvestah5003 Před 8 měsíci +32

    Normally, DLSS can reduce latency just by decreasing frame times.
    But frame generation adds a lot of latency. It interpolates between frames, so it needs to wait for the game to render frame 1 and frame 3 before it can generate frame 2, so it only shows you frame 2 after frame 3 is already finished.
    Frame generation is absolutely not to be used in a competitive game, but Nvidia will show you performance numbers that rely on it anyway. Pretty much all the downsides of DLSS are actually just the deceptive behavior of Nvidia.

    • @Wobbothe3rd
      @Wobbothe3rd Před 8 měsíci +9

      Ray reconstruction doesn't require frame generation, it's independent. You can turn on RR without FG.

    • @markpaulpangan7618
      @markpaulpangan7618 Před 8 měsíci +6

      and Invidia Reflex almost alleviate the latency problem so what's issue???

    • @markpaulpangan7618
      @markpaulpangan7618 Před 8 měsíci +4

      @@ramonandrajo6348 what's bad about it???

    • @christophervanzetta
      @christophervanzetta Před 8 měsíci +5

      @@markpaulpangan7618nothing. People just want to make comments because they have nothing else better to do 😂

    • @markpaulpangan7618
      @markpaulpangan7618 Před 8 měsíci +3

      @@ramonandrajo6348 maybe you can make me change my mind if you actually explained the problem with nvidia reflex instead of calling me a fanboy, you know??

  • @starkistuna
    @starkistuna Před 8 měsíci +1

    The Latency hit when gong from 15 fps to 88 has to be insane, also I prefer last years ai generated woman with 7 fingers and weird teeth

  • @shadowarez1337
    @shadowarez1337 Před 8 měsíci

    Will this work on my 3090 or is is still locked to 4000 series cards only like dlss 3.0?

  • @darksaiyan2006
    @darksaiyan2006 Před 8 měsíci +4

    Problem we started seeing nowadays after DLSS became mainstream is that devs design games to only reach playable framerates WITH dlss or FSR enabled, which is very counterproductive. It will actually end up raising the barrier to entry instead!

    • @wholetyouinhere
      @wholetyouinhere Před 8 měsíci +2

      All part of the plan. When devs were using traditional native rasterization, games scaled very well with older GPUs. This was obviously cutting into Nvidia's sales, so they needed to create a business model that forced customers to upgrade - and Nvidia needed to do this without relying on development of more expensive (for them) hardware with legitimate generational performance uplifts. The solution has been AI software, which has become Nvidia's entire focus because it is way more profitable than delivering better hardware. Now devs are programming with these AI software solutions in mind, and Nvidia is locking that software down to new overpriced GPUs that feature only minimal performance improvements (keeping Nvidia's own costs down). The end result is going to be marginally better looking games at the cost of optimization and an upgrade cadence that will price out the vast majority of consumers. Eventually, people will be forced onto cloud gaming. They will own nothing and be happy.

    • @darksaiyan2006
      @darksaiyan2006 Před 8 měsíci

      @@wholetyouinhere Very pessimistic outlook, but all signs point to you being right.

  • @royalrampage4643
    @royalrampage4643 Před 8 měsíci +2

    > wait can you use this to denoise onlyfans pics? :O -- Asmon
    This is gonna be revolutionary for Japan and their mosaics.

  • @Ma77riK
    @Ma77riK Před 8 měsíci

    future games will have developers only inputing wireframe figures and giving the ai algorithm a description of the scene and letting it fully generate the entire scene