Does DLSS Hurt Input Latency?
Vložit
- čas přidán 13. 07. 2024
- Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 3060 - geni.us/MQT2VG
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3080 - geni.us/7xgj
GeForce RTX 3090 - geni.us/R8gg
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6700 XT - geni.us/3b7PJub
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6900 XT - geni.us/5baeGU
Video Index:
00:00 - Welcome back to Hardware Unboxed
03:28 - DLSS Latency Results
11:21 - Final Thoughts
Does DLSS Hurt Input Latency?
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Music By: / lakeyinspired - Věda a technologie
So long story short: As long as you GAIN fps you are golden with DLSS. I assume the same goes for FSR but if you are bored ;)
No. If you are gpu bound and you have headroom in your cpu Dos’s will improve fps and latency.
Bored? wat
I mean like how was this not obvious, then again TV's upscaling has probably been the reason for this confusion from the start.
@@aflyingmodem if you don't have cpu headroom you will become cpu bound and not gain fps and thus lose latency. more fps = faster frametimes = lower latency. the general impsct if DLSS was neglectable by the looks of it.
@@robertstan298 Obviously that was @ Tim/Steve as that testing takes a lot of time so to do it for FSR as well is a huge ask unless he is bored.
So, 15 minutes summed up -> if DLSS is increasing framerate, it is making input latency better. If your framerate is not boosted by DLSS, input latency stays the same as well.
Thank you
you are the man
Thanks cba watching all of that
It's interesting though that it applies all the way through to DLSS increasing latency when the CPU is the limiting factor and enabling DLSS is actually reducing the framerate.
ayye
Pretty neat you followed up on this so quickly. Well played.
I thought that on the original video he said he was already working on a video like this?
well plaid fellas
Had this exact question the other day and HU delivers yet again! Super easy to understand video with great formatting of the graphs, thank you so much!
Thank you SO much Tim! Some guy was HOUNDING me months ago about this topic when there were no sources talking about DLSS input lag. I even busted out my Reflex setup to test myself, but I'm glad we finally have some good tests.
You've tested by lowering render resolution to achieve the same fps, but there is a use case for dlss where you turn up the settings and dlss to achieve the same performance yet with higher image quality. For instance Medium settings get 180 fps, high settings get 140, High + DLSS Gets 180 fps. Testing this could provide answers as to whether or not it would be worth playing higher settings with dlss on or just keep the lower settings.
Is that steve using his shed building skills in fortnite?
That was a really informative video. And it answers a lot of questions. Great job man.
Well played. You followed up on this quickly, and pretty neatly!
This is really interesting. Great work guys
The human eye cant even see more than 8 GB's of latency
BRING IN THE THERMAL PASTE TRUCK
That's dumb, the eye doesn't see in gigabytes.
It sees in giga transfers.
hahaha
@@SlyNine u must be fun at ssd parties
So basically everyone that says DLSS increases input latency is just trying to make excuses for their bad aiming.
Sorry for missing that shot but my fps isn't high enough.
“StUIpID LAg!!!!!!!!!!!!” 🤣
@@SkinUpMonkey Depending on the monitor, that could definitely be a thing beyond the normal added responsiveness of high refresh rates. I just upgraded monitors and realized the old one had a ton of added input lag on lower refresh rates (I could feel it and reviews showed the same). I was also looking at an Asus monitor the other day and its only downside was 30ms+ of input lag at 60hz compared to 9ms with my new monitor. That's huge!
nah, salty amd fanboys doing monkey double backflips to come up with a reason why DLSS isn't good
@@paranoidpanzerpenguin5262 only reason dlss sucks is it is closed source. But i am in the minority which prefers linux and open source software lmao
Been waiting for this vid, ty!
I never knew i needed this question answered, thanks for the detailed testing
Dammit Tim, great job. This video literally answered *all* of my questions. What a pleasant Friday treat.
Fantastic work, HuB! Very interesting and well presented. This kind of content is why people love you guys.
FINALLY! I've been wondering this for a long time...
Thank you guys!
You guys should make a video on Lossless upscaling, costs 200 INR on Steam and implements FSR on any "window", without the input lag in Magpie.
I've been planning to make a video on it, but I lack the resources to show the image quality differences as for some reason, Shadowplay doesn't capture the differences that my eyes can see.
Your coverage will bring in huge interest!
it is basically magpie with a different front end.
It does still introduce a noticeable input latency for me at 4k, but I wish they tried it
FSR on any window isn't going to be great as will affect UI elements too. Text doesn't upscale very well and is much better (and no less performant) rendered at native resolution.
@@zig131 just try man, not many are fortunate with high end hardware :/ It works surprisingly well for us "budget-gamer" standards.
Lossless needs to be updated it adds way to much input lag and it doesn't improve fps on half the stuff which could be because its windowed but still.
At this point the only thing hurt is my upgrade latency.
Not at all the result I expected. Extremely interesting! Thanks for testing this Tim.
OMG i was wanting to know this for a few weeks not! perfect! Thanks Timmy boy!
Thank you for this topic!
This is a great overview of the subject. I couldn't ask for a better answer.
Great video, thanks!
Thanks for sharing video.
Great work Tim.
Could you try out DLSS and FSR as a mean to improve the image quality at 1080p and 1440p by using it as a super sampler vs SSAA?
No channel can make such a usefull videos. Thx for the precious info
This video is 2 years old so a bit dated especially with the new 40 series cards and dlss 3.0/3.5 but i have to say its masterfully done. Excellent video. Liked & Subbed.
Hey, will you be making a video in SAM for RX 5000 series now that their drivers support them?
Yeeeees after MLID brought it up in his BF5 experiences, it's been gnawing at me as I wait for graphics cards to actually exist again.
Next up, FSR input latency!
@@toastedoats5074 yeah probably. Nvidia themselves showed data on the GPU overhead added by DLSS so I was also thinking of that. Glad to have that cleared up.
MLID ☠️
And this video precisely proves that he was wrong once again, saying that DLSS adds input lag when it doesn't.
Well it does add latency as can be seen in the performance normalized charts at 10:27. But of course that isn't to the extend to wich MLID claimed. But then again Tim didn't test BF5. And Tim also only showed one chart with normalized results wich is imho a must if one wants to compare both modes.
@@SciFiFactory 1ms is literally impossible to appreciate
Was hoping to see a test with the FPS capped to a set target that the game doesn't fall below for a true input/render latency only test, but I guess that can kinda be filled by the CPU bound games.. sorta.
Either way, the ending point is right; There's no reason to use upscaling techniques unless you're trying to improve framerate, which will in turn negate any input latency overhead in the first place.
This would be interesting to see this test with FSR.I predict it would most prob the same but it will still make a good video and compare the 2
I really appreciate the Warzone testing. Results are very edifying, great job guys, again!
Were waiting this too
Nice. I was hoping some testing on this would be done by a source I trust.
What do you think about doing a FSR vs DLSS comparison in Myst?
Can you review the new AOC CU34G3S?
I think it would be good to test these over the whole RTX stack including the 2000's because the lower end hardware might have a larger impact than the newest top end cards.
TLDR: More FPS = Lower input latency
Good summation for brainlets, but technically incorrect in a GPU bound scenario.
Read up on pipelining. FPS are a throughput metric. The more dedicated hardware is used and the more complex the DL model is, the more likely it is that some form of pipelining will be used. I'd guess they're already pipelining, but the pipeline is short enough yet to be beneficial overall. This can change in future versions, and may be drastically different in Intel's variant as well.
I always like this kind of videos. Just wanna point out that CPU limited titles are not Less Realistic Use Cases. Fortnite & Warzone alone greatly outnumbers all other tested games in player counts. The other thing I wanna say outside the results is it doesnt matter whether u can feel or not the input latency because its still there. Even if you perform well in game that doesnt mean you cant improve with lower input lag. I agree with everything else.
I wonder if this is stilll the same with low-end systems. 3080 is a beast GPU and we all know it performs better in higher resolutions than 1080p. Another video maybe? If there is something different to be shown ofc.
Okay, the results suprised me, not gonna lie. Thank your for your work ♥
what was surprising to you?
@@-........ Probably that the processing of DLSS didn't result in increased latency as one would naturally expect. Framerate (i.e. frame time) and latency are two completely separate things.
@@bricaaron3978 but just playing the games it was very obvious that there was no serious latency penalty, the fps gain is so big that would overcome easily any extra delay
@@cheese186 *"the fps gain is so big that would overcome easily any extra delay"*
That's simply false, though. The frame time decrease from 60 fps to 120 fps is a tiny 8.33 ms. 8.33 ms isn't "so big" by any means.
There are a lot of things that can add _way_ more than 8.33 ms of latency to the pipeline. Have you played Dead Space? Out of the box (before taking steps to reduce it) you can get a solid 60 fps (16.67 ms frame time) and yet have some of the most massive input latency I've ever experienced. Probably 300+ ms. Because again, framerate/frame time and latency are separate phenomena.
If DLSS really doesn't add any _significant_ (i.e. humanly detectable) latency, that's great, but definitely surprising. Because any processing that has a temporal component -- frame interpolation on TVs is a great example -- has to buffer at least one frame, which means increased latency, even though the _frame time_ doesn't increase.
Great investigation and scientific approach to resolve this dlss question
plz review the AUROUS oled monitor?
unrelated, but does the Huawei MateView GT 34" stand any chance against the odyssey g7? purely gaming wise.
Can you test HAGS on/off and Game Mode on/off for 2021? Or maybe after Windows 11 comes out?
Hi, any update on the Samsung G9 neo HDR fixes from samsung ?
I don't understand how a higher FPS could ever equate to slower input, or even the argument behind it. Input events are handled every game loop (not the design kind), so the more game loops you have the more chances to process your input. A frame in a game is a single game loop in that game and includes things like, rendering, input, sound, AI, postprocessing, and physics. Some of these can operate on other threads like physics or AI, and update via sync-points, so your input and rendering aren't waiting for complex calculations to come back.
This logic and functionality is why V-sync increases input latency because it forces the whole loop to wait for the next available refresh sync thereby reducing FPS.
It sounds like a video on "What is a frame in a video game." would be useful for folks.
What are the impact from say mouse and keyboard. As I understand the latency might vary a lot between models.
It can be significant with some keyboards, almost all mice keep it pretty low. search dan luu latency for testing
Im a big fan of DLSS. It gave me quite a big bump in framerates on COD with my 2070. With mostly maxed settings and set on Performance mode, I'm almost maxing out my Acer Predator 165 Hz.
You guys are amazing. Thank you.
great research, it's impressive how good dlss works, even when cpu limited it adds almost no latency.
So assuming you play at 4k and target fps is 120 in Warzone, should you use Dlss to be able to lock fps to 120 (or 116 with reflex) OR just turn Dlss off and lower the render resolution scale to say 85%. What would be the best case scenario for best latency while maintaining as close to locked 120fps???
In warzone if you are competitive have to play in 1080p low
Some people use DLSS + DSR to downscale 1440p to a 1080p screen (or other resolutions, 4k to a 1440p screen etc...), still hitting locked fps and getting a cleaner image.
can you also test motion clarity more deeply on this
Does an increase to performance through DLSS beyond the framerate that your monitor actually displays still provide a reduction to latency? eg. enabling DLSS bumps the framerate from 200fps to 300fps but you're using a 144hz monitor.
Why no testing while using a frame limiter? Many of us limit frames for adaptive sync so my concern would often be does turning DLSS on while limiting frames have a downside? The upside being less power draw.
a frame limiter is basicaly an artificially CPU limited scenario with the extra resources allocated to the other processes. So in that case DLSS doesn't decrease perf as it takes the now freed CPU and GPU resources.
Thank you!
How does that tool work? Does it hook up to the mouse?
what if u have a monitor 1080p ( without DLSS ) and it gives 240 fps. And u have another monitor 1440p ( with DLSS) and it gives also 240 fps with DLSS ON! so where the input lag will be lower? or it will be the same?
Great news. Definitely turning on DLSS for Nioh 2 now!
Isn't input lag more dependant on the CPU and game code?
What's the intro music name?
Another way to interpret these charts is 1080p with DLSS off is very close to 1440p with DLSS on. That would show that DLSS by itself isn't adding appreciable latency. For example in Metro exodus, 1080p DLSS off is 182 FPS with 28.4 latency and 1440p DLSS on 180fps with 29.2 latency.
Another way, dlss doesn't add any lag. Rendering the game at lower resolution will always give more fps and better latency.
DLSS is something that runs for every frame, so if it increases your framerate, input latency gets better.
This is such a simple thing.
Thanks I had gone crazy trying to explain to people how it won't hurt latency, because that's how it works, but testing it to prove it was a PITA.
Like explaining the nuance of how CPU limited titles are the only ones where you might find any issues is well hard to understand as well
At launch warzone had insane latency from DLSS but it seems they fixed it. People claimed they fixed it early on but I had no way of testing.
@@Sn1p1ngGuy117 Never played warzone so can't comment but it's possible, I saw a really shitty implementation of DLSS recently though it's generally easily fixable.
DLSS isn't doing anything to the frame buffer or anything else that can delay a rendered frame from appearing on a monitor (the actual thing that causes input lag). It is instead a part of the render pipeline. Its presence increases the total render time, but in exchange you are saving a lot more time by doing the original render pass at a lower resolution. Thus the net frame times are lower and input latency goes down.
The issue is that people aren't understanding that a rendering technique that exists in the render pipeline is something different from various driver or monitor level post-processing techniques that do things to the frames in the frame buffer. By the time a frame has entered the frame buffer, DLSS has already done its thing. Therefore the only thing that matters is if the lower original render resolution saves more rendering time than DLSS adds (and in almost all circumstances, it will). Because the more frames that enter the frame buffer every second, the more up-to-date that those frames will be. The next step in the process, moving those frames from the frame buffer to being displayed on your monitor, has nothing to do with DLSS.
You don't even need to consider what's going on in the renderer, input is a totally separate thing in the loop. If your renderer is returning faster you're getting your input updates faster, end of story.
This argument stems from not understanding what a "frame" is in a game at a fundamental level, not some misunderstanding of DLSS or post-processing.
adding more steps in the render pipeline could increase input latency brian
@@tybera1114 cpu receives mouse movement update -> game adjusts data -> sends data to gpu -> things must then go through the rendering pipeline which takes time. if you make the pipeline deeper you may introduce more latency
@@JackMott yes but that would also reduce your fps. Under no circumstances would an increase in fps create input lag. For the frame to be completed all things must return including the renderer.
@@tybera1114 Imagine an extreme example to make it easier to visualize. Imagine a gpu pipeline with very high throughput but with 1000 steps, each step takes 1 millisecond, The pipeline is always full so you get 1000 fps to the screen at all times. But once you make a movement with your mouse, the game adjust data, and puts it into the first step of the gpu pipeline. 1000 steps to go, 1 millisecond each, before it gets to the screen. That would be a full second of input latency, despite an incredible throughput.
Interesting question!
hmmmmmmmmmmmmmm
Input latency being connected to fps performance is not really a surprise.. But always nice when someone makes a video about it to show and explain..
wish you guys tested cold war call of duty best example of dls I've seen in any game in action we talkin 2 ms latency improvement its very impressive on ultra performance runing hd textures
the higher your resolution, the more benefit from dlss you will get since obviously the cpu is less important the higher the resolution is. DLSS MIGHT cause some minor input lag itself, but the benefit from framerate actually negates that downside, especially when you gain a substantial amount of frame rates. I've noticed it while playing 4k. Gain an average of 20-40 fps in most games that support it at 4k.
Could you make same benchmark for AMD FSR ?
What unit is used to measure latency? I am assuming that it is ms but a proper graph shouldn't leave something like that to assumption.
Latency always refers to the time. More specifically, unwanted extra waiting caused by some image prosessing etc. In case of monitors and Gaming in general, we're practically always speaking about milliseconds (ms).
Where 1 ms here or there isn't realistically detectable by human senses. As far as I know. In any case, 1 ms is very small latency.
One nanosecond (0.001 ms), or even dozens of nanoseconds, would be way too ridiculous vs. human senses. On the other hand, a single second (1 000 ms) of latency means something is seriously wrong with your system.
On a 24” 1080p display I noticed a smoothness similar to TAA when using quality DLSS 2.0
I was finally able to buy an rtx 2060 super for a reasonable price
I use a GPU for around 5 years, having a 1080p monitor helps with that.
Now I think dlss will let me upgrade to 1440p without worrying about low fps
In RDR2 and NMS, DLSS Quality looks better than TAA, especially if you replace the DLL with a recent one.
@@MichaOstrowski Just saw a video on that recently, can’t wait to test that!
I play COD Warzone in 4K. Do you have some stats for that?
Thank you.
Can we stop for a minute and address how good those fortnite skills are? Some of the headshots are out of this world!
wonder whos playing. i would be surprise if it is steve's daughter. pretty good gameplay
@@raimishakir It's Steve himself, I think. He does seem to play Fortnite really well, judging from his Fortnite footage from his personal channel.
Great video
Using FSR in Marvel Avengers lately (you know, just for science) and Im surprised that it give great IQ with minimal artifact despite many said otherwise on my gtx 1060 6GB(probably DLSS still won, but who give a sh*t if its just cant running on my GPU to compared) at almost steady fps (using dynamic fsr with 60fps target). Kudos for the tech...
What card you use?
I'm not suggesting you put in the work for this niche idea, but it would be neat to see the results with a weak card with dlss capabilities like the 2060 rather than a powerful one capable of more frequent CPU bound situations. I can guess based off the data here, but nothing is quite like actual testing.
I was hoping so too.
*rolls eyes at MLID*
I'm actually really surprised something that provides a visual service actually DECREASES input latency. Goes against all conventional reasoning from the past but I guess AI is really pushing the boundary. Great video as always. Thank you.
input latency is so tied to framerate so you wuold find the answer when you enable framerate limit... 60/144hz=60/144fps wuold find the answer the question...
I'm not sure what you mean, but framerate (i.e. frame time) and latency are two completely separate things.
Higher FPS = Lower latency, who would've known? No seriously, wouldn't it make more sense to test with an FPS limit?
Without fps limit the whole test make no sense.
Now test how DLSS affects Stadia's negative latency.
hahaha
Stadia? Is that still a thing?
Wow great video
Please provide tests with lower end gpus!
We just gonna ignore that nasty snipe at 13:09??
does FSR produce the same result?
I feel like because dlss kind of fills frames in between frames that you don't get input lag as much as the input just doesn't get better with the higher frame rate produced by dlss. Whatever your input latency is with dlss off will be the same latency with it on even with higher fps
This isnt DLSS 3 bro that makes latency wayyyy worse. This is just resolution upscaling part ie DLSS2
with the same specs except memory I have 5800x 3080 xc3 ultra gaming 32gb of 3200 ram in warzone I get 155fps in Game and in trial I get over 250 on odyssey g7 2k and medium settings. How do you manage to get 200+ fps ?
RAM CL rating, warzone is pretty responsive to tighter ram timings
I have a 5800x and a evga 3080ti ftw3 ultra with 32gb crucial 3600mhz cl16 ram, i get anywhere from 130-200fps but average id say about 180fps thats at 1080p aswell, on high settings with antialiasing on max
At 14:38 - "Press F to pay respects"
I have a VA display and dark objects smear a bit in fast moving scenes but not noticeable, when I turn on DLSS in COD I notice that the smearing is way worse to the point that my eyes start to get strained. Why does this happen?
Cheap va display jk
But srsly. I wonder why does. Even in some oled.
@@CutzMcOnions it wasn't cheap tho its a Dells3220dgf paid $500 cad on sale
@@meerkat-trav9489 yea "jk" means joke
@@CutzMcOnions didn't see that my bad
DLSS amplifies TAA's tendency to cause ghosting. It's not your display that's causing this. Tim even showed this quite clearly in this DLSS vs FSR video.
I wish you did a warzine best settings. I play more fortnite but in fortnite it is very simple. Performance mode and in warzine there is no performance mode.
Apparently it also makes black-smearing worse on displays such as VA's and OLEDs. Would be interested to find out if the same is the case with FidelityFX.
That's not black levels smearing you're seeing. It's ghosting due to TAA which DLSS requires.. unfortunately DLSS also have a tendency to worsen TAA ghosting the same way FSR has a tendency to worsen shimmering.
@@andersjjensen Theres both. On the cheaper panels anyways.
Wo wo wo, how can you measure DIF in warzone? (They turn this on in mp only as far as i heard)
I thought you were gonna test DLSS off vs DLSS on but at same fps and same gpu usage.
I think there is a connection between naturally aspirated engines/forced induction and native resolution/dlss. Adding layers decreases responsiveness.
Even though it was just disproven in most cases in the video you are commenting on?
Streamers are trying to balance winning and entertaining. Despite what they say, they are not truly competitive gamers, they are entertainers. In the early days of competitive gaming in Counter Strike tournaments we turned off everything that did not make aiming and seeing the opponent better. Winning was FAR MORE important then looks. I'm old now but this logic would state that this never changed. Race car drivers do not care if the winning car is pretty, just that they are driving it.
its the zoomers
Thats incredible realy it decreases the latency , realy have to give probs to Nvidia Dlss , i will wonder if you Can make also a Video for the Xbox ONEX Supersampling if its affects the Input latency thank you Hardware Unboxed.