NVIDIA RTX HDR Tested - RTX HDR vs Native HDR vs SDR - It Is Really Good!
Vložit
- čas přidán 13. 06. 2024
- 0:00 Intro - What to expect
0:32 Nvidia App is a must but there are issues
3:11 Cyberpunk : RTX HDR vs Native HDR
4:52 First Summary
7:06 SDR vs RTX HDR
8:58 Colors are oversaturated with Factory Settings
10:13 Performance loss with RTX HDR
11:01 Conclusion - Hry
Two things...first of all I forgot to adjust my Camera Setting that's why i look a little bit red in this video. Second of all , if there is some demand I am more than happy to make more videos about Nvidia RTX HDR. Let me know what games and what should I test.
Notice: Some games will break HDR completely if you turn on native HDR AND RTX HDR at the same time as it doesn't seem to automatically turn the rtx one if you turn on native hdr. Currently I can confirm that Spider-Man games lose most color tone mapping if you don't tell the Nvidia app to stop using rtx hdr
Would be great to see some comparisons between RTX HDR and Windows AutoHDR.
I'm playing witcher 3 atm might be a good game to try out rtx hdr
Is rtx hdr filter works only on win 11?
Yeah, but generally Auto HDR looks worse with minimal performance loss@@Gr4yF0x
2:45 Nvidia is a 2.2 trillion dollar company 😅
Nice... btw, my peak brightness slider max'd out out 850nits for some reason (S95B on the OG 1,500nit peak fw-1211). If you set a Max brightness in you EDID ( I put 157 (or 1499nits) in the Max Luminance field - previously, it was blank) , the RTX HDR slider will pick it up (I can now go up to 1,500). Hopefully Nvidia will fix the issue prior to full release.
Thank you for this video, been looking forward to a comparison of native HDR and RTX HDR. I do hope you can touch on the settings for RTX HDR at some point with regards to its accuracy. On Reddit, if you search RTX HDR - Paper White, Gamma & Reference Settings you will find a post of a person who has meticulously tested RTX HDR compared to reference SDR. And their finding is that RTX HDR is too saturated and bright with default filter settings. To fix this and make RTX HDR look just like SDR (but with bright HDR highlights) you want to set Saturation to -50 and Contrast to either 25 or 50. For Mid Grey nits, you can calculate the specific nits you want with a formula. For 2.4 gamma 100 nits, you just choose 19 Mid Grey. So I rock 1000 nits (on S95C TV), 19 mid gray, 50 Contrast, -50 Saturation for that 2.4 gamma 100 nits look! Now, RTX HDR is perfect imo and really is a dream come true! :D
This is highly dependent on the individual and thier equipment. I've found that a lot of "purist" are hanging on to old ideas of what is "correct" and are forgetting the point of HFR. You don't want you colors to be SDR, HDR is about the colors too not just highlights and black levels. If they're trying to make it look like SDR then play in SDR. Actual HDR is more colorful because guess what there are more colors being represented in a 10 - 12 bit color space. SDR is 8-bit thus it will look duller.
"A person has tested" could be anyone. They compared SDR to HDR... of course the colors will be different. It sounds to me like it was someone who doesn't understand HDR and what it means.
Yes, that's what I'm thinking. I think your point on how we should compare RTX HDR to HDR is precisely what I feel... I'm caught up in comparing it to SDR and obviously the guy on Reddit was as well. Maybe we should trust the RTX HDR AI to understand how the HDR colors should look, because usually HDR games do saturate colors quite a bit more, right?
Idk, for me I just don't want all my colors to be more saturated if that makes sense. I want nuance and for some colors to stand out, so I'm a little torn on this and not sure how to feel. I feel as though I need more comparisons of RTX HDR to native HDR and garner an understanding and appreciation of RTX HDR specifically when it comes to colors. RTX HDR with color saturation at 0 might actually be very close to native HDR colors in many games, who knows! :P@@ripleyhrgiger4669
How did you manage to work HDR analysis tool with RTX HDR? To me it works just with Native HDR, but when I switch to RTX HDR, the tool says it's SDR not HDR. Tried to override colorspace to CSP_HDR10 and CSP_SCRGB with no luck. It still says it's SDR and measure colorspace in % and not in HDR Nits.
Think he's using the same method for the PS5 HDR analysis - a HDMI capture card with ReShade on the input stream window.
Hi Mark! I follow you since the LG CX's days and i've aver appreciate your works! Hope you continue to give us the quality HDR world setting we need! PLEASE with this RTX HDR give a chance to Dying Light 2! It's one of the game that i prefer! PLEASE. A hug Mark
Thank you for that video.
Did you compare vs Windows auto HDR ? (for SDR games)
Would love to see more content on this. Both in general and the sliders specifically.
I was waiting for this video, thanks!
FYI...RTX HDR just updated with latest Nvidia driver update just released. This is the time to test it. I think it means default settings follow 2.2 gamma? Or am I reading the update details wrong?
still no support for dual monitor setups though i bet.
@@alexandern5296I think they might not support multiple monitor until the app officially comes out. I think I saw somewhere they said they definitely will have support in a future update.
You can have the second monitor on while using RTX HDR. Disable the second monitor on nvidia control panel and open the game with the RTX HDR filter, once you are in game you can turn on the second monitor on the control panel. The filter will not show as active on game filter overlay but RTX HDR it will continue to work even tho is not showing as a active game filter.
Please do more videos on this. Settings discussion, what they mean, best way to adjust it based on picture. Also a video on color information would be great. Also applying RTX HDR for each game would be good too, aka do you recommend it or not for the newest games like Dragons Dogma 2.
What RTX HDR settings do you recommend on an LG OLED monitor with max 603 nits?
How you get that HDR stats overlay real time ? Is it hardware or software ?
I have aoc q27g3xmn I can’t find the right settings for it in hdr mode i need help how can i get great hdr beside my monitor is 1000 hdr display certified
Great vid..would like to see some more on the sliders
Do you have to turn off auto hdr in windows for it to show up in the filters menu? I can't see it anywhere
Yes, turn off auto hdr and enable hdr in windows.
@@Bdestiny933 sounds like a stupid question I'm new to all this but isn't that the same thing lol. Turn off auto hdr in windows but turn it on in windows, or obvs I'm misunderstanding what you meant?
@@Takeme_DCUO Auto hdr is a setting under HDR. Turn off auto hdr. Keep HDR on, but disable auto hdr.
@@Bdestiny933 brilliant I'll go have a look and sort it out. Thanks for your reply 👍
@@Bdestiny933 Is "Auto HDR" some kind of fake HDR?
Can you test peak brightness after last software update on LG CX cause in my opinion its now much lower most in dolby vision but in hdr also
Please tell me that you have control over if and when you install updates on your LG display?
I just saw with latest gpu driver 551.86 the rtx hdr saturation might be fixed saying errors fixed ,,RTX HDR uses saturation and tone curve that matches Gamma 2.0 instead of 2.2
hi.when the hdr on oled tv for horizon forbidden west on pc?
Your nvidia rtx adds color ? that would be great
it stays washed out like it was an sdr signal for me
Thank you for sharing. I'd like to see you make a video about the comparison between xbox auto hdr and native hdr. I think it'll be fun.
Witch tv you are using ?
For my TV with an 2.4 Gamma Set for my Kinoroom. I use 800 Nits, 34 on midrange Nits for Paperwhite and 50 on Contrast. So you get 2.4 Gamma. Saturation -50 for native colors. 0 ist more popping.
Hi I noticed that on the c2 the automatic video color gamut in hdr loses contrast, I set it limited right
I have noticed this as well , I prefer some black crush I stead of lifted hazy picture
I did call the rtx hdr as being the one on the left. The cutoff point for highlights is weird, it's like the same object can have two different luminance values and it's not a smooth transition. It's very noticeable inside the bar, on the red lanterns hanging off of the ceiling.
Otherwise, pretty good I'd say.
I reccomend rebinding Alt+F3... that is a very unfortunate choice of hotkey
Hmm, I tried it in Horizon Forbidden West before the native HDR support has been fixed via unofficial patch, and I was not very amazed. Similar problems as Windows AutoHDR (1000 nits max) and the usual issues of SDR-to-HDR conversion like way too bright UI elements. Ended up using Special K instead, which also was far from a native HDR experience, but gave me enough control to get a somewhat decent picture, and worked flawlessly with Lilium's HDR Analysis.
i tested rtx hdr in apex legends and fortnite, did a world of difference and it is amazing.
default settings i do not like, but you adjust them Realtime in game so takes seconds to fix.
i plugged my second monitor into motherboard(igpu) and now i have multi monitor with rtx hdr.
the nvidia application is beta so it has bugs and is incomplete, but it is promising.
i think you have to make sure the game is an sdr game and auto hdr in windows is off. never worked for me anyways since you have to have the xbox stuff enabled.
and the monitor has to be in hdr mode, my monitor enables hdr when i put windows hdr on.
Why not compare to Auto HDR with 2.2 gamma icc?
In another video
The oversaturation is normal, whenever you boost contrast in an image you also boost the color contrast, since they are one and the same - so just like in photoshop, if you boost the contrast you need to tone down vibrance if you don't want to boost the colors at the same time. So this likely won't be changed - but one needs to correct it if one wants to keep a natural looking image.
You sure it isn't just using a bigger gamut?
I know from a previous video that RTX HDR oversaturates. This was apparent in the comparison. The giveaway though was the highlights and brighter areas, this made me fairly confident that the left one was RTX HDR. Apart from the green lights above the bar, the real HDR popped more, and the square of daylight outside was noticeably brighter.
I think real HDR looked better because despite the black level issue, it doesn't looked washed out and the contrast is still better IMO as bright lights really stand out compared to RTX HDR.
Wow Ok, I have messed with RTX hdr multiple times and it always looks worse than SDR even ruins the image, after watching this I figured out how to adjust it while playing the game. Now I understand why HDR is usually a broken mess, because the sliders need to be in the right positions, your chances of guessing on the sliders and launching the game with it looking good are incredibly Low. Now that I can see the changes taking effect (with alt z to bring up the nvidea overlay), I have some SDR games looking as good or better than native HDR. Thanks buddy!
RTX HDR does a really good job but other than the slightly high performance cost the lack of multi monitor support kinda sucks. Knowing Nvidia though they'll probably fix it and it is in beta after all.
It's still in beta I think...
Jacob Freeman (Nvidia) already confirmed multi-monitor support is due in an upcoming Driver update.
Fine for single monitor 3rd person rpg, at least I think
I think the mod also works with multiple monitors as long as all of them are hdr enabled.
The lack of multi monitor support is a deal breaker for now. Also using the RTX HDR mod is a pain in the ass too. I know they have announced a fix or update to allow for multiple monitor support, but I hope it is soon.
What is a good tool for your average person to check black levels in games?
ReShade HDR analysis shaders.
1:44 weird, on my Samsung Neo G8 it goes up to 1015 nits, not just 1000.
The HDR1000/HDR10 standard says that the black level of a screen/tv cannot be higher than 0.05 nit. If the black level is 0.05 nit and doesn't get higher, the screen/tv is still within the HDR10 standard. I wonder if game developers are simply following the HDR10 standard and set the black level in HDR to 0.05 nit on purpose.
The thing is, most people don't sit in a completely dark room with and OLED tv while playing a HDR game. Non-OLED TVs cannot achieve true black, so most people wouldn't even see details in the 0 nit to 0.05 nit range. In fact, it would look crushed to them. It wouldn't surprise me that developers simply use the 0.05 nit black level to make sure the most people get the best picture possible.
The problem with HDR is that you cannot simply increase the overall brightness of everything like you can in SDR. If the shadow details are too dark in SDR, you can increase the SDR peak brightness to make everthing brighter. There is no option like this in HDR. The only thing that comes into mind is the paper white setting that determines how the 0 - 100 nit range of SDR is being displayed in HDR. According to the PQ EOTF standard, HDR paper white is 200 nit. This means that everything that's underneath 100 nit in SDR is twice the nit value in HDR. This of course will also make the shadow details appear slightly brighter in HDR compared to SDR (provided, you set up SDR mode correctly with a peak brightness of 100 nit).
Twice the nit value might sound like much, but the light sensitivity of the human eye is not linear. Going from 100 to 1000 nit doesn't appear 10 times as bright, but only about twice as bright. The difference between 1000 nit and 2000 nit sounds huge, but it really isn't. I currently own a S95B and I don't see any reason to upgrade to a 1500 nit TV, since the difference wouldn't be that huge (even though it's noticeably).
That's maybe a reason. Thanks!
*"The thing is, most people don't sit in a completely dark room with and OLED tv while playing a HDR game."*
In my opinion, anyone who buys an OLED display to view it in a bright room is wasting his money. Pupillary constriction means the black level is no better than an LCD, the contrast ratio is significantly decreased, and no doubt the brightness is cranked, which drastically accelerates image retention.
I went from playing immersive games on a 24" 16:10 CRT in a pitch-dark room (good CRTs can do true black) to playing on a 55" OLED in a pitch-dark room.
@@bricaaron3978 While I agree that especially OLED TVs should be used in dark room, the same can be said for other TVs. HDR content requires a dark room to really shine. Having dark shadow details on the screen is pointless, when you cannot see them.
The thing is that even in a dark room, the difference between 0 nit and 0.05 nit can be hard to see, if there are bright highlights on the screen.
@@rexomnipotentus387 Absolutely. Beside issues of black level/contrast, a theatre-dark (or darker) environment is one of the requisites for maximal immersion.
You're right about highlights and overall contrast affecting shadow detail, especially with large projection screens. As long as the environment is dark and treated to reflect minimal light, this doesn't seem to be a problem to me, as it mirrors how we respond to light in reality. If you are in a dimly-lit environment and a bright light shines into your eyes, you will be unable to see shadow details around the light that you previously could.
Is it only for 40 series?
If you are using the LG C series TV,you should turn the max brightness to about 800 in RTX HDR.
Gracias por el video, en mi caso no encuentro mucha diferencia entre el HDR nativo y el de nvidia en juegos sdr cono eurotruck simuleitor o scum que son los juegos que más uso, no veo diferencia entre el rtx hdr y el auto hdr, monitor hp q27x hdr10 de 165 hzs
Waiting for multi monitor fix and performance fixes, not sure if my 4070 super can handle that when already playing at max path tracing
You shoukdn't be using max path tracing at all with a 4070. The performance loss is like 5-10 fps and with a 4080 I don't see more than 5 fps difference in any of the games I've tested. Alan Wake 2 with all of its RTX stuff on in balanced mode is basically the same FPS with it on or off and it looks much better with it on.
Plug second monitor to your iGPU instead and RTX HDR will still work on the primary monitor.
It even works on ps2 emulator games. Very useful
Just curious as to whether Windows HDR needs to be on or off for RTX HDR to work?
A 2-second google search would answer the question for you
Windows HDR = On
Windows Auto HDR = Off
Turn off native HDR in game if the game has it.
Not being able to use a second monitor is a no go for me. I have 4 connected.
what's the name of the second game at 7:07?
Looks cool I wanna play that game.
The Banishers
one annoying bug is that the peak brightness (nits) slider is capped at the reported number by windows, and for some reason i get 430nits and i can show 900nits, no fix on my end but that slider should be uncapped so this never happens. still looks good tho
No, I don't think so. I'm using an LG OLED G3 and have Windows calibrated to 1500 nits peak brightness, but RTX HDR still only allows me a maximum of 1000 nits. Special K however does correctly detect the 1500.
There's better ways to fix black level floor in CP77; like Prod80's reshade that cost no performance and keep native HDR intact, it just requires a little bit of work is all.
@gamingtech, can you please test Robocop Rouge City - which has no native HDR or auto HDR support?
Robocop is a UnrealEngine 5 game, so technically it has native HDR as all UE 4/5 games do. The Devs just decided not to switch it on.
You can turn HDR on in config files or you can use Lilium's AutoHDR ReShade.
The results in that game are amazing.
@@Sean-fj9pn Whaaaat... really? They just didn't turn it on... thats so weird. Can you tell me which config files I have to edit to get native HDR in the game?
I believe the increased color saturation is the right thing to do, and the default slider is very good. We can always give it a -5 or -10 to the colors if it looks wild.
Agreed. A lot of comments are focusing on making the RTX HDR look like SDR colors and only focus brightness levels when HDR is about extending the range of both brightness and colors. Meaning colors should pop more or look more saturated (to a limit).
@@alphacompton Not when the human skin tone is turning unnaturally red... It's still important to get the saturation right.
Test on the Metro 2033 Redux our Last Ligth Redux ! Iam playing whit this Rtx HDR and the diference on these games are Insane! Sometimes the black depth looks like Raytraced...this is a Gamechanger !
Just tested this on Elden Ring and I'm very impressed. I could never get HDR working right on that game or really any game on windows, but this actually looks great.
Hmm strange, native HDR in elden ring is great, you just increase maybe the saturation a little bit for you liking
I just tried it with Last Epoch and oh my god it's so much better than the Windows Auto HDR function!
I just had a thought... Can it fix Diablo 4's poor HDR implementation?
Maybe measure the performance cost in percentage or frametime milliseconds, instead of FPS
I'm not sure how to get it working corrently on my PC to be honest. It just doesn't seem to work properly. I have no issue doing native HDR and Windows Auto HDR. But when I activate this in the Nvidia control panel it looks like when you try to force HDR on non-HDR content (without the benefits of an actual HDR algorhythm). So it looks much too dark and the colours just look off. Nvidia Super Resolution also works just fine, no problem. It's just RTX HDR that doesn't seem to work properly. Not for videos and not for games either.
my peak brightness only goes up to 465 nits on the Aorus fo32u2p even though i have in hdr gaming mode with apl set to high for peak 1000 nits and i have my windows hdr calibration set to 1070 nits
Can you do an HDR analysis of horizon forbidden west on PC please?
Agreed, the colors look way too saturated in SDR games especially. I tested MGS Master Collection MGS2 and it didnt look better, thats one scenario where I prefered auto hdr but more testing is required.
Banishers is such a good game it really needs HDR. Going to try Nvidia later today.
You can force native UE HDR on that one, it really looks superb and honestly blows RTX HDR out of the water
Hi, Same here, when I downloaded the app and activated the filters option I lost about 5 to 10% in performance. Yes it looks nice and the feature is good but they should fix the performance issue, so I uninstalled it.
My pc Spec:
ryzen 9 7950x3d
RTX 4090
RAM 32G 6000mhz CL30
With a 4090 why not use the fact the card is so powerful to take advantage of the extra features?
resident evil should get a HUGE boost from RTX HDR because in that game (almost every new part) black levels are raised by default, making everything a bit washed out unless you go overboard with the settings, which kills the details. so yeah, looking forward to that video.
I still prefer native HDR and than tweak it with reshade+liliums shader HDR tools. Especially for adjusting the black levels and target peak brightness correctly.
Yeah Gaming Tech also uses those very same shaders so it's pretty odd he doesn't use his reach to advocate for those shaders. It's super simple to setup nowadays, and gives you a way better image
@@AnimeUniverseDE Yeah there's even a way to expand the colorspace so HDR games use the deeper more vivid BT2020 colorspace. Gamechanger
@@AnimeUniverseDE Maybe because he wants to show the regular methods and what's out there in terms of technology because using reshade+lilium HDR tools can for some be tricky if they have never used it before and don't know how to setup for their HDR supported TV/monitor hence RTX HDR is easier to show and talk about.
@@TheEdmaster87 setting up RTX HDR is already tricky enough and comes with a ton of downsides and quirks like no multi monitor support. Then you need to input the accurate settings because Nvidia's defaults are totally busted. For completeness sake he really should be showcasing how to fix native HDR games via Reshade and Lilium. And he also recommended picture settings changes for TVs before (lowering black level)
@@AnimeUniverseDE I don't think you can use those in multiplayer games
On My GP27U, my peak brightness can go higher than 1000, so it is probably related to the monitor spec.
on my s90c it goes from 1000 to 1200 randomly idk it picks and chooses
More games u may take a look at: Diablo 4 and Helldivers 2. Tested both with RTX HDR, looks better than their native HDR solution.
Also, funny thing I found out with RTX HDR, I love Destiny 2 (my favorite game with thousands of hours logged in) and I just had to compare RTX HDR to its amazing native HDR. I discovered that Nvidia is almost on point with its default saturation as far as how the AI EXPECTS the game to look. It really matched HDR colors well! It's a weird thing. 0 Saturation setting via the filters is certaintly not accurate, yet it just about matched native HDR colors. It definitely is mind-boggling, lol. I'm really into accuracy, so I'm just not sure what to make of it. Very interesting and I just love this feature. Nvidia really are the best!
Did they ever fix the broken native HDR implementation for Destiny 2? I'd like to use RTX HDR, but I need my second monitor for Destiny Item Manager.
Idk about second monitors and HDR if that's what you mean, but I have lots of experience with D2's HDR over the years and yes they fixed it. They fixed it I think in 2022 with the launch of Witch Queen iirc. There are proper sliders now, and the only real problem I have that keeps the HDR from being "perfect" is the simple, but dreadful fact that black level raise is a thing. That's why on my S95C, I use Shadow Detail -1 to fix this. On LG OLEDs, I used -3 Fine Tune Dark Areas back when I had my LG CX not long ago. So that's what I'd recommend. Mid Point setting needs to be lowered too. I recommend below 25. It gives the game a very nicely saturated 3D look with lots of depth. I really think it's some of the best HDR I've ever experienced and I've always felt this way! It ain't perfect, but the colors and overall depth cannot be beat! @@gamz
@@gamz Plug your second monitor to the iGPU and RTX HDR will still work (you have to restart your computer the first time doing this).
There is a mod that uses reshade addon system to re write game shaders to fix hdr
I have the lg cx and I can't sacrifice dldsr over hdr
this is the problem
i tired Special K on Banishers and it was a big difference
Don’t have peek brightness option
The fact that RTX HDR can not only make older games with no HDR support look amazing, it can even fix current gen games’ bad native HDR implementation
It makes RDR2 look so good
it looks more detailed but people look like they have moisturizer on their face
great video thanks! Banishers is a surprisingly nice game isn't it .
If only I had a microscope to see this incredible difference...
Playing Batman Arkham Asylum with 4K textures and TruHDR and it's transformative.
A work around to the second monitor being unsupported is to connect it to the motherboard video output. That will allow you to use RTX HDR on the monitor connected to the GPU. Also, they are working on more proper multi monitor support.
Well you can only do this if the cpu has an internal gpu
@@Superdazzu2 you are right
You should really give Special K and its HDR tool a try. Special K HDR tool forces real HDR in games, it's not a 3rd party overlay like RTX HDR. I tried RTX HDR in Witcher 2 and due to the game's lighting and already oversatured colors, it looked terrible, but with Special K it looked fantastic. Altho you need to use DXVK with DX9 games to make the tool work.
I am able to use multiple monitors if I plug my secondary monitor to my Intel's iGPU while my primary monitor to my RTX 4090. That way the Nvidia driver only detects one monitor plugged into the Nvidia card. Intel's driver will handle the second monitor.
What a bummer. My second monitor is just an HDMI hooked up to my GPU for audio.
For something can can be done just as well and even better with Reshade's Prod80 "Curved Levels" shader, its just not worth recommending RTX HDR over reshade since the performance hit is only ~1% with reshade compared to ~5% with RTX HDR , and it gives more precise control over black / mid / white points. Unless you're playing a game that uses an anti-cheat that will block or ban the use of reshade, only then I can rec. RTX HDR.
Just waiting to here you say: black level raise...it's just whats popping in my head when I here you talk 😄😜
Of course RTX HDR is adding color information.
It is adding P3 colors over an SDR picture.
So it is no surprise that the presentation could look oversaturated if not calibrated properly.
rtx hdr makes the game look way better like new game like u said...onceu enabled u wont go back clearer picture sharper image better colors and not washed out /
It works really good on World of Warcraft Season of Mastery.
Would love to see the difference between Windows Auto-HDR and RTX-HDR.
i prefer the windows one over nvidias
The early news is auto hdr has raised blacks and rtx hdr doesn’t with the right settings. The only negative to rtx hdr compared to auto hdr is the performance cost. That’s the early reviews I read from people online.
@@Ray-dl5mpfrom my experience using the rtx HDR on my FO48U compared to the windows version is that it blew out the colors and saturation an no matter how much i calibrated it it was no good compared to the windows hdr and hdr on my monitor calibrated correctly it look far better and the performance was better as well.. another thing is that when you dont want to use RTX HDR you still receive the performance hit.. and have to uninstall the new app and reinstall geforce experience to fix the performance hit
@@cheeemzy6651 RTX HDR's real neutral settings are:
-50 saturation for unaltered colours.
45 white paper
+25 contrast for 2.2 gamma, +50 for 2.4 gamma
Digital Foundry covered this in their RTX HDR video. Check it out.
Highly recommend this for Star Citizen. Disable in-game HDR and enable RTX HDR. Massive difference in black levels and for a space game that is kinda important :)
Are blacks higher or deeper with rtx hdr?
Spreading democracy looks better in HDR
@@enricod.7198Deeper
Judging by their comment, probably lower which is good.@@enricod.7198
Or use lilium shaders for hdr fix
1:03 I can use RTX HDR with 2 TV`s connected. I have a C2 65" and a C2 42" that i use it as a monitor and is working.
Did you plug the second one to a different GPU (like a iGPU?)
No... Why would i do that? I have both of them plugged in a 4090 . @@paul2609
Yes ,more rtx HDR Videos
talk about the sliders youll get my view
It’s so much better that I just take the FPS hit so i can use it relative to in-game and windows auto hdr.
I believe another requirement is that you must have windows 11
which tv do you use mostly?
It’s fixed here at 400 nits. (OLED g8 ultrawide. It’s edid related said Nvidia)
Turn off HDR10+ and the slider comes back again.
Use the service menu trick to have 1000 nits peak brightness in HGIG.
@@Naito75President I did.
It you do as I suggest you get the slider back. I also have an OLED G8.
HDR 10+ Gaming to off.
They just need to reduce the performance hit. It is fantastic
performance impact caused by the anti-banding filter, it can be removed in a driver mod (by the same guy who leaked RTX HDR) or just wait for nvidia to expose the setting to turn off the filter.
Lets see windows 11 auto hdr vs rtx hdr vs sdr. I would like to see RDR2 native hdr vs rtx hdr, vs auto hdr, since that game doesnt have the best hdr implementation. The finals would be good since there no native hdr vs auto hdr vs sdr, Gta 5, forza horizon 5, call of duty mw3.
Greetings may I ask why in my pc I only see 800 nits at the RTX HDR slider and you have 1000 ? I have QD OLED by Samsung s90c
Can you use dsr on your oled? If not the reason why rtx hdr not working might be because of dsc.
I wish someone would discuss why rtx hdr doesnt work half the time. It says either game doesnt support it (which it should) or the sliders are locked and cant change it. Also for Helldivers when hitting the Nvidia overlay it closes the game and spazzes out
Test on Prince of Persia please
Should have done HDR comparison to the game that has excellent HDR that isn't flawed in order to expose the flaws of RTX HDR. I'm pretty sure there are plenty (like overshooting some colors, making things brighter then they are suppose to be etc...) as it is on "assume" basis cranking stuff up according to database it has but that doesn't necessarily mean that that is how the game is suppose to look and feel visually.
What if creator intended in a certain scene for colors and brightness to be more muted to instill certain emotions and with RTX HDR it just cranks it up to what it assumes should be due to its database?
I'd like to see perhaps thorough test comparison to see how it changes the image to originals intent. I have a hunch that RTX HDR will make all games look the same HDR and color wise. It sees a light and it will always be identically bright across the games... Bonfire, moon, sun, bright scenes, that they'll all look the same across the games range which would be very bad if that's the case.
Sure it does look promising from just activating it and seeing it's turning things into HDR but I feel there is a huge impact on visual emotion if we scratch the surface of it.
While I agree to an extent if the game has bad HDR and RTX HDR fixes the problems without changing the image much in other ways then it's better. I've used it to play through the Wolfenstein series which don't have HDR and I found it much better and from what I remember on my previous play through it didn't change the mood of the game in a negative way if anything it was better because it had more contrast.
@@WArockets it can't fix bad HDR. It doesn't correct HDR. It broadens values of SDR feed.
But pay close attention to what I wrote... we don't know how much it alters the mood we need comparisons. Deep investigations on the tech.
There is a risk of this damaging the mood and the feel of many games that creators spend months tuning to get the visual emotion in a scene.
As the system is automatic and changes values according to a database and if my hunch is correct it means all games will look the same. Every lamp will have similar nits. Every puff of cloud exactly the same nit brightness... every lightbulb etc.
This kills the soul of the visual mood.
Just because it makes image HDR with dynamic and varying levels od brightness doesn't mean it's automatically good....
It's the artistical intent that may be killed with RTX HDR.
This is indeed just an emulation of HDR based on SDR source and it can very well guess wrong and make things that are not supposed to be bright look brighter. Things like white stripes on clothing may look like reflectors in worst case. 😁
It does have some adjustments though. If you feel like that the image deviates too much from the artistic intent then you can change it. You can adjust color saturation, yoi can adjust peak brightness and you can change the gamma curve from super contrasty and popping to more washed out looking.
@@MaaZeus Yeah that's great that they added adjustments for users... But still one doesn't have any clue as to what creator wanted the visual emotion to be. They could adjust to how they want to feel looking but never get the feel for what the creator wanted to instill...
That's my biggest gripe.
Wish digital foundry did thorough and in depth analysis of RTX HDR instead of just going "wow it turns SDR image into HDR without actual HDR, that's amazing!".
Which is also where Gaming Tech has a chance to do some in depth analysis and uncover all the actual flaws and how much it deviates from the original creators intent be it the loss when doing SDR to RTX HDR ... Or comparing HDR native to RTX HDR and how much do the games look samey with it or not...
as well as some actual proper flaws like the error you noticed where white stripes on clothings can turn out to be bright as reflector lol.
We need some actual dissection of technologies like this that come instead of "wow you flip the switch it does turn on and change stuff". xD
We need data damn it. Academic data in order to observe and tell which one is a tool and which one is a gimmick made to fool us up.
I agree with this. SDR Games should be palyed in SDR. Eevery form of auto HDR conversion throws accuracy completely out the window. Yeah you get more color , brightness and contrast but that is not how the Game was supposed to look. AT that point why even calibrate your TV if you are just gonna change the original color grading anyway.
The Nvidia Corp Never has gotten the Best Default Settings Done right in the Old Control Panel Why would the get it right in the New One? OLD under "Change Resolution" you have to go in to it Change #3 (Apply the Following Settings.) from User Default settings to "Use NVIDIA color settings" Making sure the HIGH (32bit) is chosen then Change the OUTPUT COLOR DEPTH to 10 bpc. Hit APPLY then OK And then you have to go to Adjust "Video color settings" CHANGE #2 HOW DO YOU MAKE COLOR ADJUSTMENTS?" from with the video player settings and set it to "WITH THE NVIDIA SETTINGS" and go the the "ADVANCED TAB" and set the DYNAMIC RANGE to "FULL (0-255) hit APPLY then OK This stuff should be set as default but NVIDIA does not!!!
Aren't you supposed to set the peak brightness to match the peak brightness on your screen, then it makes no sense to set it to max
Modern LG OLEDs have a peak brightness of over 1000 nits.
@@DasAntiNaziBroetchen Yes and many others do too.. but my point is that many tv monitors with hdr have a different scale of peak brightness below 1000 and he do not adress that in the video