Also a phone's oled screen like the Samsung Galaxy S22 Ultra. Just to see how good these screens are compared to oled and qd oled on TV's and monitors. These small screens are performing very highly as far as I know because they're easier to produce
Crts work like some kind of utra high black frame insertion, so you can get perfect motion clarity, there is no point to test the response time of crts
I think CRTs are still the fastest response time regardless of refresh rate next is OLED. I still miss my old CRT monitor and tv sadly my parents threw it way for their First LCD tv. I could have used it for some old-school gaming
as someone who teaches kids for a living, you are an amazing teacher. Everything is super clear, you don't speak *too* quickly, you mark everything, your boardwork is exemplary and nothing is left out or glossed over. Bravo on all accounts!
First Monitor Steve, now Voiceover Tim? The HUB crew is growing exponentially! ;P Jokes aside, this video is _exactly_ the sort of content I want to see on Monitors Unboxed. Keep it up, HUB crew!
Very interesting and informative, as an electronic engineer by trade, I knew some of the terms like overshoot and rise / fall time, but I now know more about how cumulative deviation is calculated and what it means
Really makes you appreciate the work that goes into these reviews. Awesome stuff! I’ll never be buying a monitor without HUB and/or RTINGS going through testing it first.
Would be nice to add icons next to each section like the "response time", "Total Response Time", "Inverse ghosting" and "CD". I am thinking of icons like the graphs you nicely drew :) One normal "s-curve" (response time), one S-curve that overshoots (total response time) e.t.c. you get the idea. Great video, thanks for sharing your amazing methodology
Excellent explanation Tim, I learnt this stuff during my undergraduate from the perspective of minimizing system response errors. The only suggestion I'd make is to consider replacing the excel tables with actual heat maps that interpolate between values. Using something like Matplotlib in Python you could probably do that for free using that Excel data directly. For most people the exact numbers don't really matter on those heat maps, and just make them a bit harder to read. Excellent summary statistics though. The only one you could be missing is a measure of the derivative of the response curve. This would give you a good measure of how "flashy" the response is. For example, a low overshoot and a low response time will yield a mostly low CD value, but if it is highly unstable and fluctuates as it reaches the steady state value, it could look like it's strobing or flashing. But measuring that is probably a challenge given your discrete sampling, and if most responses only have a single overshoot peak, it won't matter much.
@@stpirate89 yeah that's the metric, but it's infamously unstable to measure on discrete data due to the division WRT time step. The other metrics you could use is something like the highest frequency above a certain magnitude in the Fourier decomposition of the dataset, but that also has its own drawbacks.
I don't get the heatmap idea, sounds like work and potential for introducing error to get something that might be easier to interpret. Are there any trends in the data not already visible? can it extrapolate overshoot for transitions in the diagonal? would these transitions have the right rule for the gradient (linear, exponential, hyperbolic etc)?
@@zactron1997 ok I've had a chance to watch the whole video now. I didn't know if you meant the derivative of the "heat maps" hence why I asked. I also agree with Nice Lake, I don't see the need to interpolate the "heat maps", they're not even really heat aps, just tables with colour added, and I think they're fine as they are. Whilst I don't know what the time derivative is going to tell us about the monitors, assuming that the actual response curves don't deviate too much from the examples given, then finding the derivate it relatively straight forward, either as an average of the instantaneous derivates calculated from neighbouring differences, or as a difference from the cutoffs explained in the video. As for the FFT, I'm not sure what this is going to tell you either. For a perfect square wave we'd expect to see peaks at the fundamental frequency and all its harmonics, however this is only going to be half of one cycle, so we'd basically see nothing (a quick python test shows this, even if you have a gradual change of the signal levels). It's certainly not telling us the same thing as the time derivative of the response curve - a quick comparison of units show us that.
I disagree that the chart should be switched to a heatmap for a couple of reasons. Firstly, presenting it this way shows you exactly what it is being measured and what is not being measured. Secondly, removing the raw value and just showing the colour means we as viewers are reliant on Tim's interpretation of what is acceptable, where as currently we don't have to. And thirdly, the only way to make the heatplots consistent across every monitor is to is to infer some relationship between rgb difference and response time. This relationship may not be consistent across different monitors at all, and it may not even be consistent on the same monitor across different refresh rates, rgb difference etc.
I cant stress enough, how insanly amazing this video is, i hace been watching you for years, i never fully understood these numbers accurately, now i hqve a very good idea about how monitor response time is measured and read on those charts. Keep it up steeeeeve
I know all these already, but it will be useful to have a short video "showing" the best you can every visual "symptom" like dark level smearing, overshot, etc, obviously make a warning at the beginning that you can see only what the camera and the person monitor can display, but, I think a slow motion, photos and some real-time videos can visually illustrate a lot, and make it easy for a lot of us to share to others when we are explaining
Fantastic info. I was under the impression you were measuring sections of the screen independently so that say bottom left was slower than top right. Glad to be corrected XD
Wow. Just wow. How did I miss this? Why did I not watch this when it must've originally popped in my recommended months ago? This is ... just a lot to take in. With the way my brain works, I'm going to have to watch this at least 2-3 times more for it all to sink in. This is an amazing video. Thank you for publishing it.
Soo, the other day I discovered there once existed a Thinkpad with an integrated color calibration tool. We need something like that these days for our laptop monitors
Waiting for this so long ! excellent explanation, You make it very easy to understand and now i can finally understand at such a deep level the data in these calculations. incredible :)
I would like to see you guys do more reviews on OLED smart tvs. I use LG C1 for my main PC monitor which I bought largely because of your review. In your charts the C1 was green over pretty much the entire chart, and I absolutely love it. Most smart tv reviews don't really focus on it's PC and gaming uses, and with smart TVs become more and more popular as monitors, I would definitely like to see more reviews that really focus on the PC use and gaming aspects of them.
Any particular reason the overshoot is given in percent and not absolute numbers? I don't have any hands on experience, but just thinking about it a 50% overshoot on a small transition will be far less noticable than a 50% overshoot on a large transition. As seen in the video, using percentage overshoot will make the transitions close to the diagonal seem pretty bad (mostly yellow-orange colors) on paper when the absolute overshoot isn't any bigger than on the transitions further from the diagonal.
Understandable, and I reckon this is something subjective you could debate. I think a percentage makes more sense because in these kinds of system response analytics, you expect performance to be inversely proportionally to the step size. If that is not the case, then you may have a monitor which "flashes" before all chabges for example. Niether is right or wrong, what matters is to know what is being measured.
What you perceive on the screen is often dictated by the entire context of the screen, so for example in a mostly dark scene you'll find it easier to perceive small changes in values compared to if the scene had a larger range of values. So while at times it will be easier to see a certain percentage overshoot for larger transitions, that's not always the case, there are all sorts of weird quirks with human vision that can affect that perception - remember, your eyes are adaptive. Just imagine a scene that's mostly dark and shadowy, most content in the RGB 0 to 50 range. If you had 0 to 30 transitions overshooting by 100% to RGB 60, that would be very noticeable in that context. The overshoot would be exceeding the brightest parts of the scene. But in a higher contrast scene you may not notice say RGB 200 overshooting to RGB 230, the same 30 RGB value difference (but just 15% overshoot overall) I guess when it comes to these measurements it depends on what assumptions you want to make and each method has its pros and cons. Absolute values works well when a scene has high contrast, but underplays the effect of overshoot in low contrast scenes (in my opinion). Relative values has the opposite effect. We use relative values for all measurements (eg tolerances as well) which keeps it consistent across the board
@@monitorsunboxed That's why you use gamma corrected response measurements, isn't it? So the same RGB value differential would ideally represent the same light perception response from a human observer regardless of which part (dark or light) of light output range the transition is taking place. Again, ideally. But using percentages in this case is definitely wrong in its deceptiveness (from the color scheme perspective as well - green to red).
@@tarsius Gamma correction is mostly about solving the problem of where you set the start and end points for your response time measurement. The total response time is what it is, gamma correction doesn't change that. However when incorporating tolerances, using linear light output instead of gamma corrected light output creates a mismatch where the measurement start point can be way off the actual start point, but the measurement end point can be very close to the actual end point (perceptually). With gamma correction we get a much more even balance between the start and end points and how they compare to the actual start/end However even with gamma correction we're still using tolerances that are relative to the size of the transition, for similar reasons to what was described above - and also that with fixed tolerances eventually given a small enough transition you'll measure a response time of 0 (or even negative). Gamma correction doesn't solve the issue where dark transitions are more noticeable in the context of a dark scene. For example you could argue that the transition time of small transitions is less relevant than large transitions because they are transitioning over a smaller absolute RGB value range and therefore less noticeable. That's similar to saying that the overshoot percentage of a small transition is less noticeable because the actual absolute value of the overshoot is lower. Gamma correction doesn't correct for either of these perception issues, but it does make what you are measuring more consistent and fair in other ways
@@monitorsunboxed Your response сontains some quite contradictory (for me) statements. I’m not an expert in this domain, so you may be right and I may be entirely wrong here. But I’ll try to provide the reasoning behind my previous comment and be as brief as possible in that. So bear with me if you can and ask if you’ll lose my discourse at some point.
Gamma correction which is used to encode and decode luminance is based on a power law between input and output values. This law was used here based on the assumption that human perception of luminance follows an approximate power function. This way, gamma correction of input values (voltage which is almost linear to the light output in LCD, AFAIK) produces output ‘RGB’ values. And equal steps (differential) in these values roughly correspond to subjectively equal steps in human perceived luminance.
That’s why ‘absolute numbers’ (mentioned by @Alvin853) or ‘RGB’ values (in other words) as differentials (from-to) should represent the same subjectively perceived changes in luminance in different parts of the luminance range (90 to 105 and 195 to 210, for instance).That’s why measuring over/undershoot and tolerances (allowances) in percentages is deceptive (therefore wrong from this perspective).
P.S. There is a ton of questions even in the fundamental reasons behind the usage of gamma correction based on power law here. For example, the difference in perceived brightness depends on the color of the objects observed, ambient lighting and other conditions (even the mood of an observer, actually) . So, different models of perception (psychophysics) based on power laws are all rough approximations at best (with lots of factors and their relationship lost in such approximation). But that is another topic for discussion.
Thanks, I think it was comprehensive, clear, and certainly very useful for those who didn't know it. Good job! I certainly had forgotten that cumulative deviation was the area difference, calculus style. I'll wait for the color chart explanation, that I know very little about and want to know more.
Love this. Would love another video that expands on this when it comes to a monitor that has proper multi zone HDR and a monitor that has proper backlight strobing. I remember watching a few of your videos on such monitors exhibiting some weird characteristics and it would be nice to understand why that is the case and what that means for the end user.
Thank you for this explainer. I kind of understood what it depicted but now I have a better grasp. Really appreciate the work you put in to keep consumers informed.
From that explanation, "regular" response time feels pretty pointless and "total response time" feels much more useful, so it would be nice to also have stats on the refresh compliance from total response time (which we can't calculate ourselves because we don't have the heatmap, only the averages)
not actually pointless because even though the actual time from the transition is bigger, the change in values isn't so much, so when we look at the regular response time it represents most of the difference between the rpg values. For example, going from 0 to 255 we may take 10ms but u can actually go from 10 to 240 in 5 ms and that excessive value is not as perceptible as it doesnt actually represent a big change in the colors, even tho it does take its time to happen. Kinda like thinking that transitioning from 0 to 10 is much more perceptible than from 245 to 255. it has to do with how much of the change in value actually represents a change in the color that we are seeing That's what I think makes sense at least, but I'm not any professional om the topic by any means
I would love to see a more in-depth about HDR in PC games, how almost all of the screens are 400nits instead of the usual 1000 and how is HDR on Windows besides gaming. Also, how does Auto HDR on Windows 10 and 11 actually work? And is it a good thing to turn it on? And also, which is better in a monitor with 400 nits, HDR on or better SDR?
Seriously thanks for making this video!! I could make sense of good and bad from charts before but now I know much more about what I'm looking at. Now to go rewatch your reviews for the monitors I've been looking at!
great job explaining these Tim, was easy to understand 👍 would be interesting to see how the monitor testing look like to fully understand how much effort you put into these reviews
it's about time Tim, so many people have no idea how these charts work, but they are used by you and other websites such as Rtings, which can be a great for people wanting to make a purchase especially on high end monitors. this channel is a godsend for the CZcams mainstream monitor community, hope more people will be able to watch it and learn new things.
Lol. I would always just wait for him to say words like “impressive” or “great” along with lots more green on the graph. Other than that, it was all Greek to me.
Thank You a lot Tim for that video. I've decided to buy my next monitor based on Your recomendation since I started watching HU chanel despite the fact i didn't understand some of the things You were talking about. This helps me better understand and enjoy this content.
Nice video, I wish the hardware unboxed monitor reviews videos would show the response times and input lag metrics without any type of adaptive sync. When gaming at a competitive level, especially first person shooters, most if not all competitive gamers play without adaptive sync enabled and it would be nice to see those metrics. We look to your videos to see which monitor is best for us to buy so this would very much so help. Just a suggestion, love your content.
Blur Busters did some testing a few years ago and concluded that adaptive sync with a maximum FPS 3 FPS below the monitors refresh rate delivers the lowest average input lag in real world gaming.
@@groundzero_-lm4md Yeah I don't think that's the case anymore. Any recent CZcams video you see where input lag is tested you see input lag the lowest when disabling adaptive sync. Optimum tech made a video on it, true game data, and others.
@@jggg31 It's true that running high framerates, even exceeding your monitor's refresh rate will provide the lowest latency. Just have to make sure that while you do that, you won't hit a GPU limitation, as that will cause input latency in the order of tens of milliseconds. Generally having adaptive sync encaged will add 1-2ms of processing lag. You be the judge if that matters to you.
@@jggg31 There's also a difference between the best experience and lowest numbers. I personally can't stand screen tearing. Having a high game refresh rate higher than your monitor might lead to lower numbers but if the game refresh rate dips below your monitor refresh rate there will be a sudden change in latency. Adaptive sync allows for more consistent input latency.
Really looking forward to the color performance explainer. If these graphs were flying over my head before this video, then the color charts were going over the entire solar system for me.
I finally (!) fully understand these charts & can make purchasing decisi... ...wait, nope, lost it. I'll still have to listen to your summaries & look at your comparison charts, lol, but maybe I'm a step closer to being less lost next time?
I love this video. It is nice to know as much as possible about how you test the monitors. I would have liked to see how the measurements are taking. Just a short B roll of a monitor being tested. Is it at the same positioning on the screen for all 10 test's and things like that. I would like to see a way of showing the deviation between tests, I guess a good screen would have a small deviation in both response time and overshoot. I would also like to know how much noise there are in the system, how accurate are the measurements in ms and RGB values. Would it make sense to test the RGB one at a time. You might have a screen where R and G are super fast with no overshoot and B is slow and with a lot of overshoot. (It might be more important that G is at the right value fast than e.g. B) I think you should have included the gamma correction talk, just 3-5 min in the video. To me this is the only thing that makes sense. But I think maybe it is a bit misleading to use steps of 25 in the RGB values. I love the cumulative deviation plot, it makes it really hard to "game the system". The value makes no sense and if you have not integrated the values over timer the numbers cannot be compared when you get new test equipment that measures with a higher frequency. I would also have liked you to talk about Avg Total Response (time) vs Refresh Window. Cause if Avg Total Response is higher than the Refresh Window I do not think it is fair to call the screen fast enough for the clamed refresh rate. If the time is the same I think it is ok, but I would argue that the Avg Total Response (time) should be at least half of the Refresh Window, cause then the pixel is showing the right value for at least half of the time. I feel like this is important for monitors in a FPS game, where you would like have have a sharp image while moving the screen. Again, I love this video and I think you should link it in the descriptions of all monitor tests.
Great content! A question on overshoot : How are you measuring overshoot on the 255 values? For example here, transition 204 to 255 and 230 to 255 gives you a non-zero value for overshoot. How would you measure an overshoot on a pure white value - if 255 your maximum?
Not Tim, but I can make a guess on what's going on. The sensor used is a light sensor and it doesn't care about RGB values, signal bit depths, and all that other monitor input stuff. It only cares about light levels. Photons. Trillions (I think) of them. The monitor produces a certain amount of light for a particular RGB input signal and the amount of light produced is measured by the sensor. Increase the brightness on the monitor and you will see an increase in the light produced for the same RGB values, even for 255. The purpose of monitor calibration is to tune the "machinery" in the monitor to produce the correct amount of output light for a particular RGB input value. So in short the 0.255 values only exist on the input side of the monitor, the output can really be in whatever nit/lumen/candela (not sure on the unit here) range.
Absolutely Awesome. can you also please add a similar videos for color related things? i don't really understand what over saturation , gamma etc mean.
Are the response times and other values generally consistent across the displays, or is there in some cases a difference between their physical location, e.g. the middle or sides/corners of the display? I suppose if there was it would be averaged out in the charts, but I'm just curious.
That's great stuff Tim. I'm sure you considered using % over target (absolute value maybe?) for CD values. Just curious why you went with the sum of the values instead of measured/ideal expressed a percentage. That could make the two right hand tables more consistent as the values themselves (as you mentioned) don't have any real meaning in and of themselves, but the deviation as a percentage could be slightly more intuitive / meaningful.
Ideal CD would be zero and you can't divide by zero. I'd actually rather see the opposite approach and change the percentages in the overshoot table into the same "area under the curve" values used in the CD table. As Tim explained, even though the CD values are a bit arbitrary based on the measurement software used, using them for the overshoot table would make the two charts directly comparable and avoid the pitfalls of using percentages.
Please Test a crt. And some early lcds. Would be pretty interesting to see how far we've come (or haven't)
Also a phone's oled screen like the Samsung Galaxy S22 Ultra. Just to see how good these screens are compared to oled and qd oled on TV's and monitors. These small screens are performing very highly as far as I know because they're easier to produce
Crts work like some kind of utra high black frame insertion, so you can get perfect motion clarity, there is no point to test the response time of crts
I think CRTs are still the fastest response time regardless of refresh rate next is OLED. I still miss my old CRT monitor and tv sadly my parents threw it way for their First LCD tv. I could have used it for some old-school gaming
@@newbietricki239 OLED might actually be faster end to end since it removes the need for a digital to analog converter.
@@groundzero_-lm4md crt is direct analog
Tim, your content is a rare gem in the tech CZcams scene. Excellent science, done professionally.
as someone who teaches kids for a living, you are an amazing teacher. Everything is super clear, you don't speak *too* quickly, you mark everything, your boardwork is exemplary and nothing is left out or glossed over. Bravo on all accounts!
Voiceover Tim, Monitor Steve, Moustache Tim... glad to see the team growing!
First Monitor Steve, now Voiceover Tim? The HUB crew is growing exponentially! ;P
Jokes aside, this video is _exactly_ the sort of content I want to see on Monitors Unboxed. Keep it up, HUB crew!
Was waiting for this video for a while. Thanks, Voiceover Tim!
Very interesting and informative, as an electronic engineer by trade, I knew some of the terms like overshoot and rise / fall time, but I now know more about how cumulative deviation is calculated and what it means
Really makes you appreciate the work that goes into these reviews. Awesome stuff! I’ll never be buying a monitor without HUB and/or RTINGS going through testing it first.
Would be nice to add icons next to each section like the "response time", "Total Response Time", "Inverse ghosting" and "CD".
I am thinking of icons like the graphs you nicely drew :) One normal "s-curve" (response time), one S-curve that overshoots (total response time) e.t.c. you get the idea.
Great video, thanks for sharing your amazing methodology
Excellent explanation Tim, I learnt this stuff during my undergraduate from the perspective of minimizing system response errors. The only suggestion I'd make is to consider replacing the excel tables with actual heat maps that interpolate between values. Using something like Matplotlib in Python you could probably do that for free using that Excel data directly.
For most people the exact numbers don't really matter on those heat maps, and just make them a bit harder to read.
Excellent summary statistics though. The only one you could be missing is a measure of the derivative of the response curve. This would give you a good measure of how "flashy" the response is. For example, a low overshoot and a low response time will yield a mostly low CD value, but if it is highly unstable and fluctuates as it reaches the steady state value, it could look like it's strobing or flashing.
But measuring that is probably a challenge given your discrete sampling, and if most responses only have a single overshoot peak, it won't matter much.
The derivative wrt time?
@@stpirate89 yeah that's the metric, but it's infamously unstable to measure on discrete data due to the division WRT time step. The other metrics you could use is something like the highest frequency above a certain magnitude in the Fourier decomposition of the dataset, but that also has its own drawbacks.
I don't get the heatmap idea, sounds like work and potential for introducing error to get something that might be easier to interpret. Are there any trends in the data not already visible? can it extrapolate overshoot for transitions in the diagonal? would these transitions have the right rule for the gradient (linear, exponential, hyperbolic etc)?
@@zactron1997 ok I've had a chance to watch the whole video now. I didn't know if you meant the derivative of the "heat maps" hence why I asked.
I also agree with Nice Lake, I don't see the need to interpolate the "heat maps", they're not even really heat aps, just tables with colour added, and I think they're fine as they are.
Whilst I don't know what the time derivative is going to tell us about the monitors, assuming that the actual response curves don't deviate too much from the examples given, then finding the derivate it relatively straight forward, either as an average of the instantaneous derivates calculated from neighbouring differences, or as a difference from the cutoffs explained in the video.
As for the FFT, I'm not sure what this is going to tell you either. For a perfect square wave we'd expect to see peaks at the fundamental frequency and all its harmonics, however this is only going to be half of one cycle, so we'd basically see nothing (a quick python test shows this, even if you have a gradual change of the signal levels). It's certainly not telling us the same thing as the time derivative of the response curve - a quick comparison of units show us that.
I disagree that the chart should be switched to a heatmap for a couple of reasons. Firstly, presenting it this way shows you exactly what it is being measured and what is not being measured. Secondly, removing the raw value and just showing the colour means we as viewers are reliant on Tim's interpretation of what is acceptable, where as currently we don't have to. And thirdly, the only way to make the heatplots consistent across every monitor is to is to infer some relationship between rgb difference and response time. This relationship may not be consistent across different monitors at all, and it may not even be consistent on the same monitor across different refresh rates, rgb difference etc.
Excellent! Now I can't wait for the video about black frame insertion and how incredibly important it is for most games (and how to use it correctly)
I cant stress enough, how insanly amazing this video is, i hace been watching you for years, i never fully understood these numbers accurately, now i hqve a very good idea about how monitor response time is measured and read on those charts.
Keep it up steeeeeve
I know all these already, but it will be useful to have a short video "showing" the best you can every visual "symptom" like dark level smearing, overshot, etc, obviously make a warning at the beginning that you can see only what the camera and the person monitor can display, but, I think a slow motion, photos and some real-time videos can visually illustrate a lot, and make it easy for a lot of us to share to others when we are explaining
Very in-depth and informative video yet easy to understand. Great job once again!
Nice. An explanation of the “I’ll take your word for it chart” 🤣
Fantastic info. I was under the impression you were measuring sections of the screen independently so that say bottom left was slower than top right. Glad to be corrected XD
Wow. Just wow. How did I miss this? Why did I not watch this when it must've originally popped in my recommended months ago? This is ... just a lot to take in. With the way my brain works, I'm going to have to watch this at least 2-3 times more for it all to sink in.
This is an amazing video. Thank you for publishing it.
Soo, the other day I discovered there once existed a Thinkpad with an integrated color calibration tool. We need something like that these days for our laptop monitors
You can just buy one that works on any screen
Fantastic video. I’m looking forward to future deep dives on: synch technologies, I/O standards, display calibration, color spaces, overdrive settings, panel tech, etc. Thanks voiceover Tim!
I've always wanted to go more in detail about this
That was both fascinating and extremely useful information. Love it.
Waiting for this so long ! excellent explanation, You make it very easy to understand and now i can finally understand at such a deep level the data in these calculations.
incredible :)
I would like to see you guys do more reviews on OLED smart tvs. I use LG C1 for my main PC monitor which I bought largely because of your review. In your charts the C1 was green over pretty much the entire chart, and I absolutely love it. Most smart tv reviews don't really focus on it's PC and gaming uses, and with smart TVs become more and more popular as monitors, I would definitely like to see more reviews that really focus on the PC use and gaming aspects of them.
Any particular reason the overshoot is given in percent and not absolute numbers? I don't have any hands on experience, but just thinking about it a 50% overshoot on a small transition will be far less noticable than a 50% overshoot on a large transition.
As seen in the video, using percentage overshoot will make the transitions close to the diagonal seem pretty bad (mostly yellow-orange colors) on paper when the absolute overshoot isn't any bigger than on the transitions further from the diagonal.
Understandable, and I reckon this is something subjective you could debate. I think a percentage makes more sense because in these kinds of system response analytics, you expect performance to be inversely proportionally to the step size. If that is not the case, then you may have a monitor which "flashes" before all chabges for example.
Niether is right or wrong, what matters is to know what is being measured.
What you perceive on the screen is often dictated by the entire context of the screen, so for example in a mostly dark scene you'll find it easier to perceive small changes in values compared to if the scene had a larger range of values. So while at times it will be easier to see a certain percentage overshoot for larger transitions, that's not always the case, there are all sorts of weird quirks with human vision that can affect that perception - remember, your eyes are adaptive.
Just imagine a scene that's mostly dark and shadowy, most content in the RGB 0 to 50 range. If you had 0 to 30 transitions overshooting by 100% to RGB 60, that would be very noticeable in that context. The overshoot would be exceeding the brightest parts of the scene. But in a higher contrast scene you may not notice say RGB 200 overshooting to RGB 230, the same 30 RGB value difference (but just 15% overshoot overall)
I guess when it comes to these measurements it depends on what assumptions you want to make and each method has its pros and cons. Absolute values works well when a scene has high contrast, but underplays the effect of overshoot in low contrast scenes (in my opinion). Relative values has the opposite effect. We use relative values for all measurements (eg tolerances as well) which keeps it consistent across the board
@@monitorsunboxed That's why you use gamma corrected response measurements, isn't it? So the same RGB value differential would ideally represent the same light perception response from a human observer regardless of which part (dark or light) of light output range the transition is taking place. Again, ideally. But using percentages in this case is definitely wrong in its deceptiveness (from the color scheme perspective as well - green to red).
@@tarsius Gamma correction is mostly about solving the problem of where you set the start and end points for your response time measurement. The total response time is what it is, gamma correction doesn't change that. However when incorporating tolerances, using linear light output instead of gamma corrected light output creates a mismatch where the measurement start point can be way off the actual start point, but the measurement end point can be very close to the actual end point (perceptually). With gamma correction we get a much more even balance between the start and end points and how they compare to the actual start/end
However even with gamma correction we're still using tolerances that are relative to the size of the transition, for similar reasons to what was described above - and also that with fixed tolerances eventually given a small enough transition you'll measure a response time of 0 (or even negative).
Gamma correction doesn't solve the issue where dark transitions are more noticeable in the context of a dark scene. For example you could argue that the transition time of small transitions is less relevant than large transitions because they are transitioning over a smaller absolute RGB value range and therefore less noticeable. That's similar to saying that the overshoot percentage of a small transition is less noticeable because the actual absolute value of the overshoot is lower. Gamma correction doesn't correct for either of these perception issues, but it does make what you are measuring more consistent and fair in other ways
@@monitorsunboxed Your response сontains some quite contradictory (for me) statements.
I’m not an expert in this domain, so you may be right and I may be entirely wrong here.
But I’ll try to provide the reasoning behind my previous comment and be as brief as possible in that. So bear with me if you can and ask if you’ll lose my discourse at some point.
Gamma correction which is used to encode and decode luminance is based on a power law between input and output values. This law was used here based on the assumption that human perception of luminance follows an approximate power function. This way, gamma correction of input values (voltage which is almost linear to the light output in LCD, AFAIK) produces output ‘RGB’ values. And equal steps (differential) in these values roughly correspond to subjectively equal steps in human perceived luminance.
That’s why ‘absolute numbers’ (mentioned by @Alvin853) or ‘RGB’ values (in other words) as differentials (from-to) should represent the same subjectively perceived changes in luminance in different parts of the luminance range (90 to 105 and 195 to 210, for instance).That’s why measuring over/undershoot and tolerances (allowances) in percentages is deceptive (therefore wrong from this perspective).
P.S. There is a ton of questions even in the fundamental reasons behind the usage of gamma correction based on power law here. For example, the difference in perceived brightness depends on the color of the objects observed, ambient lighting and other conditions (even the mood of an observer, actually) . So, different models of perception (psychophysics) based on power laws are all rough approximations at best (with lots of factors and their relationship lost in such approximation). But that is another topic for discussion.
Impressive video, Hopefully you will make more because you are one of the best teacher I ever seen.
Thank you for this, I am finaly ready to fully understand your monitor testing videos !
Now I understand your huge efforts for a single slide presentation. Thank you so much.
02:41 argh!!!! Tim, you had me rubbing that spot on my monitor until a few seconds later when you started moving the mouse doh
Now back to watching all the reviews😂😂
This is exactly what I expected from this channel, appreciate the hard work
Thanks for making this kind of content even if it doesn't rake in views. I found this explanation really helpful.
Thanks, I think it was comprehensive, clear, and certainly very useful for those who didn't know it. Good job! I certainly had forgotten that cumulative deviation was the area difference, calculus style.
I'll wait for the color chart explanation, that I know very little about and want to know more.
Thanks Tim, this was a long time needed one
I just realized how much of a great math teacher you'd be if you were not on the monitor scene Tim!
I mean does have an engineering degree (not that that's the bee all and end all of teaching, but it certainly doesn't hurt)
Thank you... I specifically asked for it in the last video...❤
Love this. Would love another video that expands on this when it comes to a monitor that has proper multi zone HDR and a monitor that has proper backlight strobing. I remember watching a few of your videos on such monitors exhibiting some weird characteristics and it would be nice to understand why that is the case and what that means for the end user.
Excellent video Tim and HUB, thanks for making this public!
That is fantastic Tim! Genuinely impressive. Thank you for going through it in such detail.
This is the level of detail you love to see. Thank you!
Thank you for this explainer. I kind of understood what it depicted but now I have a better grasp. Really appreciate the work you put in to keep consumers informed.
From that explanation, "regular" response time feels pretty pointless and "total response time" feels much more useful, so it would be nice to also have stats on the refresh compliance from total response time (which we can't calculate ourselves because we don't have the heatmap, only the averages)
not actually pointless because even though the actual time from the transition is bigger, the change in values isn't so much, so when we look at the regular response time it represents most of the difference between the rpg values.
For example, going from 0 to 255 we may take 10ms but u can actually go from 10 to 240 in 5 ms and that excessive value is not as perceptible as it doesnt actually represent a big change in the colors, even tho it does take its time to happen. Kinda like thinking that transitioning from 0 to 10 is much more perceptible than from 245 to 255. it has to do with how much of the change in value actually represents a change in the color that we are seeing
That's what I think makes sense at least, but I'm not any professional om the topic by any means
I have always wondered how you did all this. Thanks for the great walkthrough made it easy to understand.
best video ever on monitors . well explained
very informative, thanks! :)
Now, I also see the amount of work you put on testing a new display; again, thanks for that!
This was excellent, thanks so much Tim!
I would love to see a more in-depth about HDR in PC games, how almost all of the screens are 400nits instead of the usual 1000 and how is HDR on Windows besides gaming.
Also, how does Auto HDR on Windows 10 and 11 actually work? And is it a good thing to turn it on?
And also, which is better in a monitor with 400 nits, HDR on or better SDR?
Thank you for this one, it was very clear and understandable 👍
Great video as always Tim. Next up similar video on how input lag is measured? I feel like we need more information on this.
Wow, this exceeded my expectations. Very interesting methods, cool charts and overall a great video.
Now I am able to understand the reviews, hehe :P
Great video Tim! It definitely has filled in the blanks when watching your monitor reviews =)
yes tim i am interested! thank you for your videos
Monitor Stats 101, thanks Prof. Tim!
Thank you so much for creating this video! I never fully understood and couldn’t find an understandable explanation.
I wish to see how these theoretical numbers should look like in the real world especially the overshoot
Seriously thanks for making this video!! I could make sense of good and bad from charts before but now I know much more about what I'm looking at. Now to go rewatch your reviews for the monitors I've been looking at!
This is a fantastic video, and I'm sure it has helped many people understand your reviews more clearly now.
Appreciate the insanely detailed video Tim! Keep up the awesome work.
Excellent teaching, Tim! I'm feeling very educated after that one.
great job explaining these Tim, was easy to understand 👍
would be interesting to see how the monitor testing look like to fully understand how much effort you put into these reviews
it's about time Tim, so many people have no idea how these charts work, but they are used by you and other websites such as Rtings, which can be a great for people wanting to make a purchase especially on high end monitors.
this channel is a godsend for the CZcams mainstream monitor community, hope more people will be able to watch it and learn new things.
Great video!
I would love to see what the monitor looks like in use compared to the charts. The overshoot, slow response times etc.
Very nice reference video for your reviews.
man, imagine doing all that integration by hand, technology is a blessing.
Thank you for uploading this.
I don't know what to say, but you are really making a content of GEMS
Grate methodology. I wondered about why you test only grayscale not color ones, but you explained it.
Nice work Tim! Well done. I learned a lot. Thanks for explaining everything.
Thank you for breaking this down it's nice to see what the numbers actually are :3
Yes! Now I can watch your reviews with out feeling like an idiot! More please!
Lol. I would always just wait for him to say words like “impressive” or “great” along with lots more green on the graph. Other than that, it was all Greek to me.
You and your team are amazing. You put in so much time and effort.
Thank You a lot Tim for that video. I've decided to buy my next monitor based on Your recomendation since I started watching HU chanel despite the fact i didn't understand some of the things You were talking about. This helps me better understand and enjoy this content.
Nice video, I wish the hardware unboxed monitor reviews videos would show the response times and input lag metrics without any type of adaptive sync. When gaming at a competitive level, especially first person shooters, most if not all competitive gamers play without adaptive sync enabled and it would be nice to see those metrics. We look to your videos to see which monitor is best for us to buy so this would very much so help. Just a suggestion, love your content.
Blur Busters did some testing a few years ago and concluded that adaptive sync with a maximum FPS 3 FPS below the monitors refresh rate delivers the lowest average input lag in real world gaming.
@@groundzero_-lm4md Yeah I don't think that's the case anymore. Any recent CZcams video you see where input lag is tested you see input lag the lowest when disabling adaptive sync. Optimum tech made a video on it, true game data, and others.
@@jggg31 It's true that running high framerates, even exceeding your monitor's refresh rate will provide the lowest latency. Just have to make sure that while you do that, you won't hit a GPU limitation, as that will cause input latency in the order of tens of milliseconds. Generally having adaptive sync encaged will add 1-2ms of processing lag. You be the judge if that matters to you.
@@jggg31 There's also a difference between the best experience and lowest numbers. I personally can't stand screen tearing.
Having a high game refresh rate higher than your monitor might lead to lower numbers but if the game refresh rate dips below your monitor refresh rate there will be a sudden change in latency. Adaptive sync allows for more consistent input latency.
@@groundzero_-lm4md Yeah there's no issue in screen tearing at high refresh rates.
Finally! I've just been guessing green good red bad.
Cheers Tim, very informative and you make it very easy to understand.
awsome, I was wondering what these actualy mean when watching the Monitor reviews, thanks for the education
Great, informative video. You could add a heatmap line over the bottom tables so viewers can see easily how good/bad is a value is on a scale.
Very well explained. I enjoyed all of it. I learned a lot!
Really looking forward to the color performance explainer. If these graphs were flying over my head before this video, then the color charts were going over the entire solar system for me.
solid work, brilliant stuff 🙂
While I am a math major, so I should just translate these charts effortlessly, the explination was very useful. Thank you!! =)
Loving this content and channel. Much love from Indiana, USA.
Thanks for this. It was very informative.
Thank you very much. You explain very clearly.
I finally (!) fully understand these charts & can make purchasing decisi...
...wait,
nope,
lost it.
I'll still have to listen to your summaries & look at your comparison charts, lol, but maybe I'm a step closer to being less lost next time?
I love this video. It is nice to know as much as possible about how you test the monitors.
I would have liked to see how the measurements are taking. Just a short B roll of a monitor being tested. Is it at the same positioning on the screen for all 10 test's and things like that.
I would like to see a way of showing the deviation between tests, I guess a good screen would have a small deviation in both response time and overshoot.
I would also like to know how much noise there are in the system, how accurate are the measurements in ms and RGB values.
Would it make sense to test the RGB one at a time. You might have a screen where R and G are super fast with no overshoot and B is slow and with a lot of overshoot. (It might be more important that G is at the right value fast than e.g. B)
I think you should have included the gamma correction talk, just 3-5 min in the video. To me this is the only thing that makes sense. But I think maybe it is a bit misleading to use steps of 25 in the RGB values.
I love the cumulative deviation plot, it makes it really hard to "game the system". The value makes no sense and if you have not integrated the values over timer the numbers cannot be compared when you get new test equipment that measures with a higher frequency.
I would also have liked you to talk about Avg Total Response (time) vs Refresh Window. Cause if Avg Total Response is higher than the Refresh Window I do not think it is fair to call the screen fast enough for the clamed refresh rate. If the time is the same I think it is ok, but I would argue that the Avg Total Response (time) should be at least half of the Refresh Window, cause then the pixel is showing the right value for at least half of the time. I feel like this is important for monitors in a FPS game, where you would like have have a sharp image while moving the screen.
Again, I love this video and I think you should link it in the descriptions of all monitor tests.
Great content!
A question on overshoot : How are you measuring overshoot on the 255 values? For example here, transition 204 to 255 and 230 to 255 gives you a non-zero value for overshoot. How would you measure an overshoot on a pure white value - if 255 your maximum?
Not Tim, but I can make a guess on what's going on. The sensor used is a light sensor and it doesn't care about RGB values, signal bit depths, and all that other monitor input stuff. It only cares about light levels. Photons. Trillions (I think) of them. The monitor produces a certain amount of light for a particular RGB input signal and the amount of light produced is measured by the sensor. Increase the brightness on the monitor and you will see an increase in the light produced for the same RGB values, even for 255. The purpose of monitor calibration is to tune the "machinery" in the monitor to produce the correct amount of output light for a particular RGB input value. So in short the 0.255 values only exist on the input side of the monitor, the output can really be in whatever nit/lumen/candela (not sure on the unit here) range.
@@ibbles basically saying the sensor has a bigger detection range than the monitor's output range
im loving these videos way more info im so glad you made this channel.
Yoda: "Overshoot and cumulative deviation learn you must."
Me: "It's ok, I'm not confused."
Yoda: "...you will be. You WILL BE."
Hahaha. :D
Yes! Thanks for explaining this!
Brilliant stuff. Thanks, Tim.
Thanks a lot for explanation and videos
Yes, thank you!
Very good video, thank you very much!
Absolutely Awesome. can you also please add a similar videos for color related things? i don't really understand what over saturation , gamma etc mean.
Are the response times and other values generally consistent across the displays, or is there in some cases a difference between their physical location, e.g. the middle or sides/corners of the display? I suppose if there was it would be averaged out in the charts, but I'm just curious.
Man this is so good. Thank you so much.
Next one about color performance :D
That's great stuff Tim. I'm sure you considered using % over target (absolute value maybe?) for CD values. Just curious why you went with the sum of the values instead of measured/ideal expressed a percentage. That could make the two right hand tables more consistent as the values themselves (as you mentioned) don't have any real meaning in and of themselves, but the deviation as a percentage could be slightly more intuitive / meaningful.
Ideal CD would be zero and you can't divide by zero. I'd actually rather see the opposite approach and change the percentages in the overshoot table into the same "area under the curve" values used in the CD table.
As Tim explained, even though the CD values are a bit arbitrary based on the measurement software used, using them for the overshoot table would make the two charts directly comparable and avoid the pitfalls of using percentages.
excelent video, thanks for that content!
I have two monitors - 55" LCD for movies and web serfing and 19" CRT with 0ms response time for games.
30K subs vs 900K on HUB, you'll probably kill this channel soon.
Amazing job guys, thanks for the detailed explanation. Any change of reviewing Huawei GT 34? Thank you in advance. ♥