Nvidia G-Sync vs AMD FreeSync vs Adaptive Sync in 2024

Sdílet
Vložit
  • čas přidán 17. 05. 2024
  • Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    AOC 24G2SP - geni.us/wRQh
    Gigabyte G27Q - geni.us/lwecAaK
    LG 27GN800 - geni.us/Tb1e
    MSI G274QPF-QD - geni.us/EyRk8Q
    LG 27GP850 - geni.us/9sTgfY
    MSI G274QPX - geni.us/Kj1mgh
    Gigabyte M27U - geni.us/Lm8H9Al
    Asus ROG Strix XG27AQMR - geni.us/iE8G4t
    Gigabyte M32U - geni.us/dg92bo
    LG 32GR93U - geni.us/mf5EWK
    Dell Alienware AW3423DWF - geni.us/6Pecf
    Samsung Odyssey Neo G7 - geni.us/ztC1Q
    Asus ROG Swift OLED PG27AQDM - geni.us/XPKTTI
    How we test response times: • What Are Response Time...
    Testing performed using Portrait CalMAN Ultimate: www.portrait.com/
    00:00 - Welcome Back to Monitors Unboxed
    00:24 - What is Adaptive Sync?
    03:04 - FreeSync and G-Sync Monitor Compatibility
    05:29 - Nvidia G-Sync Branding Explained
    09:15 - How to Enable G-Sync on Non-G-Sync Branded Monitor
    10:18 - AMD FreeSync Branding Explained
    13:59 - How to Ensure FreeSync is Enabled
    14:28 - Final Thoughts
    Nvidia G-Sync vs AMD FreeSync vs Adaptive Sync in 2023
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo

Komentáře • 1,1K

  • @wyntje83
    @wyntje83 Před 5 měsíci +776

    This has to be one of THE most clear and concise explanations of fps vs. Hz I've ever seen, incl. coverage of the downsides of v-sync and why adaptive sync is relevant today. The first three minutes of this video alone should be required watching for anyone trying to understand gaming monitor refresh rate specs.

    • @Tarets
      @Tarets Před 5 měsíci +11

      Because he skipped the whole context of frame buffering, without of which it's barely scratching the bottom of the topic.
      I'd suggest watching something like "Refresh Rates, V-Sync Settings and Frame Buffers Explained" by SixteenThirtyTwo, that actually explains why which issue occurs, instead of just stating the outcome.

    • @wyntje83
      @wyntje83 Před 5 měsíci +27

      thanks. might give it a look but not everybody needs that level of detail, if you're a consumer trying to inform a purchase decision you're more likely to care about outcomes than the detailed technical explanation on how it occurs. there is a balance that needs to be struck based on your goals and audience.

    • @Tarets
      @Tarets Před 5 měsíci +3

      ​@@wyntje83 Absolutely. But that's not just a theoretical knowledge - I think you do need to understand what's going on to know how to set your frame limits and buffering options to your needs if not using variable refresh rate.

    • @najeebshah.
      @najeebshah. Před 5 měsíci +1

      oh please lol far from the mose concise

    • @johnorkutcopero8656
      @johnorkutcopero8656 Před 4 měsíci

      @@Taretsmay u help me ?
      I have a adaptive sync monitor ,165hz
      What would be the best settings in nvidia panel and In game for best game experience? Even my game runs fine 90% of the time ,I’m some scenarios even with high fps ,the game will lag and stuttering all way around ,maybe some help ?

  • @papabepreachin8664
    @papabepreachin8664 Před 5 měsíci +931

    Love how Nvidia removed the 1000 nit brightness requirement for "G-Sync Ultimate" qualifications lol.

    • @floorgang420
      @floorgang420 Před 5 měsíci +178

      Basically because most OLED is rated for 400 true black.

    • @maegnificant
      @maegnificant Před 5 měsíci +132

      Yeah it's because oled can't make the rating, but are still a better hdr experience. They do put the module in 600 nits IPS displays too, though.

    • @oktc68
      @oktc68 Před 5 měsíci +29

      My QD OLED is rated for both, (and tested by Tim on this channel) 1000 nits is too bright for my gaming environment though. HDR on anything but OLED doesn't really work well across the board, especially with fine bright details like a starry sky. Can't beat light emitting pixels.

    • @maegnificant
      @maegnificant Před 5 měsíci +54

      @@oktc68 it's not "rated for both". It has a 1000 nits mode, but that's only in a 2% percent window. This is not enough for an actual VESA rating.

    • @WSS_the_OG
      @WSS_the_OG Před 5 měsíci +67

      HDR is still a shit show overall. Needs a few more years in the oven. Said from experience.

  • @jorismak
    @jorismak Před 5 měsíci +42

    Maybe something to clarify to all: In the nvidia conrtol panel, there is an option to force v-sync on or off. That has an 'adaptive' and 'adaptive half refresh' option for _YEARS_ now. That has NOTHING to do with VRR :). It's just an old-school option but with a duplicate name. It basically means 'soft vsync lock'. By setting it do adaptive sync, you force a framepaced fps limit to the refreshrate of your monitor, causing perfect framepacing without turning vsync on in games (and not having the double buffering). But the moment you drop below your refreshrate, vsync is dropped and you get tearing instead of a big latency hit. It's like consoles used to do, and it is the perfect way to get good smooth gameplay at a perfect locked 60 or locked 30 if you have a 60hz-non-VRR screen. But if you have VRR, its' useless, and the name 'adaptive' has nothing to do with 'adaptive sync' in monitors :). (It's 'adaptive on or off' so to speak)

  • @fernando858CA
    @fernando858CA Před 5 měsíci +40

    Thank you so much for this extensive deep and yet simple explanaition of the topic. If there are other resources like this online I have not stumbled upon them.
    Great work!

  • @marshallb5210
    @marshallb5210 Před 5 měsíci +331

    One thing to note is AMD FreeSync Premium Pro monitors have the display's color gamut / luminance values embedded in the EDID so HDR games can calibrate to the display's capabilities a lot better than some stupid brightness sliders ever could, AFAIK this is something NVIDIA could interoperate with instead of letting Dolby reinvent the same thing and charge royalties for every device in the chain....

    • @RAM_845
      @RAM_845 Před 5 měsíci +52

      BEST OF ALL? It's cheaper than Ngredia's monitor offerings haha the G-Sync tax

    • @Jas7520
      @Jas7520 Před 5 měsíci +48

      Freesync Premium Pro is a completely proprietary extension and isn't a part of the VESA standard, Nvidia couldn't support it even if they wanted to.
      It's not unique in embedding static metadata, that's just part of the base HDR10 spec, and it doesn't embed dynamic metadata like Dolby Vision/HDR10+. It was just a way of specific AMD-partnered games to tonemap directly to the display to improve latency. It's also been functionally abandoned by AMD, with no games added to the initial list of 8 that it launched with.
      Freesync Premium Pro is most notable for breaking HDR on supported displays with AMD GPUs, which is why Tim didn't recommend the AW3423DWF for over a year if you owned an AMD GPU as it would completely break tonemapping. Dell has since fixed this in firmware updates by just using the default HDR pipeline with AMD GPUs instead of the broken and abandoned Freesync Premium HDR pipeline, but it's still an issue on a number of Samsung monitors.
      Your entire comment is made up nonsense.

    • @MrDutch1e
      @MrDutch1e Před 5 měsíci +7

      @@RAM_845 AMD can't exceed 600 nits brightness with freesync enabled on monitors that hit 1000 nits with gsync enabled on nvidia. It's been talked about on this channel. You have to disable freesync to exceed 600 nits.

    • @Xilent1
      @Xilent1 Před 5 měsíci +5

      @@MrDutch1e You have the name of the video that talks about that. I would like to learn more on that as I have an AMD GPU along side FPP, but planning to get a TI Super early next year to go with my DWF

    • @oldschool_bamagatrader9960
      @oldschool_bamagatrader9960 Před 5 měsíci

      can anyone help here .. i have RX7900xtx and i play 4K 120HZ .. almost buying LG monitor with FreeCync premium Pro .. shall i buy it or not ? guys some advices needed here@@MrDutch1e

  • @knuffle9872
    @knuffle9872 Před 4 měsíci +20

    Thanks so much for the clear explanation! Things have changed so much that I had myself in knots trying to understand the current differences in G-sync and freesync. Learned some new things too. Great vid!

  • @mmaakk1978
    @mmaakk1978 Před 5 měsíci +6

    Thank you. Very good video. All my doubts are now cleared. I feel better. Cheers Australia ❤

  • @nellybod8742
    @nellybod8742 Před 5 měsíci +14

    Nice & informative video Tim. As usual my well rounded PC knowledge tree has never spent much time in monitors. So this sort of content is a godsend!

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 4 měsíci

      It's not informative. Displays with hardware G-Sync modules are objectively better.

  • @Thephysiquemechanic
    @Thephysiquemechanic Před 5 měsíci +4

    Thanks so much for making this video and clarifying this for me, as I was up in the air about the Asus Rog strix 27 in LED panel that just recently came out that you reviewed in your top five, and it didn't say g-sync so I was worried it wasn't going to work well

  • @greatbudda
    @greatbudda Před 5 měsíci +4

    Thanks for the GSync Compatibility tips. I had no idea I needed to go into control panel to enable the settings for my new monitor 😅

  • @Bioniclema90
    @Bioniclema90 Před 5 měsíci +13

    Thank you for this, I wondered for ages wtf adaptive sync is compared to g-sync and freesync and I couldn't find a decent explanation anywhere. I thought adaptive sync was a frame sync tech for hdmi or something lol

  • @davidbanner9001
    @davidbanner9001 Před 5 měsíci +3

    Excellent video. Thank you very much. It was very clear regarding the various standards etc.

  • @SomePotato
    @SomePotato Před 5 měsíci +6

    Thank you! Adaptive sync is one of these things that unnecessarily complicated thanks to marketing.

  • @susegadgaming3123
    @susegadgaming3123 Před 5 měsíci +10

    Thank you for making this very educational video.

  • @NerfHerdsman
    @NerfHerdsman Před 5 měsíci +1

    Going to have to edit the title in a dozen days!
    This was great to hear as someone who has paid very little attention since getting my XB270HU, the first all-boxes-ticked 1440p display.

  • @1_2_die2
    @1_2_die2 Před 5 měsíci +4

    Thanks for the easy clarification.

  • @MegaFdjk
    @MegaFdjk Před 5 měsíci +3

    Thank you for this. I've been holding off on buying a new monitor for ages because I wasn't sure how any of this worked

  • @MorningThief_
    @MorningThief_ Před 5 měsíci +3

    this is a fantastic video -- thank you.
    i am looking for PC parts for my next build & i was thinking of leaving the monitors last just because i was worried whether i was going with AMD or NVidia...

  • @djpep94
    @djpep94 Před 5 měsíci +2

    Thanks for this! Really helpful and easy to understand.

  • @martincoppa6417
    @martincoppa6417 Před 5 měsíci +2

    This was extraordinarily informative and succinct. Thanks!

  • @joaomartins4240
    @joaomartins4240 Před 5 měsíci +7

    Here in Portugal, sometimes retailers don't even put basic specs in the spec sheet like Refresh Rate or Panel Type let alone Adaptive Sync or VRR. Fortunately we have your channel to save us from ignorance.

    • @maxirunpl
      @maxirunpl Před 3 dny

      Those things are crucial to know about when buying a monitor

  • @robertr.1879
    @robertr.1879 Před 5 měsíci +32

    In my opinion, Adaptive Sync technologies is the greatest addition to gaming in the last ~10 years.

    • @YoDiamonds
      @YoDiamonds Před 5 měsíci

      Agreed brother

    • @OutlawedPoet
      @OutlawedPoet Před 5 měsíci

      It really does work like magic. I can still comfortably play games at 60 FPS even after being so used to higher refresh rates because of adaptive sync. The image just looks smoother and the lack of input lag and tearing even at higher frames is so nice to have.

  • @Madmeerkat55
    @Madmeerkat55 Před 4 měsíci +1

    Actually a phenomenal video. Thank you so much for putting this all together!

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 4 měsíci

      It's not phenomenal. It's misinformation. Displays with hardware G-Sync modules are objectively better.

  • @davidbwa
    @davidbwa Před 5 měsíci +1

    Good to know, thanks. I am changing my setup and may be purchasing a monitor in the near future. Also, I finally understand Vsync and Adaptive sync a bit better now.

  • @sheikhtashdeedahmed2740
    @sheikhtashdeedahmed2740 Před 3 měsíci +6

    One problem remains though. And this isn't a major problem but is a problem nonetheless for some of us users. While AMD Radeon GPUs have Freesync functionality over HDMI port and cable on some of the 60hz, 75hz and 100hz HDMI-only Freesync monitors, on such monitors with Nvidia, Freesync can't be enabled via HDMI cable-port.

  • @3w3Ch00B
    @3w3Ch00B Před 5 měsíci +6

    I wish I could like this video twice. The first 3 minutes especially were so concise and helpful.

  • @edplat2367
    @edplat2367 Před 5 měsíci +2

    Great video Tim. Very easy to understand.

  • @santyagobustamante7369
    @santyagobustamante7369 Před 5 měsíci +2

    This video is so important, ty for making it

  • @AnotherAnonymousMan
    @AnotherAnonymousMan Před 5 měsíci +4

    This is a really excellent explanation. Clear and concise. Thanks!

  • @TerraWare
    @TerraWare Před 5 měsíci +46

    Thanks for making this video because having all this info into one well made video is hard to find. I have the gsync ultimate oled Alienware, mainly got it because of a great deal at the time and was concerned if it would work on my AMD gpu's and works just fine. You just need to enable Adaptive Sync in the AMD Software and you're good to go.

    • @maxirunpl
      @maxirunpl Před 3 dny

      Nice. I have it the other way around. I will buy a monitor with freesync and have a nvidia gpu. Because g-sync is too expensive for me and 99%(not really, but a lot) of the monitors use freesync, so I couldn't find any that would fit me.

  • @xhenriquefps
    @xhenriquefps Před 5 měsíci +1

    Wtf how is this video so complete?
    Congrats guys! Insanely useful video 👏👏👏👏

  • @StuShoots
    @StuShoots Před 5 měsíci +1

    So glad your here to answer my questions without me even asking!!👍

  • @johng4357
    @johng4357 Před 5 měsíci +7

    I would appreciate a clear explanation on v-sync options "fast" and "adaptive" and what you should use. Also the effect of low latency mode.
    (Nvidia control panel)

    • @Genny_Woo
      @Genny_Woo Před 4 měsíci +1

      Nvidia profile inspector is better, force resize bar, V-Sync to fast for less latancy

  • @boredomkid8271
    @boredomkid8271 Před 4 měsíci +11

    I am lucky to come across this video because I was having a very hard time buying a monitor because I was afraid that I needed a G-sync monitor because I didn't want to lose out on a feature but this video helped me realized that what I thought was wrong. But in a good way because now it made it way easier for me to pick a monitor since I now know that I won't miss out on anything if I don't buy a monitor with G-sync.

    • @sneakyguy4444
      @sneakyguy4444 Před 4 měsíci +2

      Same. This should be everywhere

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 4 měsíci

      This video is misinformation. Displays with hardware G-Sync modules are objectively better. FreeSync doesn't have weird refresh rates with as much stability as a display with a hardware G-Sync module.

    • @maxirunpl
      @maxirunpl Před 4 dny

      I would like to but a monitor with g-sync instead of freesync, but the monitors with g-sync to freesync ratio is like 1 to 80.

    • @boredomkid8271
      @boredomkid8271 Před 3 dny

      @@maxirunpl I got a 1080p 24in TUF gaming monitor that does have G-sync but I think what I regret is not getting a 2k monitor instead because 1080p is just ok. I'm not sure if it's because of the specific monitor I have or if it's because going from my laptop's 2k display to 1080p made me realize that downgrading the resolution was more of a compromise than I thought it would be.

    • @maxirunpl
      @maxirunpl Před 3 dny

      ​@@boredomkid82711080p resolution is great on 24" monitor, but yeah once you tasted that 2k on a laptop screen which is probably smaller than 24" you can't go back.

  • @minimalistedc1193
    @minimalistedc1193 Před 2 měsíci +1

    Thank you this answered all the questions I had on the subject

  • @jonpysanczyn1978
    @jonpysanczyn1978 Před 4 měsíci +1

    That was great Tim, thanks a lot!

  • @40Sec
    @40Sec Před 5 měsíci +35

    I wish there was more attention given to the variable pixel overdrive of G-sync modules, as that can make a massive difference in ghosting and overshoot reduction for LCD panels while also just making it not be a pain in the ass to change overdrive modes everytime you switch to something running at a significantly different framerate. It was a shame that as big of a difference as it made, G-sync modules rarely showed up in monitors and added way too much to the cost for most people to justify.
    I'm honestly glad that we're seeing more OLED gaming monitors hit the market, as there is no need for overdrive settings as each pixel is lit individually.

    • @d.ryan96
      @d.ryan96 Před 5 měsíci +1

      G-sync / freesync implementation in OLED panels is a joke. You get strobing and flickering with minimal framerate variables and it can't be resolved due to OLED technology itself. That's why there's no OLED panels with true G-Sync module as it also wouldn't help.

    • @40Sec
      @40Sec Před 5 měsíci +1

      @@d.ryan96 - That's hyperbole and you know it. If you're talking about VRR gamma changes during large framerate changes, that's not just an OLED issue.

    • @d.ryan96
      @d.ryan96 Před 5 měsíci +3

      ​​@@40Sec Sorry pal, my old XB272BMIPRZX doesn't have these issues at all, tested it extensively with emulators and other means to push some non standard variable framerate variations. It doesn't strobe even on loading screens.
      After some experience with LG Oled CX tv, Samsung C27RG54FQR and MSI OPTIX G27C4 I can confidently say that non G-Sync module vrr just doesn't work well with my expectations.
      I could even argue that vrr is broken outside of TN panels due to ghosting and input lag of IPS panels when they stray too far away from maximum hz range.

    • @d.ryan96
      @d.ryan96 Před 5 měsíci +1

      I might add that I'm early adopter of G-Sync and mostly follow the rule of capping the framerate slightly below max Hz of the screen for consistent input lag and perceived montion.
      In more demanding games i tend to cap the framerate to something that's achievable 99% of the time. That's something that totally breaks f.e. 160hz ips panel, when you want to cap fps at something closer to 100 fps

    • @40Sec
      @40Sec Před 5 měsíci +1

      @@d.ryan96 - Really not interested in anecdotes when speaking about inherent properties of technologies. My statement about VRR gamma shifts being an issue on both LCD and OLED wasn't an opinion, regardless of whether you've gotten lucky/unlucky with the panels you bought.

  • @PLMassTahh
    @PLMassTahh Před 5 měsíci +24

    So, THX to your recommendation I got myself 34" aw3423dw. OMG - I am like 2 weeks in this game and even looking at my desktop makes me smile. Free Sync all the way (6900XT) no tearing, no input lag, no ghosting, colors from another world, perfect black and jawdropping HDR content. (driver brightness -15, saturation 135)
    ❤ Thank you ❤
    P.S. Only drawback is pixel refresh every 4h (6-8 minutes break) - what suppose to keep panel healthy.

    • @oktc68
      @oktc68 Před 5 měsíci

      You can delay the pixel refresh until standby mode is engaged (at least you can on the DWF model) 3 year warranty, unless you're spending an awful lot of time with static white images i wouldn't worry.

    • @PLMassTahh
      @PLMassTahh Před 5 měsíci

      @@oktc68 no warranty for me - bought myself a cheap returned piece. I am at risk so I'd rather not gamble with endurance just yet ;)
      Once you get used to "recommended" breaks it ain't that bad. I actually turned auto prompt off and just occasionally check if panel health went 🟡 - then I move to side screen for a bit and turn refresh on.

    • @pxwxontop
      @pxwxontop Před 3 měsíci

      You know there's shortcut for switching between SDR and HDR on windows? Windows Key + Alt + B, I don't think you need to change any saturation and brightness settings to see HDR.

    • @PLMassTahh
      @PLMassTahh Před 3 měsíci

      @@pxwxontop it's not about HDR, more like personal pref. for oversaturated picture :)
      + AW released a new firmware and color profile for this screen as they were both somewhat flawed. much better but still running mine bit oversaturated by choice ^^

  • @matthewmeldrum2890
    @matthewmeldrum2890 Před 5 měsíci +1

    Thank you for a video on this subject. Truly.

  • @thorgen_ironside5279
    @thorgen_ironside5279 Před 5 měsíci +1

    Informative as always!

  • @WrexBF
    @WrexBF Před 5 měsíci +18

    Besides Freesync and G-sync, there is another form of VRR called HDMI Forum VRR. It is part of the HDMI 2.1 standard, but it doesn't need HDMI 2.1 to work. There are plenty of HDMI 2.0 monitors that support it. HDMI Forum VRR requires at least an AMD 6000 card or an Nvidia RTX 20/GTX 16 series card. Regarding game consoles, the Xbox series s/x supports both Freesync and HDMI Forum VRR. The PS5 only supports HDMI Forum VRR. For those that have Intel graphics cards, they only support adaptive sync through DisplayPort.
    Edit: at 9:52, Tim forgot to mention that you need a DisplayPort connection in order to use G-sync on Freesync monitors. But how come that I'm using using G-sync through HDMI? your monitor supports HDMI forum VRR and that's what's being used. (just like Freesync, HDMI Forum VRR is also labeled as G-SYNC in the Nvidia control panel).

    • @ABaumstumpf
      @ABaumstumpf Před 5 měsíci +1

      But it is HDMI - it would be best of that crap was completely abandoned. Their proprietary bullshit with all the requirements like having active chips in cables, needing specific hardware for every single part, disallowing other technologies being used at the same time or their DRM will make everything look like crap... nobody should support such scumbags.

    • @mokurai8233
      @mokurai8233 Před 5 měsíci +4

      The PS5 also does not feature LFC support, which is a major dissapointment for me as my 4k 120hz LG TV with HDMI 2.1 is flickering in games that go below the refresh window and back often, like FF XVI. My G-Sync Compatible monitor with HDMI 2.1 does not though, which implies more thorough VRR testing for it's certification..

    • @cc-fz5ne
      @cc-fz5ne Před 5 měsíci +1

      Wait so if I use freesync with a monitor that uses hdmi forum vrr am I even experiencing vrr? Should I turn off freesync? I play on a ps5 with the Dell g2724d. This monitor uses hdmi 2.0 but it supports hdmi 2.1 features such as vrr.

    • @ABaumstumpf
      @ABaumstumpf Před 5 měsíci +1

      ​@@cc-fz5ne Try it. maybe it works better, maybe worse. You never know until you try it. It really depends on how Sony, Dell and the HDMI cable maker implemented their parts (and yes - due to how stupidly anti-consumer HDMI is - cables can block such features).

    • @mokurai8233
      @mokurai8233 Před 5 měsíci +1

      @@cc-fz5ne The PS5 does not support freesync (with LFC) but supports adaptive sync through HDMI VRR, which means that if the fps gets lower than the refresh window of the monitor (Which would most often be 48hz-120hz) it will revert back to normal v-sync. This can result in stutter or flicker.
      You don't need to turn off Freesync, but you still won't get it's benefits on the PS5, just the regular HDMI adaptive-sync without the software magic.

  • @ihateevilbill
    @ihateevilbill Před 5 měsíci +6

    I had a 4K Acer Predator gsync monitor that I gave to my son. It allowed a refresh rate between 34 and 60 with no tearing. It was quite nice until you went below that 34fps. Now I have an MSI 1440p active sync moitor @ 165Hz and it's so smooth. A great wee monitor. Ive not seen a torn frame since installing it, which is nice as my old 4K gsync would tear as soon as you hit 33fps (which is fair, thats its rating after all). The biggest downside to the new monitor is (as you mentioned in the video) it uses a set of led strips at the bottom and top of the screen (ie edge lit). Now theyre only visible in the dark, using HDR when the film goes basically dark, but it can be annoying.

    • @tikyreol978
      @tikyreol978 Před 3 měsíci

      Everything is "nice [only] until you get below those 34 Hertz" regardless of whether we're talking of a time before these variable refresh rate syncs or of ever since...

    • @tikyreol978
      @tikyreol978 Před 3 měsíci

      I think I heard about a double mode that somehow mixes traditional/classical vertical sync and variable sync, which would've compensated for when outside the supported framerate range by using v-sync then instead. My memory is awful though and I might be remembering wrong just as well. But like our guy says in the video, now there are screens that require no minimum and do the double frame trick to support out of range rates.

    • @maxirunpl
      @maxirunpl Před 3 dny

      Active sync? What is that? MSI's answer to gsync and freesync?

    • @ihateevilbill
      @ihateevilbill Před 3 dny

      @@maxirunpl Nah just another name for adaptive sync my friend :)

  • @e2ki399
    @e2ki399 Před měsícem +1

    Wow, very interesting video!
    Super clear explanations of a rather complicated topic!

  • @TheProjectOverload
    @TheProjectOverload Před 5 měsíci +1

    Such a helpful video - Thank you.

  • @Zyanide09
    @Zyanide09 Před 5 měsíci +9

    I have yet to see a G-Sync monitor with sharpness settings, which is why I go with Freesync instead, as its nice to have in those games where FXAA or such is forced, or when used with a console where textures might often be blurry because of forced FXAA or other blurry settings.

    • @MaaZeus
      @MaaZeus Před 5 měsíci +1

      Dude, you are better of using your GPU's sharpening features. Much better sharpening algorithms than what is built in to most monitors and TV's.

    • @Zyanide09
      @Zyanide09 Před 5 měsíci +1

      @@MaaZeus Sure, and thats a semi new feature, but that wont work for consoles on the same monitor. Im not going to buy 2-3 different screens for the same area/space of gaming.

    • @windowsseven8377
      @windowsseven8377 Před 5 měsíci

      @Zyanide09 Im using one right now. I have an Hp omen 27qt with 8 levels of sharpness control.

    • @Zyanide09
      @Zyanide09 Před 5 měsíci +1

      @@windowsseven8377 Cannot find any info on a 27qt model, just the 27q and it says its a freesync monitor, so no g-sync module.

    • @hueman1574
      @hueman1574 Před 5 měsíci +1

      lg gn800

  • @ABaumstumpf
    @ABaumstumpf Před 5 měsíci +21

    LFC - Gsync simply had that as it was part of nvidias module. So every Gsync monitor at the very least supported 1-60Hz. FreeSync had no such requirements and there were monitors with a Vrr range as low as 48-60Hz (the absolute worst of the worst) - such extreme crap was rare but LFC it self was also rarely a feature of early freesync monitors.

    • @cube2fox
      @cube2fox Před 5 měsíci +1

      That's why should use FreeSync Premium as it requires LFC. If I understand correctly, G-Sync Compatible may also not support LFC.
      Moreover, it is unclear whether TVs with HDMI VRR support do support LFC.

  • @carywatson1146
    @carywatson1146 Před 5 měsíci +1

    Outstanding work! Thank you!

  • @simoncodrington
    @simoncodrington Před 5 měsíci +1

    Great breakdown mate, cheers for the info

  • @Xbooster35
    @Xbooster35 Před 4 měsíci +7

    It's a good thing that finally things like this are getting better, a G-sync monitor at that time ware unaffordable at those times, all NVIDIA's peripherals ware ridiculous expensive. I didn't know that G-sync also offered their technology to other brands, I did know this about Free-sync, but I didn't know that G-sync was also released to other brands

    • @maxirunpl
      @maxirunpl Před 3 dny

      G-sync monitors are still unaffordable and 2x the price they should be.

  • @Rainquack
    @Rainquack Před 5 měsíci +5

    I'm still a bit scared that the issues with my HP Omen 25 will repeat - when switching it to FreeSync (using a 3060Ti) it absolutely messes up the colour saturation, especially greens, and lowers the brightness to 90.
    When manually upping it again, it sometimes switches back to a non VRR 'custom' mode - so I usually don't use that feature.
    I mainly just noticed that some applications cause it to fizzle out, with the screen going black, which is super annoying, and I'm not sure if that's just cause they're technically 'incompatible'.
    That's what the nVidia Control Panel says, but still lets me activate G-Sync, and the Pendulum Demo looks fine, too.
    (why does that peak to 100% GPU usage tho, revving up the fans?)

    • @user-pg5sz2vn1w
      @user-pg5sz2vn1w Před měsícem +1

      you should have rma'd that monitor.

    • @Rainquack
      @Rainquack Před měsícem

      @@user-pg5sz2vn1w I just assumed it was standard behavior (definitely heard of random flickering/black screens in some apps before - disabling HW acceleration fixes them - and the colour issue is just dumb implementation by HP), and I got it on a good discount.
      It wasn't really new back when I got it, and couldn't check VRR back then cause I was still on my good old 960Ti for some time afterwards.
      It's on 11000+ hours SOT now, and I'm still quite okay with it until an upgrade that truly feels worthy and worth it, and I actually don't really use/miss VRR on a day to day basis.

  • @KRawatXP2003
    @KRawatXP2003 Před 5 měsíci +1

    Thanks for making this!

  • @TechTyrial
    @TechTyrial Před 5 měsíci

    Hey man, totally different question but what camera and lens do you use?❤

  • @AndreGreeff
    @AndreGreeff Před 5 měsíci +4

    I've just recently built a new gaming PC, ended up going full Team Red with AMD for both CPU (R5 7600X) and GPU (RX 7800 XT).. my monitor choice was based more on price than it was on features, and I ended up with the LG 34GP950G, which is branded with "G-Sync Ultimate"...
    here's the clincher: I have serious stuttering issues with virtually all full-screen videos when I enable "Adaptive Sync" in my AMD driver software... I can't say I've noticed issues with local media files (not many to test with), but in CZcams, Twitch, and Netflix, severe stuttering starts up mere seconds after entering full screen, sometimes getting noticeably worse in stages every few seconds thereafter. this stutter goes away instantly when I leave full screen. I must point out that this issue simply does not happen when I have this setting turned off.
    furthermore, enabling the "refresh rate overclock" on my monitor (allowing up to 180Hz) or leaving that off ("only" up to 144Hz) does not seem to make any difference, it still happens with Adaptive Sync enabled. FWIW, I have heard the current beta drivers (23.30.something..? I think?) do apparently fix this, though I haven't tried them myself as yet.. I'm currently still on 23.20.01.10.
    so with regards to the theory behind the implementation of VRR technology, this was indeed a very interesting and informative video, but it definitely doesn't paint the full picture.... at least not smoothly (sorry, couldn't resist that one).
    regardless, I just jumped from FHD at 60Hz to UWQHD at 180Hz, and it's blowing my mind... I don't think I could ever go back after this. (:

    • @09RHYS
      @09RHYS Před 5 měsíci +2

      Just a theory but in my case I have an LG CX Oled and one of the first thing's I noticed when watching content via inbuilt apps on Netflix, CZcams etc is that I have terrible stutter on panning shots and after going down the rabbithole, I found out because the response time is so instantaneous on Oled panels, anything that doesn't reach it's native 120hz has a really noticeable slideshow effect due to 24hz, 30hz & 60hz showing a direct presentation of the smoothness of the content. My only saving grace is the motion interpolation on board, but the artifacts can be pretty bad at times so it has become a pet peeve for me.
      When reading your comment I just had the idea that when you used adaptive sync, it must have lowered the monitors refresh to match the fps of the video which inturn produced the stuttering due to your panels instant response time not masking those frame changes like a traditional LCD does, in my research I learned that due to LCD's response time in the pixels turning on or off being quite a bit higher, that it in turn makes any motion or panning alot smoother as the low refresh rates are just basically less perceivable than if it was instantaneous like it is on oled and possibly because your monitor is displaying a fixed refresh rate when it's turned off maybe helps it display smoother?
      I maybe wrong though as I don't use dedicated PC monitors, G-sync works with LG CX & 4090 and I have tested it out with it on & off in content but it still seems pretty unsmooth to me in both scenario's. Joys of Oled!

    • @The_Noticer.
      @The_Noticer. Před 5 měsíci

      Turn off hardware accelleration. I know its stupid but it works.

    • @AndreGreeff
      @AndreGreeff Před 5 měsíci +1

      @@09RHYS I've never used any super high refresh rate OLEDs before, so that is very interesting indeed.. the weird thing is that I would expect my monitor refresh rate to drop to either 60Hz or 30Hz in the case of CZcams vids in full screen, but not only does it feel slower than that, it also feels inconsistent, like it's constantly "hunting" for the right refresh rate to use. really annoying..

    • @AndreGreeff
      @AndreGreeff Před 5 měsíci +1

      @@The_Noticer. I'm familiar with that issue, but unfortunately that is very different. I've tried that, and tested with numerous browsers (Chrome, Edge, Brave, and Firefox). this is specifically VRR causing havoc.

    • @09RHYS
      @09RHYS Před 5 měsíci

      ​@@AndreGreeffIf I'm not mistaken I believe non 1080p60 streams from YT are at 24hz? Maybe with adaptive sync turned on, it could be having a weird mismatch between the G-Sync Ultimate module & your AMD card and it's possibly overcompensating the sync maybe and not enabling the 1-180hz VRR range properly?
      It does sound pretty annoying after paying a good chunk of money for it all and then having to manually switch when you game or play media.
      It could also well be the browser or heck even the stream causing it to happen where maybe it isn't a constant fps but instead a variable thus making the stream feel unsmooth when adaptive sync is turned on, making it seem faster or slower than it is meant to be, I know I have issues with any variable fps media I play using my built-in media player on the CX via DLNA casting from my phone, where because it isn't a fixed fps, I get really bad motion handling with the aforementioned interpolation as well as with any 29hz/25hz content, which presents as a tiny stutter every 5secs and is jarring when things get very fast in motion or panning shots.
      Honestly monitors and TV's will never cease to baffle me when it comes to the source material's fps rates and it's handling of it! haha
      Also, for some reason I read your first paragraph thinking you were referring to owning the new alienware oled! Sorry it was a little late and I think the person above your comment at the time said they had one I believe 😅 But I guess it still is relevant to the previous theory due to your panel still having super low response times with a Nano IPS. Also, one last thing is have you tried any kind of disabling of the monitor features itself when using Adaptive Sync? That maybe worth a try to see if any of the features are what's causing the issue.
      Anyways, I hope you have some luck with figuring it out, maybe the beta drivers will be the best solution until an official driver comes along of all else fails! 👍🏻

  • @VAULT-TEC_INC.
    @VAULT-TEC_INC. Před 5 měsíci +1

    This was extremely helpful!

  • @sidrolf
    @sidrolf Před 5 měsíci +3

    God bless you! Very comprehensive. These companies are shooting themselves in the foot by making this so confusing!

  • @charno_zhyem
    @charno_zhyem Před 5 měsíci +4

    I just want to know when will the VRR flicker be fixed. My Samsung Q80T has intense flickering below 80fps and after spending countless hours online searching for a fix the conclusion is this is still unsolved. It's what is keeping me from switching to OLED, I read the issue is more present there due to deeper blacks and higher contrast.

    • @alexg9155
      @alexg9155 Před 5 měsíci

      afaik flickering is only a problem for VA panels. OLEDs and IPS exhibit no flickering, correct me if I'm wrong though.

    • @kobusdowney5291
      @kobusdowney5291 Před 5 měsíci

      Of the two Adaptive sync monitors I've owned, none have flickering. AOC 24g2 is a Freesync Premium product, no flickering. The MSI G2774QPX I bought recently is G-sync compatible AKA generic adaptive sync. No issues there.

    • @charno_zhyem
      @charno_zhyem Před 5 měsíci +1

      Maybe it's more of an issue with VRR TVs?

    • @alexg9155
      @alexg9155 Před 5 měsíci

      @@charno_zhyem Given most of them are VA (my Q70R is) I would say yes, although mine does not flicker at all so who knows.

    • @rossmarbow-combatflightsim5922
      @rossmarbow-combatflightsim5922 Před 4 měsíci

      never its just mostly useless tech

  • @ShaoZapomnit
    @ShaoZapomnit Před 5 měsíci +13

    To clear common misunderstandings of V-Sync:
    * It's possible to play without input lag, it doesn't have any until the refresh rate is exceeded by the framerate or pre-rendered frames are not properly set. Input lag is highly influenced by the frame time which means whatever sync technology you're using, you'll be bottenecked by your monitor refresh if you want no tearing.
    * Frames duplicated have no effect in the input, it's only a visual artifact which the higher the refresh, the less noticeable. This is completely alleviated with Free/G-Sync.
    * V-Sync only works in fullscreen.
    Finally V-Sync actually can help adaptive sync technologies by providing full frames, only that Nvidia users should know that Low latency needs to be enabled at Ultra for that, that used to be called Max pre-rendered and the default is 3 which is pretty bad, setting it ultra makes it 1 which should always be tbh.
    Say for example you have 144hz, capping with RTSS to 120 fps in full screen with v-sync using free/g-sync will yield extremely good visuals with no compromises so long your hardware can keep up.

  • @brianmiller1077
    @brianmiller1077 Před 5 měsíci +1

    Thanks for this video!!

  • @blender_wiki
    @blender_wiki Před 4 měsíci

    You are the one of the fews tech CZcamsrs that don't make misleading video end talk about subjects they don't knows.
    Thanks for that.

  • @RocketHedgehog
    @RocketHedgehog Před 5 měsíci +3

    According to my experience even with new FreeSync labeled monitor you can have problems with G-Sync enabled. One of examples here is Dell G3223Q (real garbage TBH). I'd recommend to avoid ignoring G-Sunc compatibility label.

    • @LockedOutBy2FA
      @LockedOutBy2FA Před měsícem

      So what is stated in the video (that all current FreeSync monitors are compatible with NVIDIA G-SYNC) isn't 100% true.
      Good to know!

  • @Hadgerz
    @Hadgerz Před 5 měsíci +8

    was loving g-sync until i upgraded to a 45" LG oled. now it's just constant brightness flicker until i turn adaptive sync off. Gonna have to make do with fast sync and low latency mode.

    • @j3m638
      @j3m638 Před 5 měsíci +1

      Look out for firmware updates for that monitor which may fix it. Sometimes it's either that, a less than premium cable, using HDMI instead of DP or turning the feature on then fully restarting the monitor. Because it shouldn't do it.

    • @DJHEADPHONENINJA
      @DJHEADPHONENINJA Před 5 měsíci +6

      @@j3m638 No all OLEDs does this in VRR as it's shifting the gamma through different refresh rates, I got it on all my LG C-series and the Alienware QD-OLED. Nothing you can fix other than turning off VRR/G-Sync.

    • @Hadgerz
      @Hadgerz Před 5 měsíci +1

      @@j3m638 latest 45gr95QE software 3.09 installed, using the DP1.4 cable it came with. I even tried the one my previous 34GN850 came with. I could restart, turn the monitor off, do this every day and it won't change.
      Love the monitor, but while everyone is touting OLED as 'the way forward', n o b o d y is mentioning that having VRR (and especially a costly power-hungry g-sync module) in an OLED panel is completely pointless as the second you launch any game, half of the frames are, entirely at random, a completely random different gamma.

    • @opbush5272
      @opbush5272 Před 5 měsíci +2

      same here, might drop vrr as well.

    • @rossmarbow-combatflightsim5922
      @rossmarbow-combatflightsim5922 Před 4 měsíci

      its not just a oled issue va and ips have the same issues, its just more obvious with oleds@@DJHEADPHONENINJA

  • @SingleRacerSVR
    @SingleRacerSVR Před 5 měsíci +1

    Thankyou for this very helpful to understand video. But I have a question for someone with the knowledge to answer please. At exactly 3:59, there is a shot of an LG Monitor showing the "Adaptive Sync" simply either ON or OFF. But my newly purchased (older model) LG Monitor has a third option of BASIC (ON) & EXTENDED. But I am reluctant to select extended due to watching nearly all of Monitor Unboxed video's that explain that the Responce Time Option of FAST can be the best for your input lag etc, but FASTEST can often introduce unwanted motion blur. So I was worried that the EXTENDED option of Adaptive sync was a similar thing - where you might introduce some negatives with any positive it might give you. So does anyone with an LG Monitor know if EXTENDED ADAPTIVE SYNC should be an automatic selection?

  • @russbetts1467
    @russbetts1467 Před 4 měsíci +2

    Thanks for this video. I'm in the process of looking for a wide-screen monitor to replace my current 5:4 ratio LG Flatron 19 inch monitor, as it doesn't perform well with 4K videos. I'm limited by space to 24 inch screens and have several in mind, but didn't understand the terminology of the types and whether they would be compatible with my AMD Ryzen 5 5600G based PC. I'm not a Gamer, so high frame-rate is not a problem to me, as long as it's better than 60 Hz. I'm looking at 75 Hz, or better, with 4K capability. Your video has now clarified the situation and I shall buy whichever monitor my local Computer Warehouse can supply at a reasonable price.

  • @solomonshv
    @solomonshv Před 5 měsíci +5

    i went through THREE different 7900XTX, at least 7 windows re-installs, 4 freesync premium monitors (including a $1000 alienware monitor) and i couldn't get rid of the issue with my monitor randomly flashing a black screen. no matter what i tried, it kept happening. in the end i bought a msi 4090 suprim and it solved everything right away. i didn't even have to do a clean windows install or remove the AMD drivers from my previous GPU. i'm currently using the same alienware QD-OLED freesync monitor that i was having black screen flickering on with 2 different AMD cards, but i'm not having the issue with my nvidia card.
    the amount of time and money i wasted troubleshooting, packing/shipping/returning/driving/RMAing to get rid of this issue was not worth the "savings" of buying an AMD card over nvidia. i don't really care what the prices are, next time i upgrade i'm buying only nvidia. the time and stress i won't have to put in is worth the extra $600 that the RTX 4090 cost me

    • @edzymods
      @edzymods Před 5 měsíci

      A bullshit.

    • @fatidicusaeternus6498
      @fatidicusaeternus6498 Před 5 měsíci

      That's crazy. Never had a problem on my m32q on a 3080 or my current 7900xtx.

  • @The_Noticer.
    @The_Noticer. Před 5 měsíci +7

    I do think that VRR struggles when the frame to frame variance becomes too high though. Might be worth looking into.
    Additionally, you still need to perform some tricks (rivatuner frame limit, or radeon chill, nvidia reflex) to keep it below the VRR range in high fps scenarios.

    • @666Azmodan666
      @666Azmodan666 Před 5 měsíci +2

      Playing below vrr makes no sense (it's probably 48fps most often), and you won't generate more frames - practically every game now has limitations in settings...

    • @rossmarbow-combatflightsim5922
      @rossmarbow-combatflightsim5922 Před 4 měsíci

      flickering is such a big issue and none talks about it enough

    • @The_Noticer.
      @The_Noticer. Před 4 měsíci

      @@rossmarbow-combatflightsim5922 Yeah I know, having a VA panel myself its why I use Radeon Chill, because it gives the best frame pacing.

    • @rossmarbow-combatflightsim5922
      @rossmarbow-combatflightsim5922 Před 4 měsíci

      I have oled so I just turn VRR off
      only thing I turn on is in-game frame cap
      VRR relies on games having level performance which most don't
      And if the game has level performance then its usually good performance @@The_Noticer.

  • @Awakened2001
    @Awakened2001 Před 4 měsíci

    Thank you. I haven't looked into this tech in some time and still thought they were mostly exclusive.

  • @bernl178
    @bernl178 Před 18 dny

    Very well done mate. Thank you for the insight.

  • @andreycamper5863
    @andreycamper5863 Před 5 měsíci +3

    Is VRR bad for competitive games? I mean it is better to see a half frame as soon as possible to see an enemy. Yes, u have this split between frames, but it is the fastest method.

    • @hamzashaikh9310
      @hamzashaikh9310 Před 5 měsíci

      subjective. try both and see what you prefer.

    • @riba2233
      @riba2233 Před 5 měsíci +1

      No, better to have smooth and consistent experience

  • @TechGamesAU
    @TechGamesAU Před 5 měsíci +10

    The problem with VRR is OLED flicker. Would love to see you make a video about this. It straight up ruined my Alan Wake experience on my LG C1.

    • @TheMaximusPride
      @TheMaximusPride Před 5 měsíci +3

      My va lg monitor with freesync flickers as well, so not only oleds)

    • @TheMaximusPride
      @TheMaximusPride Před 5 měsíci +1

      Sad that the problem is very rarely discussed and you have no clue what panels are affected. ( in my case is happened when fps drops below vrr range, maybe “low frame rate compensation” feature is fixing it?)

    • @enricod.7198
      @enricod.7198 Před 5 měsíci

      In some cases, flickering can be caused by any type of performance overlay (like rivatuner overlay), so you should try disabling it completely and retry. Might do nothing, but it's worth a shot.

    • @Kryptic1046
      @Kryptic1046 Před 5 měsíci

      @@TheMaximusPride - You're right. I have flicker with GSync on my TCL in some games (but not others) with my Nvidia GPU, while the Freesync Ultimate in my Samsung panel is literally perfect in almost every game with no flicker at all on my AMD GPU. Both of these displays are VA. At least with my two particular panels, Freesync works considerably better than GSync, but it really seems to be on a panel-by-panel basis how well VRR works for a given person. In some displays it's flawless (my Samsung), in some it has varied results (my TCL) and in some it's just plain awful and doesn't work right.
      I wish the industry would get a better handle on VRR because it seems like right now there's too wide of a variance of VRR performance depending on the panel type, manufacturer and processing in the display itself, and when you do get flickering, it ruins the game.

    • @cbz21
      @cbz21 Před 5 měsíci

      I don't get flicker on my C2 in Alan Wake with my rx 6800 but I do get flicker sometimes on my Dell S2721DGF in select games at certain refresh rates.

  • @user-pe3rx8ww6b
    @user-pe3rx8ww6b Před 4 měsíci

    This guide was so helpful.. I didn't even enabled the freesync and I thought its working...

  • @RFC3514
    @RFC3514 Před 5 měsíci

    I work in film and video post-production and often need to mix different formats, so I'm aware of these issues (plus interlacing, don't get me started on interlacing). I often _avoid_ watching videos on this subject, because even when CZcamsrs actually have a clue about the technical details (and let's be honest, a lot of them don't), they tend to be sloppy with language in a way that misleads viewers that don't _already_ understand these issues, and my forehead gets sore from all the facepalming.
    Anyway, long story short: congratulations (and thank you) for wording everything absolutely perfectly. 🤓👍
    I would have added a little graphic showing a strip of "frames being rendered" and "frames being shown", with arrows pointing from one to the other (to illustrate the discontinuity caused by mismatched frame and refresh rates), but I think the description and slow motion were clear enough to get the point across.

  • @TheZoenGaming
    @TheZoenGaming Před 5 měsíci +13

    My monitor is from the early days of G-Sync Ultimate back in 2018, before they quietly lowered the standards for the qualification in 2021. As such, it has an older G-Sync module in it, and I've tested my monitor and found it doesn't work well with AMD cards. I bought it for the incredible HDR performance and color accuracy, and I honestly don't regret buying it, even if it only hits those values at or below 98Hz. I usually frame cap all my games to 90Hz, anyways.

    • @johnboylan3832
      @johnboylan3832 Před 5 měsíci

      Which monitor is that?

    • @TheZoenGaming
      @TheZoenGaming Před 5 měsíci

      @@johnboylan3832 The ASUS ROG Swift PG27UQ.

    • @Quast
      @Quast Před 5 měsíci

      Yay nothing is more fun as when ech companies undermine their own standards >.

    • @TheZoenGaming
      @TheZoenGaming Před 5 měsíci

      @@Quast LOL Right? According to Nvidia, the qualification would have been impossible for OLED monitors to meet since it required 1000 nits peak brightness as a minimum, so they changed it to "lifelike HDR". I think they could have just added a seperate qualifier for OLEDs.

    • @mikkelkirketerp4884
      @mikkelkirketerp4884 Před 5 měsíci +1

      ​@MasterZoentrobe or just stated that for oleds it's only 1000nits @ max 2% or something, and let's say 300nits @ 100% window size.
      In my experience, an oled with max 300 nits but peaks of 900 at up to 5 or 10% window size, far beats any other panel type.
      The ability to have 900nits right next to a 100% black pixel is just something else.

  • @detmer87
    @detmer87 Před 5 měsíci +3

    Nvidia G-Sync = dirty marketing

  • @Vlican
    @Vlican Před 5 měsíci

    fantastic explanation of the standards in place nowadays. i'm glad i don't really need to worry about this anymore when shopping for a new monitor

  • @UNSINKABLEII
    @UNSINKABLEII Před 5 dny

    Thankyou! That clears this up immensely.

  • @evalangley3985
    @evalangley3985 Před 5 měsíci +15

    No mention that AMD introduce Freesync as an Open Standard and forced Nvidia to abandon their strategy? Ah yeah, good old Tim not telling the whole history and defining this as marketing.

  • @TheFawz
    @TheFawz Před 5 měsíci +4

    One of the Nvidia GSync module's biggest downside is the lack of support for HMDI 2.1 and DisplayPort 2.0+
    Didn't there also used to have a heat issue and so require fans that generate noise in some cases? Recent implementations don't seem to have that issue
    Also G-Sync Ultimate refresh rate range used to be wider, but likely part of a requirement that changed

    • @SOG989
      @SOG989 Před 5 měsíci

      Wait, is this Asus monitor(VG28UQL1A) a lie ? I remember seeing it has HDMI 2.1 and both gsync and freesync.

    • @TheFawz
      @TheFawz Před 5 měsíci +1

      @@SOG989 it'll have gsync compatibility, not regular or ultimate, as it won't have a gsync module

  • @SirRFI
    @SirRFI Před 5 měsíci

    Looking forward more videos like this.

  • @GamingisinmyDNA
    @GamingisinmyDNA Před 5 měsíci +1

    thanks for this video, and love you from india, was looking for this video since 2019

  • @Maras666
    @Maras666 Před 5 měsíci +36

    It's important to mention that adaptive sync is so prevalent right now because AMD decided to make their VRR implementation a part of VESA standard for displays, that is why it's called FREEsync, as opposed to Nvidia' implementation.

    • @ABaumstumpf
      @ABaumstumpf Před 5 měsíci +4

      " right now because AMD decided to make their VRR implementation a part of VESA standard for displays"
      No it is the other way around. Vesa VRR was based upon nvidias contributions and then AMD marketed their proprietary implementation as freesync.

    • @valhallasashes4354
      @valhallasashes4354 Před 5 měsíci +24

      @@ABaumstumpf No, you're the one who has that backwards. It was AMD that submitted Freesync to VESA, which later became what we now know as VESA VRR on TVs. It was even part of AMD's original presentation. Nvidia announced G-Sync first. But it required a compatible card, licensing and a dedicated expensive module in the monitor. "Freesync" was literally coined as a direct result of how expensive and proprietary G-Sync was. AMD then announced and demonstrated Freesync shortly after and then specifically announced during that same presentation that they had submitted Freesync to VESA for inclusion as an open standard so everyone, including TVs could use VRR.
      Look, I have an Nvidia card too, but don't be a fanboy. Nvidia doesn't "contribute".... ever. They develop and propriatize everything for as long as they can get away with it. The only things they contribute to, are the things they don't really have much choice on. It's mostly AMD that contributes to the industry with open standards anyone can adopt. Tessellation, Mantel (now known as DX12 and Vulkan), Freesync/VRR, etc. I could make a huge list but I don't think I need to, because you can as soon as you think "what features did Nvidia contribute "openly"?", you basically stop. PhysX? Nope, locked, never opened and dead now. G-Sync? Nope, was dedicated hardware, proprietary and took years of losing to AMD's Freesync before they had no choice but to open it (even after they were already using Freesync on laptops but still calling it G-Sync, but continued to block their desktop GPUs because "money"). You remember when AMD developed Tessellation and invited Nvidia to use it. Nvidia refused, basically saying it had little value, then Microsoft included it in DirectX 11 and then it turned out Nvidia's hardware was better at Tessellation than AMD's hardware. You remember when AMD developed Mantle and literally invited Nvidia to join the endeavor? Now look at RT Cores and Tensor cores for ray tracing and DLSS.
      Nvidia does NOT "contribute"... Ever. They exploit. Unless they're forced to contribute. Nvidia does NOT play nice nor share... Ever. Nvidia's greed will never allow them to share nor contribute. And Nvidia's pride will never allow them to work with the competition even when invited and even when said endeavor would directly benefit Nvidia with an advancement that would benefit the industry as a whole (ie Mantle).
      Nvidia does make good products, but don't ever delude yourself into thinking Nvidia is a good company.

    • @edgaras1103
      @edgaras1103 Před 5 měsíci

      @@valhallasashes4354 Nvidia is not a good company , its a for profit mutli billion corp. So is AMD. Lets not kid ourselves . AMD does what it does because they are not the market leader, because open source technologies are good for PR and has bigger chance of getting adapted . You cant demand for something when you have less than 20% market share .

    • @Jas7520
      @Jas7520 Před 5 měsíci +9

      Totally incorrect, VESA Adaptive Sync predates GSync and was not developed by AMD at all.
      It was used in embedded displayport as a power saving feature, gsync was introduced as a way of getting VRR on external displays specifically for gaming. Until GSync was released, no-one considered using VRR for its gaming benefits, only for its power benefits. A year after the release of GSync, VESA Adaptive Sync was ported to the external displayport standard and was relabeled as Displayport Adaptive Sync. Literally straight out of the original press release:
      "Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."
      Adaptive sync is not "AMD's standard", as much as they would like to claim it.

    • @valhallasashes4354
      @valhallasashes4354 Před 5 měsíci

      @@Jas7520 Pulled from AMD's own FAQ page about FreeSync:
      What is DisplayPort Adaptive-Sync and how does it differ from AMD FreeSync™ technology?
      DisplayPort Adaptive-Sync is a new addition to the DisplayPort 1.2a specification, ported from the embedded DisplayPort v1.0 specification through a proposal to the VESA group by AMD. DisplayPort Adaptive-Sync is an ingredient feature of a DisplayPort link and an industry standard that enables real-time adjustment of display refresh rates required by technologies like AMD FreeSync™ technology. AMD FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tear-free, low-latency gameplay and video. ​
      Bottom line. Everybody is claiming credit and I am not about to get into a he said, she said argument. And I sure as hell am not going to get into a debate about derivatives of technologies in their earliest "forms" regardless of what their capabilities may or may not have been because I have no way of knowing what the full scope of those capabilities were. And just because it "existed" in "some" form, doesn't mean it was "used" in any "meaningful" form. That's like a child screaming "I knew the answer the whole time" even though they never used that answer in the way that mattered to the actual event in question. All I can tell you is what I saw when I was there. The events I listed in my post are what I saw. Both as things were announced, when they were announced and how events developed and transpired over the years as they happened.
      It doesn't matter who made the propeller first. What matters is who took the propeller off the mobile and put it on the plane first.

  • @VerlithTwo
    @VerlithTwo Před 5 měsíci +7

    I bought an Alienware AW3423DWF and with g-sync enabled it flickers in everything, but its gone if i disable g-sync. I decided to return the monitor due to the flickering.

    • @MrAlex26069114
      @MrAlex26069114 Před 5 měsíci +1

      You can buy the AW3423DW and enable g-sync ultimate and it will still flicker. It’s an issue with OLED

    • @rossmarbow-combatflightsim5922
      @rossmarbow-combatflightsim5922 Před 4 měsíci

      nah its an issue with all monitors
      vrr is just useless most of the time@@MrAlex26069114

  • @HDEFMAN1
    @HDEFMAN1 Před 4 měsíci +2

    Some very solid info here on the differences between G-SYNC , FreeSync and Adaptive Sync.. I am in the market for a 4K monitor in the New Year so watching this video was very helpful.

    • @BillaBong1
      @BillaBong1 Před 4 měsíci

      same here did any catch your eyes? im barely getting into pc gaming builds and pc gaming. im trying to invest in the one that can handle all the top end and newest games

    • @HDEFMAN1
      @HDEFMAN1 Před 4 měsíci +1

      @@BillaBong1 Yes, one did. Unless I see something online that changes my mind in the next few weeks, my money is on the Gigabyte M32U. This is a 32 inch 4k 144 hz IPS monitor. This looks to be a good all rounder which should provide a significant upgrade to my present setup. I will also be buying one of the new Nvidia Super4070 Ti graphics cards when they come out to power this new monitor. I am eagerly looking forward to some buttery smooth gaming on this new rig.

    • @BillaBong1
      @BillaBong1 Před 4 měsíci

      @@HDEFMAN1 Heck yeah bro!👍you hit the right check boxs. im also looking for a 32inch 4k screen to fit in a spot im plan on turning into my lil gaming area. i got the rtx 4070 but gonna save for the supers when they come out. got the i9 14900k paired with a msi z790 mother board still debating on 48gb 2x24gb or 64gb 2x 32 domintator titaniums, or the gskill trident z5s. what u think i should go with? any other recomendations?

    • @HDEFMAN1
      @HDEFMAN1 Před 4 měsíci

      @@BillaBong1 If you plan on doing any video editing or content creation or any task that requires some heavy lifting then I would say that you could find a use for 48GB or even 64GB, if not then 32 GB may well suffice. As to memory brands I imagine that any well known brand will work as long as it is compatible with your motherboard. One thing that often gets overlooked when putting together a new build is sourcing a good psu. I went with Seasonic for my last build. You don't want to spend all that money on expensive components and then use a cheap psu to power everything. Sounds like you are going to have a sweet system when you put it all together.

  • @rayder3543
    @rayder3543 Před 5 měsíci +1

    I would have liked if you showed judder issues with camera-pan demos, adaptive sync is a very important features, way more than modern and currently marketed ones.

  • @elotkarloketh166
    @elotkarloketh166 Před 3 měsíci +3

    It's not a comparison but there's "vs" in the title🤦

  • @rogermartinsen
    @rogermartinsen Před 3 měsíci +3

    can i get a rtx 4090

  • @babonet
    @babonet Před 4 měsíci

    Thanks for the great explanation. Subscribed 😊

  • @Alex-bl8uh
    @Alex-bl8uh Před 5 měsíci

    Thanks for the vid! Now I have g sync despite having a free sync monitor. Was not expecting that!

  • @mck8292
    @mck8292 Před 5 měsíci +3

    Fun Fact : I just recently bought an Asus FreeSync Monitor that was actually marked as "G-Sync Compatible", it even had a G-Sync Sticker on the front of it.... but it only had one HDMI port... no DP... i never felt so scammed in my life x)

    • @dominikweyand7497
      @dominikweyand7497 Před 5 měsíci

      Nah i call bs… every monitor after freesync/gsync release time came with native dp support… dp is standart since like a decade :D

    • @mck8292
      @mck8292 Před 3 měsíci

      @@dominikweyand7497 I swear my "gaming monitor" only has 1 hdmi port and that's it.. this is the first time i saw a monitor with only 1 port
      Even my second old TN office monitor has a hdmi AND a dvi port.. but not my main asus one

  • @mightematt6266
    @mightematt6266 Před 5 měsíci +14

    It felt glossed over, but I can't understate the benefits of G-Sync's adaptive overdrive, at least on IPS panels. I tried the switch from G to Free, and found myself immediately disappointed with how much overshoot I experienced in the lower refresh range. I ended up selling it and buying a gsync panel, and all my woes went away.
    I don't mean to say Freesync is bad per se, the price difference and better interface support are no small benefit and may tip the decision for the right people. However as with most things AMD vs Nvidia in my experience, the AMD solution always feels like the "budget" solution to Nvidias implementation. Until monitors all have perfect response times down the entire refresh range, I can't bring myself to switch back to Freesync regardless of price, the experience simply isn't up to par.

    • @Flaimbot
      @Flaimbot Před 5 měsíci +1

      i totally agree and went with gsync (hw) for the same reason, even before misbuying.
      but to clarify, nobody is stopping manufacturers from implementing variable overdrive to their scalers, like it's in hw gsync. there's even a few non-gsync monitors that exist with variable overdrive. manufacturers just like to omit this feature, because it is added cost on the bom/engineering.

    • @PyroBBQSauce
      @PyroBBQSauce Před 5 měsíci +1

      G-Sync's adaptive overdrive sounds really neat but I feel like these licenses tend to have less and less meaning as time goes on :^(
      I have a display that switched from the G-Sync HW module (AW2721D) to a FreeSync Premium Pro license (AW2723DF) and under VRR there is no (significant,

    • @andersjjensen
      @andersjjensen Před 5 měsíci +4

      That is totally on that particular model of monitor. As Tim said. Several times in fact. And he even spelled it out that a hardware G-Sync module isn't even a guarantee that the overdrive modes are set up properly, though it is less common.
      So basically you didn't listen to a word he said.

    • @CaptainKenway
      @CaptainKenway Před 5 měsíci

      @@andersjjensen I don't want or need to listen to facts when I'm justifying my purchasing decisions, buddy.

    • @edzymods
      @edzymods Před 5 měsíci +1

      ​@@andersjjensentypical Nvidia shill lmao 😂

  • @Jules-se4cx
    @Jules-se4cx Před 5 měsíci

    What a bang of a video, thank u very much to clear up this stuff. I was going crazy when looking for monitors, now I know -> it’s all about marketing (mostly)

  • @lukamanojlovic7405
    @lukamanojlovic7405 Před 5 měsíci

    Hello, i started watching your videos lately and its been very good. Can u review monitor dell g2724d

  • @ruychii
    @ruychii Před 5 měsíci +3

    where is N-Sync ?

  • @karolkilian2402
    @karolkilian2402 Před 5 měsíci +5

    nahh, just get such a powerful GPU, that you get more fps than your refresh rate :D :D

    • @lawgamingtv9627
      @lawgamingtv9627 Před 5 měsíci +5

      honestly. I have a 4090 and I use gsync in every single kind of game + locking my fps, idk maybe I'm weird. but it also saves power and redzces heat for both components.

    • @olifantpeer4700
      @olifantpeer4700 Před 5 měsíci +3

      @@lawgamingtv9627 Exactly, I limit my 4070 as well even. It's already really efficient but I can almost half the power usage by limiting the fps to e.g. 90. For a lot of games I don't need to hit 120+ and locking it to 90 just makes it more consistent overall. Less power draw, less heat, less noice, barely less smoothness

    • @stealthhunter6998
      @stealthhunter6998 Před 5 měsíci +1

      @@olifantpeer4700 I do that with RTSS too. Cap the framerate to make .1% and 1% lows a lot better and more stable frametime. I don't like fluxation of frames unless its only a bit and not 20+ so I am still VRR but capped around my average so it dosnt go up/down by tons of frames.

  • @terrymorant5895
    @terrymorant5895 Před 5 měsíci

    Awesome content! Keep it up!

  • @VictorSheyanov
    @VictorSheyanov Před 5 měsíci

    Great explanation, thank you.

  • @Wybrem
    @Wybrem Před 5 měsíci +5

    I have AMD Freesync Premium Pro? And it's awful. It's incredibly hard to get it to work at all, and when it does your monitor turns into a stroboscope during loading screen or whanever the fps drop below the supported 48fps. G-sync good, amd bad. Such poor experience compared to Apple.

    • @Drubonk
      @Drubonk Před 5 měsíci +2

      Had these issues to, its most likely because you use a VA or OLED Panel, which can vary in brightness on different Hz (Since FPS & Hz stay synchronized when using it)
      Nothiced it a lot on my old Samsung VA, especially in bad optimized games as the FPS would drop quick & randomly. After uppgrading CPU & GPU it was not noticable in any modern games, because of the more stable framerate = stable Hz :) But switched to a G7 Odysseu and not seen the issue anymore

    • @markjacobs1086
      @markjacobs1086 Před 5 měsíci

      What you're seeing is PWM flicker, which happens on EVERY non flicker free display.
      Look up reviews before you buy a display (RTINGS for example tests for PWM flicker).

    • @Wybrem
      @Wybrem Před 5 měsíci

      @@Drubonk OLED indeed, why is that a OLED issue? Philips Evnia 34" ultrawide OLED. It's 175hz. I play at locked 120hz as I prefer a stable frame rate and at that rate it's fine everything above 48 is fine. Which in games is fine but menus or loading screens often drop below 48 for whatever reason.

    • @Wybrem
      @Wybrem Před 5 měsíci

      @@markjacobs1086 No, I know what that is, I avoid screen with that due to migraine. I have an OLED monitor which does have the standard oled flicker, I can handle that. With Freesync on and below 48fps the monitor flickers very badly. Flashes almost. Very unpleasant and in badly developed games menus can be awful.

    • @riba2233
      @riba2233 Před 5 měsíci

      Pure bs

  • @thrallj
    @thrallj Před 5 měsíci

    Terrific video! Thank you!

  • @cammyjiyoga8753
    @cammyjiyoga8753 Před měsícem

    Appreciate the info!

  • @markusjohansson6245
    @markusjohansson6245 Před 5 měsíci

    This explanation was great !

  •  Před 3 měsíci

    Thanks for the understanding.