NVIDIA Driver's Overhead: Are AMD GPUs REALLY FASTER with slower CPUs?

Sdílet
Vložit
  • čas přidán 27. 06. 2024
  • Get 30% discount with my "SKAG" code!
    Windows 11 Pro ($22): www.gvgmall.com/gvg/SKAG1
    Windows 10 Pro ($15): www.gvgmall.com/gvg/SKAG
    Windows 10 Home ($14): www.gvgmall.com/gvg/SKAGH
    Windows 11 Home($20): www.gvgmall.com/gvg/SKAG1H
    Office 2019 ($46): www.gvgmall.com/gvg/SKAG9
    Hello Dear Viewers :D
    Today, we're testing the CPU overhead in the GPU Drivers, with AMD vs NVIDIA to see how they fare with 3 different GPUs (Ryzen 7 7800X3D, Ryzen 5 7500F & Ryzen 5 5600X).
    00:00 - Intro & What is CPU Overhead
    02:45 - Assetto Corsa: Competizione
    03:32 - The Witcher 3
    04:22 - Counter Strike 2
    05:06 - Banishers: Ghosts of New Eden
    05:49 - Fortnite
    06:29 - PUBG: Battlegrounds
    07:39 - Ratchet & Clank: Rift Apart
    08:36 - Microsoft Flight Simulator 2020
    09:11 - Hogwarts Legacy
    10:05 - Dragon's Dogma 2
    10:54 - Starfield
    11:19 - The Last of Us: Part 1
    11:55 - Alan Wake 2
    12:48 - Spider-Man (RT Included)
    14:02 - 15 Games Average
    14:47 - Conclusion (Drivers CPU overhead, CPU performance)
    18:52 - Channel Members
    19:12 - More Videos
    Hope you enjoy the video. Comment with your thoughts about it. Don't forget to hit like, subscribe and share the video if you can.
    If you want, support me on:
    Patreon: / ancientgameplays
    Paypal: www.paypal.me/AncientGameplays
    Social Networks:
    Discord: / discord
    Instagram: / agpisco
    Facebook: / ancientgameplays
    AMD AM4 PC USED for CPU testing:
    CPU Cooler: Mars Gaming ML360
    MB: Asus X570 Strix-F
    RAM: 2 x 8GB - Patriot Viper 4400Mhz CL19 (3600MHz CL16)
    SSD (Nvme): 500GB + 1TB
    PSU: BitFenix Whisper 650W+ Gold
    CASE: Bitfenix Nova Mesh SE TG
    INTEL 10/11th Gen PC USED for CPU testing:
    CPU Cooler: Artic Freezer Penta 33
    MB: MSI Z490 Tomahawk
    RAM: 2 x 8GB - Patriot Viper 4400Mhz CL19 (3200 MHz CL14)
    SSD: Kingston V300 120Gb
    HDD: Seagate Barracuda 2Tb 7200 rpm
    PSU: Seasonic SIII 650W Bronze
    CASE: Aerocool One Frost Edition
    INTEL 12th Gen PC USED for CPU testing:
    CPU Cooler: Mars Gaming ML360
    MB: Gigabyte Z690 Aorus ELITE (DDR4)
    RAM: 2 x 8GB - Patriot Viper 4400Mhz CL19 (3600 MHz CL16)
    SSD: 240GB
    SSD (Nvme): 500GB
    HDD: Seagate Barracuda 2TB 7200 rpm
    PSU: Aerocool X-Strike 800W 80+ Silver Semi Modular
    CASE: Phanteks P400A White
    AMD AM5 PC USED for CPU testing:
    CPU Cooler: Enermax Aquafusion 360
    MB: Asrock B650E Taichi
    RAM: 2 x 16GB - Flare X5 6000MHz CL32 (6000 MHz CL30)
    SSD (Nvme): 1TB
    PSU: Enermax Revolution D.F.X 1050W (80+ Gold)
    CASE: Enermax StarryKnight SK30
    My Main PC (used for GPU testing):
    CPU: AMD Ryzen 7 7700X (5.4GHz Static Overclock)
    CPU Cooler: Noctua NH-D15S
    MB: MSI X670E Carbon Wifi
    RAM: 2 x 16GB - Flare X5 6000MHz CL32 (6000 MHz CL30)
    SSD (Nvme): 500GB + 2TB + 2TB + 4TB
    PSU: Corsair RM750X (White) 750W 80+ Gold
    CASE: Thermaltake View 51 TG
    #amddrivers #nvidiadrivers #ancientgameplays
  • Věda a technologie

Komentáře • 406

  • @AncientGameplays
    @AncientGameplays  Před 2 dny +35

    Hope you like this video and give a BIG thumbs up (and share if possible) and it took A LOT of time to test everything properly :D

    • @InternetListener
      @InternetListener Před 2 dny +2

      Very interesting topic. The overhead thing as you said, but also the reality of benchmarks results for years, is much less important than other factors. One needs to be very sure of hardware being the culprit to a lose in perfromance, there is so much more on software side to be "profiled" with the proper tools... Is a pitty no 4K is tested (as it is not important, per se, in enhancing the "cpu bottleneck" desired scenario, but with RT, upscaling and frame generation is a case use that makes also more intense CPU use as expeted, usually multithreaded loads, but outside the Directx 11 or older and OpenGL, differences are enough to be measured and surprising). But the 5600x (or 5600), 7500f and 7800x3d are very nice steps, representing the desired levels of performance in 2024, almost affordable for the majority of gaming comunity. Adding 4K makes possible to infere that the only current cpus capable of keeping up with 4K FPS out of the future RTX 5090 will be 7800x3d/7950x3D and probably 12700k to 14900ks (8 pcore no ht no ecores I guess)... as FPS at 1080/ FPS 4K with 4090 is most of the time lower than 170% when it should be around 400%. Future AMD with high memory bus could be surprisingly good making good use of DDR5 in >128bit bus setups. Productivity tests could have added something to say as also bound to CPU usage in different ways... thank you for trying new interesting topics...

    • @mleise8292
      @mleise8292 Před 2 dny

      What happened at 7:40? I wonder if setting the "Power management mode" to "Prefer maximum performance" changes that. Just curious.

    • @ssj3mohan
      @ssj3mohan Před 2 dny +1

      You should test in really slower CPU like mine . I have I7 4770k , and man its impossible to play BF2042 . And a Ryzen 5600x its way faster than mine so your test will not work . If you really want to prove the Nvidia sucks with slower CPUs you must get Slowers CPU .

    • @KOT-ANGRY
      @KOT-ANGRY Před 2 dny +1

      Speaker centres, pull out with tualet paper tube and your mouth... 2 years repeating 🤣🤣🤣

    • @lupuradu841
      @lupuradu841 Před 2 dny

      What about with an intel cpu and amd gpu?

  • @100500daniel
    @100500daniel Před 2 dny +117

    Radeon isn't so bad when you ain't got a Redditor in your ear telling you it sucks

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +16

      True hahah

    • @ajsrf
      @ajsrf Před 2 dny +10

      I usually find the opposite is true, people on Reddit love to fanboy all over radeon gpus

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq Před 2 dny +9

      ​@@ajsrfin the AMD subreddit. I don't think PCMR does.

    • @GrainGrown
      @GrainGrown Před 2 dny

      You suck.

    • @supertrexandroidx
      @supertrexandroidx Před 2 dny +6

      @@JohnDoe-ip3oq Right, it's not difficult to find fanboys for both. Personally, I have as much loyalty to either company as they do to me as a consumer - which is to say none. I'll buy whatever I believe is best for me at the time, and I don't care who makes it.

  • @humayoonxpro3220
    @humayoonxpro3220 Před 2 dny +53

    0:28 There's a fly here - fabio 2024
    also that clip is hilarious 😂🤣

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +13

      hahaha, good to make people laugh

    • @_TrueDesire_
      @_TrueDesire_ Před 2 dny

      Flies and mosquitoes is pure terror when I'm trying to sleep 😢😂

  • @tomthomas3499
    @tomthomas3499 Před 2 dny +39

    This confirmed what my friend experienced on MFS, he used 2070 with 3700x i warned him that MFS benefit more with CPU upgrade since it's a sim game but he wouldn't listen and upgraded to 4070, and all he get was stuttery experience.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +9

      Yup, amd works much better there

    • @Iceman-hb3uk
      @Iceman-hb3uk Před 2 dny +2

      You could tell him to lower LOD settings (there are 2 but I don't remember the exact name), put them both at 50 and he'll have butter smooth gameplay.
      Obviously the details on the ground will be drastically lower, but it's the only way to get a smooth experience.
      Even people with a 12900k (fastest CPU at the time of the post in the mfs forum about graphics optimization) would struggle to mantain anything over 100 in highly detailed cities, I'm still getting bad 1% and .1% lows with a 5700X3D so I use 80 for both.

  • @NowThatsPCGaming
    @NowThatsPCGaming Před 2 dny +17

    I actually discovered this on my own. My friend just recently bought a 4080. I have an xtx. We wanted to swap GPUs to see the differences. I have a 5800x3d, and he had a 7800x3d. Comparing the data, I was getting around 10% less performance on the 4080 with my 5800x3d then he was with his newer 7800x3d, while he was getting the same performance as me with the xtx, but was getting far better performance with his 4080. I didn’t quite no what to call it, but now I do lol. Very interesting how that works. I guess it’s another reason why I love AMD. I’m sure if AMD does release a high end 8000 series GPU, the 5800x3d would still handle it really well.w

  • @XiaOmegaX
    @XiaOmegaX Před 2 dny +15

    Here's a fun cpu test that might be worth adding: make a reactor explode in Hardspace: Shipbreakers. it turns larger ships into thousands of physics items/debris.

  • @chadfang2267
    @chadfang2267 Před 2 dny +5

    always love seeing my CPU (7500f) in your videos.

  • @adammartin4906
    @adammartin4906 Před 2 dny +9

    You just cannot get Important gaming information like this anywhere else!!! Thanks Fabio for working so hard and bringing us this important information that keeps us all well informed so we know exactly what CPUs will work best with our hardware👍

  • @amz212
    @amz212 Před 2 dny +27

    ive seen someone using a 12400f on 7900xt and using 1440p the gpu was never below 97-99% but on 4070 the gpu was 86% to 96% usg. so weird

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +22

      exactly, the overhead was there

    • @reviewforthetube6485
      @reviewforthetube6485 Před 2 dny +3

      @amz212 yep been watching the same from another channel with the 4080 super with 12400f same thing 97%+ at all times at 1440p to 4k ultra. It's just that games being played there now are 80% reliant on the gpu and now budget cpus have enough cores. Thats really what it is. Has nothing to do with anything else lol

    • @doctorvengeance1
      @doctorvengeance1 Před 2 dny +2

      i use i5 10600kf with rx 7800xt 4k for story games and 1080p for fps gaming

    • @amz212
      @amz212 Před dnem

      @@doctorvengeance1 solid build bro

  • @gruuli
    @gruuli Před 2 dny +4

    Let's hop into it, pretty exited for the results!

  • @debnadaebna9981
    @debnadaebna9981 Před 2 dny +5

    Good job, Fabio, that was helpful!

  • @antondovydaitis2261
    @antondovydaitis2261 Před 2 dny +7

    Thank you for your discipline, focus, and presentation.
    Useful.

  • @overclockingorobsolete
    @overclockingorobsolete Před 2 dny +14

    Been waiting for that video, late night uploads never miss :D
    I've tested only cs2 on my "channel", great to see a full set of games being tested by you

  • @pju28
    @pju28 Před 2 dny +4

    I appreciate your work and effort! Greetings from Austria 🇦🇹

  • @silvio351
    @silvio351 Před 2 dny +8

    Fabio, astounding work as always, I've been watching benchmarks about the RTX 4070, as some people say driver overhead is visible in some scenarios but not that much. Also, I let you know that thanks to you I went for a 5700x3d as my final upgrade on the AM4 platform, so happy with the performance per watt it provides and runs really cool.

  • @Oceanborn712
    @Oceanborn712 Před 2 dny +1

    Thank you for putting in this work. I have been primarily using AMD cards for quite a while now but I had no idea about this part. Guess it's not very "first hand" valuable knowledge to me as I'm running high end CPUs in most of my systems but it's still really good to know!

  • @pshymon60
    @pshymon60 Před 2 dny +4

    Thank you for including PUBG in the CPU benchmark

  • @beirch
    @beirch Před 2 dny +6

    Very nice concept for a video, and this is exactly what I've been telling people when recommending the 7900GRE over the 4070 Super.
    *However;* I think you should have considered doing this test with GPUs more towards the midrange, like the 7800 XT/7900GRE and 4070/4070 Super (or Ti/Ti Super), because people who currently have (or buy) an R5 5600X, 7500F, 7600 etc, are not likely to buy a 7900 XTX or a 4080 Super. It's much more likely that people who already have (or buy) midrange CPUs are also buying midrange GPUs.
    I understand if you chose a high end GPU to eliminate GPU bottlenecks though, but the real world data would have been nice to see with a more midrange setup, if only for consumer agency.

  • @hyprjay
    @hyprjay Před 17 hodinami +2

    Eagerly waiting to see a video on the new update for 24.6.1 i cant wait to hear u read "some games are not designed to launch" "some games may experience slowdowns" lol

    • @ultrase7en
      @ultrase7en Před 9 hodinami +1

      Waiting a new driver too. Could AMD stop releasing the driver for this month because it hasn't found time to finalize the planned fixes ? :(

  • @W3RN3R318
    @W3RN3R318 Před 2 dny +5

    Thank you.

  • @Silent-1983
    @Silent-1983 Před 2 dny +3

    Thank you!

  • @jkeez959
    @jkeez959 Před 2 dny +7

    Nvedia driver overhead is a very old problem and seeing its still not fixed im starting to think nvedia is just relying on cpus becoming faster to make it irrelevant

  • @_RedsTech_
    @_RedsTech_ Před 2 dny +21

    So if you value 1% lows... which most do... and you are still on AM4. Go AMD for your GPU.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před 2 dny +5

      Literally doesn't apply to any scenario except when you're CPU bound. In that case instead of getting an inferior GPU, just upgrade to a better CPU which is so easy on AM4.

    • @RuslanDoman
      @RuslanDoman Před 2 dny +2

      @@DragonOfTheMortalKombat Да?
      И на что же поменять 5600x?
      На оверпрайс 5800x3d?

    • @xjax8519
      @xjax8519 Před 2 dny +2

      @@RuslanDoman Жди новых Ryzen 5800 XT )))

    • @RuslanDoman
      @RuslanDoman Před 2 dny +2

      @@xjax8519 Да,слышал об таком.
      Думаю перейду на него если цена буде адекватной - как не крути а 8 ядер есть 8,должны и частоты поднять.

    • @insertnamehere4419
      @insertnamehere4419 Před 2 dny

      Nonsense.

  • @fdcendm
    @fdcendm Před 2 dny +2

    i love you, thanks for adding cs2

  • @Whatever1337-e8l
    @Whatever1337-e8l Před 2 dny +4

    Would've been intresting to see helldivers 2 results in this comparison.

  • @robertmyers6488
    @robertmyers6488 Před 2 dny +3

    Thanks Fabio. I do wonder if the memory bus has a role in the 1% lows. Might be interesting to compare the Super to the 7900 GRE where the bus is the same.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      Not really as the resolution goes higher the 1% lows get higher as well

  • @infinity2z3r07
    @infinity2z3r07 Před 2 dny +34

    My guess about the driver overhead is it has something to do with Nvidia using less VRAM. The CPU has more decompression work. But I'm just a gamer not a software engineer

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq Před 2 dny +26

      No, this started with Kepler and Nvidia removing the hardware scheduler to save power. Which really doesn't passed on to the CPU, but games were more single threaded back then, and Nvidia saved die space. The biggest issue was performance was all driver optimization, and Maxwell caused Nvidia to stop optimization for Kepler. Instant dead card. Immediately sold my 780 for r9 290 on Witcher 3.
      Nvidia has ultimately optimized dx11 like they optimized the FX, cheating. It's not native code, they're recompiling with their own tech to meet the output. DXVK is similar. Dx11 is like java, and Nvidia basically has their own run time.
      The rtx cards were the first cards to match AMD on hardware support, but the dx11 path has never lost the hacks. This can improve the performance when not CPU limited, but in general now is a bad idea.
      IDK how much is still CPU reliant, but the GTX cards didn't support async compute and did it in software. At best there was no performance gain.
      Nvidia has sold users sub par hardware for years and hid deficiency with driver hacks. This has now backfired even with new hardware, because they're running over legacy cruft, and needs a rewrite. AMD has rewrote their drivers since Windows 10, and RDNA2 and 3, and drops legacy support like GCN. Combine that with hardware designed for new API over legacy support, and AMD is just more efficient. People complain when AMD changes things, because it breaks other things, but overall you get better efficiency without cheating and minimal effort. AMD doesn't do Gameworks, they do general purpose and open source.
      The VRAM issue is also worse on Nvidia because this is driver software running over the game, like AI and frame gen. It's not part of the games memory management, and causes memory thrashing? So like the game allocates 8gb, which is fine for 8gb, but the AI effects stack over. You can't do both without having more RAM. Games therefore have to compensate, and devs are effectively telling Nvidia to shove it for 8gb, you get no AI memory because that is the minimum, but you can use 12 for a 10gb game.

    • @pf100andahalf
      @pf100andahalf Před 2 dny

      @@JohnDoe-ip3oq ^ this

    • @spacechannelfiver
      @spacechannelfiver Před 2 dny +8

      You guess wrong. It's to do with architectural changes AMD made for PS4, you program much closer to the metal on Radeon than you do on Nvidia; Nvidia pulled a lot of the scheduling out of Silicon into Userspace as DX11 was dominant on PC; whereas AMD put a lot of of focus on that (via Mantle and then Vulkan and to a lesser extent DX12) so that consoles could punch above their weight.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq Před 2 dny +3

      @spacechannelfiver AMD didn't do it all for console, they sold those awful FX CPUs on PC, and they didn't have the budget or talent to write custom scheduling for every game. GCN was a one size fits all, pc, laptop, console, workstation. Microsoft also changed the windows 10 driver model to better utilize AMD. GCN did end up faster than Pascal, according to HUB. Nvidia knew games were single threaded, and removed scheduling to save power and die size to increase shader count. This also ties into planned obsolescence via driver support, Maxwell instantly killed Kepler with 960 128bit 2gb beating a 780 in Witcher 3.
      Ultimately backfired when modern games needed CPU and the CPU can't keep up with hardware scheduling. Nvidia's dx11 support is too crusty and needs a rewrite at this point. It's no longer a hardware limitation afaik.

    • @elderman64
      @elderman64 Před 2 dny +10

      ​@@JohnDoe-ip3oq Optimisation isn't cheating, get that delusional shi out of here. Making games and apps need less hardware to output good performance is what everyone "should" he doing, none of that "oh my amd has better hardware they just dont cheat like nvidia with optimisation"
      This is beyond just delusional and empty headed

  • @winstongraham944
    @winstongraham944 Před 2 dny +1

    What hardware monitoring software are you using in your benchmark videos?
    I love how it looks.

  • @R3YVAH
    @R3YVAH Před 2 dny +1

    you're on time with this video, I just managed to change the video card about a week ago to a 4070ti super os and I don't understand what's going on with the computer, I had to increase the cache because in some games it just started throwing out without an error

    • @R3YVAH
      @R3YVAH Před 2 dny +1

      of course, I understand that the 5600x is not super suitable here, but it is still not so critical

    • @UniverseGd
      @UniverseGd Před 2 dny

      ?

    • @R3YVAH
      @R3YVAH Před 2 dny

      @@UniverseGd I just had a 7800xt before that and everything was fine and the 5600x did great with this video card but it just stopped working due to a defect and I got a refund and I paid extra for the 4070ti super. And now since I have this video card, strange things are happening with my computer

  • @knubivandubi5797
    @knubivandubi5797 Před 2 dny +1

    nice test, would love to see that test with Intel Cpu|s too :)

  • @seamon9732
    @seamon9732 Před 2 dny +14

    So Nvidia still has trouble when CPU limited in DX12 titles.
    They really need to work on their optimization and/or future hardware.

    • @Sam-cq9bj
      @Sam-cq9bj Před 2 dny +6

      It's not gonna fixed soon. The Linus tech tips also noted this issue way long, he basically said, "Nvidia has a driver overhead head issue, because Nvidia's gpu drivers are smart/ or do lots of tasks by itselfs in the background.Like it automatically lowers the vram usage, it automatically lowes the game visuals for better performance." AMD and Nvidia gpu have a quality difference in game rendering. Amd gpus load all things on the screen at the same time. There is no instruction in the driver.

    • @AdrianMuslim
      @AdrianMuslim Před 2 dny

      ​@@Sam-cq9bj Nvidia is gay 👎

    • @sysbofh
      @sysbofh Před 2 dny +1

      This isn't new. Years ago (way before COVID) , when crypto really took off, miners already knew this: If You used an NVidia for mining crypto, one of your cores would be pegged at 100% - doesn't matter what CPU it was. And didn't matter if it was a fast or slow one: it was a timing issue, and it got pegged at 100% either way.
      If You were mining on an AMD GPU, you would see almost zero CPU usage.
      By the same token, I used some CUDA thing years later (work related), and saw the exactly same problem: one core pegged at 100%. This time I knew what it was (a known "feature" at the time): the darn thing (drivers) kept pooling the GPU (are You ready? are You ready? are You ready? and so on), to see if it was free to get new work. Looks like the idea of the GPU telling the CPU when work is finished is too advanced...

    • @insertnamehere4419
      @insertnamehere4419 Před 2 dny +2

      Sensationalism bordering on nonsense. My 7800X3D and 4080 run like a dream. Literally any AMD GPU would be a downgrade, in any scenario.

    • @seamon9732
      @seamon9732 Před 15 hodinami

      @@insertnamehere4419 #Denial is quite the drug. And that's coming from someone with the same CPU and a 3070...

  • @_overchaos_
    @_overchaos_ Před dnem

    Great test. What do you think, how much slower (compared to the two tested here) should a GPU be so that GPU bottleneck would essentially render driver overhead irrelevant? Would we see any driver overhead issues with R5 5600X if the two tested GPUs were RX 7900 GRE and RTX 4070 Super for example?

  • @0M3G486
    @0M3G486 Před 2 dny +1

    I would have liked to have seen a target fps like 60, 120 or 144 (close to the max the gpus can do bit limited) and check cpu usage (across cores) to better check the overhead each manufacturer has.

  • @jerryknudsen7898
    @jerryknudsen7898 Před 2 dny +1

    One thing I noticed with GPU optimization over time, I think watching Intel Arc improve gave me a glimpse at how "fine wine" worked. They explained in one of their driver updates Arc started with incredible driver overhead because the Intel igpu drivers are usually more biased to put as much work as possible on the cpu since in almost all igpu gaming scenarios the igpu will be the bottleneck and the cpu is left almost idle. So the more work we can put onto the cpu, the better overall performance will be in igpu gaming will be. Since the Arc doesn't have that problem though, it's better to recode the drivers to load more cpu work onto gpu since it's strong enough to take the work. I think "AMD finewine" and the driver overhead streamlining is just a dynamic AMD shifts over the life of the product. It makes sense to me as they aren't pulling performance from thin air constantly. They start a gpu generation with an extreme bias towards lower cpu utilization and more gpu utilization and over the life cycle of the architecture and it's drivers they will slowly increase cpu load because newer cpus are expected to be better than the ones present when a gpu generation launched. Most reviewers don't back to the cpu they used in their review years ago, the emphasis is on the gpu. Often reviewers just download early drivers and current drivers and use the best cpu in their arsenal to run the benchmarks. So the initial drivers are trying to compensate for the limitations of say a Ryzen 9 5900x while the new drivers can shift some of that work back because the Ryzen 9 7950x3D is the most likely cpu in the bench. Nvidia though tends to just optimize the gpu to be the most performant possible even if that means the cpu must carry the largest burden, which is why it may be years before we see the limits of the 4090 as the bottleneck is almost always the cpu. They don't shift the drivers during it's lifecycle by much. Does this sound close or am I reading further into information Arc engineers disclosed specifically about their own GPUs?

  • @alanpk0
    @alanpk0 Před 2 dny +4

    Dude
    Few days ago I commented on another video of yours that i wanted a video on this topic. 😅
    Thank you so much!

  • @ahmettaneri
    @ahmettaneri Před 2 dny +1

    MSFS 2020 most Cpu demanding sim I enjoy like others ats/ets2 ac pc2 etc. Last year when I upgraded my old pc from 4670k-1660 super to r5 5600-6700xt. I exactly consider that driver overhead issue and choose 6700xt instead of 3060ti. When I change my 1080p monitor with a big 4k one and upgraded my sim pc again 7800x3d-7900gre. For the same reason I going with all AMD. Now siming smoothly :) Nice video by the way.

  • @VanJ78
    @VanJ78 Před 2 dny +1

    What game is in Intro section looks interesting ? Good video.

  • @TheWayOfTheHeart
    @TheWayOfTheHeart Před 2 dny +2

    Thank you for this excellent comparison, Fabio! I felt something was happening when I tried RTX3060 12G with a 4c/8t CPU. Frame time was very shaky and stuttering. On the Radeon Vega iGPU on my laptop with R7 5850U, the frame time was much smoother, even with a significantly lower FPS. This video confirms my thoughts... When you have time, please do a video about the frame time analysis of Nvidia vs. AMD (RTX 3000/4000 series and RDNA 2/3 Series). From my experience, the frame time on Nvidia is shaky and stuttering, but AMD is smooth as butter. Thank you for including The Witcher 3 in this case study, and please include it in the frame time analysis case study too 🙂

  • @jttheacgnerd
    @jttheacgnerd Před 2 dny +3

    Would really like to know if reBar/SAM is on/off for all the results... Since technically they will lower 1% lows in favor of higher avg fps.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      Both are ON, as they're ON by default nowadays. Also, not really. Sam and rebar usually increase both averages and 1% lows, don't know where you heard that

  • @i_scopes_i3914
    @i_scopes_i3914 Před 2 dny

    The new AMD Ml300 w 8? Stacks of 3D stacked tiles gives me hope for "all" ccd tiles w 3D stacking on it and looking to see what they come up w gen after next GPU

  • @miguelmatias859
    @miguelmatias859 Před 12 hodinami

    Hi Fábio, in your opinion for a user that will play fps titles(the finals, battlefield, warzone, fortnite, xdefiant, apex etc) on low settings 1440p, between 7900 gre and the 4070 super what will bring the better experience (regardless of the price) in terms of higher fps and better 1%lows now, and what will be more future proof for these type of usage? Thanks for all the good videos

  • @vlad_8011
    @vlad_8011 Před 2 dny

    Fantastic video, as always. But i got some strange question - what will happen if you use FPS limiter? Lets say you limit your FPS to 60. And to make it MORE complicated, lets say in one game i use limiter from driver, in some game using RivaTuner, and in some games i use build in game fps limiter.
    You may ask why i would do it. Well, in short - the past. In longer version, since i remember, on i5 4690, i7 6700K, Ryzen 2700X, Ryzen 5800X all the GTX and RTX GPU's were not delivering as smooth image as Radeon do. Thing is, i used 2 different monitors at that time - 60Hz, TN without FreeSync, and 144hz VA LG 32GK650FB with FreeSync. And second monitor was used with GTX 1080Ti, 6800XT, RTX 3080 10GB. I always cap FPS to 60, since i remember. And from all GPU's i remember (R9 270 OC, GTX 1060, GTX 1080, GTX 1080Ti, RX 6800XT, RTX 3080 10GB) i always had... maybe not microstutter, but lack of fluidity on Nvidia GPU's. Is that possible even at 60 FPS? Reason why i used that many FPS limiters, was the way how it worked in times of Pascal and early Turing series. FPs limiter in driver was working sometimes (after alt+tab it was not working, sometimes cap wasnt active at all since launch), Rivatuner was working but i had biggest lag, and in game limiters was giving me best reults, but with GTX 1060 i had to tinker with V-sync to not get teared image, and there was ... adaptive sync i think if i remember correctly, that i used the most. I was thinking it is placebo, when i installed 6800XT for the first time, but hell, its not placebo.

  • @phillybb
    @phillybb Před 2 dny +1

    Looking at this cpu makes a huge difference

  • @spoots1234
    @spoots1234 Před 2 dny +18

    I always recommend AMD to people with like 4th-gen CPUs etc. An extra 10-15% is the difference between meh 33fps and a nice 38fps for people who don't want to spend to upgrade the platform. Kinda impressive how the 10 year old CPU still competes with a ps5 though.

    • @RobBCactive
      @RobBCactive Před 2 dny +5

      Exactly the Hardware Unboxed testing was with much older CPUs.than tried here. Even a Ry5 3600 has console like performance rather than weaker.

    • @xfreejack001x
      @xfreejack001x Před 2 dny +1

      nah.. still prefer the graphics Fedelity of DLSS/DLAA over TAA gurbage anti-aliasing on AMD.

    • @RobBCactive
      @RobBCactive Před dnem

      @@xfreejack001x lol bad CPU and you think DLSS will help. What utter ignorant fanboi bilge.

    • @Ted_Kenzoku
      @Ted_Kenzoku Před dnem

      @@xfreejack001x amd has FSR too. and dlss really doesn't matter on budget setups, you don't want to enable ddlss on 1080p under 60fps anyway it will look like shit. framegen is even worse at low fps

    • @xfreejack001x
      @xfreejack001x Před dnem

      @@Ted_Kenzoku you can set game resolution to 1440p then use DLSS on 1080p Monitor.. also there are mods that enable native 1080p DLAA. i've only had problem with only few older DLSS games that doesn't support DSR.. also AMD's FSR is worse than native TAA.. it will also be gurbage even after 3.1 update compare to DLSS.

  • @timbelcher8992
    @timbelcher8992 Před 2 dny +7

    My brother has a 4080 super and a 7800X3d. I have a 7900xtx and a 5800X3D I've had to tell him multiple times the reason he's getting worse performance is bacause of nvidias cpu driver overhead.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 2 dny +5

      7900XTX in general still faster than 4080 Super. There might be some driver overhead at low resolution like 1080p or even 1440p but at higher res 7900 XTX will always be faster in general.

    • @mihailcirlig8187
      @mihailcirlig8187 Před 2 dny

      ​@arenzricodexd4409 I'm thinking of building a Desktop and I think the best cpu to get will be the new 9800X3D which seems to release next month but for the GPU I'm not sure, DLSS is pretty damn good yes but I don't even play many newer games and I really like the 7000 AMD series, Nvidia is also better in Ray Tracing but even this is not a feature that I really like, tried it many times and it looked really really good in Metro Exodus Enhanced Edition. Maybe I'll go full AMD build

  • @rtyzxc
    @rtyzxc Před 2 dny +1

    Is this with hardware accelerated gpu scheduling on or off? Would turning it on help a low-end cpu on Nvidia?

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +1

      On as it is by default

    • @martinjennings299
      @martinjennings299 Před dnem

      My 5800x3d with 4070 would stutter in some games, once I switched HAGS off all stuttering went away!

  • @Cole_Zulkowski
    @Cole_Zulkowski Před dnem

    Hey fabio the 24.6.1 driver came out along with a new driver from nvidia as well

  • @tapioorankiaalto2457
    @tapioorankiaalto2457 Před 2 dny +1

    I have a FX-8350 in my older desktop - so a really weak CPU, so I've been looking to switch the GTX 1080 I have in it now to a Vega 64 or something similar but old enough to have a DVI port for my old LCD display.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      That's not really weak, that's bottom of barrel... even a r5 1500 will be faster

    • @Ted_Kenzoku
      @Ted_Kenzoku Před dnem

      just buy a displayport to dvi adapter, it costs nothing and you will have way more freedom in choosing a new gpu
      and you really need a new cpu, your gtx1080 is not the problem in this setup

  • @ZKWardian
    @ZKWardian Před 2 dny

    And how about a comparison but in both cases installing the drivers without telemetry? either with the minimum amd configuration, or NVClean install

  • @ShinyHelmet
    @ShinyHelmet Před 2 dny +6

    If you catch that fly using chop sticks, then I'll be impressed. 😆

  • @Lemoncit0
    @Lemoncit0 Před 2 dny +1

    Going to guess that the games that are less affected by this phenomenon are single threaded apps like CS:GO meaning that other cores can take care of background apps without hurting performance. While other games like the Witcher 3 try to use as much of the CPU as possible resulting in a notable performance difference.
    I am still rocking an r5 3600 so this made a huge difference for me when upgrading from a 3060 to a 6700xt. Cyberpunk changed from sub 60 in some areas to above 60 all the time

  • @ricardofernandes8983
    @ricardofernandes8983 Před 2 dny +2

    por aqui com um r5 de primeira geração uso normalmente uma rx570 e tive a oportunidade de usar uma RTX3060 por uma semana e a diferença é claríssima, muitos jogos ficaram injogaveis com stutering. Testei travar os fps no riva e mesmo assim não tinha a estabilidade que tenho na muito mais lenta rx570 mesmo numa instalação limpa do sistema.

    • @GoMArkkrAM
      @GoMArkkrAM Před dnem

      Tem placa ruim, a 3060 nem sei que nome dar! HORROROSA

  • @Nintenboy01
    @Nintenboy01 Před 2 dny +1

    I experience severe performance loss in RoboCop in the level where you fight the waves of droids. It doesn't appear to be fully utilizing either the GPU or CPU

  • @tek1645
    @tek1645 Před 2 dny +2

    Seems like it's not a big enough factor to matter in swaying GPU purchase. Good to know 👍

  • @CrashPilot1000
    @CrashPilot1000 Před 2 dny +1

    Thanks man! It seems that Nvidia could actually learn something from AMD concerning the 1% lows especially on slower cpu's. BTW If you still have the setup on the bench take one game with the biggest difference and use Afterburner (or the ingame engine if possible) to lock the FPS to let's say 80 fps (on amd and nvidia) and then read out the cpu usage to see the difference. Then lower the MHZ on the cpu until the FPS fall below 80 FPS and you can even say how much more MHz are needed in that game on that platform(cores etc) to make nvidia run as fast as amd. Just an idea.

  • @briananeuraysem3321
    @briananeuraysem3321 Před 2 dny +1

    Wonder how this changes with hardware accelerated GPU scheduling turned on and off

  • @Jomenaa
    @Jomenaa Před dnem +1

    Me again with my GTX 1070Ti, R5 3600 and 2x8GB RAM @ measly oc 3200MHz CL16. Some games do emit random stuff when looking and gpu and cpu usages but sometimes that might just be my quick and easy RAM oc, should study more about it... BUT I think in some cases this might be happening for some sudden gpu usage drops in games which use my GPU and CPU combo quite evenly (Newer Battlefields, CS2, some other games which I can't now remember). It's like I'm locked at 144 and get sudden drops to 120, 100 or maybe even lower and my usages are going wild or just dropping while waiting for... something? Not in all cases but some my cpu sees higher usage when gpu drops. And in some cases games don't fully utilize my CPU so some threads are pegged as gpu usage drops.
    I'm due for a Windows format tho, might finally upgrade to W11 too as I've gotten more used to on laptop/at work :) This might clear out some issues as my Windows install is many years old and I like to tinker and change stuff around so something might get borked along the way xD
    End of ramble, 10 pts to anyone who read thru xD

  • @siege4768
    @siege4768 Před 2 dny +1

    Looking good Fabio! You been working out?

  • @WaspMedia3D
    @WaspMedia3D Před 2 dny +1

    Results: 7900XTX > 4080super. But more seriously, good to see that Nvidia has somewhat addressed this issue - the earlier comparisons on this topic (by HW Unboxed I think) showed the CPU bottleneck much more dramatic - but that was a couple years ago.

  • @hocine9664
    @hocine9664 Před 2 dny +2

    hi my friend and again happy to have your video always good and nice to watch !!u are as Batman late video hhhh!!take care and always glad be parte of your chanel

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +1

      Big Hug Hocine 💪💪💪

    • @hocine9664
      @hocine9664 Před 2 dny +1

      @@AncientGameplays thanks dear friend always great to have see your videos!good night and Hugs

  • @pasha715
    @pasha715 Před dnem

    it looks like sony got 3,1 fsr on some of their games, you plan to test it soon vs 3.0?

  • @ajsrf
    @ajsrf Před 2 dny +1

    Irmão, can you please share a guide to fix AMD issues with fallout games? Wanted to play them after watching the new Fallout TV shows but it just crashes fallout 3 and new vegas

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +1

      Next driver fixes them. Comming ou tomorrow I believe 💪

  • @magnusnilsson9792
    @magnusnilsson9792 Před 2 dny +1

    Nice, how about Intel GPU and/or CPU?

  • @laserak9887
    @laserak9887 Před 2 dny +1

    Maybe worth checking games that are nearly completely CPU reliant? (Oxygen Not Included, Stellaris etc)

  • @gstormcz
    @gstormcz Před 6 hodinami +1

    4080S has higher avg fps than 7900xtx in some of those games, where seemingly driver overhead hitting 1 percent lows. I would not call it driver overhead but rather weak cpu performance in some cpu taxing game settings or choke gpu - running overclocked fully utilized it does such things like big dips in 1 percent lows.
    There is one likely cause of 1 percent lows
    favorizing AMD at 4K - having 24GB VRAM. But finally also mature optimizations for L3 cache. It was doing great on Rdna2 already.
    I am not really sure what part weak cpu has on AMD advantage, but 7900xtx really seems to perform better.
    Great benchmark

    • @AncientGameplays
      @AncientGameplays  Před 6 hodinami

      It cant be called weak cpu performance if the same cpu on another gpu runs better

  • @g2amarketplace506
    @g2amarketplace506 Před 2 dny +2

    7:55 1% low is too low with Nvidia gpu.
    According to old reviews, Nvidia gpu was 50% faster than AMD GPU on old DX11 games if CPU is bottleneck.
    Thanks for making this video. I love driver performance video.

  • @jeremiaskorndorfersouza4922

    Os olhos estralados do personagem olhando o mosquito me lembrou muitas noites com uma orquestra de mosquitos, sempre que eu apagava a luz.

  • @RobertJianu
    @RobertJianu Před 2 dny +2

    Is it true that AMD has poor GPU drivers? Whenever I mention that I want to purchase a 7700 XT or a 7800 XT, almost everyone says that they have poor drivers, you experience stutters for no reason, and you occasionally get a black screen. I've never owned an AMD graphics card, but the prices are very competitive compared to Nvidia. Also, between a standard 4070 (not the super or Ti versions) and a 7800 XT, which is the better choice? I only play games at 1080p, and the prices are the same in Romania. Thanks!

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +4

      Most people don't even know the basic of pc stuff and then complain. Just stay away from cheap models, and believe me, 99% are user related issues

    • @Audiosan79
      @Audiosan79 Před 2 dny +2

      That driver thing is a decade old problem AMD had, and that Nvidia fanboys still obsessing over.. There is no more driver issues on AMD today than Nvidia have. Might even be less.
      I have never experienced any form of stuttering or blackscreen or bluescreen or any other funny screens on any of my AMD GPU's. I've got a 6800XT Red Dragon, a 6950XT Red Devil, a 7900XT Hellhound and a 7900XTX Red Devil. Just make sure to DDU your old Nvidia drivers before installing the AMD drivers.

    • @kognak6640
      @kognak6640 Před 2 dny

      @@Audiosan79 Does Windows Update still force install AMD graphics drivers automatically if system detects AMD graphics without drivers? If I was DDUing graphics drivers, I'd either disconnect internet or disable the autoinstall feature for first boot after DDU until proper drivers are installed(pre-download ofc). Issue is if user installs (different)drivers at the same time, you very likely create a mess.
      Obviously this is not an driver issue whatsoever and automatic driver installs are useful for vast majority of people. Just something to consider if DDU is used.

    • @warnacokelat
      @warnacokelat Před 2 dny

      @@kognak6640 that stopped happening since long time ago. With win10 I think?

    • @Ted_Kenzoku
      @Ted_Kenzoku Před dnem

      if anything they are better than nvidia's. especially in the long run. i've had AMD gpus for more than a decade now and never had a problem. each time i want to upgrade amd gpus happen the be the better value for money at the time. which is why i go with them

  • @timinamegames
    @timinamegames Před 2 dny +1

    So does it possible pairing low end(budget CPUs ---r5 5600X) with high end AMD cards makes massive performance boost in BR games ??? I only play PUBG, that's why I'm asking !!!

  • @lanelesic
    @lanelesic Před 2 dny +3

    This is way more obvious with older quad core cpus/xeons(other cpus) with limited single core performance and lower/mid range gpus.
    Pairings like i7 4770 and RX6600xt / 3060 12GB and R5 1600 or E5 1650 v3 with an RX 6700xt/3060ti.

  • @laserad
    @laserad Před 2 dny +1

    Hey Fabio, AMD is suffering from a similar affliction with dx11 games. I used to run 9700k and it was bottlenecking my rx6800 in older titles, especifically ubi games when cpu limited but giving 30% more fps while using vulkan.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny

      That's not because of the drivers, vulkan helps because the cpu is slow mostly ans the games are poorly coded

    • @laserad
      @laserad Před 2 dny

      @@AncientGameplays Probably right. Do you reckon Vulkan would help with framers when using Nvidia and a slow cpu?

    • @NathanBinks
      @NathanBinks Před 2 dny

      I think it's due to Vulkan uses 8 cores for drawcalls like dx12. DX11 uses 1 or 2 cores. The reason GTA 5 gets good FPS is because they use command buffer built into DX11 which improves the FPS by a lot. 90% of games with DX11 don't use the command buffer. I can't remember how to see if DX11 buffer stuff but look it up on Brave Search or Bing or Whatever.

  • @daLiraX
    @daLiraX Před 2 dny +1

    nvidia is also a major anomaly in latency right now, solely adding huge spikes in the driver kernel.

  • @halistinejenkins5289
    @halistinejenkins5289 Před dnem +2

    any chance of a similar video with intel cpus?

    • @AncientGameplays
      @AncientGameplays  Před dnem

      Don't have recent Intel ones

    • @halistinejenkins5289
      @halistinejenkins5289 Před dnem

      @@AncientGameplays owell...i'm a sucker for cpu scaling. got yourself a new sub. i appreciate your enthusiasm.

  • @mahmudjami8074
    @mahmudjami8074 Před 2 dny +1

    So this basically means if I pair my R5 3600 with a 7900XT, it'll perform better than the 4070Ti Super? Am I right in coming to this conclusion?

    • @AncientGameplays
      @AncientGameplays  Před dnem +1

      that card would perform on par with a 4070Ti Super (sometimes better), so yes, it will be better with a 3600. I would just upgrade to a 5700X3D asap though

    • @mahmudjami8074
      @mahmudjami8074 Před dnem +1

      @@AncientGameplays got it, thanks a lot. I'll try to sneak in a 5700X3D too though, it's just I'm focused on getting the GPU first. Cheers!

  • @AshtonCoolman
    @AshtonCoolman Před 2 dny +1

    AMD has had Hardware Command Processor since the ATI HD 2900 XT. Nvidia never implemented that in hardware.

  • @thseed7
    @thseed7 Před 2 dny +2

    👍

  • @sapphyrus
    @sapphyrus Před 2 dny +1

    There is indeed an overhead difference, though in real life unless one pairs a 5 year old 200$ CPU with a brand new 1000$ GPU and then somehow use that on a 10 year old monitor, it won’t affect anything. Since extremely heavily CPU-bound games are a rarity, this shouldn’t affect purchasing decisions when it comes to brand but rather shows one shouldn’t keep a very old platform and tack on new expensive GPUs.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +1

      exactly. When you're using 4K DLSS performance though, you're technically running at 1080P and people forget that

  • @jeffreyrodriguez9009
    @jeffreyrodriguez9009 Před dnem +1

    Interesting results because of the folktale that AMD has poorer 1% lows. Now that could be put to rest.

  • @robertvizler749
    @robertvizler749 Před 2 dny +1

    00:50 Which game is this?

  • @gruuli
    @gruuli Před dnem +1

    DING DONG mr. Fabio, there is a new driver incoming (Adrenalin 24.6.1) please read the instructions for
    use or consult your doctor or pharmacist, we await your statement (the video of course) Greetings! :D

  • @nn-bw8nf
    @nn-bw8nf Před 2 dny +2

    Heyoo, Do you know why AMD doesent give us driver often like they did?

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      they're working on other things I guess. New driver pretty soon though

    • @subhajitrouth5379
      @subhajitrouth5379 Před 2 dny +3

      Atleast now they not screwing the drivers...From my experience I am really happy they are not giving new frequent driver updates.😂

    • @raresmacovei8382
      @raresmacovei8382 Před 2 dny +3

      They're preparing the software side for RDNA4
      They're preparing WDDM 3.2 for stable channel when used on Windows 11 24H2
      They're working in FSR 3.1.
      Tons of updates for a lot of stuff on GPUOpen
      Getting Anti-Lag 2 integrated in mores games
      Working (hopefully) on reintegrating Anti-Lag+ back in drivers
      Further improving AFMF maybe
      Further improving AMF for video quality and speed (AVC/HEVC/AV1)

    • @aaronsomler9423
      @aaronsomler9423 Před 2 dny

      @@raresmacovei8382 dont forget about zen 5 which comes out next month

    • @arx117
      @arx117 Před 2 dny

      Because AMD has small employee, they have a lot works to do on the list.
      Meanwhile Nvidia has plenty of dev and they can relocate every person to each team to do software stuff

  • @galinhadopai
    @galinhadopai Před 2 dny +1

    Deveria ter medido o uso de Cpu man, de preferência com fps travado a 60 ou outro, pra ver somente o uso de Cpu pra ver em qual era maior

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +1

      Isso é falado e mostrado um pouco, mas não lado a lado. Podia ter feito um pequeno teste sim

  • @adnelortiz
    @adnelortiz Před 2 dny +3

    5600x is still too strong of a CPU for the situation to manifest. It is actually just scaling. HUB did this, and some results were mind-blowing.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +5

      I know, I explain all in the intro though

    • @ur_pore
      @ur_pore Před 2 dny

      Still can see improvements on a micro level for total system latency, which some of us care about greatly. So regardless of being gpu bottleneck I'll always upgrade.

    • @puffyips
      @puffyips Před 2 dny

      I disagree with your first statement though, 5600x holds back even mid range gpus in certain scenarios so high end is definitely bottlenecked by it

    • @Ted_Kenzoku
      @Ted_Kenzoku Před dnem

      @@puffyips it barely bottlenecks very high end gpus of today and only in some games. and also only in specific usecases, low res high fps gaming

  • @larsenmats
    @larsenmats Před 2 dny +1

    What's also interesting, looking through these results are that the 5600X actually is not a huge bottleneck VS AM5 7500f with high end cards like RTX 4080 Super and RX 7900 XTX. 5600x is still a very capable CPU. And very cheap. I wouldn't pair the 5600x with any of these GPUs personally, but I still think it could have done much worse.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny

      True, but as soon as you OC the 7500F it gets much faster. Look here: czcams.com/video/_MK0wO7MXuc/video.html

  • @young-j731
    @young-j731 Před 2 dny

    Is Smart Acess Memory dead ? Will they improve it one day to make AMD CPU/GPU combo better ?

  • @HoodHussler
    @HoodHussler Před 2 dny

    I guess I won’t be upgrading my 7600X CPU anytime soon unless the 9600X delivers way better performance then I will

  • @saminder_singh
    @saminder_singh Před 2 dny

    can anybody tell me which controller to buy xbox controller or the ps5 controller?

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 Před 2 dny

      Game developers usually use the Xbox controller as a standard, which is why it has better support on PC in general. In any case, most the features of the PS5 controller will be lost (If the motion sensor is important to you, for example using it in emulators like Yuzu, then go for the PS5 controller). Personally, I prefer to buy third-party controls that have things like extra buttons, they help a lot to better customize your PC experience or even to use macros.

    • @brunogm
      @brunogm Před 2 dny

      there are hall sensor based controllers for pc that emulate xbox.

  • @txkingg
    @txkingg Před 2 dny +1

    Bro should test escape from Tarkov lol

  • @evacody1249
    @evacody1249 Před 2 dny

    So what should I get a 5600x to go with my 6700xt.

    • @UKKNGaming
      @UKKNGaming Před 2 dny

      Yeah that’s fine I had a 6800Xt with my 5600X.

    • @chronexx69
      @chronexx69 Před 2 dny +1

      5700x and 6800 here 😊

  • @Definedd
    @Definedd Před 2 dny +1

    In some future CPU benchmarks (not fully relevant to this video), would you be able to test games like Stellaris from Paradox to compare how long a year or some length takes to reach on each CPU? Severe lack of that online and it’s a rlly good way to test the CPUs speed (for example on Crusader kings 3 my 5800x3D could complete three months 3 seconds quicker than my 5600 could

  • @PhoticSneezeOne
    @PhoticSneezeOne Před 2 dny +3

    Not a real CPU bottleneck test until you use a FX8350 CPU ... just joking Fabio ;)

  • @SnakeJuice
    @SnakeJuice Před 2 dny +4

    When 24.6.1?

  • @SeeWildlife
    @SeeWildlife Před 2 dny +1

    In short Geforce is Buckfast and Radeon is châteauneuf du pape.

  • @lubossoltes321
    @lubossoltes321 Před 2 dny

    My prefered test would have been to use the 7800X3D on all tests but changing the power limits to make it clock slower ...

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      That would make no sense though as the IPC and cache would help a lot too

    • @lubossoltes321
      @lubossoltes321 Před 2 dny

      @@AncientGameplays it would be a much more consistent benchmark. IPC means nothing if clockspeed is cut ...

  • @DonaldHendleyBklynvexRecords

    Oddly I have an 11th gen I3 with my 7900xtx and at 4k it seems like my CPU never maxs out, Im sure id get more frames with a newer CPU but it almost seems unneeded with AMD

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      Your cpu never being at 100% doesn't mean it isn't bottlecking, but at 4K most games will be fine

    • @DonaldHendleyBklynvexRecords
      @DonaldHendleyBklynvexRecords Před 2 dny

      @@AncientGameplays ahhh I see ok, i was always waiting for 100% usage haha 😄

  • @Pavlobot5
    @Pavlobot5 Před dnem

    Would be interesting to separate the results by DX version

  • @Pixrl3491
    @Pixrl3491 Před 2 dny +1

    Why no one mentions that on amd overall image quality is much sharper and more colorful?

    • @arx117
      @arx117 Před 2 dny +5

      Because AMD color default is 4:4:4. Meanwhile Nvidia default still using old 4:2:2 which is no wide color gamut, you need to change it on setting to 4:4:4 to enable it.

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +1

      That's a thing for another video

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      He means that the display engine is a bit better and it generally is

  • @1967KID
    @1967KID Před 2 dny +1

    Fabio, nice job, gracias de corazon hermano. 💯🫡

  • @PeterPauls
    @PeterPauls Před 2 dny

    To be fair it wasn’t as bad as I thought. My opinion is that ReBar on nVidia GPUs just doesn’t work many times and the interesting results came from when the game is ReBar whitelisted, in the other hand AMD’s SAM works almost every time and it eliminates the CPU overhead or the nVidia overhead comes from the lack of ReBar support. (Just speculation).

    • @Vinterloft
      @Vinterloft Před 2 dny +1

      It's also because Nvidia offloads more texture decompression to the CPU. Like, way more

  • @mohmedkhaled9645
    @mohmedkhaled9645 Před 2 dny

    I was hoping to see 1440p performance as well, just as a reference. Other than that, amazing video as usual!

    • @W3RN3R318
      @W3RN3R318 Před 2 dny +3

      Did you even watch the video, he tested 1080p and 1440p

    • @AncientGameplays
      @AncientGameplays  Před 2 dny +2

      Dude, you didn't even watch and it shows. 1440P was there