DDR4 vs. DDR5, The Best Memory For Gamers? Core i9-12900K, 41 Game Benchmark

Sdílet
Vložit
  • čas přidán 26. 05. 2024
  • Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Intel Core i9-12900K: geni.us/xEZFptS
    Intel Core i9-12900KF: geni.us/hCMbu
    Intel Core i7-12700K: geni.us/8KvA
    Intel Core i7-12700KF: geni.us/ES3BK
    Intel Core i5-12600K: geni.us/sDNeS
    Intel Core i5-12600KF: geni.us/bbH0ALI
    Z690 Motherboards: geni.us/YKTbBaf
    DDR5 Memory: geni.us/LDZZER
    Read this article on TechSpot: www.techspot.com/review/2387-...
    Video Index:
    00:00 - Welcome to Hardware Unboxed
    02:31 - Test System
    03:14 - Assassin’s Creed Valhalla
    03:49 - Battlefield 2042
    04:08 - Call of Duty Vanguard
    04:20 - Counter Strike Global Offensive
    04:39 - Far Cry 6
    05:16 - Fortnite
    05:34 - Halo Infinite
    06:02 - Hitman 3
    06:13 - Microsoft Flight Simulator
    06:25 - PlayerUnknown’s Battlegrounds
    06:44 - The Riftbreaker
    07:16 - StarCraft II
    07:33 - War Thunder
    07:49 - Average
    08:18 - 41 Game Breakdown
    09:36 - Final Thoughts
    DDR4 vs. DDR5, The Best Memory For Gamers? Core i9-12900K, 41 Game Benchmark
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Music By: / lakeyinspired
  • Věda a technologie

Komentáře • 1,1K

  • @Bruh-gx1ey
    @Bruh-gx1ey Před 2 lety +633

    41 games at 3 resolutions! You guys put in so much effort into your content, much appreciated! Thanks Steve!

    • @_draxin_0514
      @_draxin_0514 Před 2 lety +12

      ​@Klip One To be fair not that many people concurrently play Valorant. At least not in comparison to other games out there that have the same kind of hardcore sweatlord community members and gameplay.

    • @Neuroszima
      @Neuroszima Před 2 lety +12

      As if anyone cared about valorant. More fancy CSGO at most

    • @yodaaki3652
      @yodaaki3652 Před rokem +3

      @Klip One Valorant runs on a toaster, it's not like we need DDR5 for that...
      Any current configuration is enough for valorant :)

    • @dazeen9591
      @dazeen9591 Před rokem

      should've ran benchmark at 480p and 720p

    • @oliversmith2129
      @oliversmith2129 Před rokem +1

      @@dazeen9591 240p

  • @bb5307
    @bb5307 Před 2 lety +903

    Most big gains are on Ubisoft games it seems. Lots of high bandwidth ram must be benificial when you layer the game with 3 layers of DRM and spaghetti code.

    • @jadhsyjhduyhjy
      @jadhsyjhduyhjy Před 2 lety +23

      haha you are right

    • @erikhendrickson59
      @erikhendrickson59 Před 2 lety +34

      LMAO. I'd love to see some comparisons here between pirated (re: cracked) and "fully-DRMed." Honestly though, a lot of it just the effect of real-time asset streaming benefiting IMMENSELY from sheer memory bandwidth over lower latency -- especially now with above 4G decoding capable graphics cards to pull from said memory. Can effectively move entire 4GB chunks in one second which is massively beneficial for asset streaming in open-world games (and the ability to prevent stutter from occurring when doing so)

    • @Hadgerz
      @Hadgerz Před 2 lety +19

      Being an avid war thunder player, and seeing the OVER 20% PERF INCREASE in this test, that spaghetti code comment seems very much legit.

    • @saricubra2867
      @saricubra2867 Před 2 lety +9

      "Ubisoft"
      You mean Buggysoft.

    • @Manakuski
      @Manakuski Před 2 lety +1

      I would've imagined Call of duty: Warzone to be a title where we should see a difference

  • @mroutcast8515
    @mroutcast8515 Před 2 lety +237

    41 game benchmark - still sane, Steve??

    • @LucariosAxe
      @LucariosAxe Před 2 lety +6

      PoE reference?

    • @mroutcast8515
      @mroutcast8515 Před 2 lety +2

      @@LucariosAxe I guess, been playing it for 8 years - phrase must have stuck with me.

    • @Boodoo4You
      @Boodoo4You Před 2 lety +6

      41 games, at 3 resolutions = 123. Running each benchmark 3 times means he would’ve done at least 369 benchmark runs!

    • @marekk.3322
      @marekk.3322 Před 2 lety

      Spot the PoE player in the comments

    • @xlinnaeus
      @xlinnaeus Před 2 lety

      11:38 why don’t you ask him yourself…?

  • @IMarsiii
    @IMarsiii Před 2 lety +231

    Again probs to you for that amazing work, testing. No bulls***t just crispy clean numbers, polished and presented for us in nice diagrams.

  • @JarrodsTech
    @JarrodsTech Před 2 lety +315

    Nice new set area!

  • @CESAR_H_ARIAS
    @CESAR_H_ARIAS Před 2 lety +196

    When only the ram costs as much as a processor and motherboard combo bundle 😂😂💩 2021 has been amazing.

    • @xmaverickhunterkx
      @xmaverickhunterkx Před 2 lety +17

      It's new tech. It's always especially expensive.

    • @FeintMotion
      @FeintMotion Před 2 lety +5

      That isn't a supply thing, it's a yields for a new thing thing

    • @mza2001
      @mza2001 Před 2 lety +8

      1) Its new tech tax just like DDR4,
      2) The power architecture changed and its in the ram itself not the board
      3) Its still maturing and its a generation leap for the ethusiast since ddr4 is at its peak maximum and cant handle some applications
      3) Production is ramping up with lower latency+higher speeds
      4) Prices will mature once AM5&Z790s come out with better binned sticks for cheaper.
      5) Potential is only going to be crazier
      I dont get you guys complaining 😑, its new tech.

    • @takehirolol5962
      @takehirolol5962 Před 2 lety

      It was obvious since the start of time...

    • @sagerdood
      @sagerdood Před 2 lety

      Buuuuuut ddr5 is cheap. Certainly not as much as a mobo

  • @striker241186
    @striker241186 Před 2 lety +18

    Thanks for the tests Steve. I myself got a 12700k and kept my DDR4 3200 kit from my previous motherboard

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 Před 2 lety +1

      3200? oc that to 3600, im sure they can go witha little voltage(if they crashed on or old system i bet that was the old imc not memory itself) i myself had 32gb ddr4 3200 that refued to run past 3200 at ANY voltage, oh.. and required a crapton of imc overvolting for my 7700k.. now.. 3600 easy.. and pushed to 3800 now, working fine (i expected 3600 to be a bit annoying but nope, 1.4v and done, 3800 1.45v), ram is fine at 1.5v anyway, that's the heatsink use(if its a heatsink and not some useless platic rgb crap over the ram that being said

    • @sebastianwallace268
      @sebastianwallace268 Před 2 lety

      My 3200 cl16 4x8 crucial ballistix kit is running at 3600 mhz with just 1.37 volts vs the stock 1.35. Definitely try a memory overclock. It was as simple as upping the frequency, rebooting the pc to see if it worked. Didnt touch any timings.

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 Před 2 lety +3

      @@sebastianwallace268 u sure u kept timings?, remember to ALWAYS set em to manual when doing that as if they're on auto and outside of xmp profile specs the board can yolo it and set cl 22 timings or worse

    • @striker241186
      @striker241186 Před 2 lety

      @@sebastianwallace268 thanks for the tip. I overclock my cpu and gpu, but never thought of overclocking my ram

    • @sebastianwallace268
      @sebastianwallace268 Před 2 lety

      @@greenumbrellacorp5744 the timings are manually set to what they are for xmp except for trefi which is set to the max 65535 which is supposed to help. It may even run at a lower voltage but i know 1.37 is safe and bumped it just to insure stability.

  • @Yellowswift3
    @Yellowswift3 Před 2 lety +1

    Love the lighting here, looks really good. I also noticed Steve giving us the eye when talking about the ram (trident-z), I think it was. :)

  • @Tepnox
    @Tepnox Před 2 lety +18

    Nice to see Riftbreaker in your test setup. This game has some pretty good implementation of newest techniques (Raytracing, VRS, FSR) and runs pretty well, despite demanding for recent hardware!

  • @michaelbaldwin5953
    @michaelbaldwin5953 Před 2 lety +68

    Like the set , including the overclocked OLED TV with Airconditioning cooling above it 🙂

    • @humer101
      @humer101 Před 2 lety +2

      He is in the lower level of the house, here we call it the basement, which is nice and silent for those videos.

  • @halfwayvinny9786
    @halfwayvinny9786 Před 2 lety

    Thank you for summing up the results in the beginning. Got the info I wanted, dropped the like. Great video

  • @NewmanOnGaming
    @NewmanOnGaming Před 2 lety +46

    CL36 is rough to use over the speed premium where you can still get solid performance out of CL14 and CL16 DDR4 kits.

    • @Dr.WhetFarts
      @Dr.WhetFarts Před 2 lety +8

      well, ddr5 destroys ddr4 in several of these games and ddr5 is brand new and far from mature, we will see 8000-10000 speeds in the next few years. ddr4 sucked balls on release compared to high-end ddr3. ddr5 seems far more promising than ddr4.

    • @NewmanOnGaming
      @NewmanOnGaming Před 2 lety +1

      @@Dr.WhetFarts This speeds are only as good as the CCX cache of the CPU. For Intel in certain types of work loads with would be great. As for AMD this may not correlate the same. Overall in gaming instances it's nowhere near as impressive. It's all about what you want to accomplish with said speeds and modules.

    • @Ryan-gp7jd
      @Ryan-gp7jd Před 2 lety +10

      @@NewmanOnGaming "nowhere near s impressive" LOL dude in several games it was over 20% faster. That's what the chart says and that is now when DDR5 is immature and DDR4 is highly matured... as time goes on, developers will only add support and optimize for DDR5 more. It's GG DDR4 bruh.

    • @weecious
      @weecious Před 2 lety +3

      @@Ryan-gp7jd It's funny that DDR5 is up to 20% faster "mostly" in Ubisoft titles... lol.

    • @kowismo
      @kowismo Před 2 lety

      @@weecious didn't knew Fortnite is from Ubisoft, but okay.

  • @dignes3446
    @dignes3446 Před 2 lety +30

    Hi, Hardware Unboxed. I really appreciate you guys are including a few RTS games in your benchmarks (SC2,AoEIV) please keep doing this for us RTS gamers. Our genre is not dead... its Undead!

    • @noelchristie7669
      @noelchristie7669 Před rokem

      Just started playing AoE 2 and I'm having a great time, excited to try the newer games once I'm done with it!

  • @catsspat
    @catsspat Před 2 lety +78

    1:51 470 USD for 32GB kit. Wow, just wow.
    Granted, I never buy "gamer" memory sticks, but even the most expensive pair of memory sticks I ever bought (2 sticks of KSM32ED8/32ME = 64GB) was less than that, at 376 USD.

    • @puciohenzap891
      @puciohenzap891 Před 2 lety +12

      My 6000 c36 32GB kit was €700 in Europe and it doesn't even work at 6000C36. Well played.

    • @MatarM0
      @MatarM0 Před 2 lety +1

      Got 32gb of kingston fury 6000 c40 for 320$ ;)
      its way below the market pricing now

    • @zakelwe
      @zakelwe Před 2 lety +3

      Before Alder Lake came out Crucial had 4800 actually reduced from £160 down to £155 for 2x16GB, 90 sticks of it. Then it was not reduced, then it was £175. Now it is £250 and not in stock.
      Although not quick, that original price, in hindsight, was a bargain, even though still more expensive than DDR4.

    • @chrisvig123
      @chrisvig123 Před 2 lety +12

      Complete waste of money at this point…I’m not getting bent over for diminishing returns 😯

    • @LiveType
      @LiveType Před 2 lety +3

      Eh, I'd rather get 128GB DDR4 for about the same price. Sure it's not "fast" memory (3200cl16), but you can open up anything you want and keep it running. Especially on Windows which has bloated up significantly with how aggressively it preloads and caches stuff. Right now I'm at 9.8GB active memory with 17 GB utilized. I literally only have a password manager, cloud file sync, and chrome with ~15 tabs open. How? Like actually how? Why does it need that much memory? That should take no more than about 6GB (that 9.8GB is about right), yet windows managed to fill up 17GB.
      If anything, I'd get 64GB DDR4 of decent Samsung B-die and tune those sticks. That's the best performance you'll get without going overboard. Memory performance is important and makes almost everything your computer does faster. Now add in an optane cache and you're off to the races.
      I'd give DDR5 another 18 months before it becomes a "better buy" than DDR4 right now.

  • @shiishani3302
    @shiishani3302 Před 2 lety +2

    As always, excellent work. Thank you for keeping us informed & entertained.

  • @1danbon
    @1danbon Před 2 lety +12

    Thank you guys so much for including 4K. Even with RTX 30 Series and RDNA 2 it seems that 4K is still ignored in most benchmarks. I know it’s heavily GPU bound but still nice to see the data.

  • @vMaxHeadroom
    @vMaxHeadroom Před 2 lety +40

    Agreed, wont be looking to upgrade from my 12700K for at least 4 to 5 years and by then it will be 15th gen with ultra cheap DDR5 or even DDR6!

    • @mikerzisu9508
      @mikerzisu9508 Před 2 lety +1

      Would be nice to have the platform now to upgrade to faster ddr5 down the road versus having to buy a new mobo

    • @cicciopasticcio6939
      @cicciopasticcio6939 Před 2 lety

      i've still got one 10700k @5.1 and i don't see one reason for upgrade...for almost 3 or 4 years

    • @bobbrock4221
      @bobbrock4221 Před 2 lety

      I'm waiting for DDR12 myself.

    • @weecious
      @weecious Před 2 lety +1

      @@cicciopasticcio6939 You've had 10700K for 3 - 4 years..? Damn... you must be in the future then since 10th gen hasn't even been out for 2 years as of right now.

    • @silentlygaming3396
      @silentlygaming3396 Před 2 lety

      @@weecious I think they meant they don't see a reason to upgrade for 3 to 4 more years

  • @miceforever9956
    @miceforever9956 Před 2 lety +3

    Thank you so much for all your hard work Hardware Unboxed :)
    Love from Denmark

  • @gregnezz
    @gregnezz Před rokem

    Thank you very much for this and all the tech vid's. This is my go to place. Love from Ireland!

  • @johntotten4872
    @johntotten4872 Před 2 lety +3

    @Hardware Unboxed Great video Steve and spot on man. Would love to see you test DDR4 4000 Mhz CL16 or CL17 memory for the upcoming memory test if you can. Some of us bought some nice memory for our last build and migrated it over to a new Alderlake build.

  • @warlordwossman5722
    @warlordwossman5722 Před 2 lety +126

    41 games benchmark? That's impressive.
    I was wondering if you could add Escape from Tarkov to the games you benchmark, it seems to be very CPU heavy and generally is a popular game, could be interesting data for many people!

    • @userise222
      @userise222 Před 2 lety +3

      Might be hard to reproduce same scenarios each time so not a fair comparison

    • @tyre1337
      @tyre1337 Před 2 lety +11

      if it's not on steam it doesn't exist

    • @whiteblack6865
      @whiteblack6865 Před 2 lety +2

      Factorio too! That is the perfect game to benchmark, especially that it's quite memory heavy. I don't get why they didn't include that game.

    • @warlordwossman5722
      @warlordwossman5722 Před 2 lety +9

      @@userise222 using your logic why would Steve have spent hours in Battlefield 2042 recreating the same situations? In tarkov you can not only choose maps but even play offline with AI turned on to make very controlled tests lol

    • @Ideost
      @Ideost Před 2 lety

      I totally agree. Would be great to see popular games like Tarkov.

  • @raptor6600gt
    @raptor6600gt Před 2 lety

    Cheers for including The Riftbreaker to your benchmark suite. Please include it if you do a CPU benchmark comparison in the future.

  • @ImaMac-PC
    @ImaMac-PC Před 2 lety +2

    @14:00 anyone else notice the "death defying" near vertical descent in the upper left hand corner of the G.Skill box? Looks so cool against the tinted backdrop...like he was descending a mountain near sunset...I wonder if he made it down 😄 Steve, your camera is so good, that was a great shot! Love the new studio and different areas set up.

    • @iffy_too4289
      @iffy_too4289 Před 2 lety +1

      Until i went back a few seconds from 14:00 I was thinking "This guys a bit mental. WTF is he on about."
      Its an ant on the box. OP is "a bit imaginative".

  • @sinephase
    @sinephase Před 2 lety +20

    I'm curious if there's ever been a time where a new memory type wasn't matured at least somewhat in the server market before being brought to desktop?

  • @globnomulous
    @globnomulous Před rokem +3

    You saved me $800. I'm returning my Asus Z690 Hero and my GSkill Z5 C32 6400 and opting for the MSI Gaming Edge with 32 GB of the 3200 C16 DDR4 I already have. Just signed up on Patreon at $10/mo.

  • @testerzz
    @testerzz Před 2 lety

    Thanks! Perfect! What I was looking for last weeks.

  • @TheModeRed
    @TheModeRed Před 2 lety

    That pc setup in your background is super terrific! Have you done a tour?

  • @diegoberan7883
    @diegoberan7883 Před 2 lety +13

    I managed to get playable framerates with an RX580 on Halo Infinite by overclocking the ram. I'm playing on 720p low ddr3-2400mhz, got almost a 50% fps increase only by setting the ram from 1600mhz to 2400mhz.
    Setup:
    Core i7 4770k 4.1ghzOC all cores
    RX 580 8gb
    4x4gb ddr3 2400mhz Hyperx Predator
    motherboard Asus z97-A
    240gb sata ssd

    • @Leo.Kalash
      @Leo.Kalash Před rokem

      You'll get a massive boost at a reasonable price if you just move to a am4 mobo, and get that ddr4 combined with some ryzen 5

  • @ccjh0806
    @ccjh0806 Před 2 lety +29

    The under recognition of CPU limitation in games is overwhelming LOL. Most of the time my 3080 isn't running above 97% utilisation

    • @vme23
      @vme23 Před 2 lety +9

      Pair that 3080 with an Alder Lake CPU and you'll see 99% GPU utilization in most games.
      I have an i9 12900k and my 3080Ti is maxed out in 1440p in all games that I play, even in the ones which are CPU limited like BF2042, New World, Anno 1800 etc.

    • @josephhodges718
      @josephhodges718 Před 2 lety +6

      NVidia has a lot of driver overhead, they did a video on this a few months back I believe.

    • @Vegemeister1
      @Vegemeister1 Před 2 lety +5

      The only way to get 100% or near-100% GPU utilization is for the CPU to prepare frame N+1 while the GPU is rendering frame N. That adds latency.

    • @TrueThanny
      @TrueThanny Před 2 lety +2

      That's full utilization of the GPU.

    • @andersjjensen
      @andersjjensen Před 2 lety +6

      97% is to be considered "a strong GPU bottleneck". The PCIe bus has a command latency. The BCLK fluctuates (as it should if you have spread spectrum enabled (and you should)), so the response time from the GPU can vary a bit from command batch to command batch which makes it impossible for the CPU to time everything perfectly. The GPU itself sometimes stalls on memory reads because of the dynamic-refresh nature of all DDR memory types, etc, etc, etc. Throwing "more CPU" at all of the above wont solve a single thing....
      It's basically only realistic to reach 100% GPU utilization with something like Furmark because it was specifically designed to do so. Furmark causes practically no PCIe traffic once the assets and shader programs are in VRAM. And the code that gets run on the GPU causes very little memory I/O but is insanely compute intensive.

  • @captante9889
    @captante9889 Před 2 lety

    Wow .... nice image quality for this video!
    And I'm currently running OC'ed DDR4-3000 @ 3333/1667 CL16 on a Ryzen 5800 and am not bothering to upgrade it anytime soon.

  • @memberman
    @memberman Před 2 lety +1

    Just the vid I've been wanting thanks 😊

  • @GodKitty677
    @GodKitty677 Před 2 lety +3

    I was reading about this on another website. DDR4 4400 CL16 and DDR5 4800 CL40 have the same performance in games. In Aida64 memory tests its 63283 read DDR4 and 70195 DDR5. The FPS is the same. Going from 4800 to 6000 is two FPS and to 6400 is 1 fps.
    Another website states DDR4-3600 16-20-20-20-34 1T is just 0.9% slower than DDR5 36-36-36-76 1T Gear 2. This is at 1080p. This drops to 0.4% difference at 4k.
    Now with DDR4 4000CL15 the latency is 44.7ns and with DDR5 5600C36 the latency is 57ns. With overclocking DDR5 you can get 6500 CL28-37-37-26 CR1 Latency is 47.6ns and bandwidth is over 100GB/s in Aida64. This matters only if you are not GPU limited at 4k which you will be.
    With DDR5 5200CL36 many websites show no real performance increase with DDR4 3600C14 ahead by 1 fps. This is Far Cry 6, shown as massively better on DDR5 in this video.
    I think the whole DDR5 6000 vs DDR4 3600 is not a valid way of doing this test. DDR4 4000 and DDR4 4400 should also be there. So should the slower DDR5 speeds. From overclocking if you get the right performance kits you can change the conclusion to your liking. The DDR5 6500 CL28 overclock will have better performance but will be gpu limited at 4k. Even websites that show DDR5 massively ahead at 1080p, also show no performance increase at 4k because you become gpu limited. Average FPS remains the same, this is with DDR5-4800 CL40 vs DDR4-3200 CL22 or stock memory for the Core i9-12900K. Max Memory Bandwidth 76.8 GB/s which is Intels spec.
    What the video shows is that the higher end overclocks for DDR5 can increase performance but this is also much more experence RAM than the DDR4 3600 kit. Then ignores everything inbetween and stock speed. Really the fastest DDR4 4400k it should be used, if money is no object and you are getting DDR5 6000 or higher. Remember stock is DDR5 4800. DDR5-4800 trades blows with DDR4-4000 quite well in many reviews.
    Why would you be getting DDR4 3600 RAM if performance is the goal and the money is there for a DDR5 6000 kit. The justification for the DDR4 3600 kit is very weak, to the point you would seriously question why this comparison would happen. Thus the videos conclusion is that their DDR4 3600 CL14 overclock is not as good as DDR5 6000 CL36 kits. I did not need a video to tell me this fact. Why not just overclock to DDR5 6500 with a latency of 48ns and be done with it. If the goal is the conclusion you want. Then compare to DDR4 3600 and state 20% plus more performance in many more games.
    Here is a difference conclusion, "The DDR5 topped the field by a smidge at 1,920 by 1,080 and the Medium setting. Better memory latency still had some impact, however, as evidenced by the standard DDR5-4800 falling slightly behind the DDR4-4000 XMP at other tested resolutions and settings. But it was super-close."
    And another, "In popular titles like Rainbow Six Seige and Star Wars Jedi: Fallen Order, DDR4-3700 memory is up to 7 FPS faster than the newer DDR5-5400 standard. Although the deltas in most cases are within the margin of error, it can’t be denied that DDR4 (even overclocked) is a fair bit cheaper than standard DDR5-5400 kits which makes it hard to recommend the latter. And this is with the higher-end Z690 chipset. "
    Why ignore the slower DDR5 speeds and the faster DDr4 speeds? It makes no sense. Unless the goal is a very fast DDR5 kit vs the average DDR4 kit. Ignoring faster DDR4 kits that lower the price difference. It looks like cherry picking to get a conclusion one wants. Ignoring no real performance gain as a posiblity under other configurations. Seems that the real conclusion could be that the DDR5 kits are faster bandwidth wise over the DDR4 kits that this balances out the slower latency. That latency and bandwidth both matter but at some point faster DDR5 kits could pull away from DDR4 in performance. Just a guess.

  • @nathanddrews
    @nathanddrews Před 2 lety +9

    While it's not exactly comparable to standard ECC RAM, the extra on-chip ECC bit-flip protection of DDR5 is worth some of the premium IMO. *SOME*

  • @AlexanderKalish
    @AlexanderKalish Před 2 lety

    Thank you for such a comprehensive analysis. Cool new set.

  • @hhectorlector
    @hhectorlector Před 2 lety

    Some great b-roll footage here! Also- kickass gaming room

  • @Dragon211
    @Dragon211 Před 2 lety +4

    solid work as usual, I'll wait until the chip shortage is over before I make any major upgrades to my system. my general rule is "if you're getting at least 40fps don't upgrade." that flashy new hardware probably is tempting to buy but you rarely need to buy it.

    • @Izzydawizzy
      @Izzydawizzy Před 6 měsíci

      This is the comment I needed to read😂😂😂

  • @moos5221
    @moos5221 Před rokem +4

    This is a great test, I just wish you would have used the widely available CL16 instead of niche CL14 DDR4 RAM.

  • @tonyc1956
    @tonyc1956 Před 2 lety

    Agreed across the board. As for the GSkill Ripjaws 3600-CL16 kit I've got 2x32GB kits in a 64GB config running 3800 CL16 with just a mild bump in voltage to 1.39. Might go higher but my IMC on my R5-3600 is prob holding me back and I don't want to de-couple the Fclock. Once the new Ryzen 3D CPUs are out we'll see if that improves. Nicely done Steve. Merry Christmas you Aussies.

  • @keyserxx
    @keyserxx Před 2 lety +2

    Nice round up thanks. I've never really been a ram hypetrain guy, I just buy whatever my motherboard manufactures tell me to on the HQL, test it and forget it. I can't remember what I'm on right now except its the greatest CPU ever made (3600) ;) Love the new digs.

  • @HuyV
    @HuyV Před 2 lety +8

    The problem is that while the frequency is a lot faster, the latency is actually worse.
    That 3600 CL14 has a latency of 7.78 ns while the 6000 CL36 has 12 ns of latency.

    • @johnboy2436
      @johnboy2436 Před 2 lety

      To make this simple for you... Latency doesnt matter when it only converts to FPS. This is what people like you dont understand. If DDR5 offers better FPS.. The latency is lower. SIMPLE done and dusted... SO why you converting latency numbers beats me? DDR5 is different and the latency number say nothing right now. Benchmarks DO

    • @broklond
      @broklond Před 2 lety +4

      @@johnboy2436 bruh, he's talking about memory latency, not input latency xD

    • @johnboy2436
      @johnboy2436 Před 2 lety

      @@broklond Again, my comment stands and i know exactly what he means... I can type the MHZ and timings into a calc and get the same ns he is referring too... AGAIN FPS is the end result and tells all. THere is no need to put ram timings as 12ns ram on paper could get better FPS and therefore have less latency in-game...
      Not sure if thats confusing you? High FPS = less latency... The real world aplication of memory latency doesnt mean faster ram or more FPS... SO bruh chill with the not so smart comments. People dont sit putting memory timings for no reason. He thinks DDR5 is slower because of memory latency... ITS NOT. All these benchmarks and comparison videos involve gaming.. THATS IT. I have full addressed his misinformation.

    • @n00blamer
      @n00blamer Před 2 lety +2

      @@johnboy2436 Depends on the workload. Some games were slower with DDR5, because the code was more sensitive to latency. Generally, higher bandwidth is better because there are latency mitigation techniques in both hardware and software. In hardware we of course have the cache and this can be augmented with cache-friendly data layout, which stems from software author's decisions. At 4 GHz 12 ns is 48 clock cycles, 12900k with 12-wide execution ports can miss up to 576 instructions in the worst case while waiting for memory. Of course the stall is likely to be less than the maximum because we can assume that the read request was issued way before the data dependency. For writes the latency is less relevant as they go through write commit buffer, and the data is live in the cache.

  • @ashtonw9931
    @ashtonw9931 Před 2 lety +72

    Now I am curious how much latency effected this. CL14 is really good, but even normal ddr4 isn't much higher (16-20) at reasonable prices.

    • @Krenisphia
      @Krenisphia Před 2 lety +5

      I wonder if the Cas Latency on DDR5 will ever reach CL30 or even lower in the future...

    • @pr0xZen
      @pr0xZen Před 2 lety +14

      Having been through 3 of these generational transitions before this, I'd suggest to expect good primary, out-of-the-box XMP timings for DDR5 to be around CL30 ish well before mid-life cycle. Timings usually double vs former gen JEDEC, but each gen have had their new little bits, bobs tricks and means of optimization (including in the fab department). With DDR5 having more space for XMP timings, and the CPU memory controller no longer needing to do power management (the memory manufacturer can in theory even tune that per stick because power management is on-board the RAM module with DDR5) - odds are good that we're gonna get some blistering fast, tight timings DDR5 in the next 1-2-3 years. And you likely won't really need those golden unicorn bins of CPUs anymore, or more explicitly the CPU memory controller, to actually get good use of that high speed DDR5. High end VRMs for memory will help in that department though. Not for tons of amps ofc, so you don't need a lot of them.

    • @andersjjensen
      @andersjjensen Před 2 lety +24

      Always remember that latency is counted in ticks. So DDR5 6000CL40 is the same latency as DDR4 3000CL20 (but double the bandwidth).

    • @kyzyl4915
      @kyzyl4915 Před 2 lety +10

      @@andersjjensen I am glad you clarified this. Lower CAS does not mean the absolute latency is lower automatically :)

    • @bingbing3464
      @bingbing3464 Před 2 lety +1

      @@Krenisphia already had. Tinkers and overclockers are running 6600c28 daily.

  • @marcchapleau8343
    @marcchapleau8343 Před rokem

    Now that's is an impeccable benchmark and review. Very informative. You help me better understanding RAM. I am tooled to avoid the marketing bulshit. Thanks!

  • @Lazarosaliths
    @Lazarosaliths Před 2 lety

    WoW the new setup is CRAZY!!!
    Please make a video of the set so we can put it as background to our pc monitor

  • @jshico3417
    @jshico3417 Před 2 lety +6

    the future proof argument is so dumb, youre better off building a pc now with a plan to upgrade to a new pc for the same price in 5 years than to build a pc for the price of both and holding onto it for 10 years.

    • @tyre1337
      @tyre1337 Před 2 lety +2

      my friend wants to buy a 3090 so he could "play games for the next 10 years without worries"

    • @jshico3417
      @jshico3417 Před 2 lety

      @@tyre1337 the Maxwell titan is barely alive and that's just 7 years old in 1080p, and that's without talking about mechanical failure, my 970 required a fan swap after 5 years of use and it was a pain in the ass to get the original fans.

  • @maxwilliam8470
    @maxwilliam8470 Před 2 lety +20

    Do you want that 2%-10% framerate gain: YES
    Do you wanna pay 2x the price of the previous GEN: NO

  • @Adromelk
    @Adromelk Před 2 lety

    The new set looks great! Keep up the great work buddy. Cheers!

  • @youngporky5788
    @youngporky5788 Před 10 měsíci

    I see lots of effort went into bringing us that data. Much appreciated !!

  • @boyanmilev653
    @boyanmilev653 Před 2 lety

    I watched some of your videos before buying components for the new build which helped tremendously with the choice, thank you very much! In the EU, I managed to buy 64GB's of DDR4 3600MT/s memory for less than the cost of 32GB DDR5 6000MT/s memory, and that's not even accounting the extra cost for the motherboard. Even though I had no limitations on the build, it just made no sense going DDR5 right now. Glad to see the more extensive testings only prove this further.

  • @saumyashhah
    @saumyashhah Před 2 lety +14

    Would like you to include budget DDR5 kits as 6000 is very expensive.

    • @Hardwareunboxed
      @Hardwareunboxed  Před 2 lety +27

      It's a safe assumption that budget DDR5 will be even worse, but should scale relative with budget DDR4.

    • @SGIMartin
      @SGIMartin Před 2 lety +16

      DDR5 is not going to be a "good" buy for at least a year, as with every new iteration of memory. DDR4 was very expensive and not much fast than ddr3 when new

    • @TommyVercettisGamingNews
      @TommyVercettisGamingNews Před 2 lety

      They'll be scalping PC cases next

    • @avatarion
      @avatarion Před 2 lety

      Corsair just announced DDR5 6400. I'm hoping that a year from now I will be able to buy 7000 for my Zen 4.

    • @imadecoy.
      @imadecoy. Před 2 lety

      I would hope that if you are buying DDR5 you've already committed to not cheaping out. It's like buying the cheapest Porsche to say you have a Porsche. :P

  • @krauzerheisenberg7960
    @krauzerheisenberg7960 Před rokem +7

    Hi Steve! After 6 months of DDR5 on the market, could you guys please do another short review if anything changed for these games considering windows/drivers updates, market prices, and new kits released? I would also love to see the difference between streaming/non streaming while playing those games, where it might show the real difference ddr5 vs ddr4. Thanks!

  • @13Dread
    @13Dread Před 2 lety

    Will you publish the results in a written article with details per every of the 41 games? Nice work, really nice to see these kind of comparisons :)

  • @jokesinyourpants
    @jokesinyourpants Před rokem

    you helped me tremendously! thank you so very much.

  • @EyesOfByes
    @EyesOfByes Před 2 lety +103

    Regarding memory and gaming. According to Mojang's own devs, Minecraft is very memory dependent. If they meant latency or bandwidth I'm not sure of. So it would be interesting to see some benchmarks for Minecraft with perhaps some of the most popular mods/launchers.Or are there that many infinite variables in testing Minecraft that it would either make no sense testing, or it would take a 10hour video on the topic? 😏🤓 Why Minecraft? Because my nephew's Christmas wish is a gaming PC, duh 😉

    • @machompko9905
      @machompko9905 Před 2 lety +19

      You are gifting your nephew a gaming PC with DDR5???? Holy shet.

    • @sabnox9869
      @sabnox9869 Před 2 lety +15

      The reason there's no point in testing minecraft is because it can be played on a Haswell CPU with ddr3 at max settings with more frames than your monitor can display. If you want to get your nephew a pc maybe look into an old dell office pc with a 4th gen or newer i7 or xeon equivalent and then you just need a 1060 to 1070 performance level card and a 550-600w psu and you're golden.

    • @zeinnassar2770
      @zeinnassar2770 Před 2 lety +16

      @@sabnox9869 he asked about mods, those things get really demanding

    • @0xCAFEF00D
      @0xCAFEF00D Před 2 lety +11

      @@zeinnassar2770 If you start benchmarking with mods you've got a huge number of variables to cover. Every mod will behave differently.
      You could make a benchmark for common libraries but if that doesn't already exist there's no sense in trying.

    • @criscrix3
      @criscrix3 Před 2 lety +2

      Mojang's statement is probably only related to the amount of memory though it would be interesting to see some benchmarks in heavily modded minecraft.

  • @yedrellow
    @yedrellow Před 2 lety +5

    The biggest problem with the future proof argument is that within ~18 months, people will want to be using 8000 MHz ram, which won't run on current motherboards anyway.

    • @chillnspace777
      @chillnspace777 Před 2 lety +1

      This

    • @chillnspace777
      @chillnspace777 Před 2 lety

      For.me.future proofing is just going mid range tier and upgrading every some odd years. No point on top end when the midrange usually in a few years time matches or beats it

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Před 2 lety

      Why bother for high ram frequency if it nearly impossible to see the fps difference in games without fps counter ?

  • @jadamsnz
    @jadamsnz Před 2 lety

    Many thanks for including FS 2020 - I wish more review sites did...

  • @brettwainscott1165
    @brettwainscott1165 Před 2 lety +1

    Thanks, great video and advice.

  • @Karazure
    @Karazure Před 2 lety +13

    Glad to see Riftbreaker making into the comparison. That game is fun!

  • @francis147741
    @francis147741 Před 2 lety +7

    Loving the new studio!!!

  • @Terrados1337
    @Terrados1337 Před 2 lety

    My initial thought would by that timings and latency "should" play a big role assuming you are running optimized code.
    Of course there situations when moving lots of data to RAM where the higher throughput of DDR5 comes into play. When it comes to gaming I can only really think about simulation or strategy games where this would be the case.
    thank you for being so thorough with your benchmarks :) great video!

  • @iseeu-fp9po
    @iseeu-fp9po Před 2 lety

    Excellent as usual, Steve. You look happy in your new place.

  • @mikfhan
    @mikfhan Před 2 lety +18

    Hmm, yeah DDR5 prices are ridiculous so the future proofing is probably more like having a board that can later use the DDR5 when MHz grow further and price drops to normal. Sounds like latency won't improve much from DDR4 - shame new boards can't handle old generations also - you're probably best off waiting out for DDR5 price reductions before doing the board upgrade.

    • @Vegemeister1
      @Vegemeister1 Před 2 lety +1

      Latency hasn't really budged since, like, 2004. Nobody should be holding out for lower latency DRAM.

    • @mikerzisu9508
      @mikerzisu9508 Před 2 lety

      @Klip One at this particular time. Give it a year and ddr5 will be smoking ddr4, and then you are stuck with a motherboard that doesn't support it. I am going ddr5, and upgrade to higher frequency ram later and sell what I had for probably not much than what I bought it for. Things will improve well bios updates as well. The problem is getting it atm.

    • @fightnight14
      @fightnight14 Před 2 lety

      DDr4 ain't going away anytime soon. I still use my DDR3 and 4th gen intel without problems and I still enjoy lots of modern games. When DDR5 speeds go up, current motherboards won't even be supporting them so you will still end up replacing it.

  • @mehrdadgholami5738
    @mehrdadgholami5738 Před 10 měsíci +3

    Did anybody notice that spider on gskill box at 13:56 - 14:08 ? 😂😂😂

    • @heikg
      @heikg Před 9 měsíci +1

      Nice eye

    • @porojandaniel5738
      @porojandaniel5738 Před 7 měsíci

      Would have been nice to see a huntsman spider or a black funnel web one:))

  • @Windupmykilt
    @Windupmykilt Před 2 lety

    Absolutely love the new set, your shed is amazing

  • @DJRaffa1000
    @DJRaffa1000 Před 2 lety

    thanks for mageling your brain on this one. the detailed data is much appreciated ^^

  • @mthorne074
    @mthorne074 Před 2 lety +20

    Considering the impact RAM speed has on the AMD platforms, I'm very interested to see how things perform there in the future

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 Před 2 lety +3

      ram LATENCY kills ryzen... and cl40 .... we'll see, that being said, that can change vastly with 3d cache and new imc soo... yea

    • @popcorny007
      @popcorny007 Před 2 lety +7

      @@greenumbrellacorp5744 CL40 means nothing if you don't mention the speed.

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 Před 2 lety +1

      @@popcorny007 well obviously talking about the modules of ddr5 shown in the video, 4800 cl 40

  • @fasic46
    @fasic46 Před 2 lety +4

    We wait for AMD APU ddr5 use case

  • @kohiek
    @kohiek Před 2 lety

    Nice background/film set!

  • @tonybeckett66
    @tonybeckett66 Před 2 lety

    Thanks for your hard work Steve truely appreciated

  • @johnnywoolen1453
    @johnnywoolen1453 Před 2 lety +17

    DDR5 does have one advantage. All of the board makers seem to have knee capped the DDR4 boards when it comes to premium features. For example, there is only one DDR4 motherboard that has a post code readout, and it isn’t available (in the US) either. So basically you don’t get that feature unless you use DDR5. You also won't get 2 PCI gen 5 slots unless you move up to a DDR5 board.

    • @moldyshishkabob
      @moldyshishkabob Před 2 lety +6

      Interesting to note, though a bit of an artificial advantage.

    • @MaZEEZaM
      @MaZEEZaM Před 2 lety +1

      @@moldyshishkabob I bought the Gigabyte Aorus Pro DDR4 version, it has a postcode but its a very limited board. I wanted the master but its only available for DDR5 which cannot be bought in Australia yet.

    • @SnifferSock
      @SnifferSock Před 2 lety

      Who is using gen 5 slots already?

    • @johnboy2436
      @johnboy2436 Před 2 lety +2

      Where do you find this junk info? Even the MSI edge... A cheap 1700 socket board DDR4 board has gen 5 PCI-E.... Infact most boards im looking at have PCI gen 5 slots... Mate get outa here with your rubbish info.

    • @MicaelAzevedo
      @MicaelAzevedo Před rokem

      @@johnboy2436 not pcie x16 5x

  • @Ryan-ml7ws
    @Ryan-ml7ws Před 2 lety +6

    It would be cool to see how heavily tuned ddr4 compares to ddr5. 4133 15-15-15-35 dual rank with tuned secondary and tertiary timings is possible on the 12900k.

    • @dainluke
      @dainluke Před 2 lety

      A friend of mine and myself got lucky with MSRP Z5 at a local retailer here. My system was very borked so I haven’t had a chance to properly use it or test it, but he used the latest beta BIOS to hit 1.25V 6200 36-36-36 with relative ease.
      This makes me think 7000 is very doable with the Samsung ICs at 1.5V. I only own a Strix-F so will have to see what it caps out at at 6400, but I’m hoping for something with relatively tight timings. By that point DDR5 does significantly outpace even very good DDR4.
      My previous setup was a 5.1/4.8 10850K with 4000 15-15-15-30 DR and super tight subs. I’ll probably not notice much difference in Vangaurd but I’m expecting it to make more of a difference in games that had obvious CPU/memory limitations like Farcry 6 (stutter for days on Comet Lake).

  • @finnba_h
    @finnba_h Před 2 lety +1

    What camera and lens are used for the b-roll? Outstanding looks!

  • @GamexJoy
    @GamexJoy Před 2 lety

    Great review, thank you!

  • @Constantin314
    @Constantin314 Před 2 lety +5

    i think amd was very smart not to release next gen Zen in this moment seeing what is happening to the ddr5 market. amd still has more sales than intel cause of this ddr5 situation

    • @benjaminoechsli1941
      @benjaminoechsli1941 Před 2 lety +1

      "Intel's got absurd amounts of money. Let's let _them_ be the beta testers for this stuff."

  • @solidreactor
    @solidreactor Před 2 lety +11

    [Feedback] When presenting DDR memory Speed and CAS Latency, would be nice if you could add and complement them with the calculated Nanoseconds latency.

    • @JackMott
      @JackMott Před 2 lety +2

      That would be good feedback for the RAM industry!

    • @Lead_Foot
      @Lead_Foot Před 2 lety +1

      First word latency is just best case scenario and doesn't tell you what latency is in real world situations. DDR5's independent channels per dimm and same bank refresh improve latency in certain situations. Intel's latency checker shows more comprehensive latency results for different workloads.

    • @nathangamble125
      @nathangamble125 Před 2 lety

      @@Lead_Foot Of course, but it's still useful to know it, as it's correlated with the other latency timings. That's why CL is seen as so important in the first place.

    • @solidreactor
      @solidreactor Před 2 lety

      ​@@Lead_Foot Well do you think comparing CAS (tick latency) is more appropriate than time latency in nano seconds?
      Intels latency checker is not a thing you can use to add any value to a benchmark like this one, it is by itself a benchmark tool and needs its own slides or a own full dedicated benchmark video (which would be interesting by itself).
      I'm talking about adding extra info, one more variable, to an existing benchmark like this one which is the time latency in addition to tick latency (e.g. CAS).

    • @solidreactor
      @solidreactor Před 2 lety

      ​@@jayden974 They are not irrelevant at all if you compare TIME latency in nanoseconds between the types of RAM. You are thinking of comparing "clock tick" latency (e.g. CAS) which is NOT what I'm referring to.

  • @rwk1013
    @rwk1013 Před 2 lety

    Wait, Steve has legs? I didn't know that. I'm digging your big screen tv cooler on the wall. Awesome video. Glad I went with the DDR4 on my 12900k and the same Gigabyte motherboard.

  • @conza1989
    @conza1989 Před 2 lety +1

    Steve! Love the set, question, I'm building a new house and my air con outlet is right above where my TV will be, just like your set, any issues with that or anything I should be concerned about?

  • @marceldiezasch6192
    @marceldiezasch6192 Před 2 lety +37

    Someone testing Fortnite, nice to see. How exactly did you test? I'd assume replay mode? Did you test a heavy late game scenario with lots of players around or something lighter? I'm asking because I found that Fortnite results can change heavily between those workloads. Ryzen 5000 was significantly (as in 20+%) faster than the 10900k in light scenarios, but in heavy late game scenarios the 10900k was within margin of error.

    • @Hardwareunboxed
      @Hardwareunboxed  Před 2 lety +25

      It's the replay and you can see the footage. It's mid game but doesn't really matter too much, as long as both memory types were tested in the same section, scaling will be very similar throughout the game. I'm also very familiar with Fortnite performance ;)

    • @marceldiezasch6192
      @marceldiezasch6192 Před 2 lety +6

      @@Hardwareunboxed Thanks. I'd love to see the game become part of your standard titles. It's still a very popular game with both a large casual and esports audience and it's the only UE5 game out right now. Sadly the constant updates complicate testing as sometimes they can have an impact on replay performance or flat out break the replay file.

    • @casperes0912
      @casperes0912 Před 2 lety +1

      @@marceldiezasch6192 What? Fortnite uses UE5? When did that happen?

    • @Hardwareunboxed
      @Hardwareunboxed  Před 2 lety +15

      @@marceldiezasch6192 the problem is the benchmark (replay) is broken with every update which is why no one bothers testing the game. You have to throw out your data each week.

    • @marceldiezasch6192
      @marceldiezasch6192 Před 2 lety +1

      @@casperes0912 They made the switch earlier this month with the update to chapter 3

  • @dainluke
    @dainluke Před 2 lety +5

    I appreciate your effort Steve, but I’ve been saying for a long time that the solution is simple - find two equally priced high performance 32GB kits.
    The 6000Mbps Z5s at $470 MSRP, should at the very least have been pitted against the 32GB 3800C14 kit. It just makes the most sense. This is like comparing a 3080 vs an RX 6800.
    The 3600C14 kit isn’t slow by any means, but why not just go like for like in terms of price range? I dunno for certain, but the 4000C14 32GB might actually be right at the MSRP of the Z5 6000s.

    • @ThunderingRoar
      @ThunderingRoar Před 2 lety

      wouldnt 4000cl14 DR be difficult to run in gear1 on z690 since most of ddr4 boards are entry level ones?

    • @dainluke
      @dainluke Před 2 lety

      @@ThunderingRoar Not necessarily. On the Asus Z690 and MSI Z690s, it shouldn’t really be an issue at all. A lot of dudes got 4000 Gear 1 working. Gigabyte is a bit of an issue.
      Also I saw your previous iteration, I mean I don’t want to sound like a dickhead, but it’s Steve from HUB. If he wanted/needed a $500 kit of DR 4000C14s, he’d get one for sure.

    • @dainluke
      @dainluke Před 2 lety

      @H27W Fair point. I feel like when these comparisons are made this should be explicitly mentioned if that’s the purpose of the video.

  • @xgunnas32
    @xgunnas32 Před 2 lety +1

    i believe this is the first extensive tighter timing ddr4 vs ddr5 test, thank you HUB

  • @techsweclome2567
    @techsweclome2567 Před 2 lety

    Thank you for showing 1440p and 4k charts! Much appreciated!

  • @HeirofCarthage
    @HeirofCarthage Před 2 lety +56

    Wow...why the hard push by Intel and other companies for DDR5 if it is so useless in games? Maybe games will use it one day but yea I would see no reason to get DDR5 boards for the foreseeable future. Great video! Thanks.

    • @Hybris51129
      @Hybris51129 Před 2 lety +10

      It goes to show that memory speed should take a back seat to sheer amount of memory. Trying to compensate for only having 8 or 16 GB of memory by getting super fast memory has been a long known waste of money. One that if we are being honest has been either ignored or even pushed by the popular tech tubers.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 Před 2 lety +19

      I imagine we’ll see the difference once they get the timings down, latency matters more these days than megatransfers

    • @4RT1LL3RY
      @4RT1LL3RY Před 2 lety +30

      Mostly its for Server. DDR5 is going to offer higher capacities. This is exactly how it is at the start of each memory generation. The top end of the last gen beats out the brand new high end until they find the right combination of bandwidth, latency, etc. DDR4 took 2-3 years for it to start being superior at reasonable prices.

    • @alherch
      @alherch Před 2 lety +1

      Money!

    • @skywalker1991
      @skywalker1991 Před 2 lety +11

      Its a tactic to make fanboys believe higher number is better , Intel simply wants you to believe Alderlake CPUs must be better cause they support ddr5 . some dumb fanboys fall for it , some just throw money on new and latest technology even if doesn't offer an y benefit.

  • @asldfkhjaslk
    @asldfkhjaslk Před 2 lety +5

    Therapist: Standing Steve doesn't exist, he can't hurt you.
    Standing Steve:

  • @eitansharabi9488
    @eitansharabi9488 Před 2 lety

    I was looking around for a clear comparison and God dem did this hit the spot compared to other sources I've seen

  • @MCPabs
    @MCPabs Před 2 lety

    Great video!! A suggestion.Could you add Icarus to this kind of testing? it is quite memory intensive

  • @SB-qo2fo
    @SB-qo2fo Před 2 lety +4

    I believe that activating SAM/ resizable Bar would reduce the difference between DDR4 and DDR5 in the games showed the highest difference, like AC Valhalla. Do you confirm that steve?

    • @MrMeanh
      @MrMeanh Před 2 lety +3

      It would actually be very interesting to see if this is the case.

  • @derptyderp5287
    @derptyderp5287 Před 2 lety +5

    I wonder if you can get over 60 fps in Star Citizen's Orison area with DDR5, definitely a game that accesses a lot of system memory.

    • @alexandroutsos5990
      @alexandroutsos5990 Před 2 lety +1

      If you don't mind me asking. What type of StarCitizen player are you? I ask because I still love the concept and scope of the game. However with the size of the team and the "update map" always significantly delayed. I wonder if Restarting development in unreal 5 is smarter at this point. 5 years delayed and 20% of content in a functional Alpha phase.
      I guess my question is. At what point will you give up on it as a pipe dream? Or will you ever give up on it.
      3 years of waiting for them to have a persistent universe and its still buggy as fuck.

    • @derptyderp5287
      @derptyderp5287 Před 2 lety

      @@alexandroutsos5990 I tried the free fly earlier this year, then bought the base package after that.
      I'm sceptical to say the least, but it's fun and I'd like it to get somewhere.
      Considering how long it took them to get to this point, I doubt starting over would be a good call, escpecially with them having grabbed a load of the Crytek team to work on the engine.

  • @CrackaSlapYa
    @CrackaSlapYa Před 2 lety

    Thank You, great job man.

  • @tyraelhermosa
    @tyraelhermosa Před 2 lety

    Thank you for your hard work. Great video 👍

  • @bibeltours
    @bibeltours Před 2 lety +3

    So, you tested 41 Games. That is impressive! But if you have so much time, why not leave 1 or 2 games out and do 1 or 2 PRODUCTIVITY BENCHMARKS instead?
    I know your audience is most likely to be gamers, but I'm pretty sure that many of them do stuff besides games, like video or image editing / rendering / streaming / compiling / etc.

    • @TrueThanny
      @TrueThanny Před 2 lety +1

      He already did that in the day-1 review. As he stated at the beginning of the video. This video is explicitly about getting more data on DDR5 in games.

    • @swft_
      @swft_ Před 2 lety +1

      DDR5 smokes DDR4 in productivity workloads due to bandwidth, capacity and on-die ECC. In games, specially the FPS type, DDR4 benefits from having lower latency, however on other CPU heavy game types DDR5 handles them better or equal.

  • @vibonacci
    @vibonacci Před 2 lety +8

    You should run the AIDA64 memory bandwidth test. You'll instantly contact a scalper for DDR5.

    • @phoenixzappa7366
      @phoenixzappa7366 Před 2 lety +4

      DDR6 will be faster 😁

    • @frieza1016
      @frieza1016 Před 2 lety +3

      Which is pretty much irrelevant for gaming...

    • @scaryonline
      @scaryonline Před 2 lety +6

      Imagine buying ddr5 just for benchmark and not for real life performance

    • @ThunderingRoar
      @ThunderingRoar Před 2 lety +7

      You should run AIDA64 memory latency test. You ll instantly forget about that overpriced DDR5 kit.

    • @vibonacci
      @vibonacci Před 2 lety +1

      @@ThunderingRoar Was waiting for this all too obvious rebuttal 😛

  • @DeanCalaway
    @DeanCalaway Před 2 lety +1

    Yes, thank you! Was waiting for this, hail the king of benchmarks

  • @807800
    @807800 Před 2 lety

    Thank you for your service, Steve!

  • @Muldeeer
    @Muldeeer Před 2 lety +5

    It would be good to test this with Escape from Tarkov..

    • @GTRWelsh
      @GTRWelsh Před 2 lety

      A lot of the CPU comes from scavs and it's not easy to like for like replicate sessions exactly, but it's such a CPU demanding game man I totally agree it would be great if it would be doable

    • @devilboner
      @devilboner Před 2 lety

      Is Tarkov particularly memory sensitive compared to other games?

  • @3nimac
    @3nimac Před 2 lety +3

    If we're doing tests for science, I'd love to see them compared at about equal latencies in terms of nanoseconds, because presumably DDR5 will get better in the future, but at that point we won't care to bench DDR4 any more.

    • @nathangamble125
      @nathangamble125 Před 2 lety

      But at what frequency? Presumably you'd need to have the DDR5 running at much higher frequency, or you'd need to use very high-end DDR4 with extremely loose timings in order to run both at the same frequency. I guess maybe both at 4800 could work?

  • @redringer314
    @redringer314 Před 2 lety

    I'm enjoying all the new you games you've added to the tested roster!

  • @brothatwasepic
    @brothatwasepic Před 2 lety +1

    Thank you so much for this

  • @Wongseifu548
    @Wongseifu548 Před 2 lety +8

    Feel sorry for those who paid the early adopter rax for ddr5

    • @isakh8565
      @isakh8565 Před 2 lety +2

      Don't, they can sell their kit for $2000 on ebay 😂

    • @noodlefoo
      @noodlefoo Před 2 lety +3

      Don't be, they obviously have more money than sense.

    • @heickelrrx
      @heickelrrx Před 2 lety +1

      probably they use it for work not gaming
      if they only use for gaming... Hnggggggg

    • @promc2890
      @promc2890 Před 2 lety +2

      @@heickelrrx even for work you need amount more than speed so they would just buy a like a 64-128gb ddr4 kit for the price of a DDR5 16gb

  • @monikaw1179
    @monikaw1179 Před 2 lety +3

    I'm on Ryzen so it doesn't matter to me, but the good thing with Intel using DDR5 first is that by the time AM5 launches the DDR5 market should hopefully have taken off in a bigger way (faster kits, better supply, slightly lower prices).

  • @SynthetickRS
    @SynthetickRS Před 2 lety

    I like what you've done with the set it's looking really good.

  • @matthewmcsparren7534
    @matthewmcsparren7534 Před 2 lety

    Good video, nice breakdown.