DDR4 vs. DDR5, New Game Benchmark + Has 5800X3D Aged Worse Than 12900K?

Sdílet
Vložit
  • čas přidán 22. 07. 2024
  • Gigabyte Aorus X16: www.aorus.com/en-au/laptops/A...
    These videos all explain why CPU limited CPU benchmarking is the only way:
    • Why Reviewers Benchmar...
    • CPUs Matter for 4K Gam...
    • Why AMD’s Bad Benchmar...
    Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Intel Core i3-13100: geni.us/DfDsan
    Intel Core i5-13400: geni.us/lQZFpt
    Intel Core i5-13500: geni.us/goRuq2
    Intel Core i5-13600K: geni.us/lJgCa
    Intel Core i7-13700K: geni.us/Gpuqh3
    Intel Core i9-13900K: geni.us/AieGiL
    AMD Ryzen 7 8700G: geni.us/rTptD
    AMD Ryzen 7 7800X3D: geni.us/FxBWD
    AMD Ryzen 9 7950X3D: geni.us/BV9I9nH
    AMD Ryzen 5 7600: geni.us/qGJVbZ7
    AMD Ryzen 5 7600X: geni.us/Hn7m9
    AMD Ryzen 7 7700: geni.us/6VP8wD
    AMD Ryzen 7 7700X: geni.us/fAfH
    AMD Ryzen 9 7900: geni.us/RRL3Fa
    AMD Ryzen 9 7900X: geni.us/Rjlq
    AMD Ryzen 9 7950X: geni.us/lTmCsO
    AMD Ryzen 7 5800X3D: geni.us/wZd1
    AMD Ryzen 9 5950X: amzn.to/32gHUAN
    AMD Ryzen 9 5900X: amzn.to/3kUa1gq
    AMD Ryzen 7 5800X: amzn.to/32cX9uB
    AMD Ryzen 5 5600X: amzn.to/3evOFn9
    AMD Ryzen 5 5600: geni.us/dLiI
    Video Index
    00:00 - Welcome to Hardware Unboxed
    00:42 - Ad Spot
    01:30 - Has the 5800X3D aged poorly?
    05:22 - Test System Specs
    05:40 - Baldur’s Gate 3
    06:14 - The Last of Us Part I
    06:41 - Cyberpunk 2077 Phantom Liberty
    07:09 - Hogwarts Legacy
    07:35 - Assetto Corsa Competizione
    07:56 - Spider-Man Remastered
    08:40 - Homeworld 3
    09:03 - A Plague Tale Requiem
    09:26 - Counter Strike 2
    09:46 - Starfield
    10:18 - Horizon Forbidden West
    10:44 - Hitman 3
    10:57 - Watch Dogs: Legion
    11:11 - 13 Game Average [1080p]
    12:19 - Final Thoughts
    Read the article version on TechSpot: www.techspot.com/review/2862-...
    DDR4 vs. DDR5, New Game Benchmark + Has 5800X3D Aged Worse Than 12900K?
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo
  • Věda a technologie

Komentáře • 1,2K

  • @ayuchanayuko
    @ayuchanayuko Před 13 dny +851

    Context: DDR5 prices were sky high with the release of of the 12900K. That and the sky high prices of DDR5 boards too. Compared to DDR4 prices and the price of the 5800X3D at the time, it was truly good value for its performance.

    • @jonivilander5758
      @jonivilander5758 Před 13 dny +64

      im still thinking buying 5700x3d to replace my 5600x i play 1080p with rx 6700...

    • @mircomputers
      @mircomputers Před 13 dny +2

      bards were cheaper after the release wave, as no one was purchasing them

    • @Gazer75
      @Gazer75 Před 13 dny +27

      @@jonivilander5758 Not worth it I think... I upgraded to the 5800X3D and got some big improvements in some of my builder and management games, but little to none in others.

    • @YordanHristooov
      @YordanHristooov Před 13 dny +50

      @@Gazer75it’s important to know which games he is playing because in MMOs, sims and a few other online games the cache makes big difference and especially in the lows. I’ve experienced it myself! Also my friend replaced his 5950x for 5800x3d, because we were doing 50v50 wars and he couldn’t believe how much better the 5800x3d was in that

    • @High_On_Hope
      @High_On_Hope Před 13 dny

      ​@@jonivilander5758save for a better gpu
      Go for x3d if u have the budget for 4070ti/7900xt level of gpu power

  • @FatetalityXI
    @FatetalityXI Před 13 dny +891

    These "experts" on X 😂

    • @Hardwareunboxed
      @Hardwareunboxed  Před 13 dny +245

      Gotta love'em :)

    • @insideline158
      @insideline158 Před 13 dny

      @@Hardwareunboxed "Xperts", that platform really is full of them.

    • @Chasm9
      @Chasm9 Před 13 dny +104

      Ah, Twitter... the Digital mental asylum. 🙂

    • @christophermullins7163
      @christophermullins7163 Před 13 dny +76

      The fact that no one has said "X-perts" 😥

    • @Owen-np3wf
      @Owen-np3wf Před 13 dny +7

      @@Chasm9thought it would of been reddit

  • @ivofixzone6410
    @ivofixzone6410 Před 11 dny +21

    The video is about 5800X3D aging. Why game benchmarks are using 7800X3D , I'm confused.

    • @MrDeathknight1
      @MrDeathknight1 Před 2 dny

      I thought the same and it would at least be on the chart but they don't have the 3800mhz stuff anymore

    • @UKKNGaming
      @UKKNGaming Před 3 hodinami +2

      You skipped the video lol

  • @marinipersonal
    @marinipersonal Před 13 dny +465

    7800x3D at £314 in the UK, and can be cooled by a £30 air cooler IS the best bet for gaming. No arguing.

    • @Toufan1-gr4br
      @Toufan1-gr4br Před 13 dny +35

      phantom spirit 120se my beloved

    • @AlexHusTech
      @AlexHusTech Před 13 dny +90

      Also doesn't need an addition PSU just to power it like the 14900 does lol :yt:

    • @TechnoFreak-IN
      @TechnoFreak-IN Před 13 dny +52

      7800x3d temps are ridiculously amazing for its gaming performance!

    • @ShadowsOfThePast
      @ShadowsOfThePast Před 13 dny +6

      ​@@TechnoFreak-IN what kind of temps are you getting in gaming and what sort of cooler..? I'm upgrading and gonna go for 7800x3d most likely.

    • @Boofski
      @Boofski Před 13 dny +31

      £314?
      Bro the 2nd hand market is being flooded with 7800x3D’s because of zen 5 being around the corner. I can pick one up for 260 euro here in the Netherlands that’s like £220

  • @DragonOfTheMortalKombat
    @DragonOfTheMortalKombat Před 13 dny +648

    It's not 5800x3D aging worse than 12900K, it's DDR5 aging like a fine wine. More games are taking advantage of the high bandwidth.

    • @fVNzO
      @fVNzO Před 13 dny +110

      Availability of lower CL 6000 and 6400 memory also went from being premium to the standard cheap option. So most people building today will have decent lower latency memory that'll just work through XMP with zero tweaking necessary. This will give a couple percent more performance in favor of a DDR5 config.

    • @FatetalityXI
      @FatetalityXI Před 13 dny +67

      But but the "experts" on X will tell you it's the 5800x3D aging like milk. How dare you make sense!

    • @stennan
      @stennan Před 13 dny +21

      ​@@FatetalityXIjokes on them, I like cheese 😋🧀

    • @SR-fi8ef
      @SR-fi8ef Před 13 dny +3

      ​@@stennanit's made from milk!

    • @insideline158
      @insideline158 Před 13 dny +8

      I wonder if equally large gains can be had if/when AMD can make CPUs that can fully use DDR5 speeds.

  • @MuhlisErtugrul
    @MuhlisErtugrul Před 13 dny +177

    The time I upgrade to DDR5, DDR6 will be the new trend so I'm gonna stay at DDR4 for now

    • @davewhitefield1030
      @davewhitefield1030 Před 13 dny +15

      same, bought and built my pc with the latest ddr4 and am4 availability while ddr5 and am5 were barely emerging on the market, can't be happy enough with the purchases

    • @brockysthoughts1662
      @brockysthoughts1662 Před 13 dny +4

      Nar its just going to change form first in to this standard lpcamm2 so we will all need a new mb again

    • @eliadbu
      @eliadbu Před 13 dny +23

      new revisions of DDR release every 6-7 years, and it's adopted year or two after official release. If you are the type of guy that wait for the new tech to come down in price, I would say you will wait until 2030 at the very least. so upgrading to DDR 5 somewhere until then isn't unreasonable.

    • @endlessgrinding3764
      @endlessgrinding3764 Před 13 dny +2

      ​@@eliadbu a good assumption. I have heard same from some of my friends as well.

    • @christianmino3753
      @christianmino3753 Před 13 dny

      @@brockysthoughts1662 Oh, I'm sorry. I didn't know you could see the future.
      You understand the 2 at the end of Cam, would imply that Cam[1] came first, right? Cam has been around for a while now, and we have to wait to see what happens to know if it will replace DIMM. We've had plenty of different ram form factors after DIMM became a thing, and plenty of form factors within DIMM itself too. Sodim, Udim etc.

  • @avi4796
    @avi4796 Před 13 dny +9

    Steve, the "lines" separating the graphs per CPU tier makes it far better to handle then without. Same for those % numbers. Much much easier. Good idea 🍻

  • @JarrodsTech
    @JarrodsTech Před 13 dny +386

    Steve staring into my soul 👀

    • @fleurdewin7958
      @fleurdewin7958 Před 13 dny +21

      Your soul has been leaking data just like Zotac . Steve picked that up.

    • @MuhlisErtugrul
      @MuhlisErtugrul Před 13 dny +1

      DDRy stare

    • @ibslice
      @ibslice Před 13 dny +2

      yup.. he keeps it in a nexba bottle.

    • @EbonySaints
      @EbonySaints Před 13 dny +2

      It's like an old 80s slasher flick poster where the killer is staring you down before he goes in for the kill. Steve channeled that Freddie Krueger energy hard for that thumbnail.

    • @Shadow_banned_by_YouTube
      @Shadow_banned_by_YouTube Před 13 dny +2

      👁️👄👁️

  • @auritro3903
    @auritro3903 Před 13 dny +286

    5800X3D may not be as powerful now, but it will always remain in our hearts...

    • @oogwaythesussyturtle
      @oogwaythesussyturtle Před 13 dny +69

      tbh it's still one of the best value gaming cpus out there. u can grab a 5800x3d around 300 bucks these days and it's simply unmatched in gaming at that price point. especially since older mobos and ddr4 ram is dirt cheap.

    • @Arejen03
      @Arejen03 Před 13 dny +14

      Me who recently swapped 2600 to 5600x 😅😭

    • @DevouringKing
      @DevouringKing Před 13 dny +7

      In Star Citizen, its still extreme good because X3D Cache Advantage is huge there. with 40 Watts CPU drain i got over 400fps inspace.

    • @mikem2253
      @mikem2253 Před 13 dny +14

      Its still a very powerful gaming cpu.

    • @angeltzepesh1
      @angeltzepesh1 Před 13 dny +33

      Ehh, i dont know why everyone has this mentality "it's not the biggest number, it is not good anymore". Most people are still gaming at 1080p/1440p with 120hz/144hz panels, a 5800x3D will still suffice for such a scenario.

  • @twiggsherman3641
    @twiggsherman3641 Před 13 dny +53

    My 5800X3D was on a flash sale. So it cost me 290 CAD. I replaced a 3600X. 12900k was never going to be able to compete on price with a drop in solution that good.

    • @dieseIboy
      @dieseIboy Před 12 dny +1

      That’s a great price! I got mine for $315 CAD. Where did you get yours at with that great price? Cheers 🇨🇦

    • @bufalong
      @bufalong Před 5 dny

      amd cpu work well? any crash in gaming?

  • @TheZoenGaming
    @TheZoenGaming Před 13 dny +27

    LOL My 5800X3D is still going strong. I've got it working with 32GB of DDR4 @3600 MHz CL14-16-16-36 and a 3090. There's no "degradation" to be seen.

    • @Resolute.j
      @Resolute.j Před 11 dny

      This is my rig except on a 5900X. CL14 @3600Mhz is so good, I don't think I'll move up until there is an equivalent in the current or upcoming platforms.

    • @metradawn8071
      @metradawn8071 Před 11 dny

      What's the point of running b-die on a 5800x3d? Any shitty djr could have done the job. Also, if you're running your 3600mts cl14 b-die at stock, you might as well run at jdec spec then sell your cpu and replace it with an x3d chip.

    • @exoticspeedefy7916
      @exoticspeedefy7916 Před 10 dny

      @@metradawn8071 Latency makes a difference even on 5800x3d. I got down to 55ns with 1900fclk

    • @dixie_rekd9601
      @dixie_rekd9601 Před 14 hodinami

      ​@@metradawn8071sell his 5800x3D and replace it with an X3D variant? ... Hmm 🤔

  • @WarCrab-009SS
    @WarCrab-009SS Před 13 dny +137

    No clue about benchmarks but my 5800X3D with 3733mhz 32GB RAM and 7900 XT kick every game I throw at them in Linux and Windows.

    • @Hardwareunboxed
      @Hardwareunboxed  Před 13 dny +90

      It should, it's still a great CPU!

    • @WarCrab-009SS
      @WarCrab-009SS Před 13 dny +39

      @@Hardwareunboxed They say we should ignore those that antagonize us, but you did it with style with those "experts" on X. Well done.

    • @TheBoothy666
      @TheBoothy666 Před 13 dny +5

      I've got a similar set-up, just a 6900XT at the mo. Also on Linux (with dual boot into Win 10, which hasn't been used in months now!). Didn't see the point in going to AM5 at the time (no AM5 X3D CPUs back then, plus expensive RAM etc). So just got the 58000X3D instead. Next buy will be a new GPU, but I'm in no rush, I'm mostly strategy, base/city building, and RPGs. Will likely see what comes out later in the year from AMD, if anything.

    • @DeeDee.Ranged
      @DeeDee.Ranged Před 13 dny +2

      I've got a 5800x (2020) no 3d paired with a rx 6900 xt (2022 got it for €.500)), running Debian testing with wine-staging using Heroic (as an Epic and GoG replacement) and Steam launchers and some games direcly on wine-staging no lutris or whatever. Will still be some years before I'm building a new rig. This will last me for the next 2 to 5 years depending how gaming/future evolves. No need for the latest and greatest yet. By that time will look around for the best value for the money like most sensible people will do.

    • @christophermullins7163
      @christophermullins7163 Před 13 dny +3

      ​@@WarCrab-009SSX-perts
      It was right.. there!!

  • @cracklingice
    @cracklingice Před 13 dny +163

    Would have been nice to see the 5800x3d in the final charts. I know you guys do not recommend AM4 but for anyone that wants a board with two 8 lane PCIE slots, the price of AM5 board is literally double so the 220 dollar increase in platform cost does make AM4 a consideration, particularly with the 5700x3d at just 200 dollars. I mean that's the CPU and board for the cost of just the AM5 board.

    • @paulrmurrayful
      @paulrmurrayful Před 13 dny +18

      Yeah and the saved cash could help afford a much better GPU to pair it with.

    • @cracklingice
      @cracklingice Před 13 dny +1

      @@paulrmurrayful I get that, but it is not my concern as I already have a GPU. I just want to be able to upgrade from my X99 setup and still be able to have a GPU a network card and my SSDs.

    • @n3xus236
      @n3xus236 Před 13 dny +6

      Wrong and outdated, decent am5 motherboard can be found for as low as $150. Not $220. You 5800x3d fanboys really love inflating numbers for new gen, reminds me of 10series cards owners. We get it was a good investment, but times change

    • @Shadowninja1200
      @Shadowninja1200 Před 13 dny +1

      The issue is that you're buying into a dead platform. I tell anyone looking for a new system that if they want the best gaming performance for 2-3 years is go for am5/7800x3d. You get the benefit of potentially doing a cpu swap down the line (if amd does a similar support like the am4 line) and it's easier to swap a gpu out then a whole motherboard/cpu. A solid foundation makes for an easier time keeping a system updated for years imo.
      That and you're looking at pcie gen 5 which can be split easily because of the insane bandwidth that it run at. Highly doubt anyone will actually use up a gen 4 lane much less a gen 5 lane. I mean hell we're just now getting to the point where gen 3 is a bottleneck.

    • @markh4750
      @markh4750 Před 13 dny +6

      @@n3xus236 Yeah, I recently updated my son's system from a R5 3600 to a 5700x3D, which made a lot of sense since he was already on AM4. When I built new for my stepson, it was 7800x3D all the way, because it didn't make sense to go back a gen.

  • @PookaBot
    @PookaBot Před 13 dny +13

    I've seen people suggest that the 3D vcache makes memory bandwidth less important to those parts. It might be interesting to test something like the 7800x3d with slower memory to see if it's less affected than expected.

    • @rallyscoot
      @rallyscoot Před 9 dny

      yes want to see that review too. It might give us some intresting data.

  • @TheHoff6300
    @TheHoff6300 Před 13 dny +32

    The real value of the 5800x3d was (an sometimes stills is) the upgrade from older Ryzen CPUs. It really pushes your PC on a whole different level when you drop a 5800x3d in your 2700x system. For a relativ small amount of money you are able to breath enough CPU power to your system to last a fair bit longer.

    • @francoiscoupal7057
      @francoiscoupal7057 Před 13 dny +2

      I'm planning to do just that very soon to give my current desktop a few more years of use. Updating from a 3600.
      And since most of my games are CPU intensive sims (Stellaris, Oxygen Not Included, Rimworld, MOBAs,...) I don't even need a new GPU to max it.

    • @alexanderramos9121
      @alexanderramos9121 Před 13 dny +4

      Agreed 100%. 5700x3d has even better value. I find it comical AMD actually made a part for AM4 users right before their AM5 launch that literally has 78% of a 7800x3d's performance according to the 13 game average at 1:54 for less than half of the cost it would be to get a new motherboard, new ram, and the new cpu.

    • @NuttGreez
      @NuttGreez Před 13 dny

      The upcoming 5900xt are gonna be an epic productivity drop in upgrade for anybody still on AM4... When they drop into $200 territory my old 2800x productivity rig can be productive again lol.

    • @tonicipriani
      @tonicipriani Před 13 dny +2

      I just did a full renewal of my PC from late-2019, from 3700X and 2060 to 5700X3D and 4070 Super, doubled the RAM to 32GB, all on the same motherboard. I should be set for the next while, maybe even till AM6.

    • @nateo200
      @nateo200 Před 11 dny

      @@toniciprianiYeah I’ve got a 3900X with a 3060Ti. 5800X3D or 5900X not sure which. I’ve got 32GB of RAM but it’s 2400MHz so maybe I should upgrade that as well

  • @AdamWebb1982
    @AdamWebb1982 Před 13 dny +21

    Still rocking a 5800x3d on my day one purchase of a X370 board… madness that longevity.

    • @ferox63
      @ferox63 Před 13 dny +3

      Me too. I'm still running a Preorder X370 C6H. The fact that we were able to run so many generations of CPU's on one board is wild.

    • @AdamWebb1982
      @AdamWebb1982 Před 12 dny +1

      @@ferox63 yeh C6H here as well 💪

    • @W1ldSm1le
      @W1ldSm1le Před 5 dny +2

      X370 taichi with a 5700x3d here. 3rd cpu in the same board. Probably end up using the same board for a decade, as someone who's been building computers since 2004 it's pretty mind blowing to me. Used to upgrade virtually everything every 2.5 years.
      Even case design has come so far that it'd be hard to justify upgrading a good case made in the last 8 years.
      The barrier to entry for a new builder has become more expensive over time but 10 years ago nothing you bought was ever worth keeping. The recurring costs have actually gone down. A 300$ hard drive in 2005 was functionally worthless by 2008 for instance. But if you bought a 300$ nvme drive in 2021 it wouldn't be a paper weight today.

    • @benton7254
      @benton7254 Před 4 dny +3

      5800X3D on a B350 lmaoooo

  • @RedundancyDept
    @RedundancyDept Před 13 dny +92

    I'm glad you tested with DDR4-3600 instead of 4000. The former is more representative of what people actually use. Locked Alder Lake CPUs don't even have the option to use memory speeds much higher than 3200 in Gear 1 mode.

    • @CaptainScorpio24
      @CaptainScorpio24 Před 13 dny +2

      really ?? damn mine stuck at 3000mhz on i7 12700 non k

    • @MuhammadRidwan-pe7ny
      @MuhammadRidwan-pe7ny Před 13 dny +2

      @@CaptainScorpio24 really? even my i3 12100 could manage 3200 on xmp. maybe try changing the ram itself.

    • @CaptainScorpio24
      @CaptainScorpio24 Před 13 dny

      @@MuhammadRidwan-pe7ny i hv old 32gb set of ram that i used to run on my i7 7700 non k..

    • @Kumoiwa
      @Kumoiwa Před 13 dny +15

      Yes, no point on testing DDR4-4000 when it's usually even more expensive than regular DRR5-6400. Even 3600 cl14 is unrealistic as most people will buy 3600 cl18 or 3200 cl16

    • @Solrac-Siul
      @Solrac-Siul Před 13 dny +2

      I understand your point but he is testing K versions so the results presented even with ddr4 would surpass what one would get with locked version. And quite bluntly if there are people running 13600k/14600k and 13900k/14900k with ddr4 3600 I do not know what to say except that they should not have purchased those cpus and use ddr4 . The point of K version is to tweak them, and 13600/14600, 13700/14700 and 13900/14900 can run 4000+ in 1 to 1 ratios. I run my 14700k at 4533 16-16-16-36, and that allows it to match ddr5 7600+ in many games. in fact in some titles , like arma , the ddr4 setup beats my other ddr5 system, a 14900k running ddr5 at 8400. Let me repeat 4533 1 to 1 beats ddr5 8200 in some games, and probably will beat 8600 or 8800 also since the increase in performance above 8200 is very limited to the point i prefer to run 8200 instead of 8600 in order to limit ram temperatures.

  • @amusetech
    @amusetech Před 13 dny +25

    At the same time this video once again shows what a beast of gaming CPU the 7800X3D really is...

    • @aberkae
      @aberkae Před 13 dny +4

      Yeah it's aging well against multiple generations vs i9 13900K/S and now i9 14900k/S. Intel decided a double smack down is needed renaming aside 😂.

    • @jimdob6528
      @jimdob6528 Před 13 dny +7

      I mean it also shows how well the 5800x3d and 5700x3d is aging as well. The fact I could put a 5700x3d on my b450 motherboard for 200usd and get a 60% increase on just the cpu performance is crazy. Basically a new pc and in 4-6 years I can platform jump to the am6 platform when that comes out and it’s fantastic all around. Love the amd/intel competition

    • @amusetech
      @amusetech Před 13 dny

      @@jimdob6528 Sure! If someone is on the AM4 platform and just want to get the last boos out of it, it's an amazing deal and value.
      For people buying new gaming rigs, I can't see a valid reason to go with anything else than 7800X3D.

    • @donaldnemesis393
      @donaldnemesis393 Před 13 dny

      I'm curious about how it performs on video rendering to see if it also the budget option for video editing as well

  • @hasnihossainsami8375
    @hasnihossainsami8375 Před 13 dny +149

    X3D's biggest selling point isn't that it's the fastest - honestly, the average % difference between all of the CPUs mentioned here mean nothing at the proper resolution, because 1080p on a 4090 is silly.
    No, the real selling point of X3D is the 1% lows. Smoother gameplay > highest fps any day. And that is where even the 5800X3D sometimes comes out ahead of the 14900k, while the 7800X3D is on another planet.

    • @magnusnilsson9792
      @magnusnilsson9792 Před 13 dny +34

      Indeed, cache is king, for the 1% low.

    • @hivemind8817
      @hivemind8817 Před 13 dny +7

      4:16 Maybe look at the charts. Its 1% low is worse.

    • @-T--T-
      @-T--T- Před 13 dny +2

      I play most of my games through my 4090 at 1080p!

    • @copperypuddle3858
      @copperypuddle3858 Před 13 dny +24

      @@hivemind8817When compared to a faster CPU using faster RAM, yes. Compare that to the DDR4 results and it’s better. Now compare the 7800x3d to Intels equivalent, the 7800x3d is always superior in 1% lows.
      Cope’n’seethe

    • @copperypuddle3858
      @copperypuddle3858 Před 13 dny +9

      @@hasnihossainsami8375 How is it possible that people still don’t understand why games are tested at 1080p. How.

  • @SoranPryde
    @SoranPryde Před 13 dny +85

    Yes the 12900K + DDR5 combo today is faster in modern titles than the 5800x3D + DDR4 combo
    But you also have to remember that back when the 12900K was new, to get its full potential you needed a decent Z690 DDR5 MB as well as DDR5 DIMMs to go along with it. DDR5 back then was expensive
    The 5800x3D on the other hand ran perfectly fine on cheap potato B550 MBs and 3600Mhz XMP D4 was much lower in pricing than even cheap potato 4800Mhz JEDEC D5. And for the 12900K + DDR5 to beat the 5800x3D + DDR4 by a decent margin, it needed 6000Mhz and above D5 which was EXPENSIVE AF

    • @cameronbosch1213
      @cameronbosch1213 Před 13 dny +3

      Keep in mind that DDR5 prices have freefallen in the 12 months since Ryzen 7000 was released. It's now a no brainer for new system builds.

    • @SoranPryde
      @SoranPryde Před 13 dny +13

      @cameronbosch1213
      I'm comparing in the context of how market conditions were when both the 5800x3d and 12900k were brand new. Of course we know DDR5 price nosedived 1 year after Ryzen 7000 launch but by then it was already half a year since 13900k was launched

    • @chomper720
      @chomper720 Před 13 dny +7

      You can even go lower than B550... B450!!!

    • @adlibconstitution1609
      @adlibconstitution1609 Před 13 dny +1

      Past is the past. Bottom line is I will choose 12900k + ddr5 over the 5800x3d when building a new PC.

    • @SoranPryde
      @SoranPryde Před 13 dny

      @@adlibconstitution1609
      The video's title is "Has 5800X3D Aged Worse Than 12900K"
      Therefore we are comparing how they were when they were both brand new back then to buy, and how they age till today in terms of performances vs money spent back then to enjoy from then until now

  • @buckiesmalls
    @buckiesmalls Před 13 dny +21

    I am glad your clocks are upside-down. Get your vids around 7am (US) right after i get off work. They are nice to wind-down to.

    • @Osprey850
      @Osprey850 Před 11 dny +1

      Funny that you say that *their* clocks are upside down when you're the one getting off work when most of us in the US are just waking up. 😆

    • @buckiesmalls
      @buckiesmalls Před 11 dny +1

      @@Osprey850 Yeah. It works because MY life is upside-down with my working hours, as an Electrical Lineman. LOL

  • @MN12BIRD
    @MN12BIRD Před 13 dny +10

    Damn some of these differences are mind blowing tbh I remember when DDR5 launched 4 years ago (Yes, it was 4 years ago now!) and in gaming benchmarks it was only like %2-5 faster than DDR4 at best if you were lucky (some games it was basically the same!) and now to see %20 and even %30 gains is crazy. I know DDR5 RAM has got a lot faster since it launched. I mean what was the best at launch 5600 at cl38 or something like that? I wonder how much of this improvement is just the DDR5 RAM being faster vs games actually using more memory bandwidth?

    • @Violet-ui
      @Violet-ui Před 13 dny +1

      Now you can get 7000 CL32 for 110 dollars

    • @danitch6848
      @danitch6848 Před 12 dny

      I think its mistake.
      They used ddr in gear 2

  • @umusachi
    @umusachi Před 13 dny +1

    Great information! As a long term viewer, I like this multi-pronged up format!! What is great about keeping the older CPUs / hardware in the testing is buying advice in the sense that it helps you compare your current / older system. Even a CPU or two from those series. Like I'm on 10th get and would love to know how it compares, although I'm sure if I spent enough time digging all the information is there in your archive of videos. Excellent content. I really appreciate all the hard work you both do!!!

  • @AtifSheikh
    @AtifSheikh Před 13 dny +6

    This puts a spotlight on another thing... How minuscule 14th gen uplift actually is compared to 12th gen.
    Assuming if we put same ddr5 speeds on both 14900 and 12900 the difference shown here will only drop more.

    • @andersjjensen
      @andersjjensen Před 13 dny

      The architectural difference between 12th and 13/14th is just an increase in L2 and L3. So it has, thus far, been Skylake all over again.

  • @EnigmasityGG
    @EnigmasityGG Před 13 dny +16

    I‘m really happy I went with the 7800X3D about a year ago. It’s an absolute monster, especially paired with a nice kit of 6000 MHz CL30 memory. The craziest thing is, that it runs at 40-70 W while gaming and at a max of 90 W in an all-core workload. Also gives you a nice upgrade path to AMD‘s new X3D chips down the line.

    • @andersjjensen
      @andersjjensen Před 13 dny

      Unless you're running a stupid high end GPU at stupid low resolution you'll be well and truly set until Zen 6 X3D is being pushed down in price by Zen 7.

  • @tarkz7763
    @tarkz7763 Před 13 dny +4

    Another great video as always. As soon as I saw the title for this video it made me wonder if you would consider adding "Escape from Tarkov" to the battery of tests. There seems to be either a lot of conflicting opinions or misinformation, but for a few years now lots of people say it doesn't really matter too much about your GPU or CPU, but rather your RAM. A large portion of the player base can't/wont play one map in particular, despite many having decent, if not high end PC's. Between this video and your other recent video covering capacity, it's made me realise that there is very little coverage on RAM and I am personally really appreciative of this content

  • @TrackmaniaKaiser
    @TrackmaniaKaiser Před 13 dny +14

    Isn't the idea behind the 3D V-Cach that you get all the performance uplift of a faster memory and thats why the 3D V-Cach CPUs have a terrible DRAM-Frequency scaling, because they get all the performance even with slower memory?

    • @FTWGame0N
      @FTWGame0N Před 13 dny +15

      Thats exactly it. With more CPUs increasing its cache teh faster memory wont give better results.
      Its also whats makes 3dvcache so good on vlaue point. Get the cpu get a cheap/midrange RAM and you are set. You also dont need an expensive MB as well as the power draw is low on those x3d cpus.

    • @vx9330
      @vx9330 Před 13 dny +5

      Yes and no. x3D processors still scale with memory performance, they are just less sensitive to it but are bandwidth capped due to Ryzen's memory controller system's design. This means the addition bandwidth of higher frequency DDR5 (6400+) can't be effectively utilized, where as 13th/14th gen can scale with 7000+ DDR5.

    • @26Guenter
      @26Guenter Před 13 dny +4

      No, the V-cache allows more data to be stored closer to the CPU cores for quicker access, and not having access system memory which has a higher latency. You would have to use very fast memory to notice a performance uplift because the CPU doesn't have to access it as much.

    • @FTWGame0N
      @FTWGame0N Před 13 dny +1

      @@26Guenter why start with a No and then proceed to say the exact thing you are saying no to.

    • @26Guenter
      @26Guenter Před 13 dny

      @@FTWGame0N Because that's not what op is saying. He's saying the cache gives you a performance uplift because of a bandwidth increase similar to using faster memory which isn't correct.

  • @techwandoUS
    @techwandoUS Před 13 dny +16

    Great thumbnail.

  • @Jasontyo
    @Jasontyo Před 13 dny +51

    I just purchased a 5700x3D, upgrading from a 3900x and I feel it was worthwhile. Keeping my old system running for a reasonable price is more important than having the absolute fastest thing possible. I only have a 3070 after all. Honestly, I think the 1440p display was a mistake, staying at 1080 would yield such better performance.

    • @obeliskt1024
      @obeliskt1024 Před 13 dny +4

      what kept you from getting the 5700x3d instead? could've been the last upgrade you needed.

    • @imo098765
      @imo098765 Před 13 dny +3

      DLSS and FSR/XESS. This is the way

    • @jimdob6528
      @jimdob6528 Před 13 dny +7

      @@obeliskt1024he/she did get the 5700x3d. Unless you are asking why he/she didn’t get the 5800x3d? If so it’s 100-120 dollars more expensive fore less than a 5% performance increase over the 5700x3d.

    • @obeliskt1024
      @obeliskt1024 Před 13 dny +11

      @@jimdob6528 nah, they edited the post. They initially mentioned the 5700x non 3d. So I guess all is good

    • @Jasontyo
      @Jasontyo Před 13 dny +6

      @@obeliskt1024 Typo. It was the X3D. for $192

  • @NEZLeader
    @NEZLeader Před 13 dny +3

    These new style of thumbnails as of late have been hilarious & awesome

  • @ianwiinamaki4770
    @ianwiinamaki4770 Před 10 dny

    This is the best thumbnail I have ever seen( just in case it changes) the one with the eye reflection on the ram.

  • @jeroenvdw
    @jeroenvdw Před 13 dny +2

    Which timings were used on the RAM? Can make a big difference in determining what to buy

  • @nipa5961
    @nipa5961 Před 13 dny +48

    The 7800X3D is untouchable.

    • @poocruncha3266
      @poocruncha3266 Před 13 dny +4

      until the 9000x3d chips come out 😎....

    • @jjglaser
      @jjglaser Před 13 dny +2

      @@poocruncha3266 for double the price of 7000 series.

    • @nipa5961
      @nipa5961 Před 13 dny +2

      ​@@poocruncha3266 I fear Intel will come short again. Maybe barely competing with the 7800X3D but 2 generations behind 9000X3D, again.

    • @poocruncha3266
      @poocruncha3266 Před 13 dny

      @@nipa5961 Intel knows that their fanboys will buy anything they put out

  • @rockingfps5104
    @rockingfps5104 Před 13 dny +9

    There is a possibility to include memory latency in next vídeo? Lower latency is very important, maybe, more important than MHz / throughput.

    • @andersjjensen
      @andersjjensen Před 13 dny

      In scientific compute it is, but in game benchmarks it doesn't account for all that much as most of it is masked by cache hits.

    • @rockingfps5104
      @rockingfps5104 Před 13 dny

      In gaming I see much a good difference in lower latency, most cases where CPU is bottleneck.

  • @winterwolf9376
    @winterwolf9376 Před 13 dny +2

    We need merch with that image Steve!

  • @bront_catherder
    @bront_catherder Před 13 dny +1

    A great follow up on this would be how much memory timings such as latency settings (such as CL) and MT on DDR5 affect the chips. I know the answer is "not a lot" and "it depends" on the CPU, but given the premium some of the extremely high end kits cost, but also the flood of kits in the 6000-6400 MT space, I'd love to see what's the uplift of a CL36 6000MT kit to a CL30 kit to a CL32 6400MT kit for example (particularly since the CL30 and CL32 kits mentioned are technically the same overall latency at 10ms)

  • @ishshah8695
    @ishshah8695 Před 13 dny +14

    Thanks Steve!

  • @kohiek
    @kohiek Před 13 dny +4

    It's good to know that DDR5 is really getting used. DDR gens at launch have been underwhelming.

  • @ilijazafirov4192
    @ilijazafirov4192 Před 12 dny

    Liked this video, it is much much better to understand when comparing to older review and see the differences than just saying it how purely it is right now with top of the line GPU etc, really like these comparisons, gives us great information for newer builds, keep it up please, greatly appreciated!

  • @josephhoalt6824
    @josephhoalt6824 Před 13 dny

    Great video not sure if I missed it or not but what was the spec on the DDR 4 3600 was it CL1 or CL2 and also if I’ve read correctly isn’t ddr 4 3200 cl1 better on 12th gen as it keeps the ring bus in sync and isn’t that better?

  • @jayfangRSA
    @jayfangRSA Před 13 dny +7

    I do love the "for science" videos 😎

  • @dangingerich2559
    @dangingerich2559 Před 13 dny +3

    To me, this is telling me "more cache > more clocks" in most cases. As a 7800X3D owner, I'm fine with that.

    • @seibstershots9527
      @seibstershots9527 Před 13 dny

      I don’t think it ever was about clocks as two CPU’s with same core count and speed could be completely different. Instructions per a clock is a better way to look at it as this helps you understand how much can go through the cpu per a given clock. It’s like two different size rivers going the same speed if that helps.

  • @MinishMan
    @MinishMan Před 13 dny

    Best thumbnail yet! Showcasing Steve's slow descent into madness through benchmarks

  • @xSiJuki
    @xSiJuki Před 11 dny

    Hello, got a question. Is the premium price on the Trident Z Neo worth over TForce Dark ZA? DDR4 3600MHz? Thanks!

  • @rangersmith4652
    @rangersmith4652 Před 13 dny +3

    LGA1700 is unique among current platforms in that we can use it to make direct DDR4 / DDR5 comparisons. Makes one wonder what we could achieve IF it were possible to modify a Zen3 CPU and AM4 board to run DDR5 memory.

    • @olnnn
      @olnnn Před 13 dny +1

      They did make the mobile 6000 zen3+ cpus that support ddr5 and lpddr5 so I it does technically exist (and also some zen2 mobile ones that support lpddr5) - not sure if there is an easy way to compare them clock for clock to the zen3 ddr4 5000 counterparts tho
      I guess if they ever end up with a lot of leftover dies maybe they could make an ultra budget am5 APU based on them or smth

    • @ismaelsoto9507
      @ismaelsoto9507 Před 13 dny

      @@olnnn Sadly not as easy to compare considering that Zen 3+ is only exclusive to APUs and it was quite clear that 16 MB of L3 Cache reduced Zen 3 performance significantly compared to regular Zen 3 32 MB. But it was enough to make them quite competitive with mobile Alder Lake CPUs.

    • @olnnn
      @olnnn Před 12 dny

      @@ismaelsoto9507 Yeah the cut in half cache really hurts the AMD APUs in gaming so at least the older ones basically ended up being about roughly equivalent to the previous gen non-APU cpu with the same core count once paired with a dGPU in performance.

  • @AlexHusTech
    @AlexHusTech Před 13 dny +8

    *Wow didn't think DDR4 > 5 would makes that make performance*

    • @kesamek8537
      @kesamek8537 Před 13 dny

      On appallingly 'optimized' console ports it certainly seems to.

    • @andersjjensen
      @andersjjensen Před 13 dny +1

      @@kesamek8537 People have been saying what you just said for the three decades I've been in the PC space. Every time games feature more realistic physics, NPC logic and environment interactivity people expect it to come at no computational cost, and hence it must be "badly optimized".

    • @THU31
      @THU31 Před 11 dny

      @@andersjjensen Starfield features none of those things and it runs like garbage. 😂

    • @andersjjensen
      @andersjjensen Před 10 dny

      @@THU31 I didn't say that people were wrongly claiming that games that have what we already had, but runs slower, are badly optimized. Because those obviously are. If game A has all the newest stuff in abundance, and is still faster than game B, which has yester-years stuff, then clearly there is something unusually wrong with game B. And yes, Starfield is one of those.
      That said: Starfield has really shitty textures wrapped over pretty high polygon objects, which makes it look meh compared to what it actually is. And that's just plain stupid. My main sour grape with it, though, is that it is utterly brutal on the CPU for no apparent reason.

  • @BadBoy11m
    @BadBoy11m Před 13 dny

    Do u use intel base line on bios for k version or default sittings

  • @mikoldeon
    @mikoldeon Před 13 dny

    Another hilarious thumbnail with Steve in 'car RAM rear mirror', as if he played a part in old Steven Spielberg's 'Duel' movie. :)

  • @pseudonim1
    @pseudonim1 Před 12 dny +3

    Put 5800x3d on tests plese

  • @manoftherainshorts9075
    @manoftherainshorts9075 Před 13 dny +25

    I've lost count of people telling me that DRAM speeds don't matter in CPU and overclocking memory is stupid.

    • @MaxIronsThird
      @MaxIronsThird Před 13 dny +7

      For Intel sure, but Ryzen sees almost no improvement and the X3D parts see virtually no improvement at all.
      PS: this is also DOUBLING the frequency and getting 10% to 20% for 13th/14th gen and 15% to 25% for 12th Gen.

    • @alrizo1115
      @alrizo1115 Před 13 dny +3

      ​@@MaxIronsThird I remembered the time where AMD needing high freq RAM compared to Intel to work at best. It was about 5 years ago

    • @gizmo104drives7
      @gizmo104drives7 Před 13 dny +2

      The thing is.. in the right context that advice is actually good.. however it doesn't apply for every application... and THIS should be considered for all tech advice... there are very few universal truths

    • @M3N04
      @M3N04 Před 13 dny +2

      There is Difference but not that much here the difference is big because they are comparing DDR5 with DDR4 which is huge but if you overclocked like 200mhz you won't see big difference.

    • @yarost12
      @yarost12 Před 13 dny +1

      ​@@MaxIronsThirdthat's just false. 5200 to 6000 gives you around 15-20% of performance and tuning timings gives you another ~10%

  • @mrcnorth7149
    @mrcnorth7149 Před 13 dny +1

    Thx for this video. Lots of great info from this channel

  • @brutalysk6115
    @brutalysk6115 Před 13 dny +1

    amazing comparison!
    I'd really like to see some comparisons with ddr4 2133 at some point. I'm running 4 sticks of ddr4 on my 13900k and I'm wondering how much performance is lost.
    A lot of my colleagues have the same issue, running at ddr5 4800 and ddr4 2133 due to high memory requirements.
    I could upgrade to 2*48gb ddr5 to run ddr5 6400 or higher, but I'm not sure if it's worth it.
    Thanks and have a good day!

  • @eliadbu
    @eliadbu Před 13 dny +5

    The big reason I spent quite a lot and went the DDR 5 kit (6000MHZ CL 36) back in early 2022 for my 12900K - it makes very large impact on performance for gaming,
    especially after I upgraded to RTX 4090.

    • @BenState
      @BenState Před 13 dny

      did you watch the vid? 15-20% improvement max

    • @eliadbu
      @eliadbu Před 13 dny +1

      ​@@BenStatemaybe you didn't, on average of 13 games 22% improvement, some games went high as 33%. In my opinion as I already allocated high budget for PC, it was worth it - 22% is what I expect gen on gen improvement for CPUs. Was it critical? Of course not, but as I'm building highend PC it was worth the high price back then.

    • @BenState
      @BenState Před 13 dny

      @@eliadbu$/frame for you extra 25 frames was?

    • @eliadbu
      @eliadbu Před 13 dny

      @@BenState if you compare just delta of the RAM it was around 200-250$ more than fast DDR 4 which would make it around 10$/frame.
      if you compare how much it added to the total system cost it would be probably less than 10% back then (as GPUs prices were high , add case + fans , psu cables, CPU + cooler, ssd etc), as I said for highend system I was building it made sense for budget system it wouldn't.

  • @draxrdax7321
    @draxrdax7321 Před 13 dny +33

    DDR FOUR EVER! Not gonna download different RAM even when DDR FOURTY comes out!

  • @upforellie
    @upforellie Před 13 dny

    These thumbnails have been ART as of late. I almost look forward to them more than the reviews themselves

  • @chrisbeauchamp5563
    @chrisbeauchamp5563 Před 13 dny

    Great video Steve. I'm 12th gen ddr 4 and really been wondering how much that has been affecting my system but estimating 10%. It's Alot more apparently.

  • @j3m638
    @j3m638 Před 13 dny +5

    My poor fast aging 9900K feels so abandoned already :( Looks like I had better finally upgrade to the 7800X3D at the end of this month then!

    • @Winnetou17
      @Winnetou17 Před 13 dny +1

      Weeps in 6700HQ. With 64 GB of 2133 CL 17!! RAM

    • @aberkae
      @aberkae Před 13 dny +1

      I upgraded from 9900ks @ 5.1 ghz all cores to 7700x 5.65 all cores ghz to now 7800x3d stock. In terms of power/ throttling, long game sessions, minimum ambient temperature delta changes and small form factor build the 7800X3D is awesome especially with 50 watts of power use during average gaming sessions.

    • @BeatmasterAC
      @BeatmasterAC Před 13 dny

      my Ryzen 3700X: "come again?"

    • @ShadowsOfThePast
      @ShadowsOfThePast Před 13 dny +2

      Well..rocking an i5-6600k and a 6gb 1060... 😢

    • @DiogoSantos-qn8ev
      @DiogoSantos-qn8ev Před 13 dny +2

      I5 4690K GTX 970 G1 Gaming 😢

  • @Anomnomnomaly
    @Anomnomnomaly Před 13 dny +14

    I see no reason to upgrade my 5800X3D system for another 1-2yrs, paired with a 6900XT and 32GB 3600Mzh DDR4 it can play anything I throw at it @ 1440p with a more than acceptable framerate... throw in FSR and RSR and it's a more than capable system... Once I'm seeing new games that I play in a constant sub 100fps avg. I'll start looking at a rebuild... I could always throw in a GPU upgrade to push that out until late 2026 if I see a bargain 2nd hand card for sale.
    But when I build a new system these days, it's so I can get at least a 100% performance improvement over my previous one.
    Value for money has always been my primary concern over bragging rights... I'll buy top tier stuff... just not at top tier prices. My 6900XT was 60% of the retail price in 2022, my 5800X3D was actually less than the 3800X it replaced.
    I'll stick with a platform that has proven longevity... intel don't know the meaning of the word. AM4 had 6+yrs, as did AM3+ before it.

    • @jimdob6528
      @jimdob6528 Před 13 dny +2

      This is exactly what I am doing. I got a used 7900xt for 580 on fbmp but otherwise I don’t plan on building a new pc till am6 comes out in 4-5 years. I rock a 5700x3d but my brothers 5800x3d only seems to show a 4% increase to performance so I am happy with my build.

    • @Spentalei
      @Spentalei Před 13 dny +3

      As long as you stick with a 6900XT upgrading the 5800X3D isn't really necessary unless you suddenly fancy 1080p 480fps gaming or something. The 6900XT isn't that high-end anymore nowadays. Mid-end is a bit insulting, so somewhere in between. And that's where the 5800X3D is still adequate.

    • @t1e6x12
      @t1e6x12 Před 13 dny +2

      Quite content with my 5700X and 7900XT :)

    • @ferox63
      @ferox63 Před 13 dny +1

      I'm running a similar 5800X3D/6800XT setup on X370. I may opt for one more GPU upgrade but lacking pci-e 4/5 is starting to limit my storage options. I may be tempted to upgrade to AM5 depending on what X870 brings to the table.

    • @Spentalei
      @Spentalei Před 13 dny +1

      @@ferox63 you can use gen 4 and 5 on a gen 3 slot/bus, seriously doubt youre going to notice a major difference in practice unless you bench it. Being stuck on a gen 3 storage is one of the worst reasons to upgrade imo.

  • @troyyey4353
    @troyyey4353 Před 8 dny

    thanks for your hard work !
    any chance you can add 1440p for future benchmarks ?

  • @evankirschenmann4653
    @evankirschenmann4653 Před 11 dny

    I just noticed and LOVE the kuai kuai in the background of your video

  • @tarif7541
    @tarif7541 Před 13 dny +4

    what about productivity?

    • @Hardwareunboxed
      @Hardwareunboxed  Před 13 dny +22

      No time for work, we're gaming!

    • @tiay6269
      @tiay6269 Před 13 dny +7

      x3d is a gaming cpu not made for productivity

  • @RealLifeTech187
    @RealLifeTech187 Před 13 dny +24

    13th gen parts are missing in this comparison 🥁 LOL
    how are we supposed to know how they'd perform? 😂😂

    • @stennan
      @stennan Před 13 dny +3

      Assume that they are 1-2% slower than 14th Gen? Doesn't change the conclusion of the video which was mainly about memory scaling on intel boards and 5800X3D

    • @Winnetou17
      @Winnetou17 Před 13 dny +4

      @@stennan He was joking :))

  • @Noobsaucer
    @Noobsaucer Před 13 dny

    Great video, as always! It's very interesting that in some cases ddr4 is enough of a bottleneck to effectively reduce the 12900k to a 12600k. In spider-man remastered the 2 chips are almost identical with ddr4, while there is a massive delta between them with ddr5. I wonder if we would have seen similar results for ddr4 vs ddr3 if we had the platform compatibility at the time to compare them head to head like this. Fascinating stuff.

  • @deviousnate7238
    @deviousnate7238 Před 13 dny +1

    Whatever 5800x3d content you want to produce, I want to see. Even if it is just five minutes at the start and you move to a different topic, I'll be here for the whole thing!

  • @toddsimone7182
    @toddsimone7182 Před 13 dny +6

    A moment of silence for everyone who bought a DDR4 Intel board to save a few dollars.

    • @mattBLACKpunk
      @mattBLACKpunk Před 13 dny

      Mehr, considering they would've had to buy a second kit of ddr5 a few years in to get any of those performance gains I'd say it was still the right decision.

    • @andersjjensen
      @andersjjensen Před 13 dny

      @@mattBLACKpunk And a new board... and the DDR5 boards were also stupid expensive at the time.

  • @TetraBF
    @TetraBF Před 13 dny +5

    useless test we already know that stock ddr5 is faster than ddr4
    show us oc'd ddr4 v oc'd ddr5

  • @shlooky
    @shlooky Před 13 dny +2

    I bought a 12900K with a DDR 4 motherboard and ram, because DDR5 and DDR5 MB's were expensive. It's good to know that there is still room for more performance if I buy a Z790 + DDR5 memory.
    Thanks Steve for the insight

  • @niyamimbi1179
    @niyamimbi1179 Před 13 dny

    do you guys have also a cache of multiple premium DDR3 kits lying around as you do DDr4?

  • @IslamGhunym
    @IslamGhunym Před 12 dny

    I gotta be honest this is the first time I made it in steve videos to the end 😂 while that is always the case in tim videos 😅 so good job on this one. Thx steve

  • @AdnanKhan-be9dr
    @AdnanKhan-be9dr Před 12 dny +1

    Highly appreciate the thumbnail. You guys are on top of your game.

  • @joshman5217
    @joshman5217 Před 13 dny

    Fantastic video and content, thanks! In my mind, I knew DDR5 was going to be faster but this video helped better visualize just how much faster it is depending on th scenario.

  • @ffblegend
    @ffblegend Před 6 dny

    What ram do you suggest to pair with the 7800X3D

  • @petrihadtosignupforthis8158

    Well, it was usually in-place upgrade for 2-3000 series.

  • @tdub77
    @tdub77 Před 12 dny

    Will there be a significant performance advantage to have 16 lanes of Gen 5 pcie lanes for the next gen GPUs and having a full 4 more lanes for M.2 main drive direct to the processor in the next get Intel platform? Worth the upgrade from a z690 platform? Thoughts?

  • @SAFFY7411
    @SAFFY7411 Před 12 dny

    I have another video idea and this mainly originates from previous reviews of the 5800X3D, where when the settings are lowered, the X3D performs much better in v-cache sensitive titles than non X3D CPUs. This was first seen with ACC where the 5800X3D was stonkingly fast at medium settings but when the quality is maxed out, the X3D benefits diminish a little. What would be interesting to see is if the same applies to the 7800X3D at settings other than epic. Worth investigating as gamers, especially sim racers won't be using max settings when racing to keep FPS as high as possible, but I'd be very curious if other games respond the same.

  • @zestnah
    @zestnah Před 12 dny

    is 4 sticks of 8gb ram in ddr4 fine to get 32gb? I currently have 2 sticks of 8gb, planning to add two more.

  • @The_Dreadbug
    @The_Dreadbug Před 13 dny

    I just had one of my 32x2gb cl30 gskill trident z5 neo ram sticks go out for no reason. Anything that would cause that? I've tested it in every slot, and nothing. I did have the expo enabled, but I didn't think that would cause it to go out. The sticks were only 6 months old. Nothing else in the system changed. I'm running a 7950x3d, a rog strix 4090 with 2 30mm rads, a custom water loop, 10 Lian Li tl fans, and msi x670e Ace mobo.

  • @gravidar
    @gravidar Před 13 dny +1

    cheers. I'm about due to update my 5900X so this is really helpful. will check out the 9800X3D when that's out but wont rule out Intel just yet

  • @Neopulse00
    @Neopulse00 Před 13 dny +1

    8:45 - Surprised seeing Homeworld 3 on there used as a benchmark. It is currently still brutal to run well.

    • @motmontheinternet
      @motmontheinternet Před 13 dny

      Right? My 7600X can only get 44 fps on lowest settings including turning upscaling on while at minimum resolution.

  • @root330i
    @root330i Před 11 dny

    great video, just for the record on the CB2077 graph the 12600k delta is 22% not 29%

  • @Novdid
    @Novdid Před 13 dny

    HUB is the hardware channel that does more interesting stuff (the science type tests as Steve calls them) when hardware news are at a low. Well done!

  • @imJGott
    @imJGott Před 13 dny

    Thank you for test ACC which is a cpu heavy sim. It’s appreciated from the sim community.

  • @mudzera
    @mudzera Před 13 dny +1

    Someone please help me with. Which Intel processor will I extract maximum performance from an RTX 4060ti?

  • @Demonoid1990
    @Demonoid1990 Před 13 dny

    I'm certainly anticipating a couple things. Will Arrow lake be Intel's turnaround point, and how is this new CAMM2 going to perform. I'm certainly looking into trying both out on my upcoming build later this year. My laptop is going on 5+ years old now, and I've decided to go back to a desktop build.

  • @damasterpiece08
    @damasterpiece08 Před 13 dny

    what are the ram options for 12400 and 13500 users on non z motherboards ?

  • @TrivialTax
    @TrivialTax Před 12 dny

    Nice preview shot!

  • @carolinagamer2874
    @carolinagamer2874 Před 13 dny

    This is what confuses me about what RAM speed to pair with a CPU. When I look up what the max speed are for DDR5 memory for the i5 12600k it's less than what you're running.
    So when I upgrade to a DDR5 board should I just go ahead and future proof my memory and go over what my i5 12600k is able to use? It's gonna just run speeds that it maxes out at right?

  • @kenshirogenjuro873
    @kenshirogenjuro873 Před 13 dny

    Wow the gains with DDR5 are quite a bit larger than I would have guessed. Excellent testing.

  • @Azureskies01
    @Azureskies01 Před 13 dny

    I like how reviews (not specifically HUB) at the time were saying that DDR5 didn't increase fps when anyone that has been gaming for any amount of time knew that it would but the question is by how much. Now we have data to show just how much.

  • @blarzgh
    @blarzgh Před 13 dny

    Big fan of those % increase figures in the graphs, they can be a little bit difficult to unpack sometimes lol

  • @dianaalyssa8726
    @dianaalyssa8726 Před 13 dny +1

    I do wish I had the 3D vcache. 5800X3D totally makes sense if it's affordable and those are on the platform. 12900KF has been great with DDR5. I can't speak to DDR4 but it's an option for budget builds at least. It depended when you bought into the platform, 399 CPU, 170 Z board, 90 DDR5 replaced with a 48gb 115 kit in cost. I do wish I had done a 3700X build then upgraded to a 5800X3D but oh well.

  • @2Burgers_1Pizza
    @2Burgers_1Pizza Před 13 dny

    Do you have rank/channel interleaving enabled with DDR4 in BIOS? That big of a difference with DDR5 looks wild.

  • @projectc1rca048
    @projectc1rca048 Před 13 dny

    Great Work as always guys!

  • @Rachit0904
    @Rachit0904 Před 13 dny

    Does it still make sense for people with good DDR4 to keep it when upgrading to 13th gen?

  • @trackgg586
    @trackgg586 Před 13 dny +2

    I9 cost vs r7x3d cost + AMD A or B chipset board cost vs Intel ddr5 enabled board cost + ddr4 vs 5 cost for just 7% of gain? 5800x3d has aged tremendously it this is the comparison we are to go by.

  • @WengerEj
    @WengerEj Před 13 dny

    Would this difference be applicable at all for 1440P?

  • @e21big
    @e21big Před 13 dny

    Hey Steve, seeing your data, I think it's kind of interesting now that if you own a 12th Gen Intel you now have two paths to upgrade your system, either go all the way to 14th Gen (if you own 12600K or 12700K) with the same DDR4 system and 12th gen mobo or just upgrade your motherboard and memory to DDR5 and keep your CPU. I wonder if you would be interesting to making a video about that.
    I think most 12700K or 12600K owner would be on DDR4 platform (DDR5 was unreasonably expensive by the time these parts were out and your channel also heavily reccomended it yourself), and if upgrading your system can net you 10-30 percent performance increase it's definitely a worthwhile investment.

  • @Roo0ooD
    @Roo0ooD Před 13 dny

    good to see a normal thumbnail for a change lads

  • @lucaschenJC
    @lucaschenJC Před 13 dny

    Can we have the time stamps as well? Thanks!

  • @slothnium
    @slothnium Před dnem

    I bought a 5800X3D when it was on a massive sale. I couldn't be happier with it, even though I have a separate 13700K gaming PC.
    And now that the 13/14gen Intel i9/i7/i5 seem to have issues...