300W Intel Core i9-13900K CPU Review & Benchmarks: Power, Gaming, Production

Sdílet
Vložit
  • čas přidán 15. 06. 2024
  • Grab a GN Coaster Pack & support all our testing efforts! store.gamersnexus.net/product...
    Our review & benchmarks of the Intel Core i9-13900K CPU includes testing vs. the AMD Ryzen 9 7950X, R9 7900X, Intel i9-12900K, and more. The 13900K CPU is brand new from Intel, but sockets in both Z690 (backwards compatible) and the new Z790 boards -- though there shouldn’t be much difference between features. The 13900K can run with DDR4 or DDR5 and will likely be the last true consumer platform on DDR4. AMD AM5 is already fully committed to DDR5.
    Watch our Intel i5-13600K CPU review here: • Intel Takes the Throne...
    One correction in this review: We just noticed the power efficiency chart contains a charting error (I accidentally plotted the power efficiency for a different test for the 13900K, but the chart is for Blender - so the numbers are correct but they're for the wrong software with the 13900K). This error doesn't change any of our conclusions nor does it change that the CPU does draw 300+/ Watts when under full load, to be clear about that, so power consumption is right. This is the only error we're aware of and, again, the numbers are right but the 13900K result for the Blender chart is different. That particular chart should be between 32.5-37.0, depending on which set of results you take for the math and if you calculate it warm or cold. This means it is still one of the least efficient entries on the chart; however, the 12900K is actually slightly less efficient than the 13900K here, not more. Sorry for that mistake! No excuses. Just too much data too fast, and I'll try to get a few hours ahead of the review cycle next time so I'm not operating so tired. My fault and thanks for your understanding! - Steve
    Benchmarking includes power consumption tests of the i9-13900K, efficiency charts, and single-core consumption. We’ll also go through gaming benchmarks at 1080p and 1440p, some RTX 4090 benchmarks on the 13900K vs. 7950X, and benchmarks in Blender Cycles, code compile, and the Adobe suite (Premiere & Photoshop).
    RELATED CONTENT
    Watch our AMD Ryzen 7 7700X CPU review, which contains the latest data for the 7950X, 7600X, and 7900X: • AMD Ryzen 7 7700X 8-Co...
    RELATED PRODUCTS [Affiliate Links]
    Intel i9-13900K on Amazon: geni.us/LVBD
    Intel i5-13600K on Amazon: geni.us/sGMSK
    AMD Ryzen 9 7950X on Amazon: geni.us/6X1RO
    Intel Core i9-12900K on Amazon: geni.us/YAlrh9
    AMD Ryzen 9 7900X on Amazon: geni.us/FIyuhcS
    AMD Ryzen 7 7700X on Amazon: geni.us/udIJ
    AMD Ryzen 5 7600X on Amazon: geni.us/6AdNkw
    Intel Core i7-12700K on Amazon: geni.us/MP2tSl
    Like our content? Please consider becoming our Patron to support us: / gamersnexus
    TIMESTAMPS
    00:00 - Intel i9-13900K CPU Review & Benchmarks vs. AMD Ryzen
    01:30 - Basics
    03:05 - Boost is Infinite
    04:10 - Compatibility with Z690, Z790, & Memory
    05:07 - Power Consumption on 13900K in All-Core Workload
    07:23 - Single-Core Power Consumption on 13900K vs. 7950X
    07:48 - (Correction in Comments & Description) - Efficiency
    08:39 - Brute-Forcing Gaming Results
    09:42 - CSGO CPU Benchmarks - 13900K vs. 7900X (1080p, 1440p)
    10:48 - FFXIV Best CPUs (13900K CPU Review)
    11:22 - Rainbow Six Siege Top CPUs 2022
    11:49 - Tomb Raider CPU Benchmarks (RTX 4090 & 3090 Ti)
    13:08 - Far Cry 6 CPU Benchmarks (RTX 4090 & 3090 Ti)
    14:03 - Frametime Chart - 7950X vs. 13900K
    15:11 - Cyberpunk 2077 CPU Tests with RTX 4090
    15:30 - Production Benchmarking
    16:00 - Blender Cycles CPU Rendering Benchmarks 2022
    17:25 - Code Compile CPU Benchmarks for Programming
    18:49 - Compression & Decompression CPU Benchmarks
    20:42 - Adobe Premiere CPU Benchmarks 13900K vs. AMD Ryzen
    21:36 - Adobe Photoshop CPU Benchmarks
    22:33 - Conclusion: Intel i9-13900K
    25:45 - The i5-13600K Will Be Really Interesting
    ** Please like, comment, and subscribe for more! **
    Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
    Follow us in these locations for more gaming and hardware updates:
    t: / gamersnexus
    f: / gamersnexus
    w: www.gamersnexus.net/
    Host, Writing, Testing, & Lead: Steve Burke
    Testing & Research: Patrick Lathan
    Video Editing: Andrew Coleman
    Video Editing: Mike Gaglione
    Writing: Jeremy Clayton
  • Hry

Komentáře • 3,9K

  • @GamersNexus
    @GamersNexus  Před rokem +499

    Watch our Intel i5-13600K CPU review here: czcams.com/video/todoXi1Y-PI/video.html
    One correction in this review: We just noticed the power efficiency chart contains a charting error (I accidentally plotted the power efficiency for a different test for the 13900K, but the chart is for Blender -- so the numbers are correct but they're for the wrong software with the 13900K). This error doesn't change any of our conclusions nor does it change that the CPU *does* draw 300+/- Watts when under full load, to be clear about that, so power consumption is right. This is the only error we're aware of and, again, the numbers are right but the 13900K result for the Blender chart is different. That particular chart should be between 32.5-37.0, depending on which set of results you take for the math and if you calculate it warm or cold. This means it is still one of the least efficient entries on the chart; however, the 12900K is actually slightly less efficient than the 13900K here, not more. Sorry for that mistake! No excuses. Just too much data too fast, and I'll try to get a few hours ahead of the review cycle next time so I'm not operating so tired. My fault and thanks for your understanding! - Steve

  • @alistairblaire6001
    @alistairblaire6001 Před rokem +1018

    I love that you guys have the FX-9590 on your power charts for context lmao

    • @RobBCactive
      @RobBCactive Před rokem +60

      Right the Intel fanbois need some perspective, this is NO surprise, Intel marketing excluded there's limits to physics.

    • @ramseychong8787
      @ramseychong8787 Před rokem +74

      Bulldozer is too much of a disaster to forget. It's like the Chernobyl of PC hardware. 😝😝

    • @NextEevolution
      @NextEevolution Před rokem +25

      Doesn't Steve still run a Bulldozer CPU in his personal computer?

    • @gustavobalmaceda2346
      @gustavobalmaceda2346 Před rokem +2

      Yeah, that was once a watt-hungry beast.

    • @ShogoKawada123
      @ShogoKawada123 Před rokem +27

      Big difference was the FX-9590 performed *horribly* in all regards, though. They weren't gaining anything at all from all that power usage.

  • @oppenz3723
    @oppenz3723 Před rokem +2227

    Now we have truly entered space heater territory with 13900 and 4090 combo

    • @WiiNV
      @WiiNV Před rokem +22

      L🤭L Deathstar Fails Edition 🌑💥😂

    • @thetriplea9928
      @thetriplea9928 Před rokem +196

      honestly such usage could genuinely be an option for heating a relatively small room.

    • @tardwrangler1019
      @tardwrangler1019 Před rokem +31

      forgot about ryzen 7 series?

    • @jaws3050
      @jaws3050 Před rokem +81

      Right, I mean at a combined 900w of heat output it would be hotter than the low setting of my space heater....

    • @praetorxyn
      @praetorxyn Před rokem +155

      @@tardwrangler1019 175W is a lot less than 300W.

  • @BAJF93
    @BAJF93 Před rokem +309

    4090 and 13900K seems to be the perfect combo for the upcoming harsh winter. Can't imagine though how much power 4090Ti and 13900KS would pull.

    • @trapical
      @trapical Před rokem +42

      The two of them are going to pull over a kilowatt for a personal rig. That’s insanity.
      Meanwhile I’m over here with my Steamdeck gaming @ 10 watts…

    • @TokyoCinnamon
      @TokyoCinnamon Před rokem +8

      I need that heater combo 😂 pls!

    • @qwer55555555
      @qwer55555555 Před rokem +8

      yeah, and you'll have even more harsh electricity bill after winter with those two XD

    • @olliveraira6122
      @olliveraira6122 Před rokem +7

      The positive side is that you wont need any room heater during the winter
      To be serious though, I have an i9-12900k, & I honestly wouldnt dare to buy anything that runs hotter. Even with a triple fan AIO cooler I see temps up to around 80C at maxload with this CPU

    • @brokenstereotype
      @brokenstereotype Před rokem +3

      I can hear the circuit breaker tripping already.

  • @TomatePasFraiche
    @TomatePasFraiche Před rokem +2645

    It's crazy the timing, at the worst and most expensive time for energy they are producing the most power hungry computing gear 😂

    • @djnorth2020
      @djnorth2020 Před rokem +161

      Hey we already had the absolute highest GPU prices, like MSRP x 3, and people couldn't wait to pay for them. 😁

    • @Ravix0fFourHorn
      @Ravix0fFourHorn Před rokem +215

      thats only for the high end, for example an i5-12400 with a 6600 can run games at high/ultra 1080p 60+fps (and maybe 1440p) for a fraction of the price while consuming a lot less. Its just people suffering from consumerism that need to buy the latest most shiny thing because the number gets bigger every year.

    • @Dervraka
      @Dervraka Před rokem +118

      True, but I'm guessing someone that can drop ~$3k+ on a brand new i9-13900K with DDR5 and a RTX4090 isn't going to care if they have to pay an extra $25 a month in power.

    • @ReasonableRam
      @ReasonableRam Před rokem +14

      I'm just glad I live in an area where power is cheaper. About 6 cents per kWh in my area thankfully. (Upstate NY)

    • @legoNerd01245
      @legoNerd01245 Před rokem +10

      Most expensive time for power?

  • @sorappoli5137
    @sorappoli5137 Před rokem +769

    If you live in mainly cold areas a 13900k and RTX 4090 combo will be an excellent alternative to the usual heating solutions. Very cool!

    • @gogudelagaze1585
      @gogudelagaze1585 Před rokem +109

      *Very warm

    • @ethanwiebe6483
      @ethanwiebe6483 Před rokem +66

      i live in canada and my 5600x and 6900xt are already way too fucking much. there is no climate that justifies these heaters

    • @Crossfirev
      @Crossfirev Před rokem +15

      @@ethanwiebe6483 the vacuum of space

    • @hellfire92837
      @hellfire92837 Před rokem +28

      @@ethanwiebe6483 Those are very efficient parts.

    • @noneB974
      @noneB974 Před rokem +5

      bollocks, Uk ambient is 10-15c and it does nothing for my i9 12900

  • @nnnnoob
    @nnnnoob Před rokem +68

    Small detail, not sure if anyone has mentioned this already, but the warmer lighting on the set really is a nice look. Looks more pleasing than the cooler white tones I'm used to.

  • @Chuzo1946
    @Chuzo1946 Před rokem +1

    I love the new style of changing background settings during your videos. Excellent video as always.

  • @weeblewonder
    @weeblewonder Před rokem +251

    The whole moving desk setup is an inspired production change. Makes these so much more dynamic and interesting to watch.

    • @zakelwe
      @zakelwe Před rokem +6

      I was too busy doing the "13600k" drinking game ... is that Ian Gillan presenting this?

    • @Lishtenbird
      @Lishtenbird Před rokem +1

      My only gripe is with monitors in the background - I find looping benchmark videos distracting, especially game ones. I understand those make it look more "busy" and "professional" - like blinking lights in a spaceship - but there's still a fair amount of spoken data, and it can be harder to concentrate on it, even more so if English is not your native language.

  • @Zordonzig
    @Zordonzig Před rokem +665

    We've reached the realm of completely impractical. When a 360mm AiO can't control temps on a CPU you know things have gone beyond anything reasonable.

    • @whitekong316
      @whitekong316 Před rokem +13

      Don’t think people buying i9s usually throw aios on them though.

    • @ProXcaliber
      @ProXcaliber Před rokem +73

      @@whitekong316 I personally just recently bought a 7900x and I am using a Corsair 360mm AIO. A custom loop is still just too expensive IMO, or at least for a good setup.

    • @spectraleggings
      @spectraleggings Před rokem +103

      @@whitekong316 don't think people buying i9s have brains tbh...

    • @Kuri0
      @Kuri0 Před rokem +23

      Pentium 4 all over again

    • @whitekong316
      @whitekong316 Před rokem +2

      @@spectraleggings 🤷‍♂️

  • @solocamo3654
    @solocamo3654 Před rokem +6

    Thank you so much for including the FX-9590 here in the power consumption test. After owning one and struggling to cool it near enough a decade it's amazing to see a stock Intel pulling near enough 50% more power.

  • @rube9169
    @rube9169 Před rokem +43

    Some feedback:
    2:24 This chart stood out immediately as too small to me. Based on the cpu in the background, the chart sizing and placement seems intentional but on my 15" laptop the text was significantly harder to read than normal because of the size and/or font thickness.
    Great review! Thank you guys for the continued hard work - it is so widely appreciated!

    • @pistachiodisguisey911
      @pistachiodisguisey911 Před rokem +5

      Agreed, it’s so small it’s nearly impossible to read on a phone bc of aliasing

  • @zimtage1744
    @zimtage1744 Před rokem +227

    I never thought I'd have to install three phase power to run my gaming rig

    • @davidfeddanigga54
      @davidfeddanigga54 Před rokem +9

      if you dont want it than just dont buy it

    • @DerpUSA2025
      @DerpUSA2025 Před rokem +80

      @@davidfeddanigga54 ok jensen huang

    • @emenesu
      @emenesu Před rokem +34

      @@DerpUSA2025 you murdered the poor man

    • @tringuyen7519
      @tringuyen7519 Před rokem +16

      HUB even used a 420mm liquid cooler on the 13900K to try and stop thermal throttling. It couldn’t be done!

    • @zimtage1744
      @zimtage1744 Před rokem +10

      @@davidfeddanigga54 I love it when foolish people make up things and put words in my mouth. What else do I not want?

  • @DissertatingMedieval
    @DissertatingMedieval Před rokem +373

    It strikes me that with this power consumption level alongside the 4090's consumption EVGA's decision to focus on their power supply business looks like a really solid move.

    • @ColdRunnerGWN
      @ColdRunnerGWN Před rokem +29

      It's almost like they knew this was coming.

    • @EkiToji
      @EkiToji Před rokem +38

      Yeah if this continues they'll have to start dealing with the power limits of a single 15A circuit.

    • @luanphan2706
      @luanphan2706 Před rokem +19

      @@EkiToji We're gonna start seeing 1000W as the new normal for PSU shoppers soon.

    • @4gbmeans4gb61
      @4gbmeans4gb61 Před rokem +3

      THIS IS NOT STOCK. if its using 300+ watts then he has unlimited power on, this is not how the CPU is sapose to work.

    • @aflury
      @aflury Před rokem

      It was still a dumb business move for EVGA to dump their GPU business, but this makes it look less dumb.

  • @nicolasescarpentier854
    @nicolasescarpentier854 Před rokem +1

    Thanks for the amazing review, as always! I'd really love to hear your take on the power dissipation and thermal throttling of this CPU... So much power and limited performance doesn't sound too good, but it is still performing.

  • @myairspace121
    @myairspace121 Před rokem +28

    I know you guys hear this all the time, but thank you all so much for all of the effort and time put into each and every video!

    • @canadianbakn
      @canadianbakn Před rokem +4

      Oh God the scammers are still here.
      Edit: context for future people reading this in case it's removed, somebody replied to this with a scam message

    • @KensleyInnocent
      @KensleyInnocent Před rokem +1

      @@canadianbakn traveler from 2069 here....
      These bots are still there... They even worsen with Ai implementation.

  • @Cat_Stevens
    @Cat_Stevens Před rokem +165

    great lighting in this video, nice to see the set and shooting angles develop like this. the intro hair light, the decorative incandescent above the PSU tester, just great colouring

    • @alexandertrego879
      @alexandertrego879 Před rokem +14

      Steve's hair looks NICE

    • @Candidplayer
      @Candidplayer Před rokem +2

      @@alexandertrego879 🤣

    • @GamersNexus
      @GamersNexus  Před rokem +31

      Thank you so much! Andrew has really been studying lighting lately and he's been killing it!

    • @maxwellgriffith
      @maxwellgriffith Před rokem +4

      I’m glad the harshness of those high-kelvin background lights is gone, the improvement is very noticeable.

    • @foobar-9k
      @foobar-9k Před rokem

      @@GamersNexus Not that my opinion is worth much... but this video looks weird to me, compared to eariler ones. Feels "fake" somehow, like if the framerate was too high, or something, hard to describe 😞
      Edit: it feels almost as if it was speed up... or skipping frames at the same time somehow, Really weird. I can notice the lamp shaking in the border of the screen, for example. And I'm watching on basic 480p. It doesn't feels right, sorry.

  • @BenHighley
    @BenHighley Před rokem +75

    Something to consider for the next testing methodology update: most AAA games seem to respond very similarly when it comes to the CPU reviews. I'd love to see a more CPU bound simulation type game getting tested. Factorio or Kerbal Space Program or something like that would actually reveal a meaningful difference between CPUs.

  • @thevoid6756
    @thevoid6756 Před rokem +1

    I would love to see a video hosted by Andrew and Mike where they talk about what they look for when new Hardware is being announced and when you guys do any testing, seeing how they are on the (post-)production side of this whole operation and probably have some interesting insights that aren't necessarily about about gaming. On that note, I do want to highlight how much I appreciate you incorporating benchmarks like Blender and Adobe products :)

  • @jakestocker4854
    @jakestocker4854 Před rokem +376

    The power everything is starting to pull is just absurd. 1000 Watt PSUs are about to be the bare minimum for a upper mid range build lol.

    • @chillnspace777
      @chillnspace777 Před rokem +2

      Real talk

    • @bigbob3772
      @bigbob3772 Před rokem +36

      pRETTY SOON, YOU WILL NEED 2 separate circuits in your house to plug in two power supplies, one for the gpu and one for the rest

    • @___DRIP___
      @___DRIP___ Před rokem +51

      @@bigbob3772 Unless you live in Europe or Australia where our outlets are 220V or 240V respectively.

    • @Richard-tj1yh
      @Richard-tj1yh Před rokem +10

      Nonsense, 1000w has still plenty of room to spare even for highest end

    • @digantamajumder5900
      @digantamajumder5900 Před rokem +6

      Seems like corsair 1600i will finally be a common psu on pcs 😂

  • @jonathancarranza6046
    @jonathancarranza6046 Před rokem +46

    With the new gen of AMD and Intel CPUs coming out, I'm just glad I upgraded to a 5800X3D. It's casually hanging with the top CPUs with a 105W TDP. Best value CPU right now and it's not even close.

    • @reappermen
      @reappermen Před rokem +6

      Or to be more precise, hanging at 108W all core power draw by the power draw graph in this video.
      As a recent owner of a 5800x3d I totaly agree with you :D

    • @MrWebster191
      @MrWebster191 Před rokem +7

      Wonderfull product, 7800X3D would be great too, i think.

    • @Sgt_Glory
      @Sgt_Glory Před rokem

      Built my system last year with one of the last 11th gen i9's (one of those fictitious CPUs that don't exist heh). Doing just fine with it, both power and temps wise. Probably not upgrading for a good number of years.

    • @MarkLikesCoffee860
      @MarkLikesCoffee860 Před rokem

      A used 5800X is the best value. People are selling them cheap because of the 5800X3D.

    • @Tugela60
      @Tugela60 Před rokem

      We are glad too, since it is one person fewer snapping up stock for the rest of us to worry about.

  • @andrekaiser8902
    @andrekaiser8902 Před rokem

    thank you steve for saying that extra sentence for the ones who wanna build projects and have the... that was really nice of you :)

  • @kotekzot
    @kotekzot Před rokem +112

    I would love to see power-normalized performance testing. I understand that this is a crazy ask, given how hard it would be to do in a way that lives up to GN's commitment to perfecting methodology, but it would be massively useful even if the data is a little less perfect than usual.

    • @Smallman647
      @Smallman647 Před rokem +26

      der8auer did it and the 13900k is really efficient.

    • @purplegill10
      @purplegill10 Před rokem +8

      Hardware Unboxed did it in their 13900k review if you're interested

    • @primozsuhadolnik5468
      @primozsuhadolnik5468 Před rokem +10

      Like Smallman647 said it. Der8auer did it, so maybe check out his channel/video. Its a great review. Amongst other, he put AMD 7950X cpu to ECO mode (90W), and then did the same for Intels i9-13900K cpu. Its amazing how you almost lose no performance and yet the power consumption goes wayyyyyy down. Im a bit surprised that GN didnt try it. Probably becouse they have lots of work, especially with all the new tech coming out.

    • @ThirdLife86
      @ThirdLife86 Před rokem +5

      @@primozsuhadolnik5468 Well its not almost the same performance. If i limit my 7950x to 100W its slightly faster than my 5950x @ 230W with about 30500 Points in CBR23. With PBO and CO on it at 5.5 Ghz it clocks in at over 41k Points. Thats a 34-35% increase in Performance. Although its burning through 300W of power then.
      I like the new CPUs as they can give you ultimate performance if needed and still be friggin efficient if need be. So if i wanted to encode a Video over night i can just set it to 65-100W and it'll barely even heat up (about 50C maybe).

  • @soupysoup931
    @soupysoup931 Před rokem +96

    I love how the team is using work stations meant for other members to have variety background going on instead of a studio set that's been used previous. nice change and addition to video quality.

  • @ThunderGoatz
    @ThunderGoatz Před rokem +185

    At this point I think the industry is conspiring towards getting 1000-1500w power supplies off the shelves again. I remember when the leading advice was 650w-850w is all you need. Also a fantastic time for EVGA to focus on the PSU market 😂

    • @DenverStarkey
      @DenverStarkey Před rokem +2

      i just got a evga 850 watt 80Gold plus PSU lst year . about to finnally have the money for a 3080. people keep saying wait for radeon 7000 series ... but naaah , not waiting , i'll probably be building a whole new computer next year any way. and by then they'll hopefully have all new hardware launches any way.

    • @lilkidsuave
      @lilkidsuave Před rokem +7

      EVGA must have had a godly 8 ball kekw

    • @ChristopherGreerCDN
      @ChristopherGreerCDN Před rokem +2

      It's getting more and more to the point that, in North America at least, you need to have a dedicated 15A circuit just for your PC and peripherals.

    • @beH3uH
      @beH3uH Před rokem

      LMAO just bought EVGA PSU for my upgrade

    • @Crossfirev
      @Crossfirev Před rokem

      @@ChristopherGreerCDN I unironically am rebalancing my room's load because of my computer upgrade for this very reason. :(

  • @MistyKathrine
    @MistyKathrine Před rokem

    Wow, you upgraded your lighting. It makes the video look a lot better.

  • @MBimmer
    @MBimmer Před rokem +3

    Loving the new production change👌🏼

  • @unknowncomic4107
    @unknowncomic4107 Před rokem +405

    Remember when a computer would be the size of a desk or even take up an entire room? We are heading back in that direction.

    • @anonony9081
      @anonony9081 Před rokem +28

      The components can only get so small so at some point this was inevitable without a major shift in architecture.

    • @aMMMMungus
      @aMMMMungus Před rokem +1

      Yeah that's not going to happen anytime soon

    • @rododendron85
      @rododendron85 Před rokem +2

      but for additional cooling

    • @LondonCrusader
      @LondonCrusader Před rokem +4

      All of this has happened before... and will happen again.

    • @jackielinde7568
      @jackielinde7568 Před rokem +10

      Oddly enough, "Yes, I'm that old". My employer used to have an Amdahl system that was 26 cabinets, each with the dimensions of a standard server rack enclosure. The room was fairly large and (at the time) used four chillers to force cold air into the subfloor to be blown up into those cabinets. The room was exceptionally chilly, until you entered the maze of boxes that made up the Amdahl system. It was replaced with a single IBM box of the same dimensions.

  • @smakfu1375
    @smakfu1375 Před rokem +169

    300w is insane, but one has to also be impressed with the material science involved. When I worked in the semiconductor space, a quarter century ago (man I’m getting old) a chip running at those clocks, with multi-billion transistor count, at that node size, pulling that kinda power to make it all work, would have been thought to be totally impossible (billion+transistors - sure; multi-GHz clock rate - sure, both - no way). Setting aside our lithography limitations of the time, the material limits and Black’s Equation would have dictated nearly insta-failure. Modern semiconductors are pretty freakin amazing.

    • @LeOneToyota
      @LeOneToyota Před rokem +1

      I'm about to graduate as an engineer and I wanna work in this industry. How does one go about it.

    • @Lizardwarrior1
      @Lizardwarrior1 Před rokem +7

      @@LeOneToyota phd, go into process engineering, become a tool owner at an intel fab

    • @christopherjames9843
      @christopherjames9843 Před rokem +3

      @@LeOneToyota If you aren't an electrical engineer forget about it.

    • @EkiToji
      @EkiToji Před rokem

      Yeah just being able to deal with hot carrier injection and electron tunneling currents at this scale is impressive.

    • @vladchenkov9215
      @vladchenkov9215 Před rokem

      Piece of lazy trash, telling me it has to pull 50 watts more to be competitive. Intel is now trash 🗑

  • @zahirkhan778
    @zahirkhan778 Před rokem +1

    I really like the way you showed the cpu specs. Looks very modern and non blocky.

  • @mfortyturbo2
    @mfortyturbo2 Před rokem

    Great vid as always. Hoping there might be some 6xx/7xx boards with DDR4 vs DDR5 testing in the future. How much of a difference the could be.
    People should look at Der8auer's video of the 13900K also. He does some eco mode testing and undervolting/downclocking Ecore testing. That might give more info about if it's more or less valuable with some tweaking.

  • @BrySA20000
    @BrySA20000 Před rokem +174

    Intel and AMD is providing just the right CPUs for the winter

    • @railshot888
      @railshot888 Před rokem +21

      Intel has AMD beat easily too.

    • @gatonetwork1788
      @gatonetwork1788 Před rokem +13

      I'm a fan AMD cpu's they don't consume as much power and run cooler.

    • @nothorren
      @nothorren Před rokem +19

      @@gatonetwork1788 go back in time and say the same thing, people will tell you're crazy lol

    • @whitekong316
      @whitekong316 Před rokem +4

      @@gatonetwork1788 doesn’t the new amd cpu run at 95 degrees?

    • @RyanFennec
      @RyanFennec Před rokem +2

      @@whitekong316 yupe

  • @EnigmaticGentleman
    @EnigmaticGentleman Před rokem +96

    I really hate the trend of increased power consumption. Newsflash hardware makers, some people want their PCs to not sound like a jet engine, to actually last a good while, and to not feel like they're in a sauna when they boot up Minecraft.

    • @sowa_9543
      @sowa_9543 Před rokem +8

      Don't buy 4090 or 13900k problem solved

    • @Sam_Saraguy
      @Sam_Saraguy Před rokem +6

      At some point, state and federal regulators will step in and set power limits for consumer CPUs and GPUs. It's getting out of hand.

    • @EnigmaticGentleman
      @EnigmaticGentleman Před rokem +7

      @WT95 I mean, this kind of crazy wattage usually affects the whole stack, not just the top end. Otherwise i would be completely fine with this

    • @liammcleod159
      @liammcleod159 Před rokem +2

      Buy an i3 then

    • @88flack
      @88flack Před rokem

      @@Sam_Saraguy The high end one might require a you to apply to recive.

  • @joseh.brumrbeirojr.4118

    He seems so content in these latest reviews . Very satisfying for me to see . Thanks Steve 🖖🏻

  • @wile-e-coyote7257
    @wile-e-coyote7257 Před rokem

    Thanks for this review and your informative benchmarks, Steve!

  • @cerebelul
    @cerebelul Před rokem +199

    It's as expected, we entered the territory of "if you want to upgrade your system with current gen hardware, you are better off buying a new PC all together as you will have to change absolutely everything". And prepare for the electricity bill...

    • @2ndcitysaint52
      @2ndcitysaint52 Před rokem +13

      i get the power arguements for EU where the bills are stupid high but in NA i promise anyone dropping 4-5k on a top of the line system doesn't care about the power bill its just the reality.

    • @crylune
      @crylune Před rokem +13

      I just bought a 6800 XT. I'm happy with this and my 5900X. Not going to fund the next generation of power grid destroyers.

    • @miragept
      @miragept Před rokem +6

      Derbauer is also benchmarking hardware without the factory overclock, rtx4090 is basically the same performance at 350W and this one works quite well at 90-130W. Would be nice if gamers nexus started doing that testing too.
      Its like my rx6600, at 50W(1/2 stock) it has 90% performance.

    • @danieltatar7575
      @danieltatar7575 Před rokem +3

      @@2ndcitysaint52 Unless someone is really pushing the system a lot of the time, people shouldn't care about the energy consumption in Europe either. Especially compared to the price of the actual components.

    • @Liberty46
      @Liberty46 Před rokem

      atata s-a putut

  • @Arakki
    @Arakki Před rokem +54

    This new LUT is quite nice and complements the graphical design changes well.

    • @chickenpasta7359
      @chickenpasta7359 Před rokem +3

      yeah i really dig the aesthetic change and better production as a whole. its nice to actually see some saturation and warmth from a video of theirs

  • @FabiVoltair
    @FabiVoltair Před rokem +5

    The new lighting finally looks modern and great, Steve!

  • @madamerosario
    @madamerosario Před rokem +10

    You've become my favorite PC CZcamsr. I appreciate your very high standard of journalistic integrity and always mentioning your experience with measurements. You're wicked smart on top of it, I don't think I've heard another youtuber speak so fluently about deep hardware knowledge--you could be an engineer. Thanks for your content and your mission. One beautiful person you are. And your coasters are sick.

  • @junkmail6567
    @junkmail6567 Před rokem +64

    Thanks for the great review GN! I’d really like to hear commentary against the 5800x3d as you get to the lower cost units. I think it’s seen as a VERY popular alternative to going next gen so the direct comparison is gonna matter for a lot of people.

    • @tbrowniscool
      @tbrowniscool Před rokem

      It's amazing that the X3D is STILL! Beating these huge motherfucking retarded power hungry chips.

    • @trailblazinn
      @trailblazinn Před rokem +2

      If you plan to use a top end GPU, 13th gen cpu crushes 5800x3D. Check out LTTs review.

    • @Polaren
      @Polaren Před rokem +17

      @@trailblazinn Not in gaming it doesn't. Especially in efficiency. Intel shit the bed once again.

    • @glamdring0007
      @glamdring0007 Před rokem +20

      Not to mention that the 5800x3d is providing 95+ % of the same gaming results at literally 1/3 of the power use...13900k is a sad joke when talking thermals and power draw.

    • @fredericmontpetit5709
      @fredericmontpetit5709 Před rokem +4

      ​@@trailblazinn No it does not.

  • @TheTomster94
    @TheTomster94 Před rokem +56

    Interesting review. For gaming focus I also recommend the Der8auer review, he goes into detail on how well the 13900k still performs when cutting the power usage massively.,

    • @thegamerboy1000
      @thegamerboy1000 Před rokem +6

      Yup watched that before this as well, I love that he adds power consumption to everything

    • @bricaaron3978
      @bricaaron3978 Před rokem +8

      That wouldn't be conducive to irrationality and fear-mongering, though.

    • @zedexish
      @zedexish Před rokem +5

      @@bricaaron3978 that's irrelevant for stock vs stock.

    • @skilletpan5674
      @skilletpan5674 Před rokem +15

      @@bricaaron3978 It's relevant to most people. R9-7950x uses much less power for near the same or better performance. 13900 uses 30% to 75% more power generally than R9-7950x. The extra cost of R9-7950x will pay for it's self in a few months in power alone.

    • @noname-gp6hk
      @noname-gp6hk Před rokem +3

      @@skilletpan5674 consumers don't think of TCO when they buy things. Things are getting interesting once gaps in power consumption reach a point where there is a total cost of ownership break even point where a cheaper part is more expensive within a few months due to increased power bills, lol.
      Wonder if any computer maker is going to take a nod from the car industry and start publishing estimated electrical cost figures over a 12 month period like cars do with estimated fuel cost in the window sticker.

  • @BRC_Del
    @BRC_Del Před rokem +2

    Small error, at 13:49 even though you've just mentioned GTA, the chart is the same as the Far Cry 6 one from earlier.
    Besides that, excellent piece once again! Loving the new "station to station" shots recently, awesome increase in production quality.

  • @detlaslegacy
    @detlaslegacy Před rokem

    2:30 that overlay is such a good idea, props to the editors!

  • @Real_MisterSir
    @Real_MisterSir Před rokem +30

    Something that I think will be increasingly relevant, is to perform tests where cpus are power normalized - like for example limiting them to draw only fx 150-200 watts and redo all the tests, just to see exactly how the individual efficiency curves play out. Imagine if you could pull back half the 13900K's power consumption but retain 80% of its performance, and what would the AMD counterparts look like at that point too?
    I think in general, the more the manufacturers throw away all power limits, it becomes increasingly tempting for end users to do their own downclocking, undervolting, etc because realistically even in professional workloads, you don't want to obtain the last 10% performance if that performance comes at a vastly higher power consumption. Power draw is also consistent money out of your pocket, and realistically all of these cpus perform better than last generations so nobody is going to complain that their workflow is worse, even at less than peak performance.
    Speaking of, I run my own home-based company with design work, lots of Photoshop and 3D rendering, and I am working off of a now relatively old 9700K. It's still pulling its weight, but my workflow would be vastly improved with one of these new cpus. Realistically though, since I am not working in a major enterprise company, every second doesn't really matter to me. I have tons of things I can be productive with in the meantime while the pc renders etc, saving a couple seconds here and there helps workflow fluidity but it doesn't directly put more cash in my pockets, I simply don't have enough work to fill out every single hour like that - nor do I want to do it since it's the thought process that actually is worth the pay, not just the renders.
    If the job only consisted of people sending me files and asking me to render them in good quality, sure, then the cpu would directly pay for itself with its higher performance, but that's hardly the case for most creatives I know.
    So what we're really looking at, are cpus that are now more expensive than last gen flagship gpus, and also using the same amounts of power from stock settings, plus they require a massive investment in cooling and the motherboard too in order to handle that high powerload consistently. All in all, performance just isn't worth the cost, and if my company was larger I'd go with a Threadripper anyways if unlimited performance is truly what matters for the business. These consumer-grade cpus that are pushed way beyond their natural power limits to cosplay as enterprise hardware, and now also close to enterprise cost, is honestly a bit of an insult to the common creative worker.
    I really dont see who these chips are marketed for anymore. May as well go with the second-in-line cpus that are also half the cost and half the power draw, with performance that won't make or break anyone's workflow.

    • @WirrWicht
      @WirrWicht Před rokem +6

      Look at Hardware Unboxed, they did exactly that, a power scaling comparison...

    • @Real_MisterSir
      @Real_MisterSir Před rokem +3

      @@WirrWicht Thanks for the suggestion! I actually went to check Derbauer instead, really interesting findings and made the 13900K look a lot more promising and competitive, especially at its much lower pricepoint. Would be great if this was the new focus for all tech tubers, instead of just taking the frankly stupid stock clock settings that only exist for bragging rights and doesn't benefit end users at all.
      The 70w power level for gaming really made the 13900K shine, that's legitimately really impressive how much performance you get for that low power consumption.
      You could game on mediocre air cooling and still outperform the 12900K at 70 watts.

    • @WirrWicht
      @WirrWicht Před rokem +4

      @@Real_MisterSir Interesting, now here is a problem. Hardware Unboxed uses CB R23 and their results show a 13900K needs 170 watts to achieve the same results as an 7950X at 75 watts.Which of the results is now the better one?

    • @Real_MisterSir
      @Real_MisterSir Před rokem +1

      @@WirrWicht As someone who uses Cinema 4D professionally (which Cinebench is based off of), I question the HU testings and would like to see them verified by other independent testers. May have to do with unutilized hybrid core architecture, not sure why exactly as it seems that wouldn't be difficult to fix if it was the case but it's the only immediate thing I can think of. Definitely sounds like some part of the cpu isn't utilized and other parts are forced to compensate as a result, which would lead to that comparably much higher power draw.
      Don't take my words for truth though, I wouldn't want to claim anything without having the cpu in my own rig doing independent testing.

  • @akev2794
    @akev2794 Před rokem +125

    You should definitely look at some of intel's claims r.e. power such as the 13900k on very low wattage can match the 12900k and all that stuff. I'm sure there will also be a sweet spot where you can get 95% of the performance for a lot less power as well.

    • @djnorth2020
      @djnorth2020 Před rokem +4

      The same seemed to be possible with the 4090, cut back power draw 100 W and still win

    • @akev2794
      @akev2794 Před rokem +5

      @@djnorth2020 Yeah - I don't own a 4090 (because I have a 3080 12gb (for £490!) and that's more than enough for a while) but for sure there's efficiency gains to be had everywhere, same with the 7950x as mentioned

    • @ripleyhrgiger4669
      @ripleyhrgiger4669 Před rokem +12

      So why buy this CPU or the 4090 if you're just going to gimp their performance? Waste of money.

    • @nateTrh
      @nateTrh Před rokem +10

      This is happening because Intel and Nvidia are finally afraid of AMD. It kind of sucks that once competition arises, they'll go all out to try to beat one another.

    • @owenaser6673
      @owenaser6673 Před rokem +7

      @@nateTrh but on that same note, that's exactly what competition is good for!!

  • @limemason
    @limemason Před rokem +7

    It's wild that we're living in a time where these new parts individually cost as much as an entire budget build did not that long ago.

  • @funkengruven7773
    @funkengruven7773 Před rokem +2

    Don't know whether it's the lighting or maybe a different process, tools, etc., but the quality of the video in this review is excellent! It stands out as superior to others, although I can't explain exactly why it strikes me as thus. It super clear and the colors & contrast are warm and professional.

  • @TheRealPotoroo
    @TheRealPotoroo Před rokem +54

    The insane power consumption on both platforms makes me really want to see some AMD ECO mode benchmarks. Given the way Zen 4 is thermally throttled and not power throttled, running in ECO mode supposedly makes stuff all difference to actual performance. I'd desperately love to see some proper benchmarking on this - maybe a proof of concept test on the 7700X, perhaps. It could be the genuine difference between the two platforms.

    • @flintfrommother3gaming
      @flintfrommother3gaming Před rokem

      Except that voids warranty if I'm not wrong, correct me if I am.

    • @funkysmokee8764
      @funkysmokee8764 Před rokem +16

      @@flintfrommother3gaming ??? how do you void warranty? by changing some settings?

    • @Donnerwamp
      @Donnerwamp Před rokem +6

      @@funkysmokee8764 You're running the chip out of spec, in theory this voids warranty, the same als activating XMP voids the warranty in some prebuilds. But then, unless it's some damage that's obviously the users fault, and they don't log stuff on the chip, it's close ro impossible to proof (and, from my experience, even to cause) damage by undervolting/limiting the powerdraw. You're theoretically not entitled to a warranty claim when running out of spec, but it's pretty much impossible to proof.

    • @skilletpan5674
      @skilletpan5674 Před rokem +1

      I think @OC3DTV did something like that. TTL also mentioned the difference between the bios eco mode stuff and ryzen tuner etc.

    • @flintfrommother3gaming
      @flintfrommother3gaming Před rokem

      @@funkysmokee8764 Literally the Ryzen Master has warnings EVERY SINGLE TIME you boot it up. Also there is no way to prove it so they just slap it there even if they probably don't care. Unless your CPU dies when in that state.

  • @alrecks619
    @alrecks619 Před rokem +7

    love how you guys still put the "venerable" FX 9590 on the EPS power consumption slide to compare between possible space heaters.

  • @rubbersoul420
    @rubbersoul420 Před rokem

    Man the lighting looks so much better on this video than it usually does.

  • @easyadmin3429
    @easyadmin3429 Před rokem

    Can't wait to have those coasters 🙂ordered the 4 pack yesterday!

  • @G0dspeeeed
    @G0dspeeeed Před rokem +65

    DerBauer showed that the 13900k only loses about 10% performance when you reduce the power draw by 65% (90W), maybe worth looking at efficiency with custom settings.. not with the absurd volatage its pulling right now... its still getting 5GHZ+ all p core with less than 1.1V

    • @luistigerfox
      @luistigerfox Před rokem +36

      that seems to be a theme right now. Considering that you have a similar story in limiting the 7950x to around that area. You can also see a mere 5% drop in perf. of the 4090 when dropping the power limit to 70%. Everything seems to be pushed well beyond the peak of its efficiency curve.

    • @walkir2662
      @walkir2662 Před rokem +3

      So there *is* an eco mode? Great!

    • @RyanFennec
      @RyanFennec Před rokem

      Sounds fun

    • @bowi1332
      @bowi1332 Před rokem +11

      Such thing should be available out-of-the-box, IMHO. Like an "eco mode".

    • @JayMaverick
      @JayMaverick Před rokem

      That's insane. Should be tuned like that out of the box.

  • @peanuts.13
    @peanuts.13 Před rokem +35

    thank you for the review! for the gaming benchmarks with the 4090, i think it'd be helpful to add 5800X3D, one of the better gaming chips out currently.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      Yup, a well performing chip that doesn't double your energy bill and is affordable. Sure, it's "last gen", but it can still hold up and is in stores.

    • @WirrWicht
      @WirrWicht Před rokem

      The 5800X3D benchmarks with the 4090 would have been way more interesting than the 7900X...

    • @randommcranderson5155
      @randommcranderson5155 Před rokem

      LTT has them. Almost no gain.

  • @TimothyStovall108
    @TimothyStovall108 Před rokem +2

    I'm on a 5800X and I just tested 7Zip today for hwBot at an even 109000 compression, beating out a 3900X. That gives me some perspective on my tuning, and watching reviews I believe I'll be happy with my 5800X for years to come. Can't believe a 13900K uses over twice the power of my OC'd 5800X. At least it performs almost twice as well?

  • @UngoKast
    @UngoKast Před 8 měsíci +4

    I just installed a 13900k today after upgrading from the 9900k. Going from one inefficient generation to another feels great! Not going to lie, the performance uplift is phenomenal. Also "Raptorlake" is officially the coolest generation name ever.

  • @commonraccoon2105
    @commonraccoon2105 Před rokem +14

    In the future, we will look back with fond memories how the i9-13900K and the 4090 kept us warm as we huddled around the PC, holding hands out to be warmed by its computational power in those trying times.

    • @VicRaptor2589
      @VicRaptor2589 Před rokem

      Intel is looking into the future, u know… “nuclear winter is coming” 😂

  • @MisterOthello
    @MisterOthello Před rokem +14

    "No were not rounding that one up because we won't give you the satisfaction."
    Pure Gold.

  • @bartz7073
    @bartz7073 Před rokem +2

    I love that you have the old FX-9590 on the power consumption chart :)

  • @Cyberjjc
    @Cyberjjc Před rokem +1

    I love how you guys tactically add a sense of humor to your videos like this one. LOL.

  • @meRyanP
    @meRyanP Před rokem +84

    This really just made me feel better about getting a 5800X3D and skipping the new stuff from AMD and Intel. I'm betting AMD will unlock a little bit of performance on 7000 series with AGESA updates and will have a 7000 X3D SKU at some point, but for the 5800X3D will be good for a while.

    • @shootinputin6332
      @shootinputin6332 Před rokem +8

      A while being the next decade. Ran my 2500k from 2011-2017. 5800X3D will similarly stand the test of time.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem +4

      @@shootinputin6332 I still scrape by with my 3570K, for the simple reason that I wanted a proper step up in terms of performance. This is basically the right time.

    • @InterAliaa
      @InterAliaa Před rokem +1

      @Lui that 3800x will still kick for a while

    • @christopheroconnor4665
      @christopheroconnor4665 Před rokem

      Yup as I do a lot of racing sim stuff in vr I'll be going from a 5800x to a 5800x3d in the next few months too

    • @recession_guy6613
      @recession_guy6613 Před rokem

      Core i5-13600K beating everything except the other Raptor Lake.
      - Core i9-13900K still beating 12th Gen and all Ryzen when it's locked to 88 Watts.

  • @pets4489
    @pets4489 Před rokem +36

    You should make a chart in a couple benchmarks for TDP limits. 13900k @ 15w 30w 65w 90w 120w etc. This is very valuable information for underclocking for Small Form Factor PC

    • @MrFredscrap
      @MrFredscrap Před rokem +14

      Checkout the HardwareUnboxed review, they made that chart, bacsially you can forget about 13900K on efficiency and thermals... Power scaling is poor.

    • @gunturbayu6779
      @gunturbayu6779 Před rokem +3

      AMD will easily win mobile CPU this gen with this kind of power figure

    • @eniff2925
      @eniff2925 Před rokem +3

      Derbauer made these charts. Efficiency is very good at 70W it beats AMD, at 90W they are equal

    • @agenericaccount3935
      @agenericaccount3935 Před rokem

      Someone will. SFF isn't really GN's niche, they have enough on their plate.

    • @ChrispyNut
      @ChrispyNut Před rokem +1

      @@eniff2925 Intel wins at games (if paired with a suitable GPU/resolution where it matters), but AMD still beat Intel at "content creation".
      13100 - 13500 (which aren't yet released) would be the best gaming CPUs.

  • @rubuk2996
    @rubuk2996 Před rokem

    Color on the video looks great, good job GN team!

  • @DennyAtkin
    @DennyAtkin Před rokem

    Thank you for including comparisons to the 10900K. Everyone always compares to the previous generation ship when in fact a lot of us only upgrade CPUs every few years.

  • @rakeau
    @rakeau Před rokem +55

    As high as they both are, $100 price premium for the 7950x sounds like it would pay itself off on the electricity bills. Seriously tho, was worried Ryzen was going to lose their efficiency edge, but they've still got it, despite being pretty close in all aspects. Would be very interesting to see how these CPUs compare head-to-head in an "eco" mode / undervolting situation.

    • @TheEchelon
      @TheEchelon Před rokem +5

      It's about a 60 euro difference here in Europe, and AMD might lower their prices now that the 13900K has launched. 13900K is 770 euros and the 7950X is 830 euros.

    • @eniff2925
      @eniff2925 Před rokem +9

      Derbauer did exactly that test. Intel beats AMD at 70W

    • @user-nf5qc8qp4n
      @user-nf5qc8qp4n Před rokem +4

      Well zen4 isn’t a 100$ price premium when you factor in required ddr5 and new motherboards that are outrageously priced

    • @luxemier
      @luxemier Před rokem +7

      stop coping. the 13th gen is about 70$ cheaper on most countries with on par performance in cpu intensive work loads and better performance for games. and undervolting the 13th gen will likely reduce temps by 10-15c and fix your power consumption issue. apart from that, for zen 4 you need a new motherboard and new ram. again, stop coping amd fanboy

    • @HuntedByAFreak72
      @HuntedByAFreak72 Před rokem +3

      @@luxemier What if you're building a new system? Buying Raptor Lake doesn't make any sense. Yeah it's fast but it's power hungry by default and Raptor Lake CPUs are pretty much the last CPUs you can get for the the whatever current Intel platform naming is. Don't forget the performance loss on some workload if you opted for the DDR4 system.
      Also you can do the same thing on AMD system to deal with the power consumption issue.

  • @romanb5431
    @romanb5431 Před rokem +12

    I'm feeling pretty good about getting a 5800X3D two weeks ago, no regrets

    • @anonymousone6075
      @anonymousone6075 Před rokem

      buahahah conned by the marketing FOOL GULLLIBLE FOOL

    • @maanmahmoud4537
      @maanmahmoud4537 Před rokem

      is i9 9900k old !

    • @recession_guy6613
      @recession_guy6613 Před rokem +1

      Core i5-13600K beating everything except the other Raptor Lake.
      - Core i9-13900K still beating 12th Gen and all Ryzen when it's locked to 88 Watts. Go look at reviews.

  • @justinv3080
    @justinv3080 Před rokem +2

    Steve, would you guys consider doing another look at the 12900k vs 13900k with custom water loop vs the aio you used to see if it's possible to get alot of extra performance out of the chip if you can keep it from throttling?

  • @Kev-O33
    @Kev-O33 Před 9 měsíci

    just bought one yesterday. cant wait to try it out

  • @frizzlefry1921
    @frizzlefry1921 Před rokem +20

    You're power bill is up considerably, I don't even want to think about my power bill. Can't imagine how bad yours is wow!!! Thanks for keeping and improving the incredible job you all do there to keep us informed.

  • @DavidWalker-ko6po
    @DavidWalker-ko6po Před rokem +3

    Great review of a very interesting product, with interesting ramifications for future hardware development! Just one quick thing - the chart for GTA V is mislabelled as "Far Cry 6" at 13:45, just in case you missed it.

  • @aflury
    @aflury Před rokem +1

    I appreciate your emphasis on (in)efficiency. It's easy to ignore, but it's important.

  • @annihilator247x
    @annihilator247x Před rokem +4

    With the winter months coming, I must say it is very considerate of Intel, Nvidia, and AMD to provide their users with space heaters that also compute to help them keep warm! They truly have your best interests at heart.

    • @NovaDoll
      @NovaDoll Před rokem

      Man the 13900k draws more power than high end GPUs and at least AMD’s 7950x doesn’t thermal throttling.

    • @blkspade23
      @blkspade23 Před rokem

      So much irony for me in that post. I mined BTC like 10 years ago in my bedroom that had a broken radiator. If only I had the foresight to hang on to those 6 BTC.

  • @singular9
    @singular9 Před rokem +8

    The ryzen+radeon combo for efficiency and not having to buy new power supplies and breaker panels seems like the way to go.

    • @TheBURBAN111
      @TheBURBAN111 Před rokem +1

      Ryzen 7000 is also power hungry and hot asf...

    • @singular9
      @singular9 Před rokem +8

      @@TheBURBAN111 Its hot for a reason, its not throttling like the 13900k, did you not watch the video's? The 7950x AT WORST pulls 150w less than the 13900k...and any radeon GPU pulls 200w less than rtx 4000.

    • @Celatra
      @Celatra Před rokem +1

      @@singular9 ...no?
      the 7950x pulls around 50 watts less

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      ​@@singular9 For the electricity of a 13900K/4090 setup you could set up two AMD builds.

    • @recession_guy6613
      @recession_guy6613 Před rokem

      Core i5-13600K beating everything except the other Raptor Lake.
      - Core i9-13900K still beating 12th Gen and all Ryzen when it's locked to 88 Watts. Go look at reviews.

  • @hendrikheim5665
    @hendrikheim5665 Před rokem +59

    I must say the 5800xd is just too good I can't believe AMD released it.

    • @halrichard1969
      @halrichard1969 Před rokem +22

      Wait until the 7800X3D drops into Retail. 😆😆

    • @Celatra
      @Celatra Před rokem +1

      @@halrichard1969 it will draw double the power tho

    • @shepardpolska
      @shepardpolska Před rokem +20

      @@Celatra and if you limit it to half the power it will still be much faster then the 5800X3D. Zen4 is just past the best efficiency point on stock. Eco mode on 7950X is what, 145watts with like 5-10% less speed?

    • @Actiatam
      @Actiatam Před rokem +7

      This is what I'm seeing. We're still putting way too much emphasis on the averages in gaming. If you look at it that way, it's just additional synthetic benchmarking that nobody cares about. We're at a point with hardware where lows and stability are the most important metric. I'll take a 150avg/90low over 200avg/50low every. single. time.

    • @LucidStrike
      @LucidStrike Před rokem +1

      @@Celatra Not in gaming.

  • @fabobg
    @fabobg Před rokem +2

    You're doing an amazing job, guys! I'm convinced now you do the best hardware reviews ever. I watched a bunch of others and they don't even talk about the power consumption and throttling behavior. They focus on "but Intel is now 5% faster for gaming and cheaper". Yeah, right, definitely not cheaper if you start accounting electricity bills for a year.

  • @sultanofsick
    @sultanofsick Před rokem

    nice touch also pausing the time bar on the crash :D

  • @brianlovelace
    @brianlovelace Před rokem +63

    Great comparisons. In the future, as a photographer I'd LOVE to see the Puget benchmarks for Lightroom included. This is a CPU intensive load and is a big deal when I'm considering an upgrade 🙂 looks like I made the right decision going with 7950X but we'll see 🙂

    • @TRD_Kyle
      @TRD_Kyle Před rokem +2

      Same here. On an old 8c/16t xeon and looking for an upgrade. Personally think the 5950x is looking good for the power consumption alone and it hopefully being a bit cheaper.

    • @R055LE.1
      @R055LE.1 Před rokem

      I'm curious what demand photography can put on modern CPUs

    • @TRD_Kyle
      @TRD_Kyle Před rokem +3

      ​@@R055LE.1 On my workflow most of the load is in busts. Say going from 10-20% CPU to 100% many times per minute and depending on what edits you are doing and the resolution of the photo will change how long those 100% periods are. Since lightroom is pretty poorly optimized, those 100% spikes lock things up and you just have to wait, pretty rough when you are playing around with a bunch of different sliders to see what looks the best and your changes aren't happening live. There are some sustained loads like panoramas but those are usually only 10-30sec so not terrible. Importing and exporting is where you can have pretty long sustained loads similar to video editing. Imports are probably the worst for me since you are importing all of the photos from a shoot whereas exports is only the select good ones. I've had imports last for an hour or so when you have thousands of images that you are building previews for while importing so that they are easier to work with once editing.
      So it is quite taxing on the CPU and also ram where I'll sit at 100% on both of those (8c/16t xeon + 32gb) for quite awhile at times depending on what I am doing where some editing functions are heavier than others.

    • @FlorinArjocu
      @FlorinArjocu Před rokem

      In case any option is on the table, I could see the M* Macs rule the lightroom test by a large margin. We have to see the tests again with these new CPUs, though.

    • @luanphan2706
      @luanphan2706 Před rokem +1

      @@R055LE.1 I do a lot of Batch editing with Affinity. It is a very demanding task for the CPU and it is also very ram-hungry. Currently my 3700X can hold out enough for a batch of 100-250 photos, but it will take like 10-20 minutes to finish apply lighting and effects on the entire batch. From what I can see, the 7900X will be a really good upgrade for me, since with the same amount of time, I may be able to load 250-500 photos into a batch, making things much more productive. But with current price, it sucks to go AM5. So maybe I'll hit the 5950X :D

  • @vertigo1055
    @vertigo1055 Před rokem +16

    1 of the reasons I personally love this channel's reviews is because I do work (It's really a hobby) in Texture Processing for 3D. This includes Processing the Textures in 1K, 2K, 4K, and even 8K resolutions. Then each one of these gets compressed. To test the Textures I then load up whatever Mod Engine or Game Engine I am currently working in which "Cold Loads" the Shaders that will be compiled later by the Engine. These are then Decompressed "On the Fly" as needed. So for me 4 of these charts are extremely relevant in my workflow. This doesn't even include ANY of the 3D work that gets done with the power of High Poly counts destroying the framerate in my DCC applications. For a multi-disciplined workflow these charts shown are essential in making my decision to finally jump the ship from my 9900K + 2080ti, OC'd and water cooled system, that I am pushing the limits on daily. Not counting the 11 Gen Puke that came from Intel I can still extrapolate based on certain AMD CPU's that competed with my 9900K. With almost identical numbers with those AMD CPUs the performance difference, though the 9900K is not represented (no reason for it be at this point), with these modern CPUs is staggering. Cheers! Stay Healthy and Stay Sane!

    • @assetaden6662
      @assetaden6662 Před rokem

      If you decided, what's going to be your choice? Just curious, nothing more

  • @Phil_Goodman
    @Phil_Goodman Před rokem

    Thank you so much! we love you! and your hair! I think I finally decided which one I want after your review!

  • @yvesmillette1721
    @yvesmillette1721 Před rokem +1

    Got this today as an upgrade from i7-8700k and it's a world of difference. Reduced my C++ project compile times by ~70%.

  • @CBadger
    @CBadger Před rokem +7

    I like the idea of, in a few years, having a 6090Ti pulling 850w and an Intel 15900K pulling 600w, and needing a 2000w power supply to power it all but can overall save us money on energy by hooking up the water cooling loop to my houeshold radiators to heat the house during winter. It's going to be great. Nvidia and Intel really looking out for us during this energy crisis.

  • @jakestocker4854
    @jakestocker4854 Před rokem +11

    CPU now pulls what your GPU used to pull at the absolute MAX l o l

  • @HD-Gaming-Zero-Them-Down

    Don't matter this computer is my personal heater also during this winter I love it !!!!

    • @kaptenhiu5623
      @kaptenhiu5623 Před rokem

      Ha! And laughed so hard when AMD released their Bulldozer FX 9590 with 200W of TDP back in 2014 calling it a heater 😂, and now it becomes the industry's standard.

  • @Blafard666
    @Blafard666 Před rokem

    I just wanted to write a nice comment about your new studio set up so : nice set up guys !

  • @chrisness
    @chrisness Před rokem +14

    I wish you did a gaming power consumption test. That would be more relevant to more people compared to the all core full load test

  • @Mateus01234
    @Mateus01234 Před rokem +3

    Still rocking a 1700 here @4GHz Going strong!

  • @KarimTemple
    @KarimTemple Před rokem +1

    I'd really love to see what it really looks like when you compare Zen 4 vs DDR4-based 13th Gen (and vs AM4).

  • @rtyzxc
    @rtyzxc Před rokem +3

    Looking forward to see the gaming performance and especially 1% lows with e-cores disabled

  • @awesomecalin1498
    @awesomecalin1498 Před rokem +3

    Could you give more info for the CSGO benchmark? What settings, what map? I'm familiar with CSGO benchmarking and GN's results are always low, with way less variance between processors than expected.

  • @jantomas4884
    @jantomas4884 Před rokem +3

    Hi GN, I would like more commentary on the temps and cooler choices of this cpu. and how the cooler affects the pricing.
    I mean with the 12th gen an nh-d15 would probably still be enough to not thermal throttle if the case has many holes. but 300watts, I would like to know what the "minimal cooling requirements" are.
    also the comparison with r9 7950x could be fun, because intel might be actually running cooler(with same cooler) because of the amd's thick ihs.

  • @gnlgrim2
    @gnlgrim2 Před rokem +9

    I must say.. your production quality in the recent videos have shot up. Well done to you and the team. Keep up the great work

    • @mattholwood
      @mattholwood Před rokem

      And they invested in testing equipment well before the superficial production "values". Such a high quality channel.

  • @jordanmartinetti8224
    @jordanmartinetti8224 Před rokem

    The staging is amazing on this video. Lighting is awesome I love it!

  • @TrTai
    @TrTai Před rokem +6

    I really want to platform upgrade and move my 1600 over to a home server but man just dropping in a 5800x3d is tempting, I'm hitting bumps regularly with my existing setup so it could use the upgrade but I think I'll give in just a bit longer for RDNA3 and 7000-3D

    • @zkilla4611
      @zkilla4611 Před rokem +1

      Thats what I am waiting for. Sounds promising. AMD still wins with Performance per watt. But tis getting close. Intrested to see what RDNA 3 brings. Im not too impressed with 4090. Mainly the 1.4a Display port. Whats the point in hitting 200FPS when the port only supports 120@4k?

  • @petkozhivkov4271
    @petkozhivkov4271 Před rokem +28

    If you limit the 13900K by current(A), it becomes much more energy efficient than if it is limited by PL.

    • @elgreco502
      @elgreco502 Před 11 měsíci

      Any suggestions how exactly to do it? I used the Bauer advise to make it run 90w and made a custom 1.050V. Anything else? Thanks.

  • @andysPARK
    @andysPARK Před rokem

    Superb comparison, interesting and balanced presentation.

  • @strongforce8466
    @strongforce8466 Před rokem +8

    How about doing some benchs with the E-cores disabled see if it helps those 1% lows ? Also I feel the 4090 scores needed a 5800x3d seeing how well it performs overall !!

    • @benjaminoechsli1941
      @benjaminoechsli1941 Před rokem

      Agreed! I'd love to know how the next-gen scheduler, if it is new, works better. Also, it would be very cool to see how badly the X3D chip was GPU-bottlenecked when it came out.

  • @XyonusTV
    @XyonusTV Před rokem +3

    Perfect, now our PC parts can serve as a heater!

  • @dkman123
    @dkman123 Před rokem

    I really feel like it would be easier to put a > rather than having lower/higher is better. I would be easy to spot to quickly discover if small or large bars are better. You could have it at the top and at the bottom, since it's space friendly.

  • @phoenix7289
    @phoenix7289 Před rokem

    Did you guys use new cameras for this video? Looks really nice!!

  • @soulshadow9534
    @soulshadow9534 Před rokem +5

    I would love to see you test their claim that the 13900k performs on par with the 12900k at 65w versus the 12900k at 241w. (There was a chart they showed Scalable Performance per Watt with Multi Threaded Performance). I sincerely doubt this claim holds up from their presentation.

  • @peace_maybenot
    @peace_maybenot Před rokem +4

    As higher power draw becomes more and more prevalent, i keep saying the same thing. I would love some discussion and accommodating graphs showing total system power draw given the highest, or among the highest, computer hardware available and approximations to how much it would cost to run the system at various electricity costs
    something like 3 computer set ups. an average gamer computer, an energy efficient setup, and a maximum power draw setup, here's a range of power costs per hour, just as an example