Ryzen 7 7800X3D Upgrade! The Fastest Gaming CPU vs

Sdílet
Vložit
  • čas přidán 19. 11. 2023
  • ► Watch the FULL video here: • DF Direct Weekly #137:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry CZcams: / digitalfoundry
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on Twitter: / digitalfoundry

Komentáře • 631

  • @MarikHavair
    @MarikHavair Před 8 měsíci +69

    Yep, got a 1700 on an X370 back in 2017, running a 5800X3D off of that now, what an upgrade.

    • @timothyandrewnielsen
      @timothyandrewnielsen Před 8 měsíci +15

      Best upgrade ever, the 5800X3D

    • @MarikHavair
      @MarikHavair Před 8 měsíci +12

      @@timothyandrewnielsen It certainly is the best in socket upgrade I've seen personally.
      AM4 is probably going to be a bit of legend in generations future. We'll be all like 'back in my day'.

    • @ekhyddan
      @ekhyddan Před 8 měsíci +6

      @@MarikHavair I hope that AM5 will have the same upgradability.

    • @dutchdykefinger
      @dutchdykefinger Před 8 měsíci +2

      i started on a r5-1600 on a b350 in may 2017 when it was pretty fresh
      i slotted in a 3700x in on the b350 last year or something :D
      not quite as bitg of a jump you got, but 28% better single threading still ain't nothing to sneeze at, doubling up that l3 cache and the seperated uncore makes it far less memory clock sensitive too, so i'm riding this AM4 bitch until the wheels fall off

    • @timothyandrewnielsen
      @timothyandrewnielsen Před 8 měsíci +2

      @@dutchdykefinger started on x370 with 1800x. Same. 5800X3D until I see CPUs come out with comparable power efficiency and double the speed.

  • @i11usiveman97
    @i11usiveman97 Před 8 měsíci +146

    I have the 7800x3d with a 7900xtx and Lords of the fallen still stutters when moving between areas even though it's had many patches to date. Nothing a CPU can do about a poorly optimised game engine.

    • @solo6965
      @solo6965 Před 8 měsíci +4

      Nothing is perfect

    • @BornJordan
      @BornJordan Před 8 měsíci +28

      Yeah this game is just poorly optimized, it has nothing to do with CPU

    • @DeepStone-6
      @DeepStone-6 Před 8 měsíci +11

      At that point blame the devs thats pretty much an apex PC configuration.

    • @S4_QUATTROPOWER
      @S4_QUATTROPOWER Před 8 měsíci +1

      Getting the 7950X3d instead

    • @100500daniel
      @100500daniel Před 8 měsíci +14

      Dev response: Obviously it's because you have a plebian 7900 XTX instead of an RTX 4090. Quit being Cheap 🫰

  • @SogenOkami
    @SogenOkami Před 8 měsíci +49

    Been using this CPU for the past month on my flight sims and good lord it destroys everything I throw at it!

    • @Teramos
      @Teramos Před 8 měsíci +4

      so does my 14700k@350watt :D damn those performance/wattage is through the roof on the 7800x3d

    • @KiSs0fd3aTh
      @KiSs0fd3aTh Před 8 měsíci +1

      @@Teramos When your 14700k pulls 350 watts it's literally twice as fast than the 7800x 3d. In fact even at 200w it's twice as fast as the 7800x 3d.

    • @coffee7180
      @coffee7180 Před 4 měsíci +4

      @@KiSs0fd3aTh wtf are you talking i don't get it.

    • @H786...
      @H786... Před 3 měsíci +3

      @@Teramos that is cCAPPP 7800x3d is easy to run on an air cooler lol.

    • @a-qy4cq
      @a-qy4cq Před 2 měsíci +1

      @@Teramos Your brain performance is defo not through the roof

  • @roki977
    @roki977 Před 8 měsíci +134

    what that cpu does with like 50w is most impressive thing for me.. 5800x3d/7800xt here, great for my 240hz screen..

    • @abdirahmanmusa2747
      @abdirahmanmusa2747 Před 8 měsíci +1

      What are your frames looking like?

    • @Vinterloft
      @Vinterloft Před 8 měsíci +6

      I waited and waited with a 3570K for the first Ryzen mATX X-chipset motherboard. That's the one thing I will never compromise on. It took until X570 to arrive (thanks ASRock!) and by that time I already could go for 5800X3D as my first upgrade since Ivy Bridge. 11 years, how's that for an update cycle!?

    • @roki977
      @roki977 Před 8 měsíci

      @@abdirahmanmusa2747 i can get it over 200 in warzone, close to 300 in Finals beta, i am waiting for that one to come out... nothing fancy, just chaep 3600 cl 16 sticks and b550 mb...

    • @roki977
      @roki977 Před 8 měsíci +1

      @@Vinterloft i am not far from that, 3770k and gtx 580 Light extreme was my last serious rig until i bought am4 about 1 and half years ago but i had 5600 and 5700x in there before x3d

    • @mangatom192
      @mangatom192 Před 8 měsíci

      @@Vinterloft I got an i3 2120 before I upgraded to an am5 platform (7600). I can DEFINITELY feel the difference.

  • @mattx4253
    @mattx4253 Před 8 měsíci +20

    i got a 7800x3d and 4090 and its solid as a rock so far. Crazy cool as well as my old intel and 3080 was a room heater....

    • @KiSs0fd3aTh
      @KiSs0fd3aTh Před 8 měsíci +13

      That's the kind of nonsense that is spread around. Your 3080 pulls around 320w, probably three to four times more power than your Intel did (assuming you played 1440p or 4k), your current build probably pulls even MORE power than your old one, but somehow your old one was a space heater. Right, sure buddy. What fucking ever

    • @mattx4253
      @mattx4253 Před 8 měsíci +1

      @@KiSs0fd3aTh I can’t explain it other than the 10 core intel always needed max power. The 3080 aurus extreme was OC factory max and I’m now using 2 dims of ram not 4 but my room is cooler. The system doesn’t even try hard in games most of the time. Its just more efficient at hitting 144hz so your thinking is flawed as you are only concerned with max tdp

    • @KiSs0fd3aTh
      @KiSs0fd3aTh Před 8 měsíci

      @@mattx4253 I have a 4090 myself dude, both my 12900k and my 14900k are below 80w (in fact the 12900k hovers around 50-60) playing at 4k. Does you 3d pull 20w or what?

    • @mattx4253
      @mattx4253 Před 8 měsíci

      @@KiSs0fd3aTh I had a 10850k clocked 5ghz all core. Pulled mega wattage. My 3080 was the extreme model. Im also not at 4k im at 1600p widescreen 38" This 7800X3d and 4090 Trio X combined just max out my 144hz in the sweet spot of the power curve which is efficient. My 10850k and 3080 were always maxed out and couldnt break 100fps in most games even turned down.

    • @KiSs0fd3aTh
      @KiSs0fd3aTh Před 8 měsíci

      @@mattx4253 So you are comparing a stock with a clocked CPU, right. Are you autistic by any chance? And of course your old build could break 100 fps, you had a 3080, lol.

  • @pottingsoil723
    @pottingsoil723 Před 8 měsíci +11

    Would be really great to hear you guys talk about cache-coherent interconnects like CXL providing high-bandwidth PCIe at extremely low latency and if/when we might see this tech migrate from servers to mainstream and the affect it could have on gaming and computing.
    A few years ago we heard all about the PS5s I/O accelerator and how the Oodle Kraken & texture decompression was a game-changer but we've yet to see anything like it on PC (Windows crappy implementation of DirectStorage does not count, in my opinion)
    What are you thoughts on this?

  • @Typhon888
    @Typhon888 Před 7 měsíci +4

    Every real hardware enthusiasts already know ow X3D is just hype. It’s good at 10 games with a $2,500 GPU… then look at the multithreaded performance is worse than my 11900k from 2019.

    • @mitchmurray2260
      @mitchmurray2260 Před 7 měsíci +1

      exactly... its strictly a 1080p chip to run up the fps

    • @evilleader1991
      @evilleader1991 Před 7 měsíci

      lol what?? x3d really improves the 1% and 0.1% lows lmao@@mitchmurray2260

  • @larrytaylor2692
    @larrytaylor2692 Před 8 měsíci +13

    Doesn’t matter how good your cpu is when these devs refuse to optimize

    • @stealthhunter6998
      @stealthhunter6998 Před 8 měsíci +7

      Or in the sake of unreal engine, the game engine being the problem and devs choosing it.

    • @Shieftain
      @Shieftain Před 8 měsíci +3

      @@stealthhunter6998 Allegedly UE5.4 will bring some pretty big multi-threading optimizations. But given how game development has become this generation, we probably won't see these benefits until 2025 sadly.

    • @portman8909
      @portman8909 Před 8 měsíci

      Unreal Engine 5 games have all had stupid system requirements.@@Shieftain

    • @Shieftain
      @Shieftain Před 8 měsíci

      @@portman8909 Indeed, even when software Lumen is in use it's pretty steep. Disabling it boosts FPS a decent amount, but still the requirements are high. IMO games that have a large amount of static lighting are better off developed in UE4 as the older engine is in a good place now (mostly). Take Ghostrunner 2 and Lies of P for example. Both look visually solid and run extremely well.

    • @divertiti
      @divertiti Před 3 měsíci

      Nothing to do with optimization, modern games with modern visuals need more than a 6 yr old potato to run on

  • @stanisawkowalski7440
    @stanisawkowalski7440 Před 8 měsíci +19

    Guys, just know how I miss your CPU reviews! Direct comparisons with framerate and frametime graphs show much more than typical AVG+LOWs graphs we see everywhere.

  • @dotxyn
    @dotxyn Před 8 měsíci +69

    Enabling EXPO at 6400 may have halved the unified clock reducing performance. Check ZenTimings and make sure the UCLK matches the MCLK. If not, you need to adjust the UCLK setting to UCLK=MCLK (not /2). Hopefully your X3D will be able to run 6400 UCLK 1:1, otherwise you may need to drop down to 6200 or 6000.

    • @the.wanginator
      @the.wanginator Před 8 měsíci +17

      You beat me to it. Even AMD states that 6000 CL30 is the sweet spot. Also, make sure the chipset driver is installed.

    • @xouri8009
      @xouri8009 Před 8 měsíci

      Isn’t he using DDR4? Or did I misunderstand?

    • @erickelly4107
      @erickelly4107 Před 8 měsíci +2

      @@the.wanginator
      This doesn’t mean than 6200+ MT/s is somehow “worse” higher will be better as long as you can run it.

    • @mugabugaYT
      @mugabugaYT Před 8 měsíci +7

      @@xouri8009 7800X3D only supports DDR5

    • @arc00ta
      @arc00ta Před 8 měsíci

      Also, you want to make sure your board didn't ramp up the VDDP with the profile. Asus and MSI are pretty bad with this, thats the DDR5 I/O and anything over 1.000V (default AMD setting is 800mv) can make it super unstable without crashing.

  • @ralphmiranda2077
    @ralphmiranda2077 Před 8 měsíci +3

    I've been rocking this chip since earlier this year… I'm lovin' it!

    • @Makavelli2127
      @Makavelli2127 Před 8 měsíci

      I just bought this cpu for my first pc build, cant wait!

  • @dutchdykefinger
    @dutchdykefinger Před 8 měsíci

    even with async shader compiling, if it's a lot of shaders to compile at once, the API (read: overhead for every compile job sent) just kind of gets in the way i guess

  • @BLUEKNlFE
    @BLUEKNlFE Před 7 měsíci

    what are RPCS3 and windows scheduler? how do you enable it?

  • @45eno
    @45eno Před 5 měsíci +2

    I went from 5900x to 5800x3D to 5800x to 7600x to 7800x3D in the last 10 weeks. Vanilla 5000 series definitely didn’t let my 6900xt breath as well compared to both 3D and 7600xt.
    Snagged smoking prices with very very little out of pocket. 7800X3D was a local buy but was sealed brand new for $340. Sold my 5800x3D for $280.

  • @gibbles3216
    @gibbles3216 Před 8 měsíci +26

    The parallelization is the hard part. You almost have to make a script that will look for CPU's and create profiles based on the core and thread count. This will take time. This is where AI would be very helpful. It can be in the background optimizing threads for a 4, 6, 8, 12, and 16 core CPU. It would run a check to see the core count and execute the the script to start the threading in Windows scheduler or a pre-baked in engine scheduler. This way tasks can be offloaded to various threads based on the criticality. For instance, a thread may just be used to complile shaders in the background constantly. Future engines will need to focus on multi-core threading in earnest. You can have the fastest IPC on the planet. It will mean nothing if the engine cannot properly use most, if not all the cores and threads at its disposal. I have a 5950x. In newer games it is being used to at 7-20% based on what is enabled. I know it is not the fastest CPU on the market in terms of IPC. But, if utilized properly is still very fast. TLDR, the CPU's are not really the problem unless they are budget level CPU's. The utilization of them is and will be till engines are centered around getting the most out of them.

  • @ROAMZ101
    @ROAMZ101 Před 2 měsíci

    Any solid b650 motherboards that don't have weird issues like usb disconnects or bad driver issues? Its the only thing holding me back from switching to team red.

  • @icky_thump
    @icky_thump Před 8 měsíci +42

    Just upgraded to this when MicroCenter put it on an early Black Friday sale for only $299! It wuickly went back up to $369 like a day later 😅. So far so good. My games 1% lows seem a lot stronger, GPU stays near 99-100% more often, and strangely it runs cooler than my 7700X.

    • @phantom498
      @phantom498 Před 8 měsíci

      What’s your gpu?

    • @DarthSabbath11
      @DarthSabbath11 Před 8 měsíci

      Just bought mine for 398, ffs 😔

    • @ionnicoara6657
      @ionnicoara6657 Před 8 měsíci

      You can ask for a price match. I asked newegg. They didn't give 299 but went to 349

    • @andrewmorris3479
      @andrewmorris3479 Před 8 měsíci +7

      Firstly you scored on that ridiculous price! Secondly, the 7800X3D pulls about 30-40% less wattage in gaming and almost 50% less in multi core workloads.

    • @BiscuitMan2005
      @BiscuitMan2005 Před 8 měsíci +1

      Wow and I thought 369 was a deal from them when I got it last week there. I have the worst luck

  • @clutch7366
    @clutch7366 Před 7 měsíci +1

    Disabling smart access memory helps sometimes, like it did for me in Warhammer Darktide.

  • @jakub_zawadzki
    @jakub_zawadzki Před 7 měsíci

    If I do some productive tasks, mainly video editing, but also play a lot of games, would it be better to get a 7900 or a 7800x3d?

  • @mrjiggs8
    @mrjiggs8 Před 8 měsíci +18

    I went with the 7800x3d awhile ago.... what's impressed me most.... the reduction in my electric bill.... amazing the power per watt with it and my 7900xtx.

    • @TheJonazas
      @TheJonazas Před 8 měsíci +3

      If you wanted the ultimate reduction in power while maintaining high fps, then the 7800x3d with a 4090 would be much better than your 7900xtx. The 4090 is so efficient when undervolted.

    • @Madhawk1995
      @Madhawk1995 Před 8 měsíci +3

      @@TheJonazasas typical of amdTards, they complain about intel cpu power draw while their rdna3 gpu pull that much in difference simply idling. 7800x3d + 4090 is the most power efficient setup.

    • @Jasontvnd9
      @Jasontvnd9 Před 8 měsíci +12

      @@TheJonazas Yeah takes a while to make up for the extra $1000 spent on the card in the first place though doesn't it.

    • @BlacKi-nd4uy
      @BlacKi-nd4uy Před 8 měsíci +1

      i changed from an undervolted 13600k with 4800/3000mhz allcore setting with 3600 cl14 ram to a 7800x3d co -30 5800 cl28 setup. around +50% performance by -50% powerconsumption. the powerconsumption was the biggest hit for me. to be fair, under fps lock and gpu limit both have enough power and the powerconsumption is only a little lower on the 7800x3d. only in cpu limiting szenarios the powerconsumption difference is big.

    • @pakjai5532
      @pakjai5532 Před 8 měsíci +2

      ​@Madhawk1995, going the 4090 route is like paying for your electricity bills in advance.😂

  • @chrisbullock6477
    @chrisbullock6477 Před 8 měsíci

    Would like to hear an update on this Alex. How's things going?

  • @t3chn0m0
    @t3chn0m0 Před 8 měsíci +10

    If you run the memory at 6400MT, please make sure that its running FCLK/MEM Div 1:1, otherwise it's 1:2 and you lose quite some performance.

    • @JFinns
      @JFinns Před 7 měsíci +2

      Great tip, this is why even AMD recommends 6000 MT/s CL30 ram for these CPUs.

  • @SanderSander88
    @SanderSander88 Před 8 měsíci +1

    Just bought my pc upgrade, its coming today, got it for 350€ , so thats a good deal for where i life. So excited to get building.

  • @RhythmicJAY
    @RhythmicJAY Před 8 měsíci

    no fresh OS install?

  • @SvDKILLSWITCH
    @SvDKILLSWITCH Před 8 měsíci +3

    Definitely agree with Richard on the 5800X3D. I think it's worth keeping in mind that AMD tried several times to break forwards compatibility on AM4, but the whole platform is better off for their eventual concession and extension of AGESA support to 300-series boards. There are other limitations to consider when using such an old board (limited PCIe speeds and expansion being the main one), but I was able to drop a 5800X3D into the same board I purchased back in March 2017 with a Ryzen 7 1700 - nothing short of a gigantic increase in CPU and especially gaming performance. Newer CPUs and BIOS updates also dramatically increased RAM compatibility to the point where I've gone from having to downclock a two-stick RAM setup from 3200 to 2666 to get stability all the way to four sticks at a lazy 3600 CL16 with headroom for lower timings.

    • @PeterPauls
      @PeterPauls Před 8 měsíci +1

      I did a similar upgrade. I have a MSI B450 Tomahawk Max motherboard with great VRMS, it had a Ryzen 7 2700x in it and I upgraded to a Ryzen 7 5800X3D which was a huge performance leap. My only problem that is only a single M.2 PCI-E 4x slot, so I use a 2TB SSD in it. I put an RTX 4080 next to the CPU only PCI-E x16 3.0 but so far no PCI-E bandwith bottleneck and the CPU is also doesn’t bottleneck the GPU.

    • @Solrac-Siul
      @Solrac-Siul Před 8 měsíci +2

      it is a mixed bag in all honesty. While we do got - because i kind of did the same - a new sparky cpu, we miss on other amenities that a 2017 board like mine didn't had compared with a 2020 or 2021 board, what means that we actually do not take full advantage of the cpu... we get close, maybe 90 to 95% of the real potential.Eventually for professional reasons in early 2022 (February) I actually needed more m.2 4x and because of that , on that specific timeframe, I had to go intel way since the z690s had been out and had 4x m.2 4x slots - and AMD's upcoming platform was by then still multiple months away. So sometimes we do value too much cpu socket upgradeability since while it is nice some 4 years down the road to have an upgrade, in 4 years multiple things moved on and we end lagging on multiple aspects , and that depending on each own's personal needs, can vary from a minor inconvenience to actual time loss , and that is bad when time=money.

  • @nickvirgili2969
    @nickvirgili2969 Před 7 měsíci

    Yes display port is better that way for some reason? I havent fully mapped it yet.

  • @rluker5344
    @rluker5344 Před 8 měsíci

    So in heavy CPU use areas like crowded areas in Baldur's Gate 3 or CP 2077 a 7800X3D falls to about even with a stock 12900k with XMP ram.

  • @prototype8137
    @prototype8137 Před 8 měsíci +3

    Jeez. The star citizen footage is like night and day. 😅

  • @kravenfoxbodies2479
    @kravenfoxbodies2479 Před 8 měsíci

    I bought a combo deal back in July of this year at Best Buy for Ryzen 5 7600x / Gigabyte B650 Aorus Elite AX / Starfield for $289.99usd , I have yet to get the board to update the bios, QFlash always says bad image.
    All my MSI boards update from QFlash without issues from same 2.0 UBS flash drive, this Gigabyte board wants to be so different.

    • @nivea878
      @nivea878 Před 7 měsíci

      Take bios F2 if you use usb-stick change name to gigabyte.bin (doesnt matter if big or small letters)

  • @mrb3991
    @mrb3991 Před 5 dny

    HI I just got the same chip coming from a ryzen 7700 non X chip ??

  • @Train115
    @Train115 Před 7 měsíci

    Ive been using one for a few months now, its strong. :)

  • @bmw523igr
    @bmw523igr Před 8 měsíci +3

    Great timing. Yesterday changed my platform from AM4 5600 to AM5 7800X3d with 7800XT. Setup is not complete(missing one more M2 drive), but I can't wait to use it.

    • @ZackSNetwork
      @ZackSNetwork Před 7 měsíci

      You are GPU bound however nice CPU.

  • @JoePolaris
    @JoePolaris Před 8 měsíci

    Have guys look at ram tuning , matching infinity cache speed

  • @rob4222
    @rob4222 Před 8 měsíci +1

    I want more CPU benchmarks. CPUs do matter too

  • @dante19890
    @dante19890 Před 8 měsíci

    1.45 . Johns delayed laugh cracked me up XD

  • @jergernice1
    @jergernice1 Před 8 měsíci +2

    just built my 7700x seems to work great. have no idea what scheduling hes talking about adjusting?

    • @gellideggti
      @gellideggti Před 8 měsíci

      It will not be a problem for the chip you have mate its more for the 16 core duel ccd. one with 3d one without . The use the Microsoft game stuff to sort what half of the chip to use. 3d half or standard half.

  • @bradcole2015
    @bradcole2015 Před 5 měsíci

    Why would you use HDMI over DP? also use DLAA instead of DLSS

  • @HallyVee
    @HallyVee Před 7 měsíci

    Just did this upgrade myself, but can't get the am5 ram timings right, can't get the nvme format right, even the gpu had a clearance issue. Think I'm done with building, it was fun way back when, when the most complicated part was mb compatibility etc. Now I gotta browse 14 different forums...

    • @nivea878
      @nivea878 Před 7 měsíci

      its AM5 plattform, hit or miss, Intel is superior i have 7800x3d and 14700K , AMD every time something meehh happends

  • @razzorj9899
    @razzorj9899 Před 8 měsíci +3

    I just upgraded from a ryzen 5600 to the 7800x3d and wow what an upgrade more then doubled my fps in warzone 2 and mw3 and I play at 3440x1440 so yes cpu matters at higher resolutions as well

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Před 7 měsíci

      CPU matters at 4k especially with a 4090, my 4090 is pretty much never at 99% usage on any game at 4k Ultra, and RT makes matters worse because a 5900X just can't do it. But seeing a 7800X3D and 14900K on benchmarks, they only do like 30% better which isn't worth spending another $2000AUD just yet.

    • @xbox360J
      @xbox360J Před 6 měsíci

      Same here !!! That cpu upgrade has been awesome along with my 3080.

  • @AshtonCoolman
    @AshtonCoolman Před 8 měsíci +8

    I have a 7800X3D, RTX 4090, and CL30-36-36-76 DDR5 6000 all on an ASUS X670E Crosshair Hero and my system is flying.

    • @baghlani92
      @baghlani92 Před 8 měsíci

      I had cl30 38 38 96 is it good one?

    • @Strengthoftenmen
      @Strengthoftenmen Před 8 měsíci

      I have the exact same set up 7800x3D, 4090, x670e hero. Buuut Initially I was running corsair vengeance 5600 but the system kept crashing. Just purchased some G skill 6000hz cl30 hoping it will improve things.

    • @AshtonCoolman
      @AshtonCoolman Před 8 měsíci

      @@Strengthoftenmen Have you updated your BIOS to the latest version? It's a necessity on these boards. The BIOS this board shipped with has unstable DDR5 support and also will kill your 7800X3D due to too much SoC voltage.

    • @Strengthoftenmen
      @Strengthoftenmen Před 8 měsíci

      @@AshtonCoolman thanks for the reply. Yes I have updated the bios. I think the mb is just very sensitive to timings??

    • @FearlessP4P1
      @FearlessP4P1 Před 7 měsíci

      @Strengthoftenmen how has the new ram been working? I’ve heard a few issues Corsair and another ram, but I’ve yet to hear any issues with Gskill

  • @that_colin_guy
    @that_colin_guy Před 8 měsíci +2

    3:15
    Displayport Superiority 🤘🏻

  • @HardwareAccent
    @HardwareAccent Před 8 měsíci +4

    Is it possible that some traversal stutters can be affected by RAM and/or NVME speed?

    • @Chilledoutredhead
      @Chilledoutredhead Před 8 měsíci +2

      Its a fair bet he has the games on a good nvme drive, and his RAM is a really good high speed kit. So my educated guess would be no. Just games that are not designed for pc, because companies just want to optimise for console.

  • @andrewg2536
    @andrewg2536 Před 7 měsíci

    anyone know if I would see much improvement going from RTX 4080 with 5900X to 7800X 3D for 4K gaming on LG CX

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Před 7 měsíci +1

      Most likely a noticeable difference, but the best way to know is by monitoring your GPU usage with the framerate uncapped. I know a 5900X isn't enough for a 4090 at 4k, the 4080 is going to be far less bottlenecked but I'd imagine to some extent it still is and the 4070 is probably the fastest card I'd pair with a 5900X

  • @dylanbeazley6739
    @dylanbeazley6739 Před 7 měsíci

    Hey guys! Was wondering if you think running a stream, vtubing and discord while gaming is possible on the 7800x3d without dropping 1% lows in valorant. Trying to do all that and have 360hz feel like 360hz while streaming haha

    • @arp1
      @arp1 Před 7 měsíci

      Yes.

    • @dylanbeazley6739
      @dylanbeazley6739 Před 7 měsíci

      Oh nice you running the same chip? If so what are your 1% lows looking like while streaming?@@arp1

  • @guntherultraboltnovacrunch5248
    @guntherultraboltnovacrunch5248 Před 7 měsíci +1

    As a guy who lives in the Arizona desert, I'm just happy with how it sips power and runs cool.

    • @M41NFR4M3Gaming
      @M41NFR4M3Gaming Před 5 měsíci

      I live in Arizona too. I’ve had both. My 14900k runs cooler than the 7800x3d. Returned the 7800.

  • @marcotomiri3440
    @marcotomiri3440 Před 8 měsíci +1

    i need the 7950x 3d for cities skilines 2, 7800 has too few cores for that game in late game, with more than 500k people

  • @12me91
    @12me91 Před 8 měsíci +17

    While I would have liked to have seen the 5800x3d be used in its day I'm loving the fact that you'll be using a 3d chip finally

    • @selohcin
      @selohcin Před 8 měsíci +1

      Took the McDonalds commercials a little too seriously, did ya? 😂

    • @SpecialEllio
      @SpecialEllio Před 8 měsíci +1

      @@selohcin loving something isn't a commercial exclusive and he didn't even say the mcdonalds quote

    • @puregarbage2329
      @puregarbage2329 Před 8 měsíci +4

      Hardware unboxed used it in almost all of their benchmarks when it was still relatively new. It’s conclusively better than 10900k, and pretty close to 12900k, sometimes slightly better. 7800x3d is easily the best gaming cpu when you consider the price.

  • @nemz7505
    @nemz7505 Před 6 měsíci

    The 3DNow patch for the original Freespace was pretty impressive to be fair ;)

  • @fracturedlife1393
    @fracturedlife1393 Před 8 měsíci

    ACC massively prefers the 3d vcache too. Many games have crazy uplift, and almost as many others would prefer non X3D.

  • @f-35x-ii
    @f-35x-ii Před 8 měsíci +5

    I'm still going strong with my 5800x3d , and i even undervolted to -30 per core and wow temps are good and performance is top notch still! Im not gonna upgrade cpu for maybe 8 years, and get whatever is top price to performance then. (Edit from all core to per core -30 undervolt)

    • @devilmikey00
      @devilmikey00 Před 8 měsíci +1

      AMD CPU's are great in that undervolting usually increases performance. I had a 5700x for a bit and using PBO and an undervolt got me 250mhz more boost while running cooler. I know the X3d parts can't raise their boost but I bet it's staying at it's max boost longer and more consistently now that you've done that.

    • @f-35x-ii
      @f-35x-ii Před 8 měsíci

      @@devilmikey00 correct, mine stays at max boost all the time it doesn't even go down at all while gaming, on was locked in earlier BIOS, but with newest BIOS it is easily changed!

    • @teddyholiday8038
      @teddyholiday8038 Před 8 měsíci +2

      In running all core -30 on my 5800x3D too
      It’s a beast CPU but I don’t think it’s gonna last me 8 years. There are plenty of recent games just hammering CPUs
      I’m skipping Zen4 but I’ll consider whatever the Zen5 equivalent is (9800x3D?)

    • @exoticspeedefy7916
      @exoticspeedefy7916 Před 8 měsíci +2

      Per core is better, otherwise you could actually loose performance if you have too much of a negative offset

    • @f-35x-ii
      @f-35x-ii Před 8 měsíci +1

      @@exoticspeedefy7916 oh sorry , that is what I meant, per core, just double checked my BIOS

  • @ryanocallaghan8833
    @ryanocallaghan8833 Před 7 měsíci

    Does Star Citizen still have issues with E-cores? Could explain the large performance delta and stuttering on the 12900K.

    • @daweitao2668
      @daweitao2668 Před 7 měsíci

      E-cores issue fixed under W11 (but not W10). SC is just not optimised yet and the extra cache really helps in titles where memory usage isn't tuned in so well.

    • @ryanocallaghan8833
      @ryanocallaghan8833 Před 7 měsíci

      @@daweitao2668 Thanks. My biggest concern with the 3D V-Cache CPUs is consistency - there is a somewhat larger gap between the highs and lows compared to other CPUs. Factorio is a good example, with map sizes that "fit?" in the cache seeing large frame rate boosts, while increasing the map size to make it no longer fit in the cache and rely more on memory access makes Intel CPUs take the lead again. I've got a 13900K in my gaming machine and I find it to be an extremely reliable performer in all CPU intensive gaming scenarios.

  • @elbowsout6301
    @elbowsout6301 Před 8 měsíci +1

    I just built a new rig 6 weeks ago with a 7800X3D and an RX 6950 XT and I couldn't be happier. It's been a few years since I have built a PC from scratch and the build process was seamless. Go treat yourself to some 6000 MT/s RAM.

    • @godblessusa1111
      @godblessusa1111 Před 7 měsíci

      weak card for this processor...need min rtx 4080

  • @doc7000
    @doc7000 Před 8 měsíci

    In BG3 likely when the frame rates are lower you are hitting a GPU bottleneck which is why both are performing the same seeing as they both have RTX 4090 cards, though when the GPU load is reduced frame rates climb and you start to run into a CPU bottleneck.

  • @miikasuominen3845
    @miikasuominen3845 Před 8 měsíci

    I'll most probably get one now, from BF-sales... Most probably with ASRock X670E Steel Legend and Corsair 6000/cl30 RGB memory.

  • @mick7727
    @mick7727 Před 7 měsíci

    Games need to use direct storage. It's there... why not preload shaders and textures?

  • @elvertmack5039
    @elvertmack5039 Před 7 měsíci +3

    I went AMD...but only half way. Got a 12900k combo deal with cpu,motherboard, and ddr5 6000mhz ram for like 400 bucks whole combo and paired it with a 7900xtx and I tell you...this combo is butter smooth. I also have a 13700k/4080 and it must say that the 7900xtx system feels smother than my 4080 system. Both are on a 34 inch Alienware oled monitor at 1440p and getting beautiful results.

    • @VYM
      @VYM Před 7 měsíci

      Worst combo ever. Extremely power hungry CPU and same GPU together is a big fail. You should've bought 7800x3d and RTX4070 with 550 watts PSU.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Před 7 měsíci

      @@VYM Most people don't care about that. I'd personally rather AMD CPUs in my system with a 4090 so I don't risk saturating my 1000w PSU, but if I did go Intel, I wouldn't care that much about the extra 100w. Most people just wanna run games and get good performance with manageable temps. Despite my 5900X only getting a max of 145w under full load, it's so ridiculously hot anyway

    • @VYM
      @VYM Před 7 měsíci

      @@mttrashcan-bg1ro I cut down my 14700k turbo boost power from 253/135 to 110/110 and all core boost ratios from 55/56 to 42. Lost 14% performance but temperatures now not over 65-70 degrees with 26000 in Cinebench dropped from 30500.

  • @supraken123
    @supraken123 Před 8 měsíci

    Go on lads.

  • @fuckyoutubrforchangingthis
    @fuckyoutubrforchangingthis Před 7 měsíci

    Is my 7800x3d + 7900xtx speedster black + 990pro ssd + gigabyte 650 aurous elite ax + g Skill flare 5 ram ddr5 6000mhz. A viable set.up? Im in my return period...should i get a 4080? Orrr switch my cpu or somethin? It games very well atm, buttt.......yeah plz respond

  • @benknapp3787
    @benknapp3787 Před 5 měsíci +1

    I have this CPU coming later this week along with a 7900XTX for my very first build. Should be fun!

    • @keynoteFlyer
      @keynoteFlyer Před měsícem +1

      I’ve just ordered the same combo , how’s it been?

    • @benknapp3787
      @benknapp3787 Před měsícem

      @@keynoteFlyer It's been absolutely amazing. I run a 65" LG C2 and two 24" Dell side monitors. I run every game I play at 4K Max settings and it crushes them.

    • @keynoteFlyer
      @keynoteFlyer Před měsícem +1

      @@benknapp3787 thanks for the quick reply Ben. That sounds amazing. My setup is mainly for flight sim and a few Rockstar games. I’m looking forward to this massive upgrade from my old i7 with 6600XT. 😀👍🏻

    • @benknapp3787
      @benknapp3787 Před měsícem

      ​@keynoteFlyer You won't be disappointed. What version of the 7900XTX did you order? I'm running the Merc 310. It hits about 450w with the power limit raised but it's worth the extra performance in demanding games.

    • @keynoteFlyer
      @keynoteFlyer Před měsícem +1

      @@benknapp3787 I’ve ordered the ASRock 7900 XTX Taichi edition. Both of my previous cards have been Sapphire Nitro + versions but I can’t find any of those in the 7900XTX. The Taichi sounds like a decent equivalent. What Power Supply did you go for? The AMD website recommended 1000W but the builder wanted to give me 850W (based on the lower power draw of the 7800X3D I think)

  • @Fefeallday.
    @Fefeallday. Před 8 měsíci +2

    I Have a question, I also have the 7800x3d, but my concern is the temperatures... what would the temperatures be like when playing? I have a 240mm nzxt kraken and I find them high when playing at 70c to 75c. Help!

    • @f-35x-ii
      @f-35x-ii Před 8 měsíci

      What will help is the undervolt, I did -30 on all cores on my 5800x3d, and wow it keeps temps down and I even see better performance

    • @mugabugaYT
      @mugabugaYT Před 8 měsíci +7

      70-75°C is perfectly fine. As long as you're below 95° you're good.

    • @kevinfromsales6842
      @kevinfromsales6842 Před 8 měsíci +1

      Mine gets kinda toasty too. Shoots right up to 82 when stress testing but stays around 65 to 70 when gaming. I don't think 75c will kill a 7800x3d but I'm not sure.

    • @morbid6944
      @morbid6944 Před 8 měsíci +4

      That's quite fine temperatures while gaming. Don't worry!

    • @randomguydoes2901
      @randomguydoes2901 Před 8 měsíci

      70-75 high? haha. curve optimize like everyone else or buy a console and stfu

  • @georgeindestructible
    @georgeindestructible Před 8 měsíci

    11:57 - 12:57 yet that's literally a CPU killer right there, my 5800x was struggling so hard there.

  • @Cernunn0s90
    @Cernunn0s90 Před 8 měsíci +11

    I can't stand stutters, and recent years with shader compliation stuttering in so many games is honestly driving me nuts. Something has to be done.

    • @stealthhunter6998
      @stealthhunter6998 Před 8 měsíci +7

      It’s not just shader it’s traversal stutters which r even worse. Straight up stutters all the time not just one time. Like dead space remake. 100% the worst part of PC gaming rn.

    • @liftedcj7on44s
      @liftedcj7on44s Před 8 měsíci +1

      It has made PC gaming no fun for myself, the way these developers are coding these games is just stutterfest everywhere.

    • @michaelzomsuv3631
      @michaelzomsuv3631 Před 8 měsíci +1

      ​@@stealthhunter6998 Dead Space original can still be played. I played it recently, it was the smoothest experience with constant 240fps not a single frame drop on a 240hz monitor. Way more fun than any shitty lagfest that are 'modern' games and remakes.

    • @stealthhunter6998
      @stealthhunter6998 Před 8 měsíci +1

      @@michaelzomsuv3631 Ik… the quality of older games is just the best. Was playing Halo CE in OG graphics again and was amazed by the fact they had RTX like effects in the glass. U can see chief in the reflection of the floor in the control room crystal clear. Sure it’s dated but for back then that’s a huge achievement real time lighting effect. Just so much care and effort put into them compared to today’s stuff.

  • @vulcan4d
    @vulcan4d Před 8 měsíci +3

    I hope you guys do memory tuning and compare. It is amazing how close the 7700x is to the 7800x3d is when you have tight memory timings. The secret sauce is that the 7700x has better latency up to 16mb cache but above the 7800x3d wins however with tight memory timings it isn't that much of a difference any more.

    • @JsGarage
      @JsGarage Před 8 měsíci +2

      But what’s the point? Pay a little more get an overall faster CPU for gaming and don’t have to mess with memory tuning or risking corruption when you don’t get your timings just right. Guess if you love tweaking get a 7700X and feel special that you got there differently. For most people it’s a waste of time for what you’d save tweaking memory on 7700X instead of 3D route.

  • @FooMaster15
    @FooMaster15 Před 8 měsíci +2

    Some problems you just can't brute force with sheer CPU power... I hope UE5 gets more CPU optimized soon.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Před 7 měsíci

      UE5 doesn't seem to have CPU problems as much, it scales pretty well with core/thread count and speed/IPC, the UE5 games are just really GPU heavy. A lot of these newer games with CPU problems such as Jedi Survivor are still using UE4, the ones on UE5 that are CPU dependent probably just want faster chips

  • @Lionheart1188
    @Lionheart1188 Před 8 měsíci

    Nice, just bought myself one from Black Friday sales. Already notice the mins & max fps increase.

  • @spiritualantiseptic
    @spiritualantiseptic Před 8 měsíci +43

    I jumped from R5 7600 to 7800X3D solely because DF made me conscious of stutters.

    • @BigGreyDonut
      @BigGreyDonut Před 8 měsíci +5

      I made that exact same jump bc I had a “It’s time” moment once it went on sale again

    • @spiritualantiseptic
      @spiritualantiseptic Před 8 měsíci +10

      @@BigGreyDonut To be fair though the performance jump isn't actually worth the price because R5 7600 is a great value but man, the DF is helping hardware producers 😂

    • @BigGreyDonut
      @BigGreyDonut Před 8 měsíci

      @@spiritualantiseptic hey now…don’t actually use reason🥲

    • @DeepStone-6
      @DeepStone-6 Před 8 měsíci +10

      I got the 5800X3D because it ran Star Citizen the best out everything out at the time but then I realized every other game I play ran much smoother so now if it doesn't have 3D at the end I don't want it.

    • @nightspiderx1916
      @nightspiderx1916 Před 8 měsíci

      @@DeepStone-6 I had the 5800X3D and also the 7800X3D but i might stay with the Zen5 without V-Cache because the Zen architecture likes tuned DRAM.
      In Star Citizen a tuned Zen4 is almost faster than a Zen4 with V-Cache because some games want the +600 Mhz and for some games the additional 64MB L3 is not enough.
      There are testers out there on CZcams who compared the latest intel and amd CPUs with tuned DRAM and with tuned DRAM Intel is often ahead, like in Star Citizen.
      So i think about buying the ""vanilla"" zen5 and tune it as good as i can with DRAM 6400 and beyond (subtimings).
      czcams.com/video/HzPrGv22R7g/video.html

  • @kylecorona8176
    @kylecorona8176 Před 3 měsíci

    how is a 4090 getting 80 frames on jedi survivor at 1080p wtih dlss that doesnt make sense at all

  • @Raptor999_
    @Raptor999_ Před 8 měsíci

    I quit playing apex legends because after optimizing everything it has just stopped working anymore without constant lag and stutters my old 1080 ti is struggling out here

  • @dex4sure361
    @dex4sure361 Před 7 měsíci

    6:08 I also use this scene to test CPU single thread performance in games. Crysis engine is just very single thread heavy and that scene just always maxes out single thread and creates huge GPU bottleneck.

  • @lgolem09l
    @lgolem09l Před 8 měsíci +1

    Well, the reason people might not notice shader stutter may be because they start looking for them after most of the shades have been compiled. It's not gonna appear more than once in a game until you reinstall the gpu driver

  • @jadedrivers6794
    @jadedrivers6794 Před 8 měsíci +2

    Being able to run a 5800X3D on a motherboard from 2017 is amazing! Getting a 35% increase in fps going from a R5 3600 to a 5800X3D on 1440p (source: HUB) is mind boggling

    • @markotto1767
      @markotto1767 Před 8 měsíci

      With the same gpu ? What gpu do you have ?

    • @jadedrivers6794
      @jadedrivers6794 Před 8 měsíci +1

      @@markotto1767 I have a 7900 XT. HUB tested with a 6950 XT. There is around a 15% performance gap between the two cards but the results should be very similar.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Před 8 měsíci

    I watched some benchmark videos from a youtube channel called "Terra Nigma" and there is a margin of error in the difference of fps at 4k max settings using a 4090 and the difference between a ryxen 5900 and a 7800xt. Most AAA games today are still gpu bound at 4k even with a 4090. I was surprised.

    • @NulJern
      @NulJern Před 8 měsíci

      it all depends on what resolution you play at. if you lower your resolution you will become CPU bound at some point and not many play at 4k

    • @lgolem09l
      @lgolem09l Před 8 měsíci +1

      4k is a ridiculous resolution that was never viable for actually top of the line games without dlss.

    • @NulJern
      @NulJern Před 8 měsíci

      @@lgolem09l do you play at 4k?

    • @lgolem09l
      @lgolem09l Před 8 měsíci +1

      @@NulJern yes, previously with a 3080 and then changed to a 4090. I mean you can see in the benchmarks that modern games are often even cpu bound if you want to reach 120fps.

    • @NulJern
      @NulJern Před 8 měsíci +1

      @@lgolem09l well what kind of cpu did your pair with your gpu if you are cpu bound at 4k? You should be gpu bound at 4k, not cpu bound. I get 144+ in most games in 4k, and i'm always gpu bound, even tho i only have 8 core cpu with stock cooler.

  • @o0Silverwolf0o
    @o0Silverwolf0o Před 8 měsíci +1

    I have that board for my 7800x3d/4090 build its been great. I did a lot of research before choosing a board and heard that the pcie architecture on this board was a bit different then most allowing better performance and checked out the diagrams that showed why.

  • @Kapono5150
    @Kapono5150 Před 7 měsíci +2

    Just picked up the 7800X3D. Shout out to Intels “Core Truths”

  • @yousuff1
    @yousuff1 Před 8 měsíci +1

    With curve optimizer you can make the 5800X3D/7800X3D incredibly efficient.
    -30 all cores for me, rarely breaks 50c in games.

    • @exoticspeedefy7916
      @exoticspeedefy7916 Před 8 měsíci +1

      Better to test Per core. You can loose performance if you have too much negative offset

    • @yousuff1
      @yousuff1 Před 8 měsíci

      no loss in performance or stability, i ran the necessary tests for longer than suggested. The 5800x3d has been binned quite well, loads of people manage -30 all core. @@exoticspeedefy7916

  • @roymarr6181
    @roymarr6181 Před 7 měsíci

    should be comparing the Intel i9 13900k seems how it came out close to the same time as the Amd Ryzen 7 7800x3d, and i bet you the test would be closer, and last to benchmark cpu's you don't bench at 4k, you bench at lower res like 1080p etc

  • @J_Brian
    @J_Brian Před 7 měsíci

    What helped me is upgrading my RGBs to OLEDs

  • @nickvirgili2969
    @nickvirgili2969 Před 7 měsíci

    Its awesome, i just gotta upgrade, kinda, from my 6950xt to 7g gen, though im hoping we get some love from the company cause its still a beast on 1440 and 4k so..... Wish nvidia werent greedy m fers.

  • @kybercrow
    @kybercrow Před 6 měsíci

    Yeaaaaah, the point they made in the first 5 minutes was very apt. People are very quick to blame others' hardware, and for some reason hardly ever want to attribute major issues to game optimization. Yes a better CPU will give better performance, as this video shows, but people are very hyperbolic on this point-big issues are likely to be the game.
    I have the 7700X. Avatar runs like butter on ultra settings with the occasional hitch here & there, but I get frequent stutter in CP 2077 after patch 2.1.

  • @lotrcod4life
    @lotrcod4life Před 7 měsíci

    I'm planning on upgrading from my 12700k with 32 gb ddr4 3600 to a 7800x3d ddr5 6000 cl30. Not sure what to expect but this makes me excited for the boost that I should get. Anyone have any idea how big a boost I could see?

    • @mitchmurray2260
      @mitchmurray2260 Před 7 měsíci

      boost for games? if youre at 1080p, you might see 10 to 20% jump in some games. but if you also do work production and not just games, a 14700k would be a better upgrade as the 7800x3d is really just a cpu that runs the fps up at 1080p but performs awful when compared to the newer Intels in work production.

    • @lotrcod4life
      @lotrcod4life Před 7 měsíci

      yea just gaming at 3440x1440@@mitchmurray2260

    • @daweitao2668
      @daweitao2668 Před 7 měsíci

      No need to upgrade your current CPU tbh, it's still strong.

  • @Hi-levels
    @Hi-levels Před 8 měsíci

    I have a mini itx 3090 system with 5600x and ps5. I will upgrade to 8800x3d like 12 momths later maybe. I believe even from 5600x it is not a good jump to 7000 series. 3090 isnt that strong anyways

  • @CruitNOR
    @CruitNOR Před 8 měsíci

    8:26 Mate the difference is literally night and day!

  • @johnhughes9766
    @johnhughes9766 Před 8 měsíci

    The limiting thing you speak of if ram latency

  • @user-cf4us2lz5s
    @user-cf4us2lz5s Před 8 měsíci

    It is really a generational difference and you've shown that.

  • @ivantsipr
    @ivantsipr Před 8 měsíci

    Oh no, not the Aorus X670E Master 😮
    😅ok I reach the point where you said you updated the bios, ok 👍

  • @ymi_yugy3133
    @ymi_yugy3133 Před 8 měsíci

    It kind of depends what's causing the stutter, but it's not impossible that even a slightly lower class CPU has massive stutters where a faster one doesn't.
    Imagine something like memory management. As you play you keep accumulating garbage that needs to be freed. If the CPU is idle, it runs the garbage collection, if it doesn't it delays the task.
    A fast CPU might have enough capacity to run the garbage collection before having to render the next frame, but slower one delays it until it goes over some memory limit and then suddenly has a much more expensive garbage collection process. This is of course a contrived example and no sane programmer would program it like this. But when you have many, many tasks, weird things happen.

  • @westyk52sparky
    @westyk52sparky Před 7 měsíci

    i just done this jump today. but i had a 3700x. i need to to wait for 8000 but its xmas

  • @Fletcher959
    @Fletcher959 Před 8 měsíci

    jts a setting in bios

  • @Nintenboy01
    @Nintenboy01 Před 8 měsíci +2

    Based on leaks/rumors Zen 5 may not be a massive leap in terms of IPC - Zen 2 to 3 was apparently a bigger jump. If you're already on Zen 4 it seems more prudent to wait for Zen 6 or even 7 or see how Intel's 15th or 16th Gen do

    • @_N4VE_
      @_N4VE_ Před 8 měsíci

      I have a 79xtx and been debating on upgrading to the 7800x3d from my 5800x3d. After many videos it seems it’ll get me roughly 15-20fps gain in general, some games don’t show much improvement at all. After pricing everything I need, i’ll be spending $725.00. Is that even worth it? Or should i just stick to my 58x3d until next gen?

    • @Nintenboy01
      @Nintenboy01 Před 8 měsíci +1

      @@_N4VE_ probably best to skip at least 1 Gen. Maybe wait for the 8800X3D

    • @_N4VE_
      @_N4VE_ Před 8 měsíci +1

      @@Nintenboy01 Thanks, makes sense. Id rather spend that kinda money and get a good jump in performance rather then a minor one. Guess i’ll just have to live with it for now, 58x3d still a great chip and is doing well with the 79xtx.

    • @Nintenboy01
      @Nintenboy01 Před 8 měsíci +1

      @@_N4VE_ yeah, in many games it's also almost as good or even slightly better than the 12900K

    • @_N4VE_
      @_N4VE_ Před 8 měsíci +1

      @@Nintenboy01 glad i looked into it more and got second opinions. I almost impulse bought an upgrade that I don’t really need lol. The extra frames would be nice but i don’t see it justifying dropping $700< for it

  • @8aliens
    @8aliens Před 8 měsíci

    I'm running an almost 10 year old CPU (4790k)... could I have your 2 gen old (the horror!) CPU plz.
    The argument of "it's a compute shader it's just maths so the same" view is a bit limited, for a couple of reasons.
    Different GPU's are more performant in different stages of the render pipeline depending on their architecture. Take tessellation for example, which in all uses the Hull-Shader Stage, Tessellator Stage & Domain-Shader Stage. Some GPUs are just better at tessellation than others, especially in the early days of DX11. It's just all down to how the chips architecture handles the staging and instructions it's given.
    The second point. You can do some very funky optimization in HLSL especially with it's multithreading. (You can do 3D arrays of threads, which can even share data! (Which FSR does use.)) Thus if you know exactly which 1 GPU your application is going to be running on, you can do some optimization matching the treads to the GPU architecture, but that's not really possible in the PC space, you're more just relying on how that particular GPU architecture handles mismatched thread to GPU core count.
    As to whether these factors are an advantage/ disadvantage for AMD vs nVidia vs Intel I can't say. It would be hard measure exactly and result could vary across GPU architecture generations, and even which GPU (4070 vs 4070Ti) in said generation is being used.
    I was just pointing it out because not all 1 & 0 / "just maths" are equal.
    (A cache miss can be cause by "just maths" and they can be a real hit to performance. Which is the very reason AMD x3D CPUs get a performance boost in some situations vs other CPU's.)

  • @puregarbage2329
    @puregarbage2329 Před 8 měsíci +4

    Amd is the 🐐 for cpus, I don’t care what anyone says. 5800x3d going strong almost 2 years later. And only at 100watts 👀

    • @Hi-levels
      @Hi-levels Před 8 měsíci

      Am4 is going strong as well.

    • @dvxAznxvb
      @dvxAznxvb Před 8 měsíci

      Yeah 3d cache is barely bottlenecked at this point in time

  • @Machistmo
    @Machistmo Před 7 měsíci

    @8:38 you think that you would see higher numbers on the left(night) side.

  • @AdadG
    @AdadG Před 8 měsíci

    "[...] but it's not like, it's not a generation little difference" Alex said. But... what generation are we talking about, from the 10th to the 11th or from the 13th to the 14th generation of Intel? If so, there are like 5 generations of those together.

  • @ryzendave9303
    @ryzendave9303 Před 8 měsíci

    do u beleive the 5800x3d to be better choice than the 5900x with a 4070ti

    • @JFinns
      @JFinns Před 7 měsíci

      For gaming yes of course, for mixed productivity/gaming no.

  • @Stevecad_
    @Stevecad_ Před 8 měsíci +1

    7800x3d worth it for 4k over the 7600x?

    • @stevesonger1
      @stevesonger1 Před 8 měsíci +2

      Not so much for 4K, but for the longevity of your system, the extra cores and the 3-D V cache is going to last much longer

  • @Marco-pf3te
    @Marco-pf3te Před 8 měsíci

    Hey guys question! Is it worth upgrading from a ryzen 3700x to the 5800x3D? My gpu is 6800xt and I plan on buying an C2 oled 42'' 120hz

    • @scottanderson9414
      @scottanderson9414 Před 8 měsíci

      Yes

    • @Marco-pf3te
      @Marco-pf3te Před 8 měsíci

      @@scottanderson9414 will I not be gpu limited on 4k anyway?

    • @scottanderson9414
      @scottanderson9414 Před 8 měsíci

      @Marco-pf3te if you are gpu bound you will inherently want to turn on fsr which will increase demand on cpu. Higher fps higher cpu usage. Imo it'd be worth it. Look at 3700x vs 5800x3d 1440p resolution to see what kind of difference you'd be looking at

    • @scottanderson9414
      @scottanderson9414 Před 8 měsíci

      @Marco-pf3te also depends on what your plan is. If it were me I'd probably wait for a gpu upgrade then do whole switch to am5

    • @Marco-pf3te
      @Marco-pf3te Před 8 měsíci +1

      @@scottanderson9414 I see what you mean. Thanks for your response!
      The thing is I will probably wait at least a few years before making the AM5 + GPU upgrade.
      It's just that I want to make most of my new OLED now that Ive finally got my hands on the tech. And this is a relatively cheap upgrade when selling my old cpu

  • @FNXDCT
    @FNXDCT Před měsícem

    i am very surprised. I thought my 5800x3d was stuttering with shaders but i notice even the lastest cpu do ! :o
    The games should find ways to preload the shaders before.
    The last of us part 1 was preloading shaders and i can tell you with 5800x3d and 4060ti i've had no stutters at all, it was buttery smooth.
    Unreal engine 5 games are the worst about stutters for now, they are stuttering very hard on nvida cards, especially when you start the game and the next 30minutes of gameplay

  • @kapono2916
    @kapono2916 Před 8 měsíci +1

    Digital Foundry switches from Intel to AMD, wow that’s BIG news

  • @TechButler
    @TechButler Před 7 měsíci

    I still have a 12 year old CPU in one computer that's playing games like the latest SIMS at above 30 fps on a 1660 Super and 16Gb of RAM. It can be done.

  • @dschNgz
    @dschNgz Před 6 měsíci

    I had the 3700x and now the 7800x3d paired with my gtx 1080 and when playing battlefield, i have stutters... runnimg on msi b650 tomahawk. Before it was smooth and bow, your buying the best cpu and have stutters ? Wtf... i think its something about maybe win11 or agesa / bios. Its sad