Star Wars Jedi Survivor: CPU Bottlnecked on 7800X3D | RTX 4090

Sdílet
Vložit
  • čas přidán 26. 04. 2023
  • Jedi Survivor is out with day 1 patches and drivers, and its performance still leaves a lot to be desired. The main culprit seems to be the Ray Tracing hit to the CPU, but the CPU limitation can be pretty bad even without RT if you don't have one of the fastest new CPUs. In this video I test how it performs on an extremely high end system with an RTX 4090 and Ryzen 7800X3D.
    Test system specs:
    GPU: RTX 4090 amzn.to/3LC6Olm
    CPU: Ryzen 7800X3D amzn.to/3Hkf7Qi
    Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
    Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
    RAM: 32GB DDR5 6000 CL30: amzn.to/41XRtkM
    SSD: Samsung 980 Pro amzn.to/3BfkKds
    Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
    PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
    Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
    Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
    Mouse: Logitech G305 amzn.to/3gDyfPh
    What equipment do I use to make my videos?
    Camera: Sony a6100 amzn.to/3wmDtR9
    Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
    Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
    PC Capture Card: amzn.to/3jwBjxF
    Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
    Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
    Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
    Greenscreen: Emart Collapsable amzn.to/3AGjQXx
    Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
    RGB Strip Backlight on desk: amzn.to/2ZceAwC
    Sponsor my channel monthly by clicking the "Join" button:
    / @danielowentech
    Donate directly to the channel via PayPal:
    www.paypal.com/donate?hosted_...
    Disclaimer: I may earn money on qualifying purchases through affiliate links above.
  • Věda a technologie

Komentáře • 1,8K

  • @danielowentech
    @danielowentech  Před rokem +190

    I hope to have time to test some lower end GPUs soon, but this is all I had time for tonight.
    Again, make sure your RT actually applied before comparing performance!
    And I know people will want to debate whether this is "CPU limited" since the CPU shows such low utilization, but the GPU is clearly waiting around for the CPU to finish whatever it is doing. The game is just not coded well to fully utilize the CPU cores. I didn't have time to A/B test different CPUs but this is the same behavior that I have seen in other Unreal Engine 4 games (especially with RT enabled) like Hogwarts Legacy and The Callisto Protocol, where I have done A/B testing of different CPUs and seen it improve performance, so I am confident in my diagnosis.

    • @deluxo2901
      @deluxo2901 Před rokem +12

      Daniel look at you now, I was with you at 30k subs and told you that you will get big. I'm happy for you Daniel, cheers to your success sir. Edit: I love how you keep things simple for us to understand and are like an doctor for pc, unoptimization issues.

    • @gavinsanchez7028
      @gavinsanchez7028 Před rokem +3

      Would be interesting to see how this game runs on a 4070(ti) at 1440p

    • @ManBearPing
      @ManBearPing Před rokem

      I don't understand why even use the term "CPU limited" if no theoretical CPU can fix this

    • @AdrianMuslim
      @AdrianMuslim Před rokem +1

      Gay game from gay devs.

    • @cr7vaibhav
      @cr7vaibhav Před rokem +2

      test this with a 9600k

  • @TooBokoo
    @TooBokoo Před rokem +1506

    This actually isn't really a bad thing. It just saves me $70 on buying a broken POS game.

    • @GregoryCunningham
      @GregoryCunningham Před rokem +435

      @@SirCharlzTV So, it's broken. Thanks for the heads up. Will wait until it's $5.

    • @kkrolik2106
      @kkrolik2106 Před rokem +25

      Nvidia drivers overhead in full glory

    • @GoldenEagle0007
      @GoldenEagle0007 Před rokem +5

      Genius

    • @fiecylick3981
      @fiecylick3981 Před rokem +46

      You pay money for games? That's weird :/

    • @Krenisphia
      @Krenisphia Před rokem +79

      @@fiecylick3981 Yes, some people like to support the makers of games they're interested in instead of stealing or receing handouts all the time.

  • @Peter_739
    @Peter_739 Před rokem +556

    Just think of how amazing games like RDR2 or Metro look while running normally on cards like 3060ti, this is indefensible

    • @blacknemesy
      @blacknemesy Před rokem +65

      Nah, just throw FSR and DLSS and 16gb of vram at 1080p because that's how terribly inefficient games are with using the system resources

    • @ysnyldrm73
      @ysnyldrm73 Před rokem +30

      It is weird to think about that, those games is also not optimized properly, sure they still look so good and I love both but rtx 3060ti is not a nowhere near normal card for them, one of them maded to 2018 and other one 2019. You have the performance of the best card available for them when they released. Simply you are bruteforcing.
      Check gears 5, doom eternal, re 2 remake, crysis 3, bf1, rottr, wolfensteins.. for what a good optimization on pc can do.

    • @MitternachtAngel
      @MitternachtAngel Před rokem +28

      You can play RDR2 with a i5 7400 and a 1060 6gb flawlessly

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem

      @I crediits Exactly my thoughts.

    • @ThunderingRoar
      @ThunderingRoar Před rokem +15

      How short of a memory do you people have? RDR2 PC port was awful on launch

  • @zalankhan5743
    @zalankhan5743 Před rokem +213

    The current state of Pc Ports Reminds me of the x360 and ps3 era of Pc ports. Which were Pretty bad

    • @dnakatomiuk
      @dnakatomiuk Před rokem +31

      It's coming back again but with FSR and DLSS as Devs answer to a fix

    • @aaz1992
      @aaz1992 Před rokem +28

      GTA4 flashbacks 😵

    • @GamesTribeNow
      @GamesTribeNow Před rokem +5

      @@aaz1992 Even today, it runs pretty poorly on my 3080.

    • @alderwield9636
      @alderwield9636 Před rokem +2

      Some dude says he got 50 fps with RT on, in idle position, with 4090 💀

    • @damara2268
      @damara2268 Před rokem +2

      @@dnakatomiuk but in this game fsr won't help you at all because the game is bottlenecked by the CPU even with the most powerful gaming CPU in the world ....

  • @Horstelin
    @Horstelin Před rokem +72

    This age of completly broken games not utilizing the hardware properly is just maddening. First Hogwarts Legacy with its RAM/VRAM problems, now this.

    • @KryssN1
      @KryssN1 Před rokem +3

      It comes down to structure difference in consoles vs PC, shared memory pool and dedicated chip that handles assets compile.
      Then Xbox and PS5 have AMD chips in them, while PC market is dominated by nvidia cards, while there is limited time and so optimization for a platform have to be prioritized.
      As you noticed last couple titles are AMD sponsored and consoles are all AMD, publishers just choose their target with their limited time.
      Especially when games are getting more and more massive, with unreasonable targets to beat the last game/release.
      It's hard to hit the targets set by publisher, not mentioning optimization, while the Devs are still probably underpaid.

    • @saricubra2867
      @saricubra2867 Před rokem +25

      ​​@@KryssN1 no, no excuse. It's just bad optimization.
      By the way, i still remember that the PS3 has dedicated RAM and VRAM.

    • @brettlawrence9015
      @brettlawrence9015 Před rokem +6

      @@saricubra2867 I agree the devs have no excuse but they won’t bother optimising for launch most games are rushed. Pc players need to not buy.

    • @alexanderkhan9097
      @alexanderkhan9097 Před rokem +2

      @@KryssN1 As a PC gamer I would be 100% happy waiting potentially a year or more for ports of console games if it meant they actually got the treatment they deserve. And I suspect I'm not in the minority either, so it boggles me to no end why publishers think shovelling out half-finished crap and charging $70+ (in the US, more in other regions) for it is a sustainable strategy. With all the money spent on marketing, buying positive reviews for metacritic, and the cost of the dev team who have to run around like chickens trying to finish titles up to 6 months or a year after launch, they could... I don't know... use that money on finishing their products before trying to sell them to consumers. But what do I know, I don't work for a multi-billion dollar publishing giant like EA :/

    • @RRRRRRRRR33
      @RRRRRRRRR33 Před rokem +1

      And the $70 price tag... I think the consumer base just have to stand their ground and not buy these awful products anymore. Even if you are addicted and just want to keep playing videogames, hell, there's a huge backlog anyway, you can emulate other consoles, etc.. there is plenty of options, no reason to stay "upgraded", "up to date" or whatever if the industry is offering really bad products. But people are still buying this crap, the Harry Potter game sold a lot of copies just with the power of the brand alone, extremely generic and repetitive, technical issues, etc.. doesn't matter, the sheep embrace it

  • @finalshare1
    @finalshare1 Před rokem +435

    I think we hit the point where UE4 just cant keep up with the size of new games any more. How the hell can the fastest available CPU not even hit 100fps.

    • @SolarianStrike
      @SolarianStrike Před rokem +55

      Especially when the RT plugin is used.
      It basically has to render an extra game world in low res and then merge the RT lighting into the final image.
      So this game like Hogwarts Legacy kind basically double's its resoruce usage when RT is on.

    • @FriendlyPCGamers
      @FriendlyPCGamers Před rokem +78

      Cause developers have inexperienced interns developing games who dont know anything about optimization.

    • @FriendlyPCGamers
      @FriendlyPCGamers Před rokem +25

      2+2+2+2+2+2+2 ...... till infinity will break the optimization.
      But, 2+2 = 4 + 2 = 8 + 2 = 10 will give result then optimized the memory size together.

    • @100500daniel
      @100500daniel Před rokem +11

      The problem is devs push graphics regardless of system requirements and frame rate. I wish game devs prioritized performance. Call Of Duty for example looks great,but also performs very well. And scales a lot with different settings. Obviously at 4K ultra+RT it's quite demanding. But on the basic present the game looks decent enough and most midrange CPUs can handle 128 players mode while pushing more than 100 frames.

    • @GamesTribeNow
      @GamesTribeNow Před rokem +38

      @@100500daniel I mean, it's clear that the game is just broken. Nothing to do with pushing graphics settings too far.

  • @shrtrnd6706
    @shrtrnd6706 Před rokem +184

    Unreal, I literally just upgraded my system to a 7800x3d and a 4080 in order to enjoy this game on my 4k TV. Well, guess I'll wait 6 months until they fix it.

    • @insector2093
      @insector2093 Před rokem +24

      It is Unreal afterall, what do you expect at this point

    • @gamesmaster1060
      @gamesmaster1060 Před rokem +43

      ​@@insector2093It's not the engine it's how the engine is used. There's loads of UE4 games that run well

    • @insector2093
      @insector2093 Před rokem +9

      @@gamesmaster1060 You mean Fortnite, and atomic heart? 🤣 you know that's not a lot

    • @WinterSnowism
      @WinterSnowism Před rokem +11

      @Insector, Dead Island 2 is a recent example, and it runs decently. The trick is to not use RT at all. Every UE4 game with RT will turn out to be like crap, such as Hogwarts Legacy and this one. Even Atomic Heart decided to remove RT despite heavily promoting it before.

    • @joseolvera9287
      @joseolvera9287 Před rokem +1

      nah man i bet there's a fix to it in the meantime. Developers have been puttin out unoptimized games for awhile now

  • @whatashame4995
    @whatashame4995 Před rokem +9

    Bruh wtf is this cpu demand. Is the game trying to calculate the physics of Cal's balls

    • @mkuhnactual
      @mkuhnactual Před rokem +1

      could be the drm slowing things down, it's apparently running denuvo

    • @whatashame4995
      @whatashame4995 Před rokem +1

      @@mkuhnactual Drm shouldn't slow down a 7800x3d

  • @KubeSquared
    @KubeSquared Před rokem +325

    The game might be bouncing like crazy between the various CPU cores, since it seems to be on 30%ish on all the cores. The 7800X3D is so much faster than the console CPU this game runs on, it's not even funny. So something is definitely bugged.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +35

      Maybe new denuvo

    • @VitisCZ
      @VitisCZ Před rokem +6

      Well the game runs poorly even on console from what i've seen

    • @StruggleBoxing
      @StruggleBoxing Před rokem +37

      Like a lot of PC games, its simply not optimized.

    • @KubeSquared
      @KubeSquared Před rokem +11

      @@arenzricodexd4409 Doesn't seem to be a Denuvo game, but it does have some background EA DRM running, that might be causing this.

    • @rhinestonepeter
      @rhinestonepeter Před rokem +19

      @@StruggleBoxing Yup, 2020s pc gaming in a nutshell. Everygame seems so un optimised for months after release

  • @edcorps2876
    @edcorps2876 Před rokem +29

    First 3 months of any PC release is beta testing now. Thanks for doing your part. The devs will get to this right away!

    • @loki76
      @loki76 Před rokem

      Not just that, but they never get fixed because they are fundamentally broken. There is no way that they can all of sudden get this game running much better.

    • @ljanitza
      @ljanitza Před rokem

      *The devs will get to this, "eventually"

    • @Aphelear
      @Aphelear Před rokem

      They won't.

    • @Megistius
      @Megistius Před rokem

      Just get an XBOX/ Playstation and stop paying for getting fucked in the a**

    • @PHOBIAx57x
      @PHOBIAx57x Před rokem

      No they won’t, they never fixed Fallen Order lmao

  • @GewelReal
    @GewelReal Před rokem +137

    And this game is 70$ lmao

  • @victorn1291
    @victorn1291 Před rokem +4

    Man Daniel... you got a full time job and youre still able to come out with PC game and PC hardware reviews and comparisons of such high quality. I am very impressed with you sir. As much as I love your vids please dont lose any sleep or family time for us!! Let me know if you ever need an helping hand ;)

  • @bahubali-
    @bahubali- Před rokem +18

    Optimization has left the chat..😢

    • @KryssN1
      @KryssN1 Před rokem

      Well consoles run it as good as a 4090.
      I guess PS5 stock value has just risen once more.

    • @SmogSmok
      @SmogSmok Před rokem

      @@KryssN1 PS5 in performance mode upscales from 960P lol

  • @thecatdaddy1981
    @thecatdaddy1981 Před rokem +107

    Let's hope it won't become the norm for developers to rely on frame generation to "fix" such CPU bound scenarios.
    Oh wait...

    • @itsaUSBline
      @itsaUSBline Před rokem +3

      That wouldn't even help.

    • @sade1212
      @sade1212 Před rokem +7

      If only the game had frame gen, but it's AMD sponsored so FSR is all you get

    • @stefankoopmans2200
      @stefankoopmans2200 Před rokem +2

      @@itsaUSBline Well that will definitely help, but this game does not support it :P But it indeed does not really fix the fundamental problem of it being a non-optimized port.

    • @HeLithium
      @HeLithium Před rokem +2

      @@OnlyBlix if frame gen could push fps beyong 100+, the corresponding latency wouldn't be a problem for this type of game

    • @teapouter6109
      @teapouter6109 Před rokem

      @@HeLithium You would be getting 55~ FPS latency (native FPS latency) regardless of what your FPS counter shows with DLSS 3 turned on.
      This game would feel a lot better with 120 or 240FPS latency because of its high skill ceiling.

  • @TheDanielGuy23
    @TheDanielGuy23 Před rokem +90

    This is ridiculous. I just splurged hard to build my first desktop a few months ago with a 7700X and a 4080 and it's already not even good enough for 60 fps native at 1440p...this lack of optimization is ridiculous. How do they expect anyone to buy a game when most potential customers can't even run it?

    • @reiudfgq3vrh34ur
      @reiudfgq3vrh34ur Před rokem +6

      its mainly the cpu bottleneck i think thats fixable quickly they shouldve delayed another month

    • @mkuhnactual
      @mkuhnactual Před rokem +8

      @@reiudfgq3vrh34ur Unless it's being caused by Denuvo, in which case the performance issues will be around till they patch out denuvo in about a year.

    • @greebj
      @greebj Před rokem +15

      @@mkuhnactual and you can grab the game in like a 50%-70% off steam sale, and this level of GPU powah costs less, and the bugs are patched out. It makes no sense at all for a single player experience to buy full price on release.

    • @shreyasdalal
      @shreyasdalal Před rokem +3

      Play on 200$ XBox series S, much more fun with no stutters

    • @albanier8426
      @albanier8426 Před rokem +1

      ​@@mkuhnactual Dead Island 2 also has Denuvo and that game had absolutely no Performance issues

  • @EGH666
    @EGH666 Před rokem

    Thanks for that video, this is EXACTLY what i was looking for yesterday and couldnt find

  • @JH-uh3cc
    @JH-uh3cc Před rokem +13

    God dammit. There goes another....

  • @JohnThunder
    @JohnThunder Před rokem +18

    So if you want to get a good playable experience you need to have even better than a 4090 and 7800x3D. Saved me 70€. Thanks Daniel.

    • @Drunkgamer904
      @Drunkgamer904 Před rokem

      Nah the game runs fine for me with a 4070ti and 5900x. I can’t ever take these pc CZcams channels seriously because my experience is always way different (way better)than what they show.

    • @Drunkgamer904
      @Drunkgamer904 Před rokem

      @@Soapy_Papoose I listed what I have and I play it at 1440p at max settings with an average of 60fps….so yeah kid, it’s fine for me. Sorry you can’t afford it lol

    • @JohnThunder
      @JohnThunder Před rokem

      @@Soapy_Papoose That's a good 'Cinematic' Experience

    • @oxaile4021
      @oxaile4021 Před rokem +5

      @@Drunkgamer904 I definitely wouldn't be fine with just 60fps at that resolution with those specs. I really don't know why you're defending a shitty port so hard.

    • @raul1642
      @raul1642 Před rokem

      @@Drunkgamer904 dum dum nvidiot

  • @BearSchroomMoon
    @BearSchroomMoon Před rokem +1

    Hey Daniel I was having the same issue with the first world. When I got to the next world I was hitting 100+ fps, even during combat. This is on my 6800 xt 12700k pc. I am running the game on 1440p, nearly maxed settings (shadows medium settings).
    Love your videos, keep up the good work.

  • @cloudtro
    @cloudtro Před rokem +2

    Dan thank you for the tests. Keep it up. Are you going to make a video of the game running on different gpus and resolutions?

  • @jefmes
    @jefmes Před rokem +25

    Ryzen 5900X, RTX 4080, 32GB DDR4 3600 RAM - I just refunded the game. I play at 3440x1440 ultrawide, and seemingly regardless of the video settings I chose, I could barely get the intro areas to hold between 40-50 FPS with RT off and Medium to Low settings. I think my hardware is still pretty high-end compared to what MANY gamers will be using, and it's shocking to me that Lucasfilm Games released this AAA Star Wars game in this state. There's clearly something horrible wrong with it right now. FWIW, Hogwarts Legacy and most other UE4 games, and even Cyberpunk 2077 Overdrive RT mode, all play perfectly well on my system. As much as I also appreciate AMD's release philosophy with FSR2, I'm also pretty amazing this didn't ship with DLSS2 or even DLSS3.

    • @jankratochvil9779
      @jankratochvil9779 Před rokem +1

      Xess 1.1 made insane jumping, so now we have 2 or 3 upscaling options or native, thats pretty good.

    • @jankratochvil9779
      @jankratochvil9779 Před rokem

      It just needs more support

    • @zaidlacksalastname4905
      @zaidlacksalastname4905 Před rokem +10

      Many? That's the second most powerful GPU out there. Easily above 99% of players lol.

    • @NostalgicMem0ries
      @NostalgicMem0ries Před rokem +1

      skip this game till its fixed in few years and also cracked. denuvo versions always has 10 15% lower fps. enjoy other great games, not broken ones

    • @iequalsnoob
      @iequalsnoob Před rokem

      your cpu is not fast enough. buy a 13900k and you can get better numbers than the guy who made this video.

  • @jaxzor
    @jaxzor Před rokem +5

    signs of another unfinished release...
    be smart boys buy your games 2 years and 24 patches after release

    • @cunt5413
      @cunt5413 Před rokem +2

      That's exactly what I do.

    • @RedrickURL
      @RedrickURL Před rokem +1

      I’m on that train too, wootwoot.

  • @Falita
    @Falita Před rokem

    What if the rt cores can't handle all the load so the rest of the gpu and the cpu has to wait for it to keep pushing fps?.

  • @moorhen6156
    @moorhen6156 Před rokem +1

    this may sound odd but have you changed your field of view settings? had a few new releases that would tank performance if you adjusted FOV from default

  • @tapsofosiris3110
    @tapsofosiris3110 Před rokem +87

    All we are reminded of in these cases is that devs/publishers nowadays are really prioritizing Consoles and leaving PC performance in the lurch. I was enticed to buy hogwarts legacy on ps5 just because it didn't stutter as much as it did on my 5800x3d/3080 10gb system.

    • @darbstre2900
      @darbstre2900 Před rokem +30

      That's not the games fault in your case. You have less vram then a console. Your 3080 is bottlenecked. It's not stuttering because your ps5 is quite literally better for this then a 3080 10gb. Games have always been made for console but Nvidia decided to screw everyone with the amount of vram they gave 3000 series low end 4000 series.

    • @imo098765
      @imo098765 Před rokem +1

      hogwarts legacy was passable on a good system, you still got some stutters but with VRR you barely notice it. Just dont turn on RT
      Jedi, looks like a pile or garbage

    • @MrMeanh
      @MrMeanh Před rokem +12

      @@darbstre2900 It stuttered for me with a 5900x+3080 and a 5800X3D+4090 system. In Hogwarts, while VRAM can cause performance issues if you use higher settings than your GPU can handle, many, if not most, stutters come from CPU/system memory bottlenecks in that game.

    • @pf333
      @pf333 Před rokem +18

      @@darbstre2900 The consoles have to share the available memory between VRAM and RAM, so no, a system with a 3080 10GB plus 16 GB of system RAM has more memory available.

    • @vigilantelul4713
      @vigilantelul4713 Před rokem +1

      My game is choppy/hitching and blurry. Can't get above 32 fps standing still. It does not matter what the settings are set to. (Low-Epic) I am running RTX 2070 Super, i7-9700F CPU @ 3.00GHz (4.7 GHz TURBO), 32GB RAM, 144Hz monitor. Hogwarts Legacy ran fine for me after the usual "fixes". Hopefully they can sort this out?

  • @zirize
    @zirize Před rokem +17

    It reminds me of the UE Matrix demo and how CPU bottleneck was a significant issue. It appears that using certain features of the engine itself can create a tremendous CPU bottleneck.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +9

      UE5 is badly threaded, it literally uses less than HALF the potential of a 16 core system, and EVEN UE5 is VASTLY more efficient than this garbage. Ue5 is particularly efficient in VRAM usage, easily running well in 4K with 8GB.

    • @AdrianMuslim
      @AdrianMuslim Před rokem

      @@Wobbothe3rd It's not all about cores. Not all about using 16 cores. The CPU itself needs to be faster.

    • @existentialselkath1264
      @existentialselkath1264 Před rokem +5

      UE5 features like lumen and Nanite have a fairly fixed CPU cost. It's a significant upfront performance cost but once you can run that, you can run almost anything in UE5.
      Also worth remembering that any performance metrics taken from the editor are deceptive as a built game will always run significantly better

    • @nuruddinpeters9491
      @nuruddinpeters9491 Před rokem +1

      This isnt the same situation.
      UE5 matrix demo was demanding nearly all core usage. (Optimizing this would lessen cpu load perhaps by easing up on loaded assets/culling unseen assets whatever.)
      This game on UE4 is under utilizing cores, just like Callisto and well, that story hasnt changed much and seemingly wont.
      Design flaw for AMD consoles targeting weak hybrid APU... This game just like Callisto just cant scale up to utilize proper threads/cores.
      A damn shame.
      Perhaps a few patches and we would gain maybe 5 or 15 percent threaded usage at best like callisto, but because UE4, its probably a wrap. Performance is trash on PC.
      Garbage in, garbage out.

    • @saricubra2867
      @saricubra2867 Před rokem

      @@Wobbothe3rd Man, Unreal Engine sucks so much. Meanwhile Crysis Remastered with Cryengine 4 has those physics, rendering distance, graphics, can ran run pretty well on a potato.
      I still remember the Unreal series with the Unreal Tournament games and the Unreal Engine locked to better perfomance and better graphics for each version.

  • @Ay-xq7mj
    @Ay-xq7mj Před rokem

    Is ray tracing glitched off or does enabling it back on simply break performance? This is a stupid question but i have had opposite happen in Doom Eternal. (as in toggle off then on increased performance)

  • @TeaTimeMr
    @TeaTimeMr Před rokem +1

    I would love to see an optimization walkthrough from Daniel for those kind of games. How much performance can you get out of the system without major image quality loses. Thats much more interesting for me than just the All Maxed out setting.

  • @killerkuerbis6845
    @killerkuerbis6845 Před rokem +4

    This is why rockstar releases their pc ports much later on. Emberrasing

  • @FixCongress
    @FixCongress Před rokem +39

    I'm using epic settings with wider viewpoint. And yeah, I kept RT off. I just got done with playing this at 1440p for over 2 hours. This game has some crazy stuff going on. It just kept using up more and more VRAM too. I stopped the game and then relaunched it. The VRAM went from 20GB used to 8GB VRAM of use in the same area and once again it slowly started climbing. At one point it climbs fast to hit 13GB and this slowly climbs again. Maybe I missed a patch or I need to update my driver. But I noticed that the VRAM just keeps steadily climbing while I play. Could it be a memory leak? I saw this on my RX 7900 XTX btw. I have another system with a 3090 but I'll have to test that out more to see if I notice the same thing. Anyway was curious if you've see that on your 4090 at all. Thanks for the vid.

    • @Sonickid1011
      @Sonickid1011 Před rokem +13

      Sounds like a classic memory leak.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +2

      Memory leak

    • @levlevinski596
      @levlevinski596 Před rokem +1

      If you have too tight areas with resources loading then you get stutters like in dead space and gotham knights 🤣
      vram is like lung capacity, low vram and card can't breath normaly and card gets stutters (asthma). Amd cards use more vram, because they have more vram onboard than nvidia, and vram usage is also a driver thing.

    • @teahousereloaded
      @teahousereloaded Před rokem +2

      It's expected. Unless vram is maxed out you leave performance on the floor. Best case would be installing a game into the vram. Unless the ram is paging, it's not a leak.

    • @stefankoopmans2200
      @stefankoopmans2200 Před rokem +3

      Did it cause slowdowns?? There is nothing wrong with filling out vram to the max, it's the same with pc memory, you actually want it to be filled with as much data as possible so you don't have to reload data when that piece of information is needed again. Say you play your game for 2 hours and now you reload a savegame from the beginning, it should have all the resources already in place. This can be beneficial, for the same reason we want our pc's to completely max out 64GB or ram with browser content, so nothing has to reload. This does not mean your Chrome browser uses a LOT of ram for no apparent reason. When we need more space unneeded stuff gets flushed in no time. I think it's just by design, in your case it keeps filling because it can. On my 10GB card it won't because it can't but that does not mean it's bad behavior. In other words, the game does not need all that vram, it's not an indication of actual allocated usage, it's just there because it has enough room to leave it there.

  • @double0cinco795
    @double0cinco795 Před rokem

    Are there any settings besides ray tracing that impact the CPU utilization? 80fps doesn't hurt my head at epic settings, but it is unfortunate if it stays there at all settings.

  • @amovingtarget
    @amovingtarget Před rokem

    This is a pretty keen observation. it's been a bit since this video was posted, and some patches have made some difference, but really good observation nonetheless. Cheers.

  • @Sowk1000
    @Sowk1000 Před rokem +6

    There is an issue turning on and off the RT during gameplay.
    If you do turn it on, and reload the save the RT is still on but with the higher FPS.
    Tried in the same location you did.
    Find a water puddle with the RT reflections in it. And turn off. They will look very different. Then turn back on. Your fps will tank, but if you reload the save point the fps will be fine with RT still on.

    • @BlackJesus8463
      @BlackJesus8463 Před rokem +2

      Imagine looking into puddles to justify your $1600 gpu.

    • @xXDeiviDXx
      @xXDeiviDXx Před rokem

      @@BlackJesus8463 Consoles also do the same thing

  • @Zowek
    @Zowek Před rokem +17

    Anyone else notice how good Daniel's video's are nowadays? 👏

  • @CAVALO-QUENTE
    @CAVALO-QUENTE Před rokem

    your channel is too good, man ♥

  • @soiltek2015
    @soiltek2015 Před rokem

    You guys are up way too early! 3 hours in at 3:33 am and already over 16k views & 700+ comments. Well done!!

  • @stizmac88
    @stizmac88 Před rokem +3

    Got this game for free with my 7600x, and I still want a refund.

  • @ZeroZingo
    @ZeroZingo Před rokem +7

    You are the best at testing games Daniel! I know you are very pressed for time, but please see if it's possible to get a reasonable midrange CPU to test on as well. 5800X or so perhaps. A lot of new games starve the CPU.

  • @RafaelBirkner
    @RafaelBirkner Před rokem +1

    Is it possible that the non-3Dcache 7700X would be less of a bottleneck because of the higher clock speeds?

    • @xdamnedtobefreex
      @xdamnedtobefreex Před rokem

      I have a 7700x and I'm hitting peaking at about 3% cpu utilization at 1440 ultra wide with max settings; I also have a 4080, 32 gb of ddr5 600 cl36, and a really fast gen 4 nvme. It struggles to even hit 60 fps consistently and has frequent stutters.

  • @kwakes212
    @kwakes212 Před rokem +1

    I'm using a 12900k and a 7900xtx, and it was pretty much the same performance, let's just say 60fps, and changing settings didn't help, but when I changed the preset from epic to high, and then manually turned everything back up to epic, except for AA I kept at high, and foliage I kept at high, and now it's in the 80's. So idk if its a bug when epic preset I'd selected . Or maybe my pc is just weird

  • @GemayelDaniel
    @GemayelDaniel Před rokem +2

    you are right about the ray tracing toggle. was getting 55 fps on 1440p Epic settings with FSR quality and ray tracing on 13700k 2080ti. i turned ray tracing off and then back on again, it dropped to 40 fps.
    But the wierdest part is that i performance from FSR disabled to quality to ultra performance is within 2-3 fps margin. and i don't think it's because FSR was not working at all because i could see the image degredation from FSR disabled/quality and FSR ultra performance, but fps was not changing

    • @jaju123456
      @jaju123456 Před rokem +3

      That's just CPU bottleneck, even with your CPU.

  • @Parwitch69
    @Parwitch69 Před rokem +4

    Now I wonder, what kind of system the developers used when testing. I think developers should work with the max setting in order to know how their code perform and able to see the true artistic representation of the game.

    • @Real_MisterSir
      @Real_MisterSir Před rokem +1

      they work with consoles first and foremost, especially the AAA studios. Most likely it was optimized for good utilization on current gen consoles, and then ported to PC with very few changes and now can't do shit on a top end system. Could also just be driver issues. In the end of the day, for companies this size, this type of performance is unacceptable no matter the cause. It's the whole reason they get to call themselves AAA in the first place. I think it's time to strip some of them of a couple A's.

    • @monotoneone
      @monotoneone Před rokem

      They are console developers. They probably tested the game at 30 FPS cap and a controller.

    • @talhagrgen
      @talhagrgen Před rokem

      @@monotoneone No, they are not console developers, they have been using source forked engine for long time for their Titanfall series and Apex, all of those games run quite well on pc, they are probably not experienced in UE.

  • @TerminatorVAL
    @TerminatorVAL Před rokem

    This might be a dumb question but could this be a result of the game not able to make use of that 3d Vcache? maybe it could work better with an intel cpu like a 13700k?

  • @Trainboy1EJR
    @Trainboy1EJR Před rokem

    So, is it possible to disable hyperthreading and see if having only physical cores fixes anything?

  • @uzimafioso
    @uzimafioso Před rokem +7

    Hey, just so you know GPU utilization percentage doesn't really apply if you don't max out the resolution buffer. You should compare wattages at different resolutions, because the GPU pipeline might not even saturate at a lower resolution. Example, a 3080 Ti goes around 380W on 4K but at 1440p it might not even go 300W even on a balls to the walls 13900K. It's the way the GPU pipeline works. That's why AMD always wins at lower resolutions but suffers disproportionately at 4K.

    • @lynackhilou4865
      @lynackhilou4865 Před rokem +1

      True but the issue isn't necessarily the resolution itself , it's that nvidia has more cpu overhead when cpu limited resulting in worse performance , and that's more likely to happen the lower you go
      And you can get a 3080 to run at 100% usage and wattage at 1440p quite easily but yeah it will probably consume more at 4k since it's most likely to be fully utilized at that resolution

    • @uzimafioso
      @uzimafioso Před rokem

      @@lynackhilou4865 Recently I had to choose between a Zotac 3080, an EVGA 3080 Ti and a Sapphire Nitro+ 6800 XT, coming from a 1080 Ti FTW3.
      After testing them all on my 1440p monitor with different resolution scalings I settled for a 6800 XT to pair with my 5700X. Particularly because I was saving some cash and getting more VRAM which I need for architectural renders.
      I've noticed that the 30 series really stretches its legs at 4K because at 1440p and 1080p there was barely any difference between the 3080 and the Ti while AMD stays competitive till 1440p and 30 series just keeps gapping higher at 4K.

  • @Mike80528
    @Mike80528 Před rokem +2

    There have been videos on the differences between AMD's and Nvidia's driver architecture. Nvidia favors faster clocks versus more threads dating back to Intel's 4-core architecture. AMD went down a parallel thread path years ago with GPU's when they took that path with their FX processors. I am not aware of either fundamentally changing and AMD's choices would finally be paying off here (and would have sooner if their architecture hadn't flopped and innovation stalled).

  • @abaddon_the_annoyer3084
    @abaddon_the_annoyer3084 Před rokem +1

    Seems like Im asking this for every new title these days, but how the hell are they calculating their requirement specs then? No recent game seems to be playable on minimum thru to above and beyond spec requirement.

  • @Labiumminus
    @Labiumminus Před rokem

    This game has no Frame Generatio 3? for Nvida?

  • @issumora1
    @issumora1 Před rokem +10

    Considering how bad the cpu bottle neck is. I think a lot of the games code is still running in blue print. As when a game ships all or most, code should be converted to c++ for the best cpu performance. It is what happened with Gotham knights. As they converted a lot more code into c++ and the game is a lot less cpu limited were decent fps is possible on a wide range of hardware. It's sad that a game that was delayed and they still did not have time to get the code in a state where they can start optimizing it. How are you meant to optimize a game when 80 percent the time your gpu is running at 30-50 percent usage with rt. Games keep coming out 2-3 months too early. At this point, I might as well just ignore games for 2-3 months before looking into them on pc. Would not be surprised if Redfall is in the same state next week. Like I will most likely have a better time emulating tears of the kingdom then playing a native AAA pc port. Can a developer just have the balls to tell there publisher that the game is not ready and actually have the ability to make a last minute delay of 2-3 months. Please do not buy until these issues are fixed if most pc users don't buy it the publisher will say to themselves maybe if we want a game to sell well on pc maybe we need release a complete and optimized game. The reason why game keep coming out in this state is because pc a high amount of pc users are still buying it. If want change you have to use the only thing the publisher care amount and that is sales of first 3-6 months of a games release.

    • @MTSP-pe2fn
      @MTSP-pe2fn Před rokem

      ^This!

    • @randybobandy27834678
      @randybobandy27834678 Před rokem +2

      This a misconception. The difference in performance between blueprints and c++ is only significant with very heavy computational operation, and even then it's not that significant. Not to mention, UE can nativize blueprints for packaging so that's not even an issue most of the time. A bad programmer will mess up c++ code just as easily, if not worse. I would bet if we could access the source we'd see all kinds unoptimized code, including the c++ parts and especially shaders.

    • @ninjafrozr8809
      @ninjafrozr8809 Před rokem +1

      If you don't buy the game, and sales tank on PC. The publishers will only see that as more reason not to waste too much resources on the PC port and sloppy ports will only get more common ( if that's even possible)...

  • @MrStranger1944
    @MrStranger1944 Před rokem +4

    It reminds me of some of the issues that The Callisto Protocol had with ray tracing which is also a UE4 game.

    • @Trophykage
      @Trophykage Před rokem +1

      Same as Hogwarts. My 3090 was only using around 65% utilization with a couple cores really pushed on my 12900k in certain areas like hogsmead, but in most of the open world ran butter smooth.

  • @berkertaskiran
    @berkertaskiran Před rokem

    How is a CPU bottleneck possible when max usage for any core is 70%? And what exactly is using it? Could it be that there's some other reason like lack of direct storage or something?

  • @XPuntar
    @XPuntar Před rokem

    What if there is a bug related to RT running on APU on PS5 while in PC it is directly offloaded onto CPU completely disregarding GPU even if GPU does not support it?

  • @RolandsSh
    @RolandsSh Před rokem +6

    7700 and 7900xt. Played for about half an hour before work with auto settings (everything Epic, RT off, FSR quality at 4k).
    The "respawn entertainment" and the like title screens when starting up the game were total slideshows, and cutscenes had noticeable frameskips every now and then. Gameplay was locked 60 FPS though.
    I was expecting the performance to be worse to be honest.

    • @mihai2174
      @mihai2174 Před rokem +6

      to be honest a 7900 xt is not a low end gpu by any means...the fact that you had to use fsr to keep a 4k 60 fps at all times is not a a good sign

    • @JoeL-xk6bo
      @JoeL-xk6bo Před rokem

      it's nvidia...that's why i was waiting to see AMd performance. I have a 7800X3D with a 7900XTX...3440x1440 max settings no FSR, just over 100FPS average.

    • @JoeL-xk6bo
      @JoeL-xk6bo Před rokem +1

      @@mihai2174 you don't know his frame rate because he has vsync on "locked at 60 fps" clearly he's getting over that if vsync was off or else it would dip below 60 at times as that's how averages work...

    • @mihai2174
      @mihai2174 Před rokem

      @@JoeL-xk6bo for some reason my previous comment was removed
      i was saying that i have a 7900 xtx and most of the times i can't get 4 k 60 fps at max settings (no fsr and raytrace), worse, it goes lowe then 50 fps when i am in combat, so i am pretty sure he won't be able to get 60 fps 4 k with a 7900 xt at native

    • @RolandsSh
      @RolandsSh Před rokem

      That is correct, it was with vsync, which was enabled by default. My GPU utilization generally hovers around 80%, turning vsync off got me to ~75 FPS on average, but I had some tearing, so turned it back on. I tried enabling raytracing, that dropped my FPS to around 40-45, but the frametimes felt super inconsistent. It did look better with raytracing, but it was absolutely not worth it for me. I didn't try turning FSR off, might test it later
      I'm about 6 hours into the game, and my experience is very, very positive so far. I didn't encounter any performance issues or bugs. The game is very much Fallen Order, but bigger and better. I love the visuals and setting (Coruscant looked fantastic!), I love that I got to keep all my abilities from Fallen Order, vibe is truly Star Wars.
      I just wish the game ran well on all hardware, not just the latest&greatest AMD gear.

  • @duckpwnd
    @duckpwnd Před rokem +3

    You made a small error in your video. You said that the 7800X3D is the fastest gaming cpu. That's not true. The i5-2500k is the fastest gaming cpu.

  • @robertorossi4361
    @robertorossi4361 Před rokem +1

    Was there actually any difference in picture quality (lighting/shadows) with RT on?

    • @Sakosaga
      @Sakosaga Před rokem +1

      That's what I was wondering because even when he turned the settings down I could Barely tell tbh. I'm not sure if it's unoptimized but could be an issue with the ini files not actually working when you change the settings tbh or some people said a memory leak but I feel like you'd have more stuttering tbh

  • @CharlieLiuPhotography

    Any plans to get the latest gen 5 NVME SSD to test with this game? I hear it does quite a bit better.

  • @Azalis1701
    @Azalis1701 Před rokem +7

    It's not any better on console. On PS5 it constantly drops out of the VRR range and is usually sitting in the 50s for FPS (at least on Coruscant at the start) and has terrible screen tearing with VRR on or off. I have resorted to the 30 FPS mode just to make the screen tearing stop. I went PS5 and not PC for this one after Hogwarts was actually really good on PS5, but this is just a disaster.

    • @KryssN1
      @KryssN1 Před rokem

      It comes down to structure difference in consoles vs PC, shared memory pool and dedicated chip that handles assets compile.
      Then Xbox and PS5 have AMD chips in them, while PC market is dominated by nvidia cards, while there is limited time and so optimization for a platform have to be prioritized.
      As you noticed last couple titles are AMD sponsored and consoles are all AMD, publishers just choose their target with their limited time.
      Especially when games are getting more and more massive, with unreasonable targets to beat the last game/release.
      It's hard to hit the targets set by publisher, not mentioning optimization, while the Devs are still probably underpaid
      Consoles have equalivment of a 3060 Ti / RTX 4060 at best and PS5 runs it equaly good as a RTX 4090 combined with a 2x as fast CPU as in console.
      Clearly this was optimized for AMD arhcitecture, but it's still poo.
      In comparison all other AMD sponsored titles like TLOU1 had no issues running on AMD GPU on PC, only nvidia.

  • @wyattparnell2412
    @wyattparnell2412 Před rokem +23

    I had the same issues with my 7900xt and 5800x3d I updated the drivers with the new ones from today and it fixed it completely

    • @Ladioz
      @Ladioz Před rokem +3

      which drivers ? be more specific please?

    • @nothingisawesome
      @nothingisawesome Před rokem +1

      really? whats your performance? i have a 7700X w/ 7900XT and get like 60-70fps. Bad areas are down to like 40fps tho

    • @tasosdesypris5194
      @tasosdesypris5194 Před rokem +1

      @@Ladioz Adrenalin Edition 23.4.3 with support for STAR WARS Jedi: Survivor™.

    • @Ladioz
      @Ladioz Před rokem

      However, games tend to run perfect on AMD systems. Another blow for Nvidia cards... damn. Next time im 100% buying an AMD card

    • @darkfire3691
      @darkfire3691 Před rokem

      @@Ladioz You stupid AMD shill, its AMD sponsored tittle, what do you expect? Ofc, it will be better after few patches, and new drivers.

  • @xyz2theb
    @xyz2theb Před rokem

    Im wondering do you think denuvo has something to do with this?

  • @samtime2711
    @samtime2711 Před rokem

    I'm going put it out there , but do think the issue with cpu could be that game needs more than 8 core or have people tested it out on higher core cpu and still ran into the bottle neck.?

  • @night3094
    @night3094 Před rokem +3

    No DLSS in a 2023 game, went as expected

  • @omarcomming722
    @omarcomming722 Před rokem +24

    As soon as the 4090 dropped, we started getting some of the most unoptimized pieces of shit ever known to man, even when strictly looking to GPU usage.

    • @RicochetForce
      @RicochetForce Před rokem +6

      That doesn't even make sense, that card was released back in September/October 2022 while these games were in development for years before that. Your comment makes no sense. Bad PC ports predate the 40 series.

    • @omarcomming722
      @omarcomming722 Před rokem +3

      @@RicochetForce It's not a conspiracy, they just don't give a fuck to optimize because there's now a GPU that can run the game to some degree. If ALL the available GPUs suck, they have no choice but to delay the game, but now that a $2000 card can run ok in games where all the other ones run like shit, they are putting these games out because they technically work. We didn't have games using 14-18 gigs of VRAM before the 4090, there are now 4,5 of them in the last couple of months.

    • @existentialselkath1264
      @existentialselkath1264 Před rokem +4

      Bad pc ports have been around forever. This isn't a new thing. The 4090 also lined up with the covid pandemic which marked a significant rise in unfinished games. Coincidence? I think not.
      Its not some conspiracy. Game devs won't make any money making games for the 1% of gamers with a top spec card. It's often the publishers rushing games out to hit arbitrary deadlines.
      The one potential impact of the 4090 is raising the requirements of ultra settings to fully utilize the top spec cards, but thats not what's happening here as the whole game is broken no matter the settings

    • @RicochetForce
      @RicochetForce Před rokem

      @@omarcomming722 This is conspiracy theory nonsense. Developers begin work on games years ago, on hardware that is now very much dated. The hardware target AAA or even indie developers use is the console hardware of the time.
      These games were not intentionally created to run poorly, or somehow strongarm people into throwing out their old gear and buying new stuff.
      They're simply bad PC ports, and rushed ones at that. TLOU Part 1, for example, actually has problems that would be fixed by amateur devs.
      It's no conspiracy, these are just rushed products with cut corners.

    • @omarcomming722
      @omarcomming722 Před rokem +2

      @@RicochetForce I literally said it's not a conspiracy but incompetence. What I'm saying is that without ultra powerful GPUs that can provide somewhat acceptable results, devs would have more pressure to delay the game because the game running at 60 fps on the highest end consumer GPU looks better PR wise than it running at 35 fps. The existence of such powerful hardware gives them more leeway to put out unoptimized garbage

  • @s.omarnoor1380
    @s.omarnoor1380 Před rokem

    Is this in a similar vein to Crysis where they made it for future hardware, or is it just bad optimisation?

  • @ChrisKadaver
    @ChrisKadaver Před rokem +1

    How does a fully tweaked i9 13900K system compare then?

  • @zirize
    @zirize Před rokem +7

    I have a theory. Consoles have shared memory between the CPU and GPU, which allows for close collaboration between the two and enables enhanced optimization. However, in current PCs, the GPU's VRAM and system memory are separate. Therefore, if an engine designed with shared memory runs on a PC and the CPU needs to access the VRAM, it can lead to memory bottlenecks. I believe this may be a problem with many current games.

    • @KryssN1
      @KryssN1 Před rokem +8

      This!'
      Badly optimization of memory pool for PC, just copy paste from console, like TLOU1.
      Never the less, this game also have performance problems on consoles, but not akin to 4090 running at barerly 55 fps avg. in static envoirment with double as fast CPU.

    • @brettlawrence9015
      @brettlawrence9015 Před rokem +1

      The problem is ps5 is the system most games are built for and everything else is an afterthought. Sad considering pc pays the same price.

    • @apiestink
      @apiestink Před rokem +3

      but consoles always had shared memory since at least ps2 and probably before. this is nothing new.

    • @hydzior
      @hydzior Před rokem

      CPU is not accessing the vram. Thats not how this works.

    • @zirize
      @zirize Před rokem

      @@apiestink I'm not claiming that my theory is necessarily correct. It could be completely wrong. However, personally, I don't think there were many cases in the past where sharing memory between the CPU and GPU provided significant benefits. Nevertheless, with the rise of computationally expensive features such as path tracing that require huge amounts of memory, techniques that actively utilize CPU-GPU memory sharing have been developed. I believe this phenomenon started not with the new console generation, but rather when path tracing became a new standard for games, which led to the emergence of these memory-related issues.

  • @Bmorris-mf6qs
    @Bmorris-mf6qs Před rokem +8

    I have the same setup. For the first hours of me playing I got a solid 95 fps but after I took a break and came back the game starting bottlenecking. Game Def has problems

    • @Bmorris-mf6qs
      @Bmorris-mf6qs Před rokem +2

      Crap you right the rtx bugs out. Wow performance on this game is just bug out as he'll.

    • @metamon2704
      @metamon2704 Před rokem

      Could be because RT wasn't actually enabled the first time!

  • @marcuszettergren8885
    @marcuszettergren8885 Před rokem

    So not so much Vram used today compared to yesterday? Did it magically fix itself?

  • @cipherbenchmarks
    @cipherbenchmarks Před rokem +2

    Really well done video - particularly about the cpu utilization. Thanks.

  • @OC.TINYYY
    @OC.TINYYY Před rokem +12

    I think wording it as a CPU bottleneck isn't exactly right. As it's clear the CPU in general is not the problem, but the game itself.
    This unfortunately happens w/ 95% of AMD sponsored titles when launched. It gets fixed eventually, but it's a common occurrence when using Nvidia GPU's w/ AMD sponsored titles which have RT.
    The sad part is the RT in this game is basic as hell. Not surprising since again, it's optimized for AMD who is still 2 gens behind in RT.

    • @bronsondixon4747
      @bronsondixon4747 Před rokem +5

      It’s not the CPU that’s the issue, yes. It is the fastest gaming CPU around.
      However, it was released with a cpu bottleneck because it’s an unfinished port. To say it’s not a cpu bottleneck would be inaccurate.

    • @OC.TINYYY
      @OC.TINYYY Před rokem +3

      @Bronson Dixon it's not a CPU bottleneck regardless. The cores aren't even being pushed. This isn't a Spider-Man remastered situation.
      It's a coding bottleneck in the game engine that needs to be patched up which does not equate it to being a CPU bottleneck since the cores themselves are barely pushing 30% utilization. It's much broader than simply referring it to as a "cpu bottleneck".

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +5

      ​@Kevin P. I get what you're saying and you are correct, but technically poor multithreading is a kind of CPU bottleneck.

    • @danivinci
      @danivinci Před rokem +1

      @@OC.TINYYY If cpu slower then 7800x3d gets lower fps, then its cpu bottleneck

    • @brettlawrence9015
      @brettlawrence9015 Před rokem

      You have to call it a cpu bottleneck.

  • @davidepannone6021
    @davidepannone6021 Před rokem +3

    It feels like that any ps5 port that get ported, dev seems to pretty much try to make pc emulate the ps5, instead of actually work around and actually port the game to a different platform. That is why the last few ps5 ports have been terrible especially on the cpu side of things. Devs are basically """porting""" game as if they were being emulated rather than work on it for it to fully utilize the gpus pc have. They're treating pc as consoles apus.

  • @UndyingGhost
    @UndyingGhost Před rokem +1

    What about nvidia cpu overhead? Can we see if AMD gpus test? 7900xtx is supposed to do quite good here.

  • @Shawner666
    @Shawner666 Před rokem

    I'm curious how come how you don't have intel rigs? Why did you decide to go with the 7800x3d? Is there big a difference between the 7800x3d and the 7700x?

    • @cosmindinaa
      @cosmindinaa Před rokem

      Well, current Intel platform is at the end of the line, while AM5 just started. Also, AMD offers pci-e 5.0 x16 and pcie 5.0 nvme both at the same time, again future proofing your build. On this system, you just change the CPU in 1-2 years and you're done.

  • @BVNTR_
    @BVNTR_ Před rokem +12

    I love it when the lightsaber goes through the wall about 50 times without any effects of damage.

    •  Před rokem +10

      In games like Jedi Outcast from 2002, running on an already little bit dated id Tech 3 engine, you could "draw" with the lightsaber. But at least, now we have reflections and crappy performance.

    • @BVNTR_
      @BVNTR_ Před rokem +1

      @ The special effects are probably behind a pay wall and will be in a dlc.. 🤣🤣

    • @MLWJ1993
      @MLWJ1993 Před rokem +4

      ​@ It's kind of funny when you look at some older games that had more interactive environments than most games today have on offer. 😂

    •  Před rokem +1

      ​@@BVNTR_
      Wouldn't be surprised! Hahaha!
      Yeah, back then, they cared. I know game development is now much more complex, needs more people, better tools, better hardware, etc... But there is something still very wrong, if they can't reach that level of detail as they did 20 years ago. Sure, graphics are much nicer and sounds are better too. Animations improved a lot since then, but... that's it.
      Now, we get games that don't look bad but not looking as good as the hardware requirements would be justified.
      When Crysis came out, I think that was justified. That game looked glorious at its time, still a really nice looking game.

  • @nossy232323
    @nossy232323 Před rokem +3

    Seems like almost all AAA games that come out these days, you need the fastest system possible to run at max settings, and even then there are performance issues.

    • @mkuhnactual
      @mkuhnactual Před rokem

      Studios don't optimize, and a lot of games are running denuvo now for drm which gives a hit to performance. Not sure if that's the problem with this game, but it is running denuvo.

    • @nossy232323
      @nossy232323 Před rokem +1

      @@mkuhnactual It's like they don't want anyone to be able to play their games.

    • @mkuhnactual
      @mkuhnactual Před rokem

      @@nossy232323 A vast majority of gamers will continue to pre order and buy games on release even after a studio releases multiple games in an unfinished or unfun state. If people will give money no matter what, there's no incentive for a studio to insure it releases in an optimal state.

  • @truthseeker6532
    @truthseeker6532 Před rokem +2

    Whatever happened to DirectX 12 Ultimate being some omg it is so awesome outlook?

  • @shadeis9276
    @shadeis9276 Před rokem

    Would turning off Hyperthreading (Intel) / Multi-threading (AMD) work? I've seen people say that games work well with them turned off because the game can't use the threads that well.

  • @drizzlebizzle
    @drizzlebizzle Před rokem +10

    I'm getting basically the exact same performance with a 5900x and 4090 at 4k. I really hate that I have had to brute force 60FPS for this years AAA games (Hogwarts, TLOU, SWJS) with a system that I spent a lot of money that should do a lot more. I couldn't imagine having "lesser" cards that are supposed to also do 60fps at 4K.

    • @lucodeath
      @lucodeath Před rokem

      I have this same system, MSFS in vr runs better than starwars 😂

    • @fvallo
      @fvallo Před rokem +7

      5900x 3090 feels like its 7 yo system lol, pc gaming is a disaster, piracy is the only ethical thing a pc gamer should do, these companies deserve nothing.

    • @Dionyzos
      @Dionyzos Před rokem

      I was gonna buy a 4090 but this makes me think twice.
      I want to see how UE5 titles run before paying that much money on something that might not even be able to run it well enough.
      This game looks like it could run on PS4 when you compare it to Forbidden West.

    • @fvallo
      @fvallo Před rokem

      @@Dionyzos wait a year and get 5090 for the same price, why? That will be the first gen of cards with decent rt performance, 4090 is just barely there, frame generation lag is horrible if you render below 60fps

    • @PneumaticTire
      @PneumaticTire Před rokem +1

      Similar bro, 5800X 3D and a 4080. Play TLOU right now and it’s frustrating seeing my GPU only running in the 60% range at 1440. 120FPS with dips into the 70’s. Frametimes are so bad it feels like I’m getting 20 FPS!

  • @labileslama9848
    @labileslama9848 Před rokem +4

    the state of the game rn makes me so mad, i got a 4070 to play this and can't really get over 40fps without turning the settings way down at 1440p, how can they release something like this

  • @GR33Nday2
    @GR33Nday2 Před rokem +1

    How difficult was to just make a second part from the first game? It's all we wanted :_(

  • @ToneRetroGaming
    @ToneRetroGaming Před rokem

    9:36 strange to see screen tearing at such low framerates. Are you using a Gsync monitor Daniel?

  • @ytv6770
    @ytv6770 Před rokem +8

    Say what you want about RTX Overdrive in Cyberpunk 2077 but you get every shader unit’s worth on the visual majesty of path tracing for what it costs in performance. And it’s even efficient where you could run it fine with heavy DLSS even on a 3070

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      And it doesn't even max out most card's vram either.

    • @KryssN1
      @KryssN1 Před rokem

      @@Wobbothe3rd cudos to CDprojektRed they build custom memory pool algorithm for the game, for PC, so that it does not bottlenecks due to differences in consoles vs PC structure.
      Basically due to consoles having shared memory pool and dedicated chip that handles compression and decompression games need optimization in that depertment for PCs.

    • @Radwanze
      @Radwanze Před rokem +1

      I am on a rtx 3070 + ryzen 5800x non 3D. Yeah you could play on Overdrive raytracing but it doesnt make sense at all on a 3070. I ran some benchmarks and with rt overdrive, dlss balanced, 1080p, everything medium you'll get ~ 50 fps avarage. But you'll get the same framerate or better on rt psycho, dlss quality, 1440p, everything maxed out. Which looks much, much better. So in my opinion rt overdrive is only enjoyable on a 40 series.

  • @urchinsify
    @urchinsify Před rokem +16

    DON'T BUY THE GAME until they fix this garbage...
    You're literally wasting money on something you can't even play.

    • @dondoken1353
      @dondoken1353 Před rokem +4

      This. Unless people stop preordering and buying bugged crap on release nothing's going to change

    • @GregoryShtevensh
      @GregoryShtevensh Před rokem

      yet you can play it? is 60fps unplayable?

    • @dog_player
      @dog_player Před rokem +3

      @@GregoryShtevensh yes, 60fps is unplayable.

    • @cunt5413
      @cunt5413 Před rokem

      I will buy it only to refund it so they get the message.

    • @cunt5413
      @cunt5413 Před rokem +7

      @@GregoryShtevensh IT CAN'T EVEN HIT CONSISTENT 60 WITH A 4090 YOU FOOL

  • @wanderjahren
    @wanderjahren Před rokem

    I can see that the game isn't fully utilizing the GPU, but it looks like it isn't fully utilizing the CPU, as well. It makes me wonder if those percentages would be higher on a system with a lesser CPU and GPU. Is the code itself the limitation, and can a lesser system handle the game just as well?

  • @Nerex7
    @Nerex7 Před rokem +1

    Good explanation about the cpu bottleneck. Many people assume something has to run at up to 100% for it to be the bottleneck but that's not the case

    • @loki76
      @loki76 Před rokem +1

      It's because the calculations of the load on the CPU is based on all the cores. In this case 8 and 16 threads. But the game engine uses 4 cores. So a CPU utilization of 45% really means 90% because the CPU tries to load balance it but the game keeps asking for just 4 cores and 8 threads in how it's used. This likely creates latency.
      In a sense if one could disable all the unwanted cores/threads and up the clock speeds to 7Ghz I bet it would run much better.
      but it's idiotic how any game engine is designed with fewer cores and faster clocks when we have known for YEARS that only way forward is more parallelism and more cores.

  • @monggolean
    @monggolean Před rokem +4

    Its an Unreal Engine game. What do you expect? Surprised they're not using Frostbite or Respawn's own fork of Source engine...

    • @gavinderulo12
      @gavinderulo12 Před rokem +7

      UE4 is a great engine. Look how well dead island 2 and atomic heart look and run.

    • @user-ye7lp9lg1c
      @user-ye7lp9lg1c Před rokem +4

      ​@@gavinderulo12 they look outdated

    • @gavinderulo12
      @gavinderulo12 Před rokem +4

      @@user-ye7lp9lg1c they look good. But my main point is that their performance is excellent. So it's not the engine's fault if a game runs poorly.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      ​@@user-ye7lp9lg1c you are out of your mind if you think Atomic Heart looks outdated. Some of the best TAA I've ever seen in any engine ever.

  • @LuisOrtiz-zu3dq
    @LuisOrtiz-zu3dq Před rokem +5

    The game is using ue 4 so its probably hitting a wall because of the engine also one of the last games using the engine i think

    • @InfamazDre
      @InfamazDre Před rokem +1

      Get ready for the redfall shit show in a couple of days

    • @f0rsakenPT
      @f0rsakenPT Před rokem +3

      A lot of games use UE4 and run great on pc. This game doesn't look better than cyberpunk yet runs worse. The developers are at fault here.

    • @RicochetForce
      @RicochetForce Před rokem

      Incorrect.
      The overwhelming majority of UE4 games run dramatically better than this.
      Cyberpunk with RT Ultra (one of the heaviest rendering loads) runs better than this while using less VRAM.
      There is something very wrong with this PC port.

  • @glottis8
    @glottis8 Před rokem

    Have you noticed any leaks in the game? I have noticed consistently over time my performance will tank and deteriorate. Where stutters are super common and I have to restart to have ok performance again.

  • @AveratisArmada
    @AveratisArmada Před rokem +11

    The CPU limitations are likely caused by the Denuvo DRM they're using for the game.

    • @OC.TINYYY
      @OC.TINYYY Před rokem +4

      No. Unless you want to use that excuse to pirate a game. Denuvo has 0 performance hit on higher end systems.
      It only hurts people with shit hardware, which is why they pirate in the first place. Can't afford it.
      This game also doesn't use Denuvo, just basic EA DRM.

    • @floorgang420
      @floorgang420 Před rokem +11

      @@OC.TINYYY Nothing is free, as a developer myself, a single line of code has performance cost, this is not single line.
      BTW, you will be surprised when there are tons of way to improve a single line in the net.

    • @jacopo8827
      @jacopo8827 Před rokem +3

      ​@@OC.TINYYY Nope, the game uses Denuvo DRM.
      It's on the steam page

    • @jaju123456
      @jaju123456 Před rokem +4

      @@OC.TINYYY Digital foundry measured the cpu impact of denuvo and it was around 10% if I remember correctly. So doesn't explain the performance of this game fully but does slightly. Regardless, we know many games with denuvo run fine so can't just blame it on that.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      Incorrect. No matter WHAT Denuvo does it doesn't limit the amount of threads a game can use - that's on the developer.

  • @plyzogyz
    @plyzogyz Před rokem +4

    awesome optimization in this game . gpu using 40-50 % of its power GJ developers! bottleneck should be almost nothing with that cpu and 4 k rolfz

    • @NostalgicMem0ries
      @NostalgicMem0ries Před rokem

      amd sponsored game, what do you expect, they made it that they dont choke on RT and that nvidia cant utilize their gpus fully. nasty tactics

  • @No1x3N
    @No1x3N Před rokem

    Did you try pinning the process of the game to 4 fixed cores with the task manager? I've found some UE4 games benefit a lot by manually assigning them certain cores. It seems the way UE4 is coded makes some games to constantly jump around cores

    • @alsiniz
      @alsiniz Před rokem

      wouldn't help much because the 7800x3d is a single CCX not two CCXs merged with 'infinity fabric'

    • @No1x3N
      @No1x3N Před rokem

      @@alsiniz I know, still, you can see no cores being 100% utilized in the video, and I personally don't have a cpu with multiple ccd's, and I experienced the same behaviour with UE4 based games, if I pin the process to specific cpu cores the game runs better and those cores go to 80-100% utilization. It definitely helps.

  • @gavinderulo12
    @gavinderulo12 Před rokem +2

    What does RT even do in this game? There is no description. It just says Raytracing. Lol

  • @GoldenEagle0007
    @GoldenEagle0007 Před rokem +3

    yeah i wont be playing this game till i know its all clear, dont want my 7800x3d or 4090 get nuked

  • @SCARAB_CODE_X
    @SCARAB_CODE_X Před rokem +3

    Nah, on my 14900X2d with 9090ti it runs pretty good. Locked 70 fps of course fsr 17.4 is enabled on performance mode. Something ist clearly wrong with your potato.🤷‍♂

  • @landoishisname
    @landoishisname Před rokem

    You did do AB testing with lower end CPUs right?

  • @leoncloud6683
    @leoncloud6683 Před 11 měsíci

    Can you list which settings affect the CPU, would help a lot

  • @surfx4804
    @surfx4804 Před rokem +7

    I'm loving these "New Game Unplayable" videos. Honestly game developers are missing out on customers by making games poor for most people, let alone unpromised for the few with top end PCs.
    I'd not accept that the 7800x3d is the best, well not for this game. A 13900k might give better results if the game wants speed over cache. It may be interesting to see how the 7900XTX does as it may be AMD optimised.
    I suppose the future is non-native DLSS/FSR 3 with Frame Generation that compensate as DLSS/FSR 2 just put more load on the CPU due to lower resolution.

    • @tayloroxelgren264
      @tayloroxelgren264 Před rokem +2

      Amd gpus will probably do better because their drivers use less cpu resources

    • @Cha1nn-
      @Cha1nn- Před rokem +1

      AMD drivers are better in this regard it will probably not be about game optimizing AMD hardware...

    • @apiestink
      @apiestink Před rokem +1

      13700k here and have exact same problem as video.

    • @AbrahamZX1
      @AbrahamZX1 Před rokem +1

      My 13700k performed better than this for me last night.

  • @DeadmanNasser
    @DeadmanNasser Před rokem

    Daniel according to DSOGaming you have to relaunch the game everytime you change a settings including FSR otherwise your GPU will be underutilized not sure if it actually works
    they have an article up

  • @jakobe_bryantgaming5580

    Im about to try this game out on a 10700l/3080/32GB 4000 C15 system. I wonder if there's anything in the config files that can help :/

  • @thellorch1038
    @thellorch1038 Před rokem +3

    I have a 3070 ti and for me it runs at 30-40 fps on high and AMD FidelityFX on performance :(

    • @Padawan333
      @Padawan333 Před rokem

      similar for me on a 3060ti, I actually thought rt was enabled I was shocked to see it wasn't. High settings a few things down to medium. very disappointing

  • @jhowgx
    @jhowgx Před rokem +11

    i love pc gaming! spending 5k+ so i cant even run the game max settings, nice!

    • @mkuhnactual
      @mkuhnactual Před rokem +1

      console players are saying the game has performance issues for them too

    • @shreyasdalal
      @shreyasdalal Před rokem

      @@mkuhnactual Not really i am getting stable fps with little to no hiccups on xbox series s and it supports keyboard and mouse, really having fun just played 3 to 4 hours of that

    • @WizG.
      @WizG. Před rokem

      @@shreyasdalal Sub 60 fps tho

    • @shreyasdalal
      @shreyasdalal Před rokem

      @@WizG. I am happy with the 30 to 40ish fps since it is locked and stable with no hiccups, yea I have a 144hz monitor but its a story game, your eyes get used to the 40 fps in about 45 minutes