Star Wars Jedi Survivor: CPU Bottlnecked on 7800X3D | RTX 4090
Vložit
- čas přidán 26. 04. 2023
- Jedi Survivor is out with day 1 patches and drivers, and its performance still leaves a lot to be desired. The main culprit seems to be the Ray Tracing hit to the CPU, but the CPU limitation can be pretty bad even without RT if you don't have one of the fastest new CPUs. In this video I test how it performs on an extremely high end system with an RTX 4090 and Ryzen 7800X3D.
Test system specs:
GPU: RTX 4090 amzn.to/3LC6Olm
CPU: Ryzen 7800X3D amzn.to/3Hkf7Qi
Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
RAM: 32GB DDR5 6000 CL30: amzn.to/41XRtkM
SSD: Samsung 980 Pro amzn.to/3BfkKds
Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
Mouse: Logitech G305 amzn.to/3gDyfPh
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above. - Věda a technologie
I hope to have time to test some lower end GPUs soon, but this is all I had time for tonight.
Again, make sure your RT actually applied before comparing performance!
And I know people will want to debate whether this is "CPU limited" since the CPU shows such low utilization, but the GPU is clearly waiting around for the CPU to finish whatever it is doing. The game is just not coded well to fully utilize the CPU cores. I didn't have time to A/B test different CPUs but this is the same behavior that I have seen in other Unreal Engine 4 games (especially with RT enabled) like Hogwarts Legacy and The Callisto Protocol, where I have done A/B testing of different CPUs and seen it improve performance, so I am confident in my diagnosis.
Daniel look at you now, I was with you at 30k subs and told you that you will get big. I'm happy for you Daniel, cheers to your success sir. Edit: I love how you keep things simple for us to understand and are like an doctor for pc, unoptimization issues.
Would be interesting to see how this game runs on a 4070(ti) at 1440p
I don't understand why even use the term "CPU limited" if no theoretical CPU can fix this
Gay game from gay devs.
test this with a 9600k
This actually isn't really a bad thing. It just saves me $70 on buying a broken POS game.
@@SirCharlzTV So, it's broken. Thanks for the heads up. Will wait until it's $5.
Nvidia drivers overhead in full glory
Genius
You pay money for games? That's weird :/
@@fiecylick3981 Yes, some people like to support the makers of games they're interested in instead of stealing or receing handouts all the time.
Just think of how amazing games like RDR2 or Metro look while running normally on cards like 3060ti, this is indefensible
Nah, just throw FSR and DLSS and 16gb of vram at 1080p because that's how terribly inefficient games are with using the system resources
It is weird to think about that, those games is also not optimized properly, sure they still look so good and I love both but rtx 3060ti is not a nowhere near normal card for them, one of them maded to 2018 and other one 2019. You have the performance of the best card available for them when they released. Simply you are bruteforcing.
Check gears 5, doom eternal, re 2 remake, crysis 3, bf1, rottr, wolfensteins.. for what a good optimization on pc can do.
You can play RDR2 with a i5 7400 and a 1060 6gb flawlessly
@I crediits Exactly my thoughts.
How short of a memory do you people have? RDR2 PC port was awful on launch
The current state of Pc Ports Reminds me of the x360 and ps3 era of Pc ports. Which were Pretty bad
It's coming back again but with FSR and DLSS as Devs answer to a fix
GTA4 flashbacks 😵
@@aaz1992 Even today, it runs pretty poorly on my 3080.
Some dude says he got 50 fps with RT on, in idle position, with 4090 💀
@@dnakatomiuk but in this game fsr won't help you at all because the game is bottlenecked by the CPU even with the most powerful gaming CPU in the world ....
This age of completly broken games not utilizing the hardware properly is just maddening. First Hogwarts Legacy with its RAM/VRAM problems, now this.
It comes down to structure difference in consoles vs PC, shared memory pool and dedicated chip that handles assets compile.
Then Xbox and PS5 have AMD chips in them, while PC market is dominated by nvidia cards, while there is limited time and so optimization for a platform have to be prioritized.
As you noticed last couple titles are AMD sponsored and consoles are all AMD, publishers just choose their target with their limited time.
Especially when games are getting more and more massive, with unreasonable targets to beat the last game/release.
It's hard to hit the targets set by publisher, not mentioning optimization, while the Devs are still probably underpaid.
@@KryssN1 no, no excuse. It's just bad optimization.
By the way, i still remember that the PS3 has dedicated RAM and VRAM.
@@saricubra2867 I agree the devs have no excuse but they won’t bother optimising for launch most games are rushed. Pc players need to not buy.
@@KryssN1 As a PC gamer I would be 100% happy waiting potentially a year or more for ports of console games if it meant they actually got the treatment they deserve. And I suspect I'm not in the minority either, so it boggles me to no end why publishers think shovelling out half-finished crap and charging $70+ (in the US, more in other regions) for it is a sustainable strategy. With all the money spent on marketing, buying positive reviews for metacritic, and the cost of the dev team who have to run around like chickens trying to finish titles up to 6 months or a year after launch, they could... I don't know... use that money on finishing their products before trying to sell them to consumers. But what do I know, I don't work for a multi-billion dollar publishing giant like EA :/
And the $70 price tag... I think the consumer base just have to stand their ground and not buy these awful products anymore. Even if you are addicted and just want to keep playing videogames, hell, there's a huge backlog anyway, you can emulate other consoles, etc.. there is plenty of options, no reason to stay "upgraded", "up to date" or whatever if the industry is offering really bad products. But people are still buying this crap, the Harry Potter game sold a lot of copies just with the power of the brand alone, extremely generic and repetitive, technical issues, etc.. doesn't matter, the sheep embrace it
I think we hit the point where UE4 just cant keep up with the size of new games any more. How the hell can the fastest available CPU not even hit 100fps.
Especially when the RT plugin is used.
It basically has to render an extra game world in low res and then merge the RT lighting into the final image.
So this game like Hogwarts Legacy kind basically double's its resoruce usage when RT is on.
Cause developers have inexperienced interns developing games who dont know anything about optimization.
2+2+2+2+2+2+2 ...... till infinity will break the optimization.
But, 2+2 = 4 + 2 = 8 + 2 = 10 will give result then optimized the memory size together.
The problem is devs push graphics regardless of system requirements and frame rate. I wish game devs prioritized performance. Call Of Duty for example looks great,but also performs very well. And scales a lot with different settings. Obviously at 4K ultra+RT it's quite demanding. But on the basic present the game looks decent enough and most midrange CPUs can handle 128 players mode while pushing more than 100 frames.
@@100500daniel I mean, it's clear that the game is just broken. Nothing to do with pushing graphics settings too far.
Unreal, I literally just upgraded my system to a 7800x3d and a 4080 in order to enjoy this game on my 4k TV. Well, guess I'll wait 6 months until they fix it.
It is Unreal afterall, what do you expect at this point
@@insector2093It's not the engine it's how the engine is used. There's loads of UE4 games that run well
@@gamesmaster1060 You mean Fortnite, and atomic heart? 🤣 you know that's not a lot
@Insector, Dead Island 2 is a recent example, and it runs decently. The trick is to not use RT at all. Every UE4 game with RT will turn out to be like crap, such as Hogwarts Legacy and this one. Even Atomic Heart decided to remove RT despite heavily promoting it before.
nah man i bet there's a fix to it in the meantime. Developers have been puttin out unoptimized games for awhile now
Bruh wtf is this cpu demand. Is the game trying to calculate the physics of Cal's balls
could be the drm slowing things down, it's apparently running denuvo
@@mkuhnactual Drm shouldn't slow down a 7800x3d
The game might be bouncing like crazy between the various CPU cores, since it seems to be on 30%ish on all the cores. The 7800X3D is so much faster than the console CPU this game runs on, it's not even funny. So something is definitely bugged.
Maybe new denuvo
Well the game runs poorly even on console from what i've seen
Like a lot of PC games, its simply not optimized.
@@arenzricodexd4409 Doesn't seem to be a Denuvo game, but it does have some background EA DRM running, that might be causing this.
@@StruggleBoxing Yup, 2020s pc gaming in a nutshell. Everygame seems so un optimised for months after release
First 3 months of any PC release is beta testing now. Thanks for doing your part. The devs will get to this right away!
Not just that, but they never get fixed because they are fundamentally broken. There is no way that they can all of sudden get this game running much better.
*The devs will get to this, "eventually"
They won't.
Just get an XBOX/ Playstation and stop paying for getting fucked in the a**
No they won’t, they never fixed Fallen Order lmao
And this game is 70$ lmao
Good representation of market lmfao
☠️
The game is good but the performance is not 😅
$69. Nice
$100 in Australia on Steam 💀
Man Daniel... you got a full time job and youre still able to come out with PC game and PC hardware reviews and comparisons of such high quality. I am very impressed with you sir. As much as I love your vids please dont lose any sleep or family time for us!! Let me know if you ever need an helping hand ;)
Optimization has left the chat..😢
Well consoles run it as good as a 4090.
I guess PS5 stock value has just risen once more.
@@KryssN1 PS5 in performance mode upscales from 960P lol
Let's hope it won't become the norm for developers to rely on frame generation to "fix" such CPU bound scenarios.
Oh wait...
That wouldn't even help.
If only the game had frame gen, but it's AMD sponsored so FSR is all you get
@@itsaUSBline Well that will definitely help, but this game does not support it :P But it indeed does not really fix the fundamental problem of it being a non-optimized port.
@@OnlyBlix if frame gen could push fps beyong 100+, the corresponding latency wouldn't be a problem for this type of game
@@HeLithium You would be getting 55~ FPS latency (native FPS latency) regardless of what your FPS counter shows with DLSS 3 turned on.
This game would feel a lot better with 120 or 240FPS latency because of its high skill ceiling.
This is ridiculous. I just splurged hard to build my first desktop a few months ago with a 7700X and a 4080 and it's already not even good enough for 60 fps native at 1440p...this lack of optimization is ridiculous. How do they expect anyone to buy a game when most potential customers can't even run it?
its mainly the cpu bottleneck i think thats fixable quickly they shouldve delayed another month
@@reiudfgq3vrh34ur Unless it's being caused by Denuvo, in which case the performance issues will be around till they patch out denuvo in about a year.
@@mkuhnactual and you can grab the game in like a 50%-70% off steam sale, and this level of GPU powah costs less, and the bugs are patched out. It makes no sense at all for a single player experience to buy full price on release.
Play on 200$ XBox series S, much more fun with no stutters
@@mkuhnactual Dead Island 2 also has Denuvo and that game had absolutely no Performance issues
Thanks for that video, this is EXACTLY what i was looking for yesterday and couldnt find
God dammit. There goes another....
So if you want to get a good playable experience you need to have even better than a 4090 and 7800x3D. Saved me 70€. Thanks Daniel.
Nah the game runs fine for me with a 4070ti and 5900x. I can’t ever take these pc CZcams channels seriously because my experience is always way different (way better)than what they show.
@@Soapy_Papoose I listed what I have and I play it at 1440p at max settings with an average of 60fps….so yeah kid, it’s fine for me. Sorry you can’t afford it lol
@@Soapy_Papoose That's a good 'Cinematic' Experience
@@Drunkgamer904 I definitely wouldn't be fine with just 60fps at that resolution with those specs. I really don't know why you're defending a shitty port so hard.
@@Drunkgamer904 dum dum nvidiot
Hey Daniel I was having the same issue with the first world. When I got to the next world I was hitting 100+ fps, even during combat. This is on my 6800 xt 12700k pc. I am running the game on 1440p, nearly maxed settings (shadows medium settings).
Love your videos, keep up the good work.
Dan thank you for the tests. Keep it up. Are you going to make a video of the game running on different gpus and resolutions?
Ryzen 5900X, RTX 4080, 32GB DDR4 3600 RAM - I just refunded the game. I play at 3440x1440 ultrawide, and seemingly regardless of the video settings I chose, I could barely get the intro areas to hold between 40-50 FPS with RT off and Medium to Low settings. I think my hardware is still pretty high-end compared to what MANY gamers will be using, and it's shocking to me that Lucasfilm Games released this AAA Star Wars game in this state. There's clearly something horrible wrong with it right now. FWIW, Hogwarts Legacy and most other UE4 games, and even Cyberpunk 2077 Overdrive RT mode, all play perfectly well on my system. As much as I also appreciate AMD's release philosophy with FSR2, I'm also pretty amazing this didn't ship with DLSS2 or even DLSS3.
Xess 1.1 made insane jumping, so now we have 2 or 3 upscaling options or native, thats pretty good.
It just needs more support
Many? That's the second most powerful GPU out there. Easily above 99% of players lol.
skip this game till its fixed in few years and also cracked. denuvo versions always has 10 15% lower fps. enjoy other great games, not broken ones
your cpu is not fast enough. buy a 13900k and you can get better numbers than the guy who made this video.
signs of another unfinished release...
be smart boys buy your games 2 years and 24 patches after release
That's exactly what I do.
I’m on that train too, wootwoot.
What if the rt cores can't handle all the load so the rest of the gpu and the cpu has to wait for it to keep pushing fps?.
this may sound odd but have you changed your field of view settings? had a few new releases that would tank performance if you adjusted FOV from default
All we are reminded of in these cases is that devs/publishers nowadays are really prioritizing Consoles and leaving PC performance in the lurch. I was enticed to buy hogwarts legacy on ps5 just because it didn't stutter as much as it did on my 5800x3d/3080 10gb system.
That's not the games fault in your case. You have less vram then a console. Your 3080 is bottlenecked. It's not stuttering because your ps5 is quite literally better for this then a 3080 10gb. Games have always been made for console but Nvidia decided to screw everyone with the amount of vram they gave 3000 series low end 4000 series.
hogwarts legacy was passable on a good system, you still got some stutters but with VRR you barely notice it. Just dont turn on RT
Jedi, looks like a pile or garbage
@@darbstre2900 It stuttered for me with a 5900x+3080 and a 5800X3D+4090 system. In Hogwarts, while VRAM can cause performance issues if you use higher settings than your GPU can handle, many, if not most, stutters come from CPU/system memory bottlenecks in that game.
@@darbstre2900 The consoles have to share the available memory between VRAM and RAM, so no, a system with a 3080 10GB plus 16 GB of system RAM has more memory available.
My game is choppy/hitching and blurry. Can't get above 32 fps standing still. It does not matter what the settings are set to. (Low-Epic) I am running RTX 2070 Super, i7-9700F CPU @ 3.00GHz (4.7 GHz TURBO), 32GB RAM, 144Hz monitor. Hogwarts Legacy ran fine for me after the usual "fixes". Hopefully they can sort this out?
It reminds me of the UE Matrix demo and how CPU bottleneck was a significant issue. It appears that using certain features of the engine itself can create a tremendous CPU bottleneck.
UE5 is badly threaded, it literally uses less than HALF the potential of a 16 core system, and EVEN UE5 is VASTLY more efficient than this garbage. Ue5 is particularly efficient in VRAM usage, easily running well in 4K with 8GB.
@@Wobbothe3rd It's not all about cores. Not all about using 16 cores. The CPU itself needs to be faster.
UE5 features like lumen and Nanite have a fairly fixed CPU cost. It's a significant upfront performance cost but once you can run that, you can run almost anything in UE5.
Also worth remembering that any performance metrics taken from the editor are deceptive as a built game will always run significantly better
This isnt the same situation.
UE5 matrix demo was demanding nearly all core usage. (Optimizing this would lessen cpu load perhaps by easing up on loaded assets/culling unseen assets whatever.)
This game on UE4 is under utilizing cores, just like Callisto and well, that story hasnt changed much and seemingly wont.
Design flaw for AMD consoles targeting weak hybrid APU... This game just like Callisto just cant scale up to utilize proper threads/cores.
A damn shame.
Perhaps a few patches and we would gain maybe 5 or 15 percent threaded usage at best like callisto, but because UE4, its probably a wrap. Performance is trash on PC.
Garbage in, garbage out.
@@Wobbothe3rd Man, Unreal Engine sucks so much. Meanwhile Crysis Remastered with Cryengine 4 has those physics, rendering distance, graphics, can ran run pretty well on a potato.
I still remember the Unreal series with the Unreal Tournament games and the Unreal Engine locked to better perfomance and better graphics for each version.
Is ray tracing glitched off or does enabling it back on simply break performance? This is a stupid question but i have had opposite happen in Doom Eternal. (as in toggle off then on increased performance)
I would love to see an optimization walkthrough from Daniel for those kind of games. How much performance can you get out of the system without major image quality loses. Thats much more interesting for me than just the All Maxed out setting.
This is why rockstar releases their pc ports much later on. Emberrasing
I'm using epic settings with wider viewpoint. And yeah, I kept RT off. I just got done with playing this at 1440p for over 2 hours. This game has some crazy stuff going on. It just kept using up more and more VRAM too. I stopped the game and then relaunched it. The VRAM went from 20GB used to 8GB VRAM of use in the same area and once again it slowly started climbing. At one point it climbs fast to hit 13GB and this slowly climbs again. Maybe I missed a patch or I need to update my driver. But I noticed that the VRAM just keeps steadily climbing while I play. Could it be a memory leak? I saw this on my RX 7900 XTX btw. I have another system with a 3090 but I'll have to test that out more to see if I notice the same thing. Anyway was curious if you've see that on your 4090 at all. Thanks for the vid.
Sounds like a classic memory leak.
Memory leak
If you have too tight areas with resources loading then you get stutters like in dead space and gotham knights 🤣
vram is like lung capacity, low vram and card can't breath normaly and card gets stutters (asthma). Amd cards use more vram, because they have more vram onboard than nvidia, and vram usage is also a driver thing.
It's expected. Unless vram is maxed out you leave performance on the floor. Best case would be installing a game into the vram. Unless the ram is paging, it's not a leak.
Did it cause slowdowns?? There is nothing wrong with filling out vram to the max, it's the same with pc memory, you actually want it to be filled with as much data as possible so you don't have to reload data when that piece of information is needed again. Say you play your game for 2 hours and now you reload a savegame from the beginning, it should have all the resources already in place. This can be beneficial, for the same reason we want our pc's to completely max out 64GB or ram with browser content, so nothing has to reload. This does not mean your Chrome browser uses a LOT of ram for no apparent reason. When we need more space unneeded stuff gets flushed in no time. I think it's just by design, in your case it keeps filling because it can. On my 10GB card it won't because it can't but that does not mean it's bad behavior. In other words, the game does not need all that vram, it's not an indication of actual allocated usage, it's just there because it has enough room to leave it there.
Are there any settings besides ray tracing that impact the CPU utilization? 80fps doesn't hurt my head at epic settings, but it is unfortunate if it stays there at all settings.
This is a pretty keen observation. it's been a bit since this video was posted, and some patches have made some difference, but really good observation nonetheless. Cheers.
There is an issue turning on and off the RT during gameplay.
If you do turn it on, and reload the save the RT is still on but with the higher FPS.
Tried in the same location you did.
Find a water puddle with the RT reflections in it. And turn off. They will look very different. Then turn back on. Your fps will tank, but if you reload the save point the fps will be fine with RT still on.
Imagine looking into puddles to justify your $1600 gpu.
@@BlackJesus8463 Consoles also do the same thing
Anyone else notice how good Daniel's video's are nowadays? 👏
your channel is too good, man ♥
You guys are up way too early! 3 hours in at 3:33 am and already over 16k views & 700+ comments. Well done!!
Got this game for free with my 7600x, and I still want a refund.
You are the best at testing games Daniel! I know you are very pressed for time, but please see if it's possible to get a reasonable midrange CPU to test on as well. 5800X or so perhaps. A lot of new games starve the CPU.
Is it possible that the non-3Dcache 7700X would be less of a bottleneck because of the higher clock speeds?
I have a 7700x and I'm hitting peaking at about 3% cpu utilization at 1440 ultra wide with max settings; I also have a 4080, 32 gb of ddr5 600 cl36, and a really fast gen 4 nvme. It struggles to even hit 60 fps consistently and has frequent stutters.
I'm using a 12900k and a 7900xtx, and it was pretty much the same performance, let's just say 60fps, and changing settings didn't help, but when I changed the preset from epic to high, and then manually turned everything back up to epic, except for AA I kept at high, and foliage I kept at high, and now it's in the 80's. So idk if its a bug when epic preset I'd selected . Or maybe my pc is just weird
you are right about the ray tracing toggle. was getting 55 fps on 1440p Epic settings with FSR quality and ray tracing on 13700k 2080ti. i turned ray tracing off and then back on again, it dropped to 40 fps.
But the wierdest part is that i performance from FSR disabled to quality to ultra performance is within 2-3 fps margin. and i don't think it's because FSR was not working at all because i could see the image degredation from FSR disabled/quality and FSR ultra performance, but fps was not changing
That's just CPU bottleneck, even with your CPU.
Now I wonder, what kind of system the developers used when testing. I think developers should work with the max setting in order to know how their code perform and able to see the true artistic representation of the game.
they work with consoles first and foremost, especially the AAA studios. Most likely it was optimized for good utilization on current gen consoles, and then ported to PC with very few changes and now can't do shit on a top end system. Could also just be driver issues. In the end of the day, for companies this size, this type of performance is unacceptable no matter the cause. It's the whole reason they get to call themselves AAA in the first place. I think it's time to strip some of them of a couple A's.
They are console developers. They probably tested the game at 30 FPS cap and a controller.
@@monotoneone No, they are not console developers, they have been using source forked engine for long time for their Titanfall series and Apex, all of those games run quite well on pc, they are probably not experienced in UE.
This might be a dumb question but could this be a result of the game not able to make use of that 3d Vcache? maybe it could work better with an intel cpu like a 13700k?
So, is it possible to disable hyperthreading and see if having only physical cores fixes anything?
Hey, just so you know GPU utilization percentage doesn't really apply if you don't max out the resolution buffer. You should compare wattages at different resolutions, because the GPU pipeline might not even saturate at a lower resolution. Example, a 3080 Ti goes around 380W on 4K but at 1440p it might not even go 300W even on a balls to the walls 13900K. It's the way the GPU pipeline works. That's why AMD always wins at lower resolutions but suffers disproportionately at 4K.
True but the issue isn't necessarily the resolution itself , it's that nvidia has more cpu overhead when cpu limited resulting in worse performance , and that's more likely to happen the lower you go
And you can get a 3080 to run at 100% usage and wattage at 1440p quite easily but yeah it will probably consume more at 4k since it's most likely to be fully utilized at that resolution
@@lynackhilou4865 Recently I had to choose between a Zotac 3080, an EVGA 3080 Ti and a Sapphire Nitro+ 6800 XT, coming from a 1080 Ti FTW3.
After testing them all on my 1440p monitor with different resolution scalings I settled for a 6800 XT to pair with my 5700X. Particularly because I was saving some cash and getting more VRAM which I need for architectural renders.
I've noticed that the 30 series really stretches its legs at 4K because at 1440p and 1080p there was barely any difference between the 3080 and the Ti while AMD stays competitive till 1440p and 30 series just keeps gapping higher at 4K.
There have been videos on the differences between AMD's and Nvidia's driver architecture. Nvidia favors faster clocks versus more threads dating back to Intel's 4-core architecture. AMD went down a parallel thread path years ago with GPU's when they took that path with their FX processors. I am not aware of either fundamentally changing and AMD's choices would finally be paying off here (and would have sooner if their architecture hadn't flopped and innovation stalled).
Seems like Im asking this for every new title these days, but how the hell are they calculating their requirement specs then? No recent game seems to be playable on minimum thru to above and beyond spec requirement.
This game has no Frame Generatio 3? for Nvida?
Considering how bad the cpu bottle neck is. I think a lot of the games code is still running in blue print. As when a game ships all or most, code should be converted to c++ for the best cpu performance. It is what happened with Gotham knights. As they converted a lot more code into c++ and the game is a lot less cpu limited were decent fps is possible on a wide range of hardware. It's sad that a game that was delayed and they still did not have time to get the code in a state where they can start optimizing it. How are you meant to optimize a game when 80 percent the time your gpu is running at 30-50 percent usage with rt. Games keep coming out 2-3 months too early. At this point, I might as well just ignore games for 2-3 months before looking into them on pc. Would not be surprised if Redfall is in the same state next week. Like I will most likely have a better time emulating tears of the kingdom then playing a native AAA pc port. Can a developer just have the balls to tell there publisher that the game is not ready and actually have the ability to make a last minute delay of 2-3 months. Please do not buy until these issues are fixed if most pc users don't buy it the publisher will say to themselves maybe if we want a game to sell well on pc maybe we need release a complete and optimized game. The reason why game keep coming out in this state is because pc a high amount of pc users are still buying it. If want change you have to use the only thing the publisher care amount and that is sales of first 3-6 months of a games release.
^This!
This a misconception. The difference in performance between blueprints and c++ is only significant with very heavy computational operation, and even then it's not that significant. Not to mention, UE can nativize blueprints for packaging so that's not even an issue most of the time. A bad programmer will mess up c++ code just as easily, if not worse. I would bet if we could access the source we'd see all kinds unoptimized code, including the c++ parts and especially shaders.
If you don't buy the game, and sales tank on PC. The publishers will only see that as more reason not to waste too much resources on the PC port and sloppy ports will only get more common ( if that's even possible)...
It reminds me of some of the issues that The Callisto Protocol had with ray tracing which is also a UE4 game.
Same as Hogwarts. My 3090 was only using around 65% utilization with a couple cores really pushed on my 12900k in certain areas like hogsmead, but in most of the open world ran butter smooth.
How is a CPU bottleneck possible when max usage for any core is 70%? And what exactly is using it? Could it be that there's some other reason like lack of direct storage or something?
What if there is a bug related to RT running on APU on PS5 while in PC it is directly offloaded onto CPU completely disregarding GPU even if GPU does not support it?
7700 and 7900xt. Played for about half an hour before work with auto settings (everything Epic, RT off, FSR quality at 4k).
The "respawn entertainment" and the like title screens when starting up the game were total slideshows, and cutscenes had noticeable frameskips every now and then. Gameplay was locked 60 FPS though.
I was expecting the performance to be worse to be honest.
to be honest a 7900 xt is not a low end gpu by any means...the fact that you had to use fsr to keep a 4k 60 fps at all times is not a a good sign
it's nvidia...that's why i was waiting to see AMd performance. I have a 7800X3D with a 7900XTX...3440x1440 max settings no FSR, just over 100FPS average.
@@mihai2174 you don't know his frame rate because he has vsync on "locked at 60 fps" clearly he's getting over that if vsync was off or else it would dip below 60 at times as that's how averages work...
@@JoeL-xk6bo for some reason my previous comment was removed
i was saying that i have a 7900 xtx and most of the times i can't get 4 k 60 fps at max settings (no fsr and raytrace), worse, it goes lowe then 50 fps when i am in combat, so i am pretty sure he won't be able to get 60 fps 4 k with a 7900 xt at native
That is correct, it was with vsync, which was enabled by default. My GPU utilization generally hovers around 80%, turning vsync off got me to ~75 FPS on average, but I had some tearing, so turned it back on. I tried enabling raytracing, that dropped my FPS to around 40-45, but the frametimes felt super inconsistent. It did look better with raytracing, but it was absolutely not worth it for me. I didn't try turning FSR off, might test it later
I'm about 6 hours into the game, and my experience is very, very positive so far. I didn't encounter any performance issues or bugs. The game is very much Fallen Order, but bigger and better. I love the visuals and setting (Coruscant looked fantastic!), I love that I got to keep all my abilities from Fallen Order, vibe is truly Star Wars.
I just wish the game ran well on all hardware, not just the latest&greatest AMD gear.
You made a small error in your video. You said that the 7800X3D is the fastest gaming cpu. That's not true. The i5-2500k is the fastest gaming cpu.
Was there actually any difference in picture quality (lighting/shadows) with RT on?
That's what I was wondering because even when he turned the settings down I could Barely tell tbh. I'm not sure if it's unoptimized but could be an issue with the ini files not actually working when you change the settings tbh or some people said a memory leak but I feel like you'd have more stuttering tbh
Any plans to get the latest gen 5 NVME SSD to test with this game? I hear it does quite a bit better.
It's not any better on console. On PS5 it constantly drops out of the VRR range and is usually sitting in the 50s for FPS (at least on Coruscant at the start) and has terrible screen tearing with VRR on or off. I have resorted to the 30 FPS mode just to make the screen tearing stop. I went PS5 and not PC for this one after Hogwarts was actually really good on PS5, but this is just a disaster.
It comes down to structure difference in consoles vs PC, shared memory pool and dedicated chip that handles assets compile.
Then Xbox and PS5 have AMD chips in them, while PC market is dominated by nvidia cards, while there is limited time and so optimization for a platform have to be prioritized.
As you noticed last couple titles are AMD sponsored and consoles are all AMD, publishers just choose their target with their limited time.
Especially when games are getting more and more massive, with unreasonable targets to beat the last game/release.
It's hard to hit the targets set by publisher, not mentioning optimization, while the Devs are still probably underpaid
Consoles have equalivment of a 3060 Ti / RTX 4060 at best and PS5 runs it equaly good as a RTX 4090 combined with a 2x as fast CPU as in console.
Clearly this was optimized for AMD arhcitecture, but it's still poo.
In comparison all other AMD sponsored titles like TLOU1 had no issues running on AMD GPU on PC, only nvidia.
I had the same issues with my 7900xt and 5800x3d I updated the drivers with the new ones from today and it fixed it completely
which drivers ? be more specific please?
really? whats your performance? i have a 7700X w/ 7900XT and get like 60-70fps. Bad areas are down to like 40fps tho
@@Ladioz Adrenalin Edition 23.4.3 with support for STAR WARS Jedi: Survivor™.
However, games tend to run perfect on AMD systems. Another blow for Nvidia cards... damn. Next time im 100% buying an AMD card
@@Ladioz You stupid AMD shill, its AMD sponsored tittle, what do you expect? Ofc, it will be better after few patches, and new drivers.
Im wondering do you think denuvo has something to do with this?
I'm going put it out there , but do think the issue with cpu could be that game needs more than 8 core or have people tested it out on higher core cpu and still ran into the bottle neck.?
No DLSS in a 2023 game, went as expected
As soon as the 4090 dropped, we started getting some of the most unoptimized pieces of shit ever known to man, even when strictly looking to GPU usage.
That doesn't even make sense, that card was released back in September/October 2022 while these games were in development for years before that. Your comment makes no sense. Bad PC ports predate the 40 series.
@@RicochetForce It's not a conspiracy, they just don't give a fuck to optimize because there's now a GPU that can run the game to some degree. If ALL the available GPUs suck, they have no choice but to delay the game, but now that a $2000 card can run ok in games where all the other ones run like shit, they are putting these games out because they technically work. We didn't have games using 14-18 gigs of VRAM before the 4090, there are now 4,5 of them in the last couple of months.
Bad pc ports have been around forever. This isn't a new thing. The 4090 also lined up with the covid pandemic which marked a significant rise in unfinished games. Coincidence? I think not.
Its not some conspiracy. Game devs won't make any money making games for the 1% of gamers with a top spec card. It's often the publishers rushing games out to hit arbitrary deadlines.
The one potential impact of the 4090 is raising the requirements of ultra settings to fully utilize the top spec cards, but thats not what's happening here as the whole game is broken no matter the settings
@@omarcomming722 This is conspiracy theory nonsense. Developers begin work on games years ago, on hardware that is now very much dated. The hardware target AAA or even indie developers use is the console hardware of the time.
These games were not intentionally created to run poorly, or somehow strongarm people into throwing out their old gear and buying new stuff.
They're simply bad PC ports, and rushed ones at that. TLOU Part 1, for example, actually has problems that would be fixed by amateur devs.
It's no conspiracy, these are just rushed products with cut corners.
@@RicochetForce I literally said it's not a conspiracy but incompetence. What I'm saying is that without ultra powerful GPUs that can provide somewhat acceptable results, devs would have more pressure to delay the game because the game running at 60 fps on the highest end consumer GPU looks better PR wise than it running at 35 fps. The existence of such powerful hardware gives them more leeway to put out unoptimized garbage
Is this in a similar vein to Crysis where they made it for future hardware, or is it just bad optimisation?
How does a fully tweaked i9 13900K system compare then?
I have a theory. Consoles have shared memory between the CPU and GPU, which allows for close collaboration between the two and enables enhanced optimization. However, in current PCs, the GPU's VRAM and system memory are separate. Therefore, if an engine designed with shared memory runs on a PC and the CPU needs to access the VRAM, it can lead to memory bottlenecks. I believe this may be a problem with many current games.
This!'
Badly optimization of memory pool for PC, just copy paste from console, like TLOU1.
Never the less, this game also have performance problems on consoles, but not akin to 4090 running at barerly 55 fps avg. in static envoirment with double as fast CPU.
The problem is ps5 is the system most games are built for and everything else is an afterthought. Sad considering pc pays the same price.
but consoles always had shared memory since at least ps2 and probably before. this is nothing new.
CPU is not accessing the vram. Thats not how this works.
@@apiestink I'm not claiming that my theory is necessarily correct. It could be completely wrong. However, personally, I don't think there were many cases in the past where sharing memory between the CPU and GPU provided significant benefits. Nevertheless, with the rise of computationally expensive features such as path tracing that require huge amounts of memory, techniques that actively utilize CPU-GPU memory sharing have been developed. I believe this phenomenon started not with the new console generation, but rather when path tracing became a new standard for games, which led to the emergence of these memory-related issues.
I have the same setup. For the first hours of me playing I got a solid 95 fps but after I took a break and came back the game starting bottlenecking. Game Def has problems
Crap you right the rtx bugs out. Wow performance on this game is just bug out as he'll.
Could be because RT wasn't actually enabled the first time!
So not so much Vram used today compared to yesterday? Did it magically fix itself?
Really well done video - particularly about the cpu utilization. Thanks.
I think wording it as a CPU bottleneck isn't exactly right. As it's clear the CPU in general is not the problem, but the game itself.
This unfortunately happens w/ 95% of AMD sponsored titles when launched. It gets fixed eventually, but it's a common occurrence when using Nvidia GPU's w/ AMD sponsored titles which have RT.
The sad part is the RT in this game is basic as hell. Not surprising since again, it's optimized for AMD who is still 2 gens behind in RT.
It’s not the CPU that’s the issue, yes. It is the fastest gaming CPU around.
However, it was released with a cpu bottleneck because it’s an unfinished port. To say it’s not a cpu bottleneck would be inaccurate.
@Bronson Dixon it's not a CPU bottleneck regardless. The cores aren't even being pushed. This isn't a Spider-Man remastered situation.
It's a coding bottleneck in the game engine that needs to be patched up which does not equate it to being a CPU bottleneck since the cores themselves are barely pushing 30% utilization. It's much broader than simply referring it to as a "cpu bottleneck".
@Kevin P. I get what you're saying and you are correct, but technically poor multithreading is a kind of CPU bottleneck.
@@OC.TINYYY If cpu slower then 7800x3d gets lower fps, then its cpu bottleneck
You have to call it a cpu bottleneck.
It feels like that any ps5 port that get ported, dev seems to pretty much try to make pc emulate the ps5, instead of actually work around and actually port the game to a different platform. That is why the last few ps5 ports have been terrible especially on the cpu side of things. Devs are basically """porting""" game as if they were being emulated rather than work on it for it to fully utilize the gpus pc have. They're treating pc as consoles apus.
What about nvidia cpu overhead? Can we see if AMD gpus test? 7900xtx is supposed to do quite good here.
I'm curious how come how you don't have intel rigs? Why did you decide to go with the 7800x3d? Is there big a difference between the 7800x3d and the 7700x?
Well, current Intel platform is at the end of the line, while AM5 just started. Also, AMD offers pci-e 5.0 x16 and pcie 5.0 nvme both at the same time, again future proofing your build. On this system, you just change the CPU in 1-2 years and you're done.
I love it when the lightsaber goes through the wall about 50 times without any effects of damage.
In games like Jedi Outcast from 2002, running on an already little bit dated id Tech 3 engine, you could "draw" with the lightsaber. But at least, now we have reflections and crappy performance.
@ The special effects are probably behind a pay wall and will be in a dlc.. 🤣🤣
@ It's kind of funny when you look at some older games that had more interactive environments than most games today have on offer. 😂
@@BVNTR_
Wouldn't be surprised! Hahaha!
Yeah, back then, they cared. I know game development is now much more complex, needs more people, better tools, better hardware, etc... But there is something still very wrong, if they can't reach that level of detail as they did 20 years ago. Sure, graphics are much nicer and sounds are better too. Animations improved a lot since then, but... that's it.
Now, we get games that don't look bad but not looking as good as the hardware requirements would be justified.
When Crysis came out, I think that was justified. That game looked glorious at its time, still a really nice looking game.
Seems like almost all AAA games that come out these days, you need the fastest system possible to run at max settings, and even then there are performance issues.
Studios don't optimize, and a lot of games are running denuvo now for drm which gives a hit to performance. Not sure if that's the problem with this game, but it is running denuvo.
@@mkuhnactual It's like they don't want anyone to be able to play their games.
@@nossy232323 A vast majority of gamers will continue to pre order and buy games on release even after a studio releases multiple games in an unfinished or unfun state. If people will give money no matter what, there's no incentive for a studio to insure it releases in an optimal state.
Whatever happened to DirectX 12 Ultimate being some omg it is so awesome outlook?
Would turning off Hyperthreading (Intel) / Multi-threading (AMD) work? I've seen people say that games work well with them turned off because the game can't use the threads that well.
I'm getting basically the exact same performance with a 5900x and 4090 at 4k. I really hate that I have had to brute force 60FPS for this years AAA games (Hogwarts, TLOU, SWJS) with a system that I spent a lot of money that should do a lot more. I couldn't imagine having "lesser" cards that are supposed to also do 60fps at 4K.
I have this same system, MSFS in vr runs better than starwars 😂
5900x 3090 feels like its 7 yo system lol, pc gaming is a disaster, piracy is the only ethical thing a pc gamer should do, these companies deserve nothing.
I was gonna buy a 4090 but this makes me think twice.
I want to see how UE5 titles run before paying that much money on something that might not even be able to run it well enough.
This game looks like it could run on PS4 when you compare it to Forbidden West.
@@Dionyzos wait a year and get 5090 for the same price, why? That will be the first gen of cards with decent rt performance, 4090 is just barely there, frame generation lag is horrible if you render below 60fps
Similar bro, 5800X 3D and a 4080. Play TLOU right now and it’s frustrating seeing my GPU only running in the 60% range at 1440. 120FPS with dips into the 70’s. Frametimes are so bad it feels like I’m getting 20 FPS!
the state of the game rn makes me so mad, i got a 4070 to play this and can't really get over 40fps without turning the settings way down at 1440p, how can they release something like this
How difficult was to just make a second part from the first game? It's all we wanted :_(
9:36 strange to see screen tearing at such low framerates. Are you using a Gsync monitor Daniel?
Say what you want about RTX Overdrive in Cyberpunk 2077 but you get every shader unit’s worth on the visual majesty of path tracing for what it costs in performance. And it’s even efficient where you could run it fine with heavy DLSS even on a 3070
And it doesn't even max out most card's vram either.
@@Wobbothe3rd cudos to CDprojektRed they build custom memory pool algorithm for the game, for PC, so that it does not bottlenecks due to differences in consoles vs PC structure.
Basically due to consoles having shared memory pool and dedicated chip that handles compression and decompression games need optimization in that depertment for PCs.
I am on a rtx 3070 + ryzen 5800x non 3D. Yeah you could play on Overdrive raytracing but it doesnt make sense at all on a 3070. I ran some benchmarks and with rt overdrive, dlss balanced, 1080p, everything medium you'll get ~ 50 fps avarage. But you'll get the same framerate or better on rt psycho, dlss quality, 1440p, everything maxed out. Which looks much, much better. So in my opinion rt overdrive is only enjoyable on a 40 series.
DON'T BUY THE GAME until they fix this garbage...
You're literally wasting money on something you can't even play.
This. Unless people stop preordering and buying bugged crap on release nothing's going to change
yet you can play it? is 60fps unplayable?
@@GregoryShtevensh yes, 60fps is unplayable.
I will buy it only to refund it so they get the message.
@@GregoryShtevensh IT CAN'T EVEN HIT CONSISTENT 60 WITH A 4090 YOU FOOL
I can see that the game isn't fully utilizing the GPU, but it looks like it isn't fully utilizing the CPU, as well. It makes me wonder if those percentages would be higher on a system with a lesser CPU and GPU. Is the code itself the limitation, and can a lesser system handle the game just as well?
Good explanation about the cpu bottleneck. Many people assume something has to run at up to 100% for it to be the bottleneck but that's not the case
It's because the calculations of the load on the CPU is based on all the cores. In this case 8 and 16 threads. But the game engine uses 4 cores. So a CPU utilization of 45% really means 90% because the CPU tries to load balance it but the game keeps asking for just 4 cores and 8 threads in how it's used. This likely creates latency.
In a sense if one could disable all the unwanted cores/threads and up the clock speeds to 7Ghz I bet it would run much better.
but it's idiotic how any game engine is designed with fewer cores and faster clocks when we have known for YEARS that only way forward is more parallelism and more cores.
Its an Unreal Engine game. What do you expect? Surprised they're not using Frostbite or Respawn's own fork of Source engine...
UE4 is a great engine. Look how well dead island 2 and atomic heart look and run.
@@gavinderulo12 they look outdated
@@user-ye7lp9lg1c they look good. But my main point is that their performance is excellent. So it's not the engine's fault if a game runs poorly.
@@user-ye7lp9lg1c you are out of your mind if you think Atomic Heart looks outdated. Some of the best TAA I've ever seen in any engine ever.
The game is using ue 4 so its probably hitting a wall because of the engine also one of the last games using the engine i think
Get ready for the redfall shit show in a couple of days
A lot of games use UE4 and run great on pc. This game doesn't look better than cyberpunk yet runs worse. The developers are at fault here.
Incorrect.
The overwhelming majority of UE4 games run dramatically better than this.
Cyberpunk with RT Ultra (one of the heaviest rendering loads) runs better than this while using less VRAM.
There is something very wrong with this PC port.
Have you noticed any leaks in the game? I have noticed consistently over time my performance will tank and deteriorate. Where stutters are super common and I have to restart to have ok performance again.
The CPU limitations are likely caused by the Denuvo DRM they're using for the game.
No. Unless you want to use that excuse to pirate a game. Denuvo has 0 performance hit on higher end systems.
It only hurts people with shit hardware, which is why they pirate in the first place. Can't afford it.
This game also doesn't use Denuvo, just basic EA DRM.
@@OC.TINYYY Nothing is free, as a developer myself, a single line of code has performance cost, this is not single line.
BTW, you will be surprised when there are tons of way to improve a single line in the net.
@@OC.TINYYY Nope, the game uses Denuvo DRM.
It's on the steam page
@@OC.TINYYY Digital foundry measured the cpu impact of denuvo and it was around 10% if I remember correctly. So doesn't explain the performance of this game fully but does slightly. Regardless, we know many games with denuvo run fine so can't just blame it on that.
Incorrect. No matter WHAT Denuvo does it doesn't limit the amount of threads a game can use - that's on the developer.
awesome optimization in this game . gpu using 40-50 % of its power GJ developers! bottleneck should be almost nothing with that cpu and 4 k rolfz
amd sponsored game, what do you expect, they made it that they dont choke on RT and that nvidia cant utilize their gpus fully. nasty tactics
Did you try pinning the process of the game to 4 fixed cores with the task manager? I've found some UE4 games benefit a lot by manually assigning them certain cores. It seems the way UE4 is coded makes some games to constantly jump around cores
wouldn't help much because the 7800x3d is a single CCX not two CCXs merged with 'infinity fabric'
@@alsiniz I know, still, you can see no cores being 100% utilized in the video, and I personally don't have a cpu with multiple ccd's, and I experienced the same behaviour with UE4 based games, if I pin the process to specific cpu cores the game runs better and those cores go to 80-100% utilization. It definitely helps.
What does RT even do in this game? There is no description. It just says Raytracing. Lol
yeah i wont be playing this game till i know its all clear, dont want my 7800x3d or 4090 get nuked
Nah, on my 14900X2d with 9090ti it runs pretty good. Locked 70 fps of course fsr 17.4 is enabled on performance mode. Something ist clearly wrong with your potato.🤷♂
You did do AB testing with lower end CPUs right?
Can you list which settings affect the CPU, would help a lot
I'm loving these "New Game Unplayable" videos. Honestly game developers are missing out on customers by making games poor for most people, let alone unpromised for the few with top end PCs.
I'd not accept that the 7800x3d is the best, well not for this game. A 13900k might give better results if the game wants speed over cache. It may be interesting to see how the 7900XTX does as it may be AMD optimised.
I suppose the future is non-native DLSS/FSR 3 with Frame Generation that compensate as DLSS/FSR 2 just put more load on the CPU due to lower resolution.
Amd gpus will probably do better because their drivers use less cpu resources
AMD drivers are better in this regard it will probably not be about game optimizing AMD hardware...
13700k here and have exact same problem as video.
My 13700k performed better than this for me last night.
Daniel according to DSOGaming you have to relaunch the game everytime you change a settings including FSR otherwise your GPU will be underutilized not sure if it actually works
they have an article up
Im about to try this game out on a 10700l/3080/32GB 4000 C15 system. I wonder if there's anything in the config files that can help :/
I have a 3070 ti and for me it runs at 30-40 fps on high and AMD FidelityFX on performance :(
similar for me on a 3060ti, I actually thought rt was enabled I was shocked to see it wasn't. High settings a few things down to medium. very disappointing
i love pc gaming! spending 5k+ so i cant even run the game max settings, nice!
console players are saying the game has performance issues for them too
@@mkuhnactual Not really i am getting stable fps with little to no hiccups on xbox series s and it supports keyboard and mouse, really having fun just played 3 to 4 hours of that
@@shreyasdalal Sub 60 fps tho
@@WizG. I am happy with the 30 to 40ish fps since it is locked and stable with no hiccups, yea I have a 144hz monitor but its a story game, your eyes get used to the 40 fps in about 45 minutes