Alan Wake 2 PC Performance Tested! Low, Med, High, FSR, DLSS 3.5, Path Tracing and More!
Vložit
- čas přidán 19. 06. 2024
- Alan Wake 2 release date is tomorrow and many PC gamers will want to know if the extremely demanding system requirements are for real. In this video I benchmark Alan Wake 2 PC performance on a variety of GPUs and settings from the system requirements list to see how they stack up. I also test the performance of a GTX 10 series GPU to see if the mesh shaders requirement makes the game unplayable. I also take a brief look at DLSS 3.5 Ray Reconstruction and the game's "fully ray traced" path tracing mode.
Test system specs (ResizeBAR/SAM ON):
CPU: Ryzen 7800X3D amzn.to/3Hkf7Qi
Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
RAM: 32GB DDR5 6000 CL30: amzn.to/41XRtkM
SSD: Samsung 980 Pro amzn.to/3BfkKds
Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
Mouse: Logitech G305 amzn.to/3gDyfPh
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.
Chapters:
0:00 Alan Wake 2 System Requirements made headlines- let's test them out!
1:41 GTX 1070 vs Alan Wake 2
8:20 RTX 2060 6GB vs Alan Wake 2 (min system requirement)
12:04 RTX 3060 12GB vs Alan Wake 2 (Medium Recommended GPU sys req)
16:44 Looking at perf scaling from low to high on RTX 3060
18:00 RTX 3070 (Is it really only 1080p Med 60 with DLSS P like sys req?)
21:49 RTX 4070 vs Alan Wake 2
25:02 4K DLSS P (1080p internal res) vs Native 1080p performance
26:45 RTX 4070 vs Medium Path Tracing (Forest Scene)
28:41 RTX 4070 vs Ray Tracing/Path Tracing (Town Scene)
33:11 RTX 4090 vs Alan Wake 2
38:20 Final Thoughts- were the system requirements and headlines for real? - Věda a technologie
Edit: At a certain point testing the 4070 with path tracing, I selected only the "path tracing medium" individual setting when I meant to select the medium RT preset.
Original comment: I was hoping to have time to test more, but it is the middle of a work week for me teaching, so time was pretty limited. If there is enough interest I may be able to do a follow up video with more GPUs, maybe side-by-side looks at the graphics (path tracing on/off, ray reconstruction, etc). Not sure.
Time well used! Good job
Great Job as always Daniel! If you could test some AMD GPUs as well it would be nice. Either way, thank you for your videos
Can you try arc GPU?
Test the heck out of this game. Forget Crysis, can your PC run Alan Wake 2 at native resolution?
Very interesting video, if you want and have time test some more
Gpus, but only if you want. What could be the reason for a 4090 cant max out this game?
I really appreciate how it shows clearly what actual render resolution the different upscaling modes are.
HL did that too
@@XZ-III upscaling wasn't even a thing when Half Life released
hogwarts lol@@harrasika
I first saw that in control
The biggest problem is DirectX 12 ultimate has other upscale features that the reviewers are not talking about, so what's that about? 🤬
The mouse panning stutter is most likely a Northlight engine problem. Both Quantum Break and Control have this problem and it's infuriating that it still hasn't been fixed. There is absolutely no stutter when using a controller.
I wonder if it's like Monster Hunter's mouselook implementation where it took the live mouse input from the player and emulated controller analog stick inputs as a kludge at launch
I don't get the mouse panning stutter on Control, and let me tell you I HATE stutter with a burning passion.
I'm playing Control with a CONTROLler right now, high spec PC, and there's plenty of stutter in the game. Seems mostly like traversal stutter.
you are not getting it, we are not talking about game stutters .@@dapope3673
@@kotztotz3530same! there is stutter when loading new areas though.
how do you recognize a game is badly optimized? -> settings have basically no to minimal impact on performance.
the 1070 didn't even flinch between 1080p/low and 1080p/high and FSR2/540/low...like 2-5fps difference.
same with the 2060...no difference between DLSS quality/performance? whats going wrong there?
Thank you for taking the time and patience to swap out all these cards for individual testing! That takes a lot of time and we all appreciate it
2023 is the best time to get into the 24 FPS cinematic experience.
this is not a movie a bro game have combat system which require atleast 50fps to enjoy
I would if it weren't for choppy frametimes this framerate is doable on a 120hz tho
For hundreds of bucks, mind you
if your not gaming at 4k with high end hardware why would you even play this? stick to overwatch
Game runs and looks great on consoles. Even on Series S. Seethe and cope.
I'm proud to see you getting review copies Daniel, you've come a long way!
Although I do recommend adding the RX 5700 XT to your arsenal, its one of the most popular DIY AMD GPUs. It already sold well but when the shortage first happened it was the most powerful GPU at a reasonable price for awhile, when I'm on reddit I always see people with this card in there bio more than any other AMD card, and it also would've be interesting to see since it doesn't support mesh shaders.
Hardware Unboxed just did a video on it.
AMD has a beta driver that at least stops it crashing. Runs like a$$ but somewhat playable. Unlike the gameplay.
I'm glad to see you've left the high preset enabled unlike a number of benchmarkers that bump everything to 11. You can actually bump both shadow resolution and detail to high while keeping the high preset. The volumetric lighting is what takes a large toll without any real benefit. I've tried both the high preset with shadows bumped and then played with the point and volumetric settings. I can't tell the difference between the volumetric and point lights at medium and high save for the framerate hit that add to the already massive hit you take when using full path-traced RT.
My Strix 4090 is running at 3.1GHz, so I'm getting better framerates. Still, if you're on the high end use the high preset with the shadow detail and res pushed to max, then enable RT and enjoy at 4K with balanced DLSS. Base framerate should be consistently above 60FPS (mine was in 69 at the lowest), making frame gen perfectly viable. You can also customize the path-tracing to a single bounce instead of 3, which offers a nice lift for those looking for both eye candy and performance.
I'd highly recommend restarting the game after changing DLSS modes, as sometimes you'll see near zero performance differences when toggling between performance and quality modes because of some sort of memory leak (Presumably what's happening on the 2060 benchmark). This affects a ton of games, most recently Forza Motorsport, which was discovered by Digital Foundry. Their 2070's 8 GB VRAM was being maxed out just by toggling between DLSS modes multiple times. A restart fixed that up completely and showed proper framerate improvements (As expected of upscaling).
i also noticed that changing from quality to performance gave me 0 increase in performance in lords of the fallen. tbf i tried it like 3 days after release maybe they fixed it
This! I'm not sure if this is the reason I had so much VRAM problem when testing Cyberpunk on long period of time. If I had 8-10 GB of constant VRAM usage, it never went down when using the same settings. When I changed and tested a lot of settings and modes, I was later hitting the VRAM 12GB limit on RTX 3080 Ti. Even tho I changed setting back to previous settings. Restarting the game always removed the issue, and VRAM usage stayed about the same. In the future, if I do change a lot of settings and continue playing the game. I'll 100% restart the game.
yeah i've seen this in a few other games with farking around with DLSS on my 3070. not sure if it's game related memory leak or a dlss memory leak. seems it could be adlss thing since it occurs in multiple games with cahnging dlss settings.
@@plasmahvhFSR genuinely looks so bad in that game for honestly minimal performance gain. I recommend just tweaking with the resolution scale slider built into the game
it worked since day 1? I have 16 gigs of vram so maybe I just don't notice, but there are like 20 more frames@@plasmahvh
At first, I thought upscaling was great, but now it seems to be making developers lazy.
Not Devs. It's publishers using it to rush the Devs,.
I don't think lazy is the right word. It just seems like they're using it as an excuse to push graphics to the limits that running at native isn't really feasible on current gen hardware. Whether that should be the priority is up for debate of course. (some games with average graphics and terrible performance excluded)
Remedy has been using it for almost a decade. I think QB was one of the first to ship with it on PC and was default for certain presets.
@@theaveragecactusSure, keep blaming everything on the "big bad publishers".
It wasn't the publishers rushing to Twitter to tell people that "Baldur's Gate 3 shouldn't raise expectations for other developers".
@@recordatronNot really, the console version is very close yet remmeber they are using mid range "old" hardware,so yeah they were lazy.
Really nice video, Daniel! Well done! I'm looking forward to doing some performance scaling, myself.
"You guys have 4090's right?" Is the new "Do you not have phones?"
pc gaming in a nutshell, couldn't have said it better myself
not really, because everyone has a phone. its just that people didnt want a mobile game
@@daytheone8459 Which makes this one even worse, because 40 series cards aren't even selling that well lmao
Well if you are still on a 2xxx series or older you have no business gaming on pc
@@SPG8989 so you're saying whoever doesn't have ryzen 9 and a 4090 is a peasant, it's funny even a 3070 can't run this game on ultra without dlss, are you gonna call 30xx owners peasants too?, immortals of Aveum did the same thing with system requirements, they failed miserably and layed off half their staff
Id love to see AMD and Intel GPUs tested. Intel doesn't appear on the spec chart. There is also no recommendation beyond the 6800xt for AMD RT when the 7800xt and above or even the 6900xt/6950xt should be better at RT
There should be no hardware issues and Intel issued an updated driver that supports the game, I couldn't tell you if everything works though.
I think Daniel did nvidia gpus because they were bundling Alan wake 2 with nvidia gpus. Bad deal if you ask me, starfield with Amd gpus was a better one.
@@Noodles1922Hope intel stays in the gpu business because we seriously lack competition
I'm very thankful for you showing on how the game would run on a 10-series card, Daniel. Since I own at 1080ti and shows that I won't be able to play this game on PC with my current set-up and for the time being, I may have to consider getting this game on console (even though I'm disappointed with the fact that the game is digital only with no physical release as of right now).
Since I noticed Intel put out a driver update recently, I wonder how this game would run on the ARC GPUs.
Feel like you hit the nail on the head with this current gen of console now being the target for min requirements almost. Even on low settings the game looks good visually, just thinking back to even 5 years ago what a game on low might have looked like lol.
Seems jarring but it was bound to happen eventually once a alot of games stopped being cross gen ad the last 4 years have been hectic IRL.
Really hope frame gen or at least the dependency on frame gen doesn't catch on. It's probably okay if it's optional, but as a requirement it really doesn't feel like we're making a progress.
its alrd here and will stay same like dlss
@@lordzed83 yeah we get that. but hopefully it doesnt become mandatory for good FPS
developers are forgetting that even framegen requires a good baseline fps to feel responsive
@@lordzed83 unlike dlss frame gen is only fit for high framerate gameplay. The base fps has to be high for it to work with sufficiently low latency. So it's not going to be adopted as widely as upscaling
It's a degradation in image quality
I feel like there should have been at least one AMD card in this test, just to verify it's getting expected results.
expected results are really bad based on what we saw here.
This was likely a quick video to get something out there and he didnt have time to install different drivers and stuff. likely he will do more videos with different setups soon.
I think the driver wasn't out yet
Nvidia was bundling Alan wake 2 with gpu purchases so I think that’s why Daniel did only nvidia for the first video but I think the Amd bundle with starfield was better personally
@@LprogressivesANDliberals Except Starfield is a boring piece of shit lol
What I can recognize of them, is that the "low and medium" settings actually looks pretty good compared to what you normally expect from low and medium, but definitely I imagine they could've done some better in performance for many cards
The issue with low looking good is that they're not pushing the graphics down enough.
Low is supposed to be low, but in these types of games low is just pseudo-high.
Performs about the same, looks about the same..Not entirely sure what the point of calling it low is.
Low should look bad and perform well, the whole reason you'd put the graphics down to low.
If low looks good and performs well, it's most likely because you have a good enough card not because it's actually doing its job.
I believe these were the devs that in other words admitted that they abandoned performance in favour of DLSS and FSR.
A game can look really good and perform well, DOOM is an example of that..But you have to bother with the process. As they admitted they had DLSS and FSR in mind when developing, they clearly did not bother with the process.
Someone below in a thread said it's because of artistic integrity that low doesn't do almost anything different..That might be the case. Which is in other words a middle finger to anyone with lower-performing parts like me.
It's crazy that this game looks worse than RDR2 and CB77, and I basically cannot run it, whereas I can play the mentioned games with a stable FPS and decent graphical settings. Remedy really said "fuck optimizing this game, slap an upscaler on it as a mandatory setting and call it a day". @@Herkan97
Well also doom "looks" great but it actually doesn't have that impressive and demanding graphics.
Yeah. I'd rather they make the low settings look worse so that it performs better. That's kinda the entire point. If they're not doing that then I don't know why they even included the low settings to begin with.
I'd love to see what 4k ultra settings is like at native when all of the settings are maxed (some were left on medium for some reason, including some that I think would be quite taxing). Same with raytracing and then path.
Looks like Remedy weren't lying about the system requirements, but to me, the FPS recommendations they've listed are more like minimums instead of averages and there's still a good amount of detail in their "low" quality texture settings.
The low settings are crazy lol
Maybe they'll make a lower settings model in the future
Low settings in this game exceed High settings in most games.
@@badpuppy3 yep, I was really angry about the medium settings recommendation on my 3070 but seeing how they look.. jesus
I highly suspect that those FPS recommendations _are_ averages, just in more demanding situations. Don't forget that Daniel has not tested how frame rates behave in a combat scenario.
@@badpuppy3That is not a good thing, it shows poor scalability by the engine.
it's very useful and interesting. Thank you for all your hard work!
Love the videos, keep up the great work! Would like to see how to the game would perform on the 1080ti.. would also be interested in seeing more content on the path tracing specifically..
30:07 It was not only at this point that you selected the wrong Ray Tracing setting; Even earlier when you thought you were testing the Medium RT Preset, you had actually enabled just the Medium Path Traced Indirect Lighting.
True. I've updated my pinned comment to mention this error. Thanks for pointing it out!
@@danielowentechI think you forgot to pin the comment, Dan! Just letting you know
Looks like editing the comment unpinned it. Fixed. Thanks.
@@danielowentechjust reupload the video
@@valerabaglej7437 give the man a break bruh he works as a full time high school math teacher and makes and edits videos all on his own.
Alan Wake 2 saw Cities Skylines 2 performance and said "Hold my beer!"
2023 and we're bending over backwards for 60 fps 😂😂😂😂
Great video. Thanks for the very detailed and informative details
Upscaling had 3 purposes. To make a 1080p card reach 1440p, to turn 60 fps into 100 fps and to give new life to old GPUs. Now is imposible to reach 1080p 60 without upscaling. We went back to PS4 pro.
Corruption.
I believe upscaling's only actual intent was to create scenarios where GPU owners felt forced into upgrades they don't actually need.
It's a tech that allows developers to skip the line in terms of what we can actually deliver in this generation, and it gives Nvidia a means of saying "Welp, you should really upgrade that old card you bought 9 months ago, which is top of its series lineup."
Maybe update your GPU
@@n.erdbeerso everyone needs to have a $1k GPU to play games at 1080p? What a shitty take dude
yup
It's exactly how I imagined it to be. I wrote a comment under Danie's other video. They simply don't want to compromise the artistic and creator intent by drastically lowering the graphics. Hence poor scalability. Low - high difference is 11 fps on a 3060 12GB.
It's classic Remedy. It was the same with Quantum Break, and it was the same with Control. Control was unplayable at 1440p on a 2080 with RTX. Anyway, game looks crazy good on low, even without RT. And because its path tracing only, RT will only really be playable for the 40 series cards.
i don't believe you're a real human. the game looks utterly terrible. its blurry and underwhelming at 4k on ultra, let alone 1080p with any graphic preset, ugly ass linear game@@gabber_
RDR2, RE4 Remake, Dying Light 2, and many more titles from a consumer perspective look nice and play excellent. yes from a technical standpoint, Alan Wake 2 is superior but after 5 minutes of playing those advanced techniques will be forgotten and nobody cares while struggling with low fps and input lag. yet I'm not gonna push my 3070 to the roof to play this while marginally looks better than those mentioned titles.
Sad thing is Control looks better, likely because of the lack of up scaling needed for Control. People are going on like this is the new Crysis, when in reality it doesn't look noticeably better than most modern games, and even some from 4+ years ago have basically this level of visual quality.
Starting to feel like people are getting used to fuzzy, low frame rate games. They are forgetting what 1080p is supposed to look like.
@@Stephen__White lack of up scaling needed for control??? lol most people don't have 4080s man. On a 3070 control DEFINITELY needs upscaling, but with it on, game looks and runs amazing. I'm certainly not complaining how control runs, but not using upscaling is foolish
Oh wait. I forgot a lot of you PC gamers still use 1440p monitors. My bad.
@@HugoStiglitz88I played control on a 2080 super with no fps issues
Are You joke? Dying light 2? Trash optimize. Assasins creed mirage its the best optimize and open world.
And also take into account that the new gpu's run extremely hot in some games on stock settings, just now friend of mine has 4090 and it goes into the 100 celsiuses and he has to make a lot of adjustments which an average user would never know how to do, or what to do at all when they encountered an issue like this. Its mind boggling how not user friendly everything is and more and more demanding.
Wow, man, so much commitment! Day job, children, powerlifting, and he still finds time to do this for us. Respect!
Thanks for taking the time to make this video.
Thanks for the preview Daniel! good luck with the classes. So here we are 2023 and 30 - 40 fps is now "ok" LOL. There is always a setting that will work for people and get better than 60. I remember when Jensen walked around on stage repeating over and over.... frames win games. Ok then. But seriously. This is a heavy one.
Can't even achieve 30 fps on the lowest possible settings with very aggressive upscaling on a 1070 is insane
that card was state of the art 7 years ago... time to upgrade. coming up on a decade old lol.@@The_MEMEphis
Point and Click Games will be back soon, running on impressive 5fps and DLSS Ultra Performance.
I was hoping they, and in deed all studios, would bring out a benchmark to test it at home, however i`m grateful for all your work to see what i could realistically get :)
Love how this channel tests a variety of gpus instead of just focusing on top tier
It's just sad what the pc gaming world has become, all about graphics and not performance. Either that or they just directly port the console version to pc without optimizing it at all. They just cash grabbing with most games now a days, we shouldn't have to upgrade our systems just to get 60fps, it's ridiculous
Thank you for putting this out so quickly I’ve been trying to see if a 1070 Ti would even boot the game
My GTX 1080 and i7 8700K @4.2Ghz, cannot run this game with locked 30 FPS.
The opening stinger sequence showed a little promise, where the framerate was above 40 in a dark woods.
But as soon as bright volumetric lighting (flashlights) kicked in the scene, the performance tanked to 20s and teens.
After the story stinger, the two FBI main characters enter the scene, it's in the forest at daytime, and I never managed to get a framerate above 26, and that's with everything absolutely at the lowest, and at 720p, with FSR2 set to Ultra Performance.
The game looks like smeared sh*t and I still can't get 30fps.
So forget about it, friend.
Bro I also have a gtx 1080. It's rough. I'm almost giving up on gaming lmaoo @@felisasininus1784
@@felisasininus1784 What resolution? (internal? Native?)
@@AdrianMuslim Already said, 720p, that's display resolution, and FSR 2 at ultra performance, so that's very low internal resolution.
No, of course it's not native. Doesn't matter, can't run this sh*t.
10xx series dont support mesh shader. So a 1650 can run it better a 1080ti
Really appreciate your videos!
Was looking forward to this. Cheers
Hi Daniel,
I liked your video, but I would have liked if you tested the settings for the RX7800XT or the RX7900XT (for example) in this video, instead of focusing only on Nvidia GPUs. Just to get an idea.
the xtx as well please!
At 42:35 he said he wanted too but video was already long and had to teach in the morning.
@@CMJPSYCHO
All he had to do was test a Radeon instead of one of the five or six Geforces he tested in this video, and it wouldn't have taken any longer.
It's gonna be better performance than it's nvidia counterpart without RT, and tank the performance with RTon, just like all other games. what else are you expecting? 🤷♂
@@cariboumagique2234Probably would have been significantly more complicated due to drivers shenanigans.
The Postprocessing setting was always set on High, no matter what graphic preset you choose. I believe if manually set to low, plus turning off motion blur and couple others options would squeeze some more fps at lowest settings.
Motion blur makes game feel smooth.
@@AdrianMuslim Subjective..Just seeing the video you can barely see anything when he moves thanks to motion blur.
For testing it should be off so you can actually see and get a better understanding of what the settings do, we're not here for feelings.
One of the reasons I couldn't be arsed with Forza Horizon 2, not only 30fps but I can't see half the time...Motion blur also causes the feeling of slowness.
In all my time playing, 10fps unstable, 10fps stable, 20fps unstable and so on..Never have motion blur done anything to smooth out the image.
Should never turn off important features like Motion Blur.
That's literally the first option I turn off in 99% of games, it looks like dogshit and robs FPS@@aweigh1010
@@aweigh1010 How in the world can you consider motion blur important? 90% of us disable that in games before doing anything else. Motion Blur, DoF, vignetting - auto-off in all my games.
The mouse panning stutter was much better below 85 fps and when setting it to the lowest possible ingame mouse sensitivity and instead raising sensitivity on your mouse, that was in Control. I expect it being the same here since the engine hasn‘t changed much.
Amazing work as always!!!!
Thanks for all the hard work for this video. Will you b producing a Radeon-based series of tests as well?
Have you gotten a chance to look at the new Intel APO feature on the 14th generation CPUs yet? Looks like it can boost game performance by up to 20% and have seen good results.
Supported by only two game? And 7800X3D beats 14900K in most games.
Thanks for sharing this why i want buy it everyone should wait to buy it especially on pc and see the benchmarks thanks Daniel Owen 👍
Amazing work as always! Hope my GPU will do its work 🤞
it's not just dlss itself having a performance cost, but could be some effects actually being higher quality when higher native is selected regardless of dlss. One example of a setting that is supposed to be corrected for upscaling is bloom(though it has negligible performance cost)
Hope the 7% of pc players get to enjoy this game
90% were never going to buy this game no matter how it runs
pirate or die
Not that remedy care they already got million to keep the game of Steam. remember Gearbox got 120 million for Borderlands 3. @@ms3862
4090 can't handle 1440p at max settings lmfao what a joke of a game.
yo ! Daniel , i mean Mister Daniel , your videos are always much informative and direct to the point than many so called tech CZcamsrs , i rock an RTX 3070 Ti in my system and you did comfort me here , i liked this franchise back from 2010 and i'm very exited now to play this sequel now that i know that my GPU can keep up with the new version of the Torchlight engine , keep up the god work mister , and take care o your kids and students !
So excited for this video! 👀
DF just released the PS5 review and it looks like Performance mode is using the PC equivalent low settings at 847p upscaled to 1440p (balanced FSR2). The 3060 is slower than a PS5 and achieves 50+ with those settings. So it doesn't look like a bad console port at least, just a very demanding game that's not optimized for anything below PS5 equivalent PC hardware.
Seems like an okay console port and a shit PC port tbh
@@HugoStiglitz88 Did you even read my comment? PC performance is exactly equivalent to PS5 and they added features to make use of high end PC hardware.
Alex tested a 3070, not a 3060. The 3070 is significantly faster compared to a PS5.
@@dampflokfreund I wasn't referencing Alex's tweet, only Daniel's and Oliver's video on DF.
Seems like bad optimization tbh.
Just letting people know, according to digital foundry, the woods area where the 3070 gets tested is the most demanding area in the game that saw the most drops in performance
I just want to say thank you i know you work hard and just want to tell you we appreciate it 👍
EXCELLENT WORK!
Thanks! was really interseted to see how this would run on 10 series, if at all. Surprised there is no graphics glitches! This feels like the game to finally kill those gpus. like the day quake killed the 486. Truely a historical moment lol.
im not ready for my 1050 laptop to go to the grave yet. sorely in need of an upgrade.
@@LutherTaylorwell,i have bad news for u sir,u should probably change It by next year
@@LutherTaylor ngl, It probably already should've lol
My GTX 1080 with i7 8700k @4.2ghz cannot manage a locked 30 fps, even when set to 1280x720 display resolution and Ultra Performance FSR2.
At that resolution the game looks like smeared sh*t, and still, 20 something fps at best in the woods with daylight.
@@felisasininus1784 yeah thats nearly my setup but with 5820k. Had a good run. i think were in a transitional period where the next gen or 2 of gpus will feature some crazy dlss version that makes 540p upscaled to 4k look amazing and thats where things are going lol
the good news is 120 fps in menu
starfield didn't have that
Pretty sure it did. That's the only place most people will get to see that game get 120 FPS, in fact.
@@HunterTracks hmmm must be some kind of animation issue then; there was so many 60/120 FPS mods to smoothed it out to make it feel like 120 then
Pretty bad for Bethesda if their 120fps menus are stuttering
Thanks Daniel, you saved me 20 minutes of settings faffing tomorrow.
Hi Daniel. Shame you didn't test it on any of the top line AMD cards. Will you do a video for the 7800XT and 7900XTX?
I dont think the game looks good enough to justify having the highest spec requirements to date. Still looks amazing though!
Yeah A LOT of people are saying this game is insane and that it's pushing graphical boundaries... I just don't see it.
@@DelgaDudeI agree there isn't much of a graphics boost that’s revolutionary, as some people are saying.
@@DelgaDudeyou're absolutely insane of you think this game maxed out doesn't look better than literally any game ever made like you actually don't have eyeballs
i think we are just blind, or these companies are forcing gpu upgrades in our throats
It's definitely not better looking than last of us part 1
Oof 70-90w draw on the 1070? Something's severely limiting the usage which really should be maxed on a card that old to have any chance
Mesh shading. The 1070 can't render it properly.
I was curious to see what AMD was like as well, hopefully well get another video soon!! Thank you for this one!!
Thanks for the video. I would love to see more om the 4070 vs 3080-12gb in this game with PT and 1440.
I'll buy the game in 5 years when there are cards that will do higher resolutions. Plus the game will be $10. Thank God for my massive game backlog.
Lol. Disadvantages of high resolution. This is why 1080p is the better than 1440p & 4K, especially with high end GPU. You can play whatever you want, whenever you want, however you want with more frames.
I can appreciate that the game looks good no matter how low you have the settings. I just wonder which settings could be dialed back even further (beyond what the game settings allow) without really hurting immersion. I suppose we will have to wait for a "potato mode" mod or something.
Exactly my thoughts!
Just the first game to baseline the PS5. If ones PC is weaker they can't play. Finally.
Yeah, i'm sure someone will tinker with an ini file to get the games textures to render at oil painting resolution so people who want to run games on outdated hardware can enjoy...that.
@@baronsengir187
Not sure why you want people to have to upgrade to play a game. Maybe if we're talking consoles, sure. But PC, there's little reason to not make it as accessible as possible.
Making a game look good on high settings doesn't necessarily mean you have to raise the minimum requirements for low settings. The shader tech not supported by older cards is one thing, but this game doesn't seem to scale much even on current gen PC hardware.
@@MrNside You serious with the "no reason"?
There are tons of reasons. Even a little CPU instruction change in the hardware can boost the rendering by many percent. Like with Intels SSE waaaaay way back. At some point you have to let go in order to get rid of inefficient code.
Love that you are always out here doing the work for us PC people. You never leave anything out that I would want to test myself. I wish someone could explain to me why there are so many games coming out that dont seem to have the visuals to back up the insane system requirements. Is their seriously some sort of scheme from Nvidia with Devs to force us to buy 40 series GPUs? DLSS is amazing, but im getting so sick of it as a crutch. Also going back and replaying Cyberpunk just this week, I dont believe anything looks better than that game, and it actually runs really well. So again, confusion. Control uses the same game engine, and looks incredible, and I maintain over 120fps in that game with all the insane destruction and particle effects. Someone make it MATH! I wish I would have started a youtube channel 4 years ago when I planed to, to do exactly what youre doing now.
just shitty youtube bitrate. All games looks far worse than irl
The hardware cost of improving graphics isn't linear, as we get closer to true photorealism we need to rely more and more on fully simulating graphical elements, which is incredibly taxing. Many of the tools and tricks that developers used to approximate stuff like lighting, shadows, and reflections are no longer relevant now that the bar has been set for more complex techniques. Consoles are also much closer in raw power to most PCs these days, and that means games are being designed from the ground up with a higher hardware target in mind. There's also the whole pandemic thing, dealing with the transition to remote work, and the general unrest in most labor forces around the world, especially in creative fields. So there are many reasons that games have been releasing with less than stellar optimization, and I don't think any of them are an Nvidia conspiracy theory.
I also have to emphasize that writing good software, and especially something as complex as a videogame, is really fucking hard. I don't know if you've ever programmed anything, but if you have you'd know that even something as simple as a calculator app can be a challenge. I sincerely doubt that the people actually working on these games were incompetent or lazy, far more likely they simply didn't have enough time or money allocated to solve the incredibly complicated technical challenge of optimization. And yeah, it would be great if devs got infinite time to polish their games, but sadly that's not how it works when you're beholden to shareholders. I'm sure they've run the numbers and determined that they'd lose more money with a delay than releasing the thing unpolished.
@@myel8531 all your above points are totally valid and predominantly true in most cases. The issue I have is when there are older titles with better visuals and more complex game worlds that run significantly better than a newer title (AW2) on a seasoned game engine that at this point should be mature and optimized. That being because the dev team knows how to get the most out of it. So when I see 55fps on a RTX 3080 using DLSS at 1440p in a tiny forest environment in Alan Wake 2 vs the open world ray traced city of cyberpunk I can't help but scratch my head how one is doing it so much better. So like one of your points, either not enough time to optimize, or maybe not utilizing engine assets intelligently. Also another huge frustration is actual game settings have less and less impact on performance scalability with the only setting with any big impact being some form or ray tracing or image reconstruction.
Also the current consoles are no where near the power of current PCs. Closer than the past, especially with half step consoles like previously, but a 3080 is practically 65% more powerful than the series x or ps5. It's like 30tflops vs 12tflops. I KNOW, NOT EXACTLY THE BEST MEASURMENT.
I guess I just want well optimized and beautiful fun games. I'm sick of having to upgrade my pc 2 years later just to finally experience all my favorite PC games fully maxed out and above 100fps at all times.
Let's not even talk about 4k. 4k is basically a lie in 90% of games and only achieveable with imahe reconstruction or dynamic res. Yet we've been acting as if 4k gaming has been a thing for years. Maybe in Doom Eternal or rare gems like that. But try that in Alan Wake 2 or starfield and your 4090 slows down to sub 30fps. THATS NOT HOW THIS SHOULD WORK!!! Especially since starfield looks like fallout 4 with some lipstick.
Good analysis. Next time try to max out also the volumetric lights that with the high preset remains on medium.
So, a game requiring a $2k GPU and aggressive upscaling to approach 60 fps with RT at 4k with high + RT? And requiring frame generation to actually get there?
Yes, because it's doing things we wouldn't be able to do in 10 years without AI. You don't get it.
I'm starting to think that all the game developers are only catering to the content creators with their high-end computers. I for one won't even buy games like that. Why should I play games at 40 fps when there are so many older games that are usually even nicer look and run at 100 fps+
superb video..thank you
Very Underrated Channel You Remind me of a Young Thio Joe
even with the low settings look that good , it doesn't look "below 30 fps on a 1070" kinda good. like seriously it doesn;'t look much better tahn RE2R or RE3R or RDR2 for that matter. and all those games run great at 1080p on a 1070. what the hell is this game doing taht it's low settings are so demanding?
i wonder how a 3090 or 3090 ti would compare to the 4070 since they trade blows , yet the 3090 and 3090ti both have a ton more vram.
ill buy this in 10 years when it is playable.
also it will be outdated at that time. It's like playing Crysis 1 today.
It's playable now. Did you even watch the video? The 10 series is almost a decade old. They make games aiming for console specs, so maybe you should just get a console. PS5 has been out for 3 years now.
My bro 🐐🫂
@@itsaUSBline if I want low crappy frame rates I would buy a console. I spend good money for good performance. this is hot garbage.
@@itsaUSBline
Bro digital foundry just uploaded a video talking about the performance in console and it's really bad wdym
Stop defending bad optimized games
Good video, thanks!
I'm curious how a 5700 XT would perform.
My computer does use a 1070 graphics card. After seeing this test, I decided that I needed a budget to buy a computer. It's time to give the old computer to my sister.
I sigh that time flies by so fast. It seems like only yesterday that I was given a computer.
Why arent you doing 4070 at 1440p instead of 1080p?? It's not a 1080p card
power usage on the GTX 1070 is suspiciously low (jumping between 70 and 100W with occasional 140W peaks for a 150W TDP GPU). Looks like it is not properly utilized (even though clocks and load % looks okay) which imho hints at lacking driver support/optimization.
Not sure it would make a big difference if it was fully loaded but I found that interesting.
It's because it doesn't support Mesh Shading.
I always wondered whether it was the hardware OR software that was holding up advancements in gaming. I think they should push the boundaries, but only if there are perceivable benefits.
Love games like this where I can slow down and enjoy the visuals, especially with RT. At what point would it be worth it to turn on frame generation in games like this? And what preset/settings are you running to achieve your desired framerates? thanks
I wonder how 5700 xt does in this game. It should theoretically outperform the 2060, but the question is if missing mesh shader support makes it perform worse.
Sadly yes, I saw someone that testes saying that in this game 5700XT is way below 2060
RIP my 5600 XT :/
@@DescomplicandoHW This is, unfortunately, the future of how a lot of devs will release games. The console RDNA2 hardware is now the minimum. That means any graphics card older than what XBSX/S and PS5 use will not run new games well. It doesn't have to be this way, as the Rachet and Clank PC release illustrated, but that seems to be the business decision a lot of these companies are making.
@@hueypautonoman One game uses a feature not supported by RDNA1 and suddenly you think that this "seems to be the business decision a lot of these companies are making"?!
I would suggest an alternative theory, which is that most AAA games aren't going to require mesh shaders, or other features not supported on RDNA1 or the GTX 1000 series for proper functionality over the next two years. I would guess that a relatively small number of games which don't work properly on those older architectures will be released over the next two years.
Games which really push hardware hard like Alan Wake 2 have been extremely rare in the last 10 years. There does seem to be more of it in the last couple of years, but it's still extremely rare for a game to not work properly on an architecture that's only 4 years old.
@@syncmonism It's not just mesh shaders though. A lot of these new UE5 games are also listing PS5-level GPUs as their minimum system requirements. I'm talking about the performance level. So, a 5700xt might be able to run some of these new games, but less powerful GPUs will be below the minimum.
I even mentioned Ratchet and Clank to clarify that it's not being done across the board. Some companies are just making that business decision, probably to cut costs, since optimizing for a wider range of hardware takes more manpower, and they're greedy capitalists.
I am using 5700xt I was getting about 18 - 20fps with 1080p all setting pinned to LOW. I refunded the game as it is just unplayable at that frame rate. I love my 5700xt though it looks great on all other games
I think Remedy made a typo in the recommended specs for the RTX 3070. They wrote 1080p+DLSS Performance but they probably meant 1440p+DLSS Performance.
I don't think so, look below it's recomanded for 60 fps as the one for the 3060 is recomanded for 1440p 30 fps gameplay.
Yeah... looks pretty impressive. Getting strong Stephan King vibes from that small country town and misty wooded hills.
Is 1080p the lowest native resolution you can go, or can you select 720p with a render resolution of 540fps to squeeze out even more performance? I wish to hold onto my 1070 a bit longer, since the computer cost me close to $2,000 back in 2018.
I don't know where you live, but RX 6600 XTs have been as low as 220 USD. 6700 XTs have been as low as 300. 6800's have been around 400. 7800 XTs can be found for 500, and RTX 4070's can be found for 550.
If you are willing to pay 500 or more, then you might want to wait for the RTX Super/ ti refreshes to come out. That should help to push prices on current hardware down more, and then hopefully there will be a 16GB version of the RTX 4070 (with more memory bandwidth as well), which would be really nice.
I'd get it if the game looked next gen++, but... is it just me or everything looks extra shiny like early UE3 games that have their specular maps cranked to eleven?
The materials too look low detail and rubbery: take a screenshot of a rock and a tree trunk and they're borderline the same! Wtf is happening?! Where does all the performance go all of a sudden? There has to be a reason, since I doubt that every other major game developer studio just forgot how to make games at once.
36:19 Basically you need an RTX 4090 just to run 1080p native Ultra with 60 stable fps. Pc Master Race *4na1 Wake* more like. :D
Some testing on AMD 6000 - 7000 would've been appreciated. Nice work though!
Here I was thinking to myself upgrading from a 3090ti to a 4090 was overkill, but damn, it really made a lot of a difference in playing recent titles in higher resolutions, settings, RT and with a bunch of mods, Reshade, etc. Still, I would not have expected games to become that taxing that fast after the release of the 40/70 series and new consoles. I do expect it to slow down at this point for the next years, since games still have to be able to run on a PS5/Xbox series s/x with reasonable settings.
Its impossible to me to even get an 3090 ti
Remedy is my favorite developer. I am playing through Control again right now with RTX on and high settings, it looks absolutely insane for a game that has been out for a while, can't imagine how good this looks.
Control is gorgeous. Also one of the best games ever made imo. The world design and storytelling is insane. The combat is great. Its basically perfect
@@zaidlacksalastname4905 No its not. Graphics are the only part of Control that's better than average. Perfect is a way overexaggerated way of saying "I dont play much games and dont have a taste so it amazed me"
It would good to see how AMD cards works at this Nvidia sponsored game.
Prob still fine. Depends on the driver team like Starfield’s abysmal NV performance.
Oh, there's probably a reason he got early code to review performance without AMD present!
Fine without RT
would be cool to see a 5700 XT 8GB... EDIT: just checked, not a single amd test on youtube lol
Works just fine on AMD using MESA on Linux though Heroic. Can't comment on Windows performance but its probably fine.
I'm not surprised about RDNA1 missing out. The lack of hardware RT/VRS support was a matter of time to bite them back in the ass.
Darn, was waiting on that 4080 testing. Great video though!
The hell with this DLSS/FSR nonsense.
It should only be needed for older or entry level GPUs, not so $800 cards can barely hit 60fps at 1080p.
Any game developer that thinks this is acceptable should be ashamed of themselves for releasing what amounts to a broken game.
The game should have a an ultra low/ potato mode settings since low settings do not seem to reduce performance and aesthetics by much.
As he mentioned in the video, the low settings are likely targeting consoles, and since that means targeting the PS5 and Series X instead of the PS4 and Xbox One, low settings threshold has gone down. Basically consoles improved faster than most people's PCs did.
@@itsaUSBline cause stupid people will buy shitty 400 bucks gpu that cant run shit instead of console priced same as just shit card...
@@itsaUSBlineyeah low now is basically 2080 performance. Maybe even 2080 super.
@@ancientflames Yep, Consoles are the minimum now and they have RTX 2070S/2080 GPUs in them. It is not the devs fault that PC players didn't upgrade. If your GPU is weaker than RTX 2080 then you have no right to complain considering how good the game looks. People on GTX 1000 series or RTX 2060/2070 complaining are the same as PS4 players complaining why Spider Man 2 is not on PS4. Because time goes on and hardware improves. You either upgrade and catch up or get left behind and play older games instead. I am tired of hearing the word "unoptimized" being thrown out like nothing.
@@yusifalizade3394 Some games do look mediocre and perform like ass though even on €600 cards.
The best way is to war 3 or 4 years after release. Hardware becomes cheaper and affordable for the games.
I hoped that FSR and DLSS will break the rule. I loved the time where upscaling was available and the games were not much more demanding. It was nearly possible run new games in iGPUs like Cyberpunk 2077 and Forza Horzon 5. On a 5700G the games run acceptable good. Like 40-50fps with FSR Medium at 1080p.
Too bad that game studios find out the benefit of upscaling and use it for optimization. So a game what usually run at 1080 is running now at 1080p with FSR/DLSS performace and need the same new hardware.
got a 3070ti and Im happy with that fps. can't wait to play it!
I don't get it. This game does not look good enough to put a 1070 or a 2060 to its knees. At all. You have to have a $600 video card to even get TEN frames over 60fps at 1080p. 2023.
I don't get it.
The way some people were talking about it Lol, I thought my mind was gonna implode just by looking at the graphics
Benchmarks were released showing the 4070 Ti hitting over 80fps at 4K using frame gen and DLSS quality. Which is better performance than Cyberpunk with RT overdrive 😂 people are mad cause they don't understand the cost of path tracing
Sure, but how many people are playing Cyberpunk with RT Overdrive? It feels like all of the performance optimization in this game has gone towards RT, which is not the right way to use those man hours if you ask me.
Was gonna comment "careful amd fanboys are gonna attack you" but then this huntet guy shows up, yep
@@Eleganttf2I don't know who you're trying to suck up to here, Nvidia or Remedy, but I can assure you neither of them will be impressed.
Amazing video.
I would like to see comparisons between similar GPUs such as the RX 5700xt and the rx 6600/xt or RTX 3060 and GTX 1080 to see what kind of a difference mesh shader support has.
And/or the classic AMD crying in path tracing