Should You Buy A 4080 Super for 1440p Gaming?
Vložit
- čas přidán 4. 07. 2024
- Is the 4080 Super a waste when gaming at 1440p? Is it overpowered, making it an overly expensive choice when compared to more sensibly priced cards on the market?
We’ll be testing a variety of games at 1440p, including Diablo 4, Cyberpunk, Starfield and Gray Zone Warfare to see if a 4080 Super can be maxed out when pursuing a high end, high refresh rate experience at 1440.
▼ My PC Specs and Gear ▼
DISCLAIMER: I may earn money on qualifying purchases through affiliate links below:
CPU: AMD 5800X3D amzn.to/49giGm5 (affiliate link)
GPU: Gigabyte Nvidia 4080 Super amzn.to/48fzIQj (affiliate link)
RAM: 32GB Corsair Vengeance RGB 3600 CL16 amzn.to/469E7nM (affiliate link)
Motherboard: MSI B550-A Pro amzn.to/46ez45C (affiliate link)
SSD: 1TB Samsung 980 Pro amzn.to/3LHvg4D (affiliate link)
PSU: Corsair rm850e amzn.to/4byHacl (affiliate link)
Case: NZXT H510 amzn.to/3RDZyZP (affiliate link)
Thanks so much for taking the time to watch this video. If you liked the video, please consider leaving a like, it helps the channel more than you can possibly imagine :)
00:00 - Introduction
01:03 - Diablo 4 - RT Ultra
02:10 - DIablo 4 - RT Ultra - DLSS Quality
03:49 - Diablo 4 - Ultra
04:20 - CoD Warzone 3 - Extreme Preset
06:30 - CoD Warzone 3 - Ultra Preset
06:59 - CoD Warzone 3 - Balanced Preset
08:39 - Forza Horizon 5 - Extreme Preset
09:41 - Gray Zone Warfare - Epic Preset
10:48 - Gray Zone Warfare - Epic Preset - DLSS Quality
11:58 - Gray Zone Warfare - Low Preset
12:50 - Gray Zone Warfare - Epic Preset - DLSS Quality - Frame Gen ON
13:47 - Starfield - Ultra Preset (No Upscaling or Dynamic Res)
14:31 - Starfield - Ultra Preset - DLSS Quality (No Dynamic Res)
15:09 -Starfield - Medium Preset (FSR 3)
16:27 - Cyberpunk - Ultra RT (No Resolution Scaling)
17:32 - Cyberpunk - Ultra RT - DLSS Quality
17:59 - Cyberpunk - Ultra RT - DLSS Quality - Frame Gen ON
18:31 - Cyberpunk - Ultra (No Resolution Scaling)
19:04 - Cyberpunk - Ultra - DLSS Quality
19:32 - Cyberpunk - Ultra - Crowd Density Low - DLSS Quality
20:32 - Counter Strike 2 - Very High
21:50 - Counter Strike 2 - High
22:50 - Counter Strike 2 - Low
23:54 - Halo Infinite - Ultra
24:30 - Closing Thoughts
#4080 #1440p #4080Super - Věda a technologie
Not even a 4090 is an overkill for 1440p anymore when you got high refresh rate displays and super GPU heavy stuff like raytracing
I think the 4090 barely hits around 25 FPS with path tracing in 4k. RT and PT are still too much for the hardware of this generation.
@@Greenalex89 ye. RT is cool and all but I say it's still 2 GPU generations away from maturing properly in order to be consistently worth using on mid to high end tier GPUs. Path tracing is literally a benchmark at this point, that's at least 3-4 generations away. Both RT and PT are cool but they're usually still not worth the performance hit
@@Greenalex89 exactly! People forget about PT and PT is literally the future. Honestly though it is not this far away from possible BECAUSE OF frame generation!
As long as the cards can pull off 60fps, FP will bump it 120fps. So I really am expecting real PT performance in the next 3-5 years
All depends on what kinda monitor you using and what games i have a 1440p 360 hz monitor so rtx 4090 is perfect for that
4K native / 8K native Is what I care about most , pathtraycing etc Is cool but I can live without It. Don't want to rely on any DLSS/FG tricks , native Is always best and you can tell the difference.
I'd rather have an overkill card now so that down the line it's still a good card.
make sense to me. pay double now, you will have premium experience and no need to upgrade any time soon.
I would disagree. Hardware (and more concerning - A.I.) advances too fast at this point. It's also too easy to just sell your old hardware on the internet. There's really no point in getting something you don't actually need now a days.
Basically - Buying more than you need today banking on the idea it will last multiple generations is a hard gamble and just isn't worth the risk of an actual potential loss
@@ThisIsExile91 First, why 4080 super is not needed? We are talking about gaming PCs, gaming PC is not needed xD. Seriously, it is going to be premium experience, really nice fps, and card will last way longer than 4070 for example.
2. I think it depends how long you want your PC to last. For example, if you expect your PC to last for 6 years, you cannot expect it to run games smoothly in 4 years without a headroom now, so you probably want to buy something more powerful than what's just enough right now, that way it will be premium experience for 4 years, and 2 years of ok experience.
@@daniil3815 you completely missed my point. You can't predict how long a PC and it's hardware is going to last.
If you are getting about the same fps with a 4070ti and a 4080s at 1440p with DLSS on, it's objectively not worth it
@ThisIsExile91 I went from a 1080 to 3090ti so going premium for longer term is worth it. I'm considering getting one only cause I can put my current card in my kids rig.
I run a regular 4080, it is a 1440p beast, I run max everything in nearly every game I own, super happy....
Raw performance in every game for me...
I do not enable any type of frame generation software...
The 4080 is plenty powerful, make sure you match it with a good enough chip...
CHEERS!
Same here, also mine runs undervolted and silent, using max 220w delivering capped 144fps everywhere
No OverDrive path Tracing on Cyberpunk ? what an insult to the game.
Yeah, waste of a 4080.
Yeah and Path Tracing is much faster that ray tracing!
@@MrBalrogosmuch faster? no it need even more power
@@Gamba16374 r/woosh
Aye. I got the 4080S to play Cyberpunk with PT in 1440p :)
IMO, it's not an overkill, but rather a 'piece of mind' view set. You have piece of mind knowing that which ever game you install and play you can crank up the settings to max for the next couple of years.
Exactly this. With a 4080 Super you’re covered on any game in existence at 1440p at max settings plus if you ever want to bump up to 4k gaming the card can do that too.
@@RedShogun13 Not true , plenty of games struggle even at 1440p with a 4080s. You HAVE to use dlss for many games
@@Dempig 4080+DLSS or even Frame Gen is still a 4080 using its own feature though imo. Personally I count that as the GPU being able to handle it as it’s coming from the GPU. I’ve played Cyberpunk at 1440p native on my Pc with a 4080S and 7800x3D. I really think it would handle anything at 1440p with or without DLSS fine depending on what you deem acceptable FPS. For example if you’re someone coming over from console gaming that feels like 60fps minimum is perfectly fine then I can’t think of a game without path tracing (maybe Alan Wake 2? Idk) that a 4080S wouldn’t be able to handle native at 1440p where it would be hard to play/unplayable. I reserve the right to be mistaken but this has just been my experience with it. Like if your point is that it won’t hit every 1440p game in existence at 120fps native than you may have a point that I agree with but like for me personally I don’t mind approximately 60 or 90 fps depending on the game and depending on what the graphics fidelity is like.
You mean "peace"?
@@jocerv43 I did indeed.
Takeaways for me:
- 5800X3D is an amazing chip. Got one paired with a 6800 XT right now and it's great to know that I've got so much headroom to go in terms of GPU upgrades. 4080 Super is more than 50% stronger and it looks like this CPU will handle it just fine for 1440p :)
- Gray Zone Warfare and Starfield have no excuse to run this poorly lol
Also did you undervolt your 5800X3D? Mine needed undervolting to actually boost close to its advertised 4.5 GHz. Out of the box it ran very hot and throttled like crazy. Now it only goes past 80 C in stress tests, which should be a heavier load than Starfield. I have a Noctua DH-14 cooler on it.
I revived my AM4 platform with a 5800X3D, 32GB of 3600 CL16 memory and a 7900 XT. Just going from the old 2600X, to the X3D CPU- was almost like the difference between a HDD and an SSD for responsiveness and the overall smoothness of game play. The 7900 XT just took it to a whole other level. Most games are 1440p, high or maxed- and giving me at least 120 of the 240Hz my monitor does.
is your 5800X3D water or aircooled?
@@gzeus7534 air cooled. DH-14 is beefy tho, it's like the second-best air cooler Noctua makes I think
Is not past 80 C high? Max temp for AMD CPUs are only 90 C.
@@Magentak it plateaus at like 81 in stuff like AIDA that loads all cores at 99%, and my 3dmark runs usually report average temps in the high 60s. I think it's fine. It holds 4450 in games.
OP was hitting low-mid 80s in Starfield which is probably not good and could benefit from undervolting
What kinda Monitor have you got for this set up ?
What do you recommend for triple 1440 144hz? There is a ton of us running triples on our sim rigs.
get a 4070 or 7800xt (:
running a regular 4080, 32 gigs at 3200mhz, and a 5900x use an LG 65 inch c2 for a monitor. runs almost every game I throw at it at maxed settings 4k 120 Hz. since I don't play competitive shooters, I'm not missing out on any frames. totally happy with the setup.
is that coil whine in the background?
You briefly mentioned testing with the 5600X and it sounded like you were saying the CPU was a bottle-neck for ... ? I have a 5800X and wonder what your thoughts are on how it would compare to your 5800X3d in terms of CPU usage with that 4800 Super?
Ah yes, I have another video where I run the 5600X with the 4080 to see if it bottlenecked it.
So based on testing the 5800x3d and the 5600x both with the 4080, do you have an opinion on how well a 5800x will do at 1440p?
What air cooler you use for cpu i bought r7 5700x3d i dont know which cooler be good for it ?
An AK620 is pretty good, make sure your ram will fit. I had to raise one of the fans to fit.
Is it fair to say that turning on DLSS usually (always?) increases CPU usage in a noticeable way?
It may allow your CPU to work harder if the GPU limit is decreased due to DLSS
I use 4080 super for Full HD, what are you talking about, what overkill ? I have 180hz refresh rate and i play at avrg 200 fps, but some games like cyberpunk with path tracing run at 60 fps without frame generation
18:27 why havent you tested rt overdrive/pathtracing ?
4080 does 70fps in CP2077 with path tracing at 1440p w/ DLSS quality, there's your answer
is a 4080 super/4090 card & 7800x3d enough for gaming for 5yrs+? i only play destiny 2 and yes i know it's an overkill but i do stream as well and don't wanna think abt upgrading or anything for a long time. rn ive got a 3090ti and a 540hz screen thinking of getting a 1440p second monitor
Keep your 3090 ti...
Wait for 50xx series, keep the 3090ti, trying to upgrade from flagship to next gen is a complete waste of money considering the 3090 is going to give anything this gen a good run.
can someone explain to me how you're getting 156 average FPS on Warzone 3 with the extreme preset at 1440p but I'm dropping to like 30-40 fps
Dude showed us what we actually wanted to see. Real-word realtime results, not just a bunch of graphs. Thank you, dude. Bought my 4080 super oc tuf a couple months ago, and looking forward to getting my build done. I’ll be doing 1440p, and never with ray tracing
Currently rocking rtx 2080, it play everything I need. Yes I may not be able to toggle to the highest setting, but it’s fine for now. I actually risked getting cities skyline 2 and that with the patch’s run ok now. I’m going to hold off till the rtx 5080 come out and then I might grab that or 4090 on sale if the prices are insane.
just swapping to a 7800X3D will give you a substantial boost in fps ;)
Yup !!
Regarding the 5800x3d I would throw a -20 all core undervolt on there in BIOS. Saw it hitting TJMax at 90 C a couple times. However I have to leave mine at -17 for max stability.
What cooler are you running?
@@anitaremenarova6662 Noctua NH-U12A
I have rtx 4080 and I don't see is able to get 144hz for ultra wide 1440p without DLSS & FG.
I run a 4090 on 1440p ultra wide and I don’t always get 144hz. I like to try to hit the 100hz mark as that’s generally smooth enough for me
7800x3d 4080 super UW 1440p I get 90-100fps in cyberpunk maxed graphics with ultra raytracing and path tracing on with dlss quality and frame gen it runs smooth. I do feel a slight lag for the latency but it's not bad and definitely playable
Will 3900XT give roughly the same performance?
Not even close
How does he enable to reading on the top left?
MSI Afterburner
Yes its perfect for 1440p but i will say benchmarkers have confused the community with comparisons. Its alwyas ultrs everything jacked way up and people go oh so its not dojng 4k amazingly. Im like umm get it yourswlf. Use high mixed with ultra and its getting insane performance. You use it at 4k and high and it just gets insane fps. Then most show it with ray tracing etc. So just keep in mind the 4080 super is a monster 4k gaming gpu. Idk when this new age went to all ultra and ray tracing. When 4k high is stipp better then 1440p and it looks and feels fantastic. Turn ray trscing off snd its just outstanding. So guys if you get this or even the 4070 super and realize wow this is actually good at 3k thats because well you can adjust settings! Just like we alwaus have. It was never judged on just ultra yet this modern era is. Its ridiculous. Do what we always have mess with settings and get your mind blown.
Im a sim racer and in acc the most modern aaa sim racing title is pumping out 80fps on ultra. And over 100fps on high. Then with dlss i was able to get 140 to 150fps.
Also who wouldnt want to play 4k on singe player games? But yeah multiplayer use 1440p and enjoy. None of these cards are over kill for 1440p. But many do better at 4k then some may show. Im very happy whenever I can use even high settings at 4k and get 80 to 100fps.
Once I realized how well I performnace at 4k in my hobby I cant go back. Hasnt effected me at all besides enjoying the image more.
The only thing holding me back now is im cpu limited. So 1440p is hard on me i can do 4k better because my cpu utilization goes wau to high in 1440p. Thats on me though. Its what made me realize just how good these 4070s and up are at 4k.
I would never suggest 4k for competative shooters.
I got a 7900xtx for 1440 because it gives me plenty of vram and I won’t have to replace it for 5 years.
if anything doing a crude ton of tuning on my gaming laptop i was able to get a benchmark score from 3dmark of around 8751 which put me in their free version to being number 5 in it
What is the buzzing noise I'm hearing in your mic ? I hope it's not coil whine from your FE card because it would drive me absolutely crazy.
Seems fine the 4080 super.
I'm building a new gaming computer soon and I'm looking on 7800x3d and either 4070 Ti super or might go for 7900xtx. But mostly play cs2 and valorant and seems that many has issues with amd cards and cs2.
They dont have issues with amd cards
They have had issues at certain points so if that’s all you play I’d go for the 4070 ti super but if you play litterally anything else like story games I’d go 7900xtx as it’s much faster
@@An-intricate-man what issues have they had other than the Anti-Lag+ thing?
@@stefannita3439 yeah now that you mention that’s really it lmao and hasn’t had major issues for years
I have the exact same setup ram ar 3200 cl16. And I have same type of results and I, using 2 monitors one is 240hz lg oled and the other is 27 side 1080 p monitor. It works like butter.
Solid video.
Well I have a 240hz monitor. People should stop testing everything for 60 fps and claiming overkill for anything above. Monitor technology changed a lot too.
A 4080 Super is notably faster than my 7900XT. But, I'm having little trouble either maxing out my monitors frame limit (240Hz). Or maxing the settings in game, still maintaining good frame rates.
Not neccesarily true, an overclocked 7900XT can basically match the card and even outdo it in certain titles. If you don't care about RT it's a really good alternative.
Maxing AAA titles graphic settings to MAX, you aren't getting 240 fps at 1440p or 4K with a 7900XTX, let alone a 7900XT. Not even a 4090 can pull that off. Iff you are using FSR, and dial back antialiasing a bit, maybe.
Unless it’s an fps shooter, yeah 240hz on max 1440p/4k ain’t happening. My XTX can do 240fps on max on shooter games, but not big AAA single player titles. Sure it does well but 240hz on max settings is still a generation or 2 till then I’d say.
@@anitaremenarova6662you mean.. 7900xtx?
@@beilence No, I mean 7900XT. AMD cards can be overclocked a lot higher than nvidia ones.
I main a 1440p and a 1080p monitor as a supplemental. I also have an LG C3 and would love to play the upcoming Elden Ring DLC in the native resolution. That's why've I ordered a 4080. I didn't get the Super because there was a hefty discount on the OC Vulcan version. 4090 sounds great, but I can't justify the price to myself.
Counter Strike 2 has a FPS workshop. It's consistent and I recommend you use this for valid FPS benchmarks for this title.
My 3090 runs WZ @ 120ish extreme in 1440p. GZW epic @ 75fps with DLSS quality. I can't quite justify the 4080, I think I'm going to aim for a 4K setup, with a 1440p second panel for crushing fps.
I appreciate seeing the performance in these games!
Upgrade your CPU then i cant play in 1080p because my CPU is the bottleneck in 1080p for my GPU when i turn everything in ultra and max out so i have to play in 1440p get the 7800x3d
Too expensive for me so i copped a 4070 super off of ebay
still failed, i wold go with 7900 gre for 500 usd
@@hasiktak1544 I wanted the rtx performance for cyberpunk. It cost me £510, dunno what that is in usd.
@@UnimportantAcc That is a lot of grease burgers in US money
That is a very nice card , but they messed up with the 12g vram. So if you want envidia features and a nice card go for the 4070 ti super. 4080-4090 are overkill for 1440p in my opinion.
If you can afford any card then at that point none of this price to performance crap matters. 4070ti s, 4080,4090,7900xt,xtx they will do great to have fun and enjoy gaming for a long time.
@@John562Shalom yes bummer abt the 12g vram. I tend to play w settings on high and not ultra though, I turn down anything not making a noticeable difference.
The cost for a ti super was almost £200 more, I just couldn't justify spending 40% more for 15% more fps. Nvidia really went full Apple w this product stack 😒
I thought the monitor refresh rate capped the frame rates?
Caps what you can see, but not what is rendered. There’s a school of thought that even on a 60hz screen, super high frames have a benefit as the frame to see are newer frames, if that makes sense?
kinda surprised that my 5700x + 4070 super setup is doing about aswell as this in some of the games i play O_O
mainly starfield, cyber and diablo 4
would have expected this setup to do alot more considering the price difference
He is playing cyberpunk woth ray tracing ultra at 1440p.
@@adlibconstitution1609 correct i somehow failed to notice that
@@gmoddude12Something is wrong with his card, power draw is nonexistent when it should be 280W and up.
I thought it’s 4k card oh boy we getting robbed blind
it is a 4k card but not at max settings at native 4k though but will run most of the games at max settings with 60+ fps at native 4k then there is dlss quality and frame generation supported games at 4k to max your games and get high refresh rates
@@TheOne214 It's such blanket term "4K card" or "1440p card" or "1080p card". What do people actually mean by that? Performance of a card will differ from game to resolution to settings to FPS - and if that FPS is good or bad to you depending on what you are looking for and your display.
I run my 4080 non super at 4k native max settings on horizon forbidden west at 60 fps vsync, no upscaling or dlss etc, Runs super sweet. I dont notice any issues or lags. I do got 7800x3d and 32gb ram etc though.
i got 13700k and 4080 and easily playing any game in 4k on high/ultra + ray tracing + dlss quality without frame gen, always 60+ fps
@@ivananetdoma8382 if you use dlss, you are not playing at 4k. It’s no different than a console for 500 dollars.
Im having the same performance with my 3090 and 10850K cpu in warzone. NO need to upgrade, mind you im at 1440p at 165hz samsung odyssey g5 monitor and still more fps than my monitor can handle,... nice vid
I bought gamerock 4080 oc for my 1440p coolesrmaster 180hz mini led monitor.. it was worth it as i can maxed out every settings
I have a 4090 paired with the 5800x3d. I think it is fantastic combo. I can play anything at max settings native resolution of 1440p on an 34 inch ultra wide monitor, with zero upscaling. I capped my FPS at the monitor’s max refresh rate of 144hz. The only game I have to use upscaling is Cyberpunk. With that game, and full path tracing enabled, I peg the 144hz cap all day long.
The 5800x3d is a very capable CPU, paired with the 4090.
Your cpu is holding you back on GZW. I get around 130 fps with a 6950xt and a 7800x3d in that gane. Natively around 90.
I got a 7800x3d but a 3440x1440 165hz monitor so no cpu limit but will be good for a few years before cycling g
Thats my problem with the 4080:
U essentially buy it for the RT experience (or u would have bought the 7900xtx instead, cause more VRAM+cheaper). But in the very few games with actually hardware-based RT, the 4080 just isnt strong enough (as seen in CP2077). And by using upscalers u basically degrade your 4080 to one tier below of what you paid for. DLSS quality in 1440p is 1.707 × 960, DLSS performance in 1440p becomes full HD 1280 x 720 etc. Path tracing is even too much for the 4090 in native res 4k.
Dont get me wrong, the 4080 is a nice card, but it cant handle its own features without decreasing the resolution or using artifitial frames with other downsides.
I personally dont see much difference with RT on in CP2077 unless I stop and watch very carefully. Most other games have software based RT which is even worse. So, for my needs, going for a 7900xtx was a no brainer and the 24GB VRAM are more than enough to not make me lower the resolution in 2-3 years, because with 16 GB you will have to without a doubt (and I think thats intended by nvidia).
Yes the 7900xtx draws more power, but cmon..who cares? The difference is a joke, even if you play 8 hours a day, 5 days a week on full load. Do the math for yourself in your region. I undervolted+OC-ed my 7900xtx and it draws 280-320W on full load, never getting hotter than 50°C (65 hotspot) with I think about 3 fps difference in CP2077 compared to stock settings and 400+W draw.
So, imo, the only use for the 4080 is if you are an absolute RT enthusiast and you are willing to buy a new GPU sooner than later, and you work a lot with programs like blender (and you are scared of the 4090 to burn your house down).
If RT matters so much to you, go for a 4090, if you are ok with using upscaling, go for the 4070 ti super instead of the 4080. If you dont play the handful of games that use actual RT and /or you dont see much of a visual improvement, go for a 7900 GRE, 7900XT or XTX for raw power and longevity. Thats what I think.
But what does it matter what I think? Its your money! If the card suits your needs, thats awesome. Happy gaming yall :)
Same my thoughts...amd is better value...i dont like fake frames....So f dlss and fsr....And in raw performance 7900xtx is way better value...no brainer.....Ppl will tell dlss is better than fsr and all...But dude why would i use it even...i want native experience
That's incorrect, DLSS *UPSCALES* from a lower resolution using AI. You're not getting any downgrades, the only downside are the occassional artifacts depending on game implementation. Cyberpunk RT overdrive + DLSS quality is incredibly looking and in most games you won't even need upscaling for RT.
amd copium lol
@@Zombie101 Look at 3070 8gb lol....fuck of a card nowadays with that 8gb of shit ram...Its own ram is its bottleneck while paying double the price...naah am good with my raw performance and more vram
@@user-tb2zk2ip4f "Look at irrelevant example" nobody cares, 8GB was too little in 2020 already and anyone who bought it for more than 1080p is a fool. We're talking about good nvidia cards here.
I got two 4090 for 1080 gaming max settings on Minecraft . Doesn’t everyone buy a 4080 for 4k gaming ultra settings
How? nvidia killed SLI
@@Fiwek23452 Nvidia didn't kill it. SLI killed itself.
Well now try Hell Blade 2 using a ultrawide 1440p monitor without FG or DLSS with your 4080 and let me know if you still think it's overkill lol.
DLSS looks better than native, especially when you mix it with DLDSR(you should try it!). what do you not like about it if I may ask?
I feel like tons of people are missing out on DLDSR and combo it with dlss. I use it on a ultrawide 34inch monitor and 1.75 DLDSR.. it get much better
Had a few issues in warzone where some bushes started shimmering, so, thinking it was an enemy, I shot at it, but then revealed my position and got sniped! I’ll take a look into what you say, thanks
@@davidnott_ warzone is a bit different, instead of dlss I use fidelity cas. In warzone the cas works very well.
I got a 4080 Super a few weeks ago and judging by how much system requirements are getting increased every year for (AAA) games in a year or two it might be underpowered for 1440p. For me at least 60FPS is not "good enough" in terms of input lag so I always aim for at least 100-120FPS.
Something is wrong with your 4080, stock power consumption at 100% usage musts be about 300W, not 180-200W. Your 4080 Super is not performing as it should. You probably set power limit very low
its at 280s in cyberpunk 1440p just isn't stressing it and might be cpu bottleneck
Yep, this
But in Diablo IV with RT on it was showing 230 W with 99% GPU utilization.....40 series power usage is odd
*I watched another video of a 4080 SUPER playing D4 at 4K max, 1440p max with and without DLSS and RT. At 1440p they got 250-270 watt draw with maxed settings. But at 4K, they were hitting 270-320 watts.*
Looks like your CPU bound a lot. Time to upgrade :P
currently using it at 1080p, saving to get a great 4k monitor
13900k with 4080 serves me very well with maxed out settings 2077 with pathtracing and mods.
now what about the 4070 ti super?
Would love to test it. If only money grew on trees!
I have a 6750 XT and can easily push 1440p and I was just playing Ghost of Tsushima in 4K with FSR 3 and a little frame generation.
Only game I've found to be more taxing was Alan Wake 2, but that games engine is pretty unoptimized no matter your hardware. Though it's the exception and not the rule.
Alan Wake 2 has a build in software based RT. So you cant turn off RT even if you wanted to. Thats why it runs better on Nvidia cards.
Yeah I'm aware of that, it is a really good looking game though, I'll give it that
@@Greenalex89It doesn't, with RT off you get better frames on AMD cards.
I have a 165 hz 1440p hdr 1000 monitor and a 55 inch 144 hz 4k hdr 1500 (yess has all the gaming features for pc)so I want a 4080 for options, I have a 3070 build right now
i only got a 4080 bc i found a good deal on it and i just so happen to be stuck on 1440p lol
i'll get a 4k monitor down the line but for now im chilling
I have a 4070ti paired with a 5800X3D and its more than enough for fps games. I have a 1440p 240hz KTC oled and i try to saturate the 240hz whenever i can
Should of bought the 7900XT much better combo with SAM regardless the 7900XT stock is faster than the 4070TI.
@@UKKNGaming indeed it is faster. I bought the 4070ti in April of 2023 so the prices all around for the gpus sucked. But so far the 4070ti has been treating me well. I do prefer the more efficient power draw of the 4070ti though. But I mostly play fps games and it handles it well at 1440p. My last amd card was a r9 380x and it treated me well as a first pc build. I wouldnt mind going amd in the future. I love Amd cpus and plan to stick with them in the future
What if I want to play minecraft with mods
Short term, no. Long term, yes because games will get more and more demanding. You will also be able to unlock more of it’s potential with a cpu upgrade down the line (since it’s a cpu bottleneck). But again, a 1100$ 4080 is considered a long term card
Also to add on nobody will be able to buy a 5090 next gen because it will be sold as AI cards so unless you got a huge chunk of cash laying around and considering your luck around availability, YOU NOT GETTING ONE!
You were CPU temp limited, need better cooling. As it approached 90C CPU would drop power draw and not use > 90W vs 115W.
An overkill GPU at 1080p/1440p resolutions with FPS properly capped will run cool, quiet & energy efficient & the 1% lows won't be a thing to worry about, because the GPU is just casually jogging along instead of sprinting full-tilt & has plenty of processing power in reserve to to spare. Less stress, wear & tear on components.
I have a 7900xtx and a 5800x3d I cap my fps at 170 in diablo 4 and it consistently is at 170. If I uncap my fps it's around 300fps in 1440p ultra. Glad I went with amd lol.
Is the xtx much cheaper where you live? It's basically the same price as the 4080 super in my country, so the choice isn't very easy
@@Henrique-wv9xq over here xtx is about 120 or 150 euro less
Regular 4080 + 7900x for me is great for ultrawide 1440p gaming on newer titles at around 100-120 fps with higher settings.
It's an ideal gpu to pair with the new alienware 1440p 360hz OLED panel
EA WRC 1440p max settings = 65fps. 80fps with "car reflections" set to low. 150 fps with absolute minimum settings and DLSS ultra performance. Even WRC 10 i have to turn down graphics to get 200 fps (which seems to be the cap in this game). I got something like 380 fps average in Assetto Corsa benchmark (max settings), and in Assetto Corsa Competizione i get between 200 and 240 fps without DLSS and otherwise pretty good (not ultra) settings.
after using msi suprim x 4080 for 7 months and played almost 40 games
its 5 times powerful than my old 1070
its very fast for my 1440p monitor
its future proof
i paid 1525 usd for 4080
The problem is the 16gb VRAM.
Crazy how diablo 4 uses nearly 16GB.
Alan Wake 2 also one that uses around 11gb. Last of us, around 11gb. Cyberpunk 12.5gb.. Quite the games that easily go beyond 11gb without RT enabled.
Imo it depends on the 1440p and your expectations, ultrawide 1440p is a lot more demanding than nornal 1440p and thats why i got a 7900xtx
Even my 4090 doesnt play at max gsync refresh for the games I play at 1440p
I play with a 4080 aero in 1440 and record on 4k.
The 4080 can do this with no trouble
I strongly recommended
Youre not even mentioning the fact the 5800x3d bottlenecks the 4080super in 1440p and even in 4k.
The answer is - YES. 90fps is the bare miniumum if you have a high-grade monitor. The difference with less fp- honestly the difference when you can't hit 120fps- is visible.
Owner of a 7800x3d-7900gre- 34" 3440x1440 175hz oled screen combo.
You can see that on cyberpunk when he turns on frame gen the massive difference in smoothness
I have a Sapphire Pulse RX 7900 XT 20GB and 7800X3D and to play native grafics to 1440p is OverKill and is gráfic card to more 3 a 4 years i think
It’s not just the cpu unreal 5 engine puts a big load on gpu now.
9:51 me watching in 480 p “wow insane graphics beautiful”
RTX 4070 TI super, RTX 4070 super, RTX 4070 are the best cards for 1440p gaming
I have a 12600k/ RTX 4070 combo and it is working really well at 1440p.
Above 144Hz is 8 bit depth anyway, I could run 165Hz on my screen but every red will look orange again. Aint nobody got time for that.
The 15.2gb of vram usage in diablo 4 tho D:
Yea thats what we said back then with the 3090. Look at where we are now.
The 3090 still holds up pretty well today, its not like its garbage now
@@greaves8332 never said it was garbage. People mentioned it was overkill back then, now its just nice for 1440p gaming. The title is misleading, rather have an “overkill” card now than having to upgrade every year just to keep up with the industry’s standards.
Need to get myself a 40 series GPU 😅
Martonic! Loving your content lately, been quietly watching from a distance. Keep at it!
I just bought a 4080 super paired with i9 14900k and asus pg34wcdm oled 240hz monitor. I hit my 240fps all day... ID RATHER FUTURE PROOF.
i'm a huge fan of DLDSR with my 4080S. if i don't have the performance overhead i'll leave my monitor at 1440p, but anything but the most demanding games I run DLDSR 2.25x to get damn near 4k quality picture. even better when the game supports DLAA. i'm playing ghost of tsushima rendered at 4k with DLDSR on my 1440p monitor, DLAA, max settings, getting steady 120fps. looks so damn good
It’s not pointless. Why would it be pointless when it takes advantage of a 1440p 240hz display. Plus there are ppl out there that want to play with ray tracing at 1440p over 100fps in triple a games.
Well, the 4080 Super is only around 50% faster than the rumored PS5 Pro GPU.
Heavy Modern Triple A titles are heavy. (And yes, I know the reply of "In the games I play it runs at...")
First of all 5800x3d and 4080 super is gonna bottleneck the 5800x3d is took weak for that gpu at 1440p 😂 a processor bottleneck is not good .A processor bottleneck is considered worse than a graphics card bottleneck. With a processor bottleneck, the utilization of the processor reaches its maximum capacity, potentially causing other programs running in the background to suffer performance issues. This can lead to reduced responsiveness and efficiency in multitasking scenarios.
Additionally, due to the processor bottleneck, you may not experience the maximal performance capabilities that your graphics card offers. The graphics card's potential will not be fully realized, resulting in potential limitations in rendering graphics, frame rates, and overall gaming experience.
Raytracing in D4 is a big joke. It's so unoptimized, hammers the CPU and you can barely tell the difference on screenshots let alone when the whole screen is covered in explosions and spell effects. Now that they disabled FrameGen (was causing crashes) it's not even worth enabling on a 4090 for 1440P.
I turned RT off and it's just soo much more responsive and smooth at 144 FPS.
The RT reflections are very low resolution btw and cause alot of stutter for no reason.
Imho the only real difference RT would make in D4 would be global illumination.
mister... I have 3090 and I've got 28 fps in 1080p in modded Skyrim with raytracing...
I use a 4080S to game on 1440p ultrawide. I still wish I had more. HD2 on ultra runs about 100 fps avg. MW3 MP runs between 170-200 fps on ultra. It's wonderful.
Nvidia still have ray tracing homework for rtx 50s
I am cursed (blessed?) with framerate, resolution, and ray tracing blindness. I have a 2k 144hz monitor, a 4k 60hz monitor, and a system powerful enough to max out both. But if I'm honest I can't tell the difference between 2k, 4k, 60, or 144 fps or raytracing either for that matter. I can tell the difference between 1080p and 1440 but that's about it. Or maybe I just need glasses.
That's one hell of a low power target on your 4080s. My 4070ti non ti pulls about 5 less frames at 1440p native in d4, but also runs @280w and up to 300w when overclocked. My first thought was cpu bottleneck for you, but the 5800x3d should be leagues faster than my 11600k @ stock. RT must not take much advantage at all of the x3d part of amd cpu's.
My second choice to the 4070ti was going to be the 4080, but it was $500 CAD more...I could have also bought a 4060 along with my 4070ti for the savings. Not saying it is a bad card, just wasnt worth the extra cost (and size) to me.
Me with a i7 9th gen and 1660Ti looking to upgrade to a 3080Ti i9 12900H 🗿
My 1440p 360hz QD-Oled says its worth it.
todays 4k card is tmmrws 1440p card, and eventually 1080p then time to replace rinse and repeat.
Why not just buy an rx 7900 xtx if you hate all the AI stuff. It beats the 489 super at all rasterization