Will RTX 5090 Be Too Fast For Any Current CPU?

Sdílet
Vložit
  • čas přidán 22. 04. 2024
  • ► Watch the FULL Video: • DF Direct Weekly #159:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry CZcams: / digitalfoundry
    ► Digital Foundry Merch: store.digitalfoundry.net
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on X/Twitter: / digitalfoundry

Komentáře • 659

  • @UnimportantAcc
    @UnimportantAcc Před 27 dny +291

    Just replace the CPU with a GPU silly 🥰🥰

    • @tsorakin
      @tsorakin Před 27 dny +12

      Like duh 😂

    • @vitordelima
      @vitordelima Před 27 dny +22

      This is kind of possible but it takes too long to explain. IBM Cell was one failed attempt at it.

    • @jmssun
      @jmssun Před 27 dny +18

      The more you buy, the more you save ~

    • @daniil3815
      @daniil3815 Před 27 dny +9

      exactly. you have money for 5090, but not for CPU upgrade lol

    • @photonboy999
      @photonboy999 Před 27 dny +2

      @@daniil3815
      I think you missed the joke.
      Anyway, I remember Intel trying to go the other way with Larrabee trying to use stripped-down x86 cores to create a GPU. It was interesting, but predictably was very inefficient for GPU tasks. So I'm curious WHY they put the money into it. I'm not going to just assume there was no purpose whatsoever based on my limited computer experience.

  • @byronfranek2706
    @byronfranek2706 Před 27 dny +129

    32" 4K/240hz OLED displays would be an obvious target for the 5090.

    • @xpodx
      @xpodx Před 26 dny +13

      8k 120hz/144hz

    • @GatsuRage
      @GatsuRage Před 26 dny +19

      even a 5090 wouldn't be able to push 4k 240fps... unless u only playing cs and LoL lmao so I seriously see no point in looking at those displays yet. 1440p still makes way more sense for high refresh rates.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Před 26 dny +1

      That's all well and good, but I've just gotten a 4k 240hz monitor, I have a 4090 and 5900X, and even games from 2015 are CPU bottlenecked, 240hz should never be a target. Making 8k gaming doable seems to be what it'll be targeting, we don't need better GPUs for 4k right now.

    •  Před 26 dny +4

      @@GatsuRage Well the more FPS the better. 4K looks way better than 1440P, and OLED is better than any display in the market.

    • @xpodx
      @xpodx Před 26 dny +1

      @GatsuRage I get 180-220 in cod Vanguard with my 4090 at.4k max no dlss. But yea the 5090 won't be able to do cyberpunk 2077 max 4k 240 and other similar games. But easier games for sure.

  • @leo_stanek
    @leo_stanek Před 27 dny +49

    What a world we live in where we are concerned that our GPUs have gotten so good we bottleneck the CPU at 4K high refresh rate. I remember when getting 1080p60 was the dream for high end hardware.

    • @stephenmeinhold5452
      @stephenmeinhold5452 Před 20 dny +4

      its still is for me on a 2080ti although with DLSS i can go up to 1440p.

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic Před 16 dny +4

      @@stephenmeinhold5452 Yh Alan Wake 2 can barely run at 1080P on my 4090 with raytracing on lol.

    • @UTFapollomarine7409
      @UTFapollomarine7409 Před 15 dny

      my 3900x lacks in 4k in some areas believe it or not

    • @LordKosmux
      @LordKosmux Před 9 dny +1

      ​@@Veganarchy-Zetetic You know that this is due to developer's laziness to optimize the game, right?

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic Před 9 dny +1

      @@LordKosmux I would say it has a lot to do with Ray Tracing.

  • @professorJorge11
    @professorJorge11 Před 27 dny +156

    I need a 5090 for My 1080p monitor, like a fish needs a bicycle

    • @darkdraconis
      @darkdraconis Před 26 dny +8

      Hey hey hey hey now!
      What kind of "fish-racist" are you?
      Does a fish not have the right to evolve into a bike riding creature?
      Incredible you anti fishists!

    • @professorJorge11
      @professorJorge11 Před 26 dny +1

      @@darkdraconis it's a song bruh

    • @darkdraconis
      @darkdraconis Před 26 dny +4

      @@professorJorge11 it's a joke brah

    • @CeceliPS3
      @CeceliPS3 Před 25 dny +4

      Let me teach you, professor. There's this thing called DLDSR. You can render games at 1440p and 1620p, mantain high af FPS and still get a much better and crispy clean graphics on your 1080p. Sure, a 4090 user (or even a 5090 user) could do with a 1440p monitor, but your analogy is entirely wrong in this case as there is a use for those GPUs with a 1080p monitor.

    • @professorJorge11
      @professorJorge11 Před 25 dny

      @@CeceliPS3 I have a Radeon 7600. There's no DLSS, it's FSR2

  • @wickfut8917
    @wickfut8917 Před 27 dny +145

    VR needs more power. Always. My 4090 isn't good enough for high resolution headsets in graphic intense games. The new headsets on the horizon with 3800x3800 resolution per eye will easily chew through the power of the next few generations of GPUs.

    • @clockworklegionaire2135
      @clockworklegionaire2135 Před 27 dny +12

      Real

    • @mattzun6779
      @mattzun6779 Před 26 dny +9

      Who in their right mind would develop a game that NEEDS something faster than a 4090 to be good.
      If VR needs that much power, VR games need to go for several thousand dollars each to make a profit with current tech.
      One can hope that next gen consoles and NVidia 6000 series mid range cards get to that level
      Hopefully, there are tricks like frame generation and higher resolution where you are looking that help.

    • @rahulahl
      @rahulahl Před 26 dny +26

      @@mattzun6779 Not official games. But the UEVR mod allows you to play non VR UE games in VR mode. Imagine running the latest UE5 games at about 6k resolution, aiming for 90FPS stable. My 3080 couldn`t even run simple games like Talos Principle 2 at a playable quality or frame rate. Best I got was a blurry mess equivalent to 720p at low settings at about 80ish FPS. This is why I am waiting for the 5080/90 so I can finally play those UE games in VR.

    • @xpodx
      @xpodx Před 26 dny +6

      Yea and tons of games can do higher render scaling, and the 4090 is not strong enough 4k max 8k Render at 144hz+

    • @nossy232323
      @nossy232323 Před 26 dny

      @@mattzun6779 I personally would hope games will be scalable enough to use all the power from the lower end up to the ultra high end.

  • @EmblemParade
    @EmblemParade Před 27 dny +95

    As a 4K/120 gamer I can promise you that we're still GPU limited with the 4090. I often have to compromise on AAA games by enabling DLSS 3 or lower settings, and sometimes just hit 60 FPS. At the same time, I do think upscaling is changing our requirements and expectations, so I hope the silicon can be optimized around that. We don't necessarily need more pixel shader performance if we assume upscaling. The die space is better spent on other features.

    • @StarkR3ality
      @StarkR3ality Před 27 dny +4

      I can think of one title you would be fully GPU limited on that card is Cyberpunk 2077 path tracing mode.. and that game is an outlier and also extremely CPU heavy, I'm CPU bottlenecked on that title in certain areas on a 4070s so I'd get your card looked at because something wrong there.

    • @lorsch.
      @lorsch. Před 27 dny +9

      And to max out high end VR headsets these days a 6090 is probably not enough...

    • @ghostofreality1222
      @ghostofreality1222 Před 26 dny +4

      @EmblemParade - What CPU and RAM are you running? CPU and RAM has a lot to do with it as well. - but I also agree with @starkr3ality - 4090 @ 4k 120 should be running fine on all AAA games with a couple of exceptions being Cyberpunk or Microsoft Flight Sim - A 4090 should be maxing out all AAA @4Kx120hz. You have got to be hitting a CPU Bottleneck and that is why your having to lower graphics settings to get your desired results. Again this is mostly assumption at this point as I have no idea what CPU or RAM your running but this is what makes sense in my mind with what you stated in your post.

    • @EmblemParade
      @EmblemParade Před 26 dny

      @@ghostofreality1222 Look at benchmarks from HardwareUnboxed, GamersNexus, and others, and see that you are very far from the mark. I have a 5800X3D and high-end DDR4 RAM. I'm not saying that I'm not having a great time with this setup, but forget about ultra settings AAA at 4K without the help of DLSS. The 4090 is great, but 4K is a lot of pixels.

    • @blackcaesar8387
      @blackcaesar8387 Před 25 dny +4

      @@StarkR3ality cyberpunk is no longer an outlier...Alan wake 2 made sure of that. I am guessing helllblade 2 further confirm that.

  • @markusmitchell8585
    @markusmitchell8585 Před 26 dny +53

    Diminishing returns at this point. Games are unoptimized that's the only reason why these ridiculously stupid specs are needed.

    • @saliva1305
      @saliva1305 Před 24 dny +4

      so true

    • @justhomas83
      @justhomas83 Před 19 dny +1

      correct I have a Rtx 4080 i am not upgrading for another 3 years.
      It's getting pointless

    • @saliva1305
      @saliva1305 Před 19 dny +1

      @@justhomas83 i play to get 50 series since im on a 3070ti, but maybe AMD us an option too.. sad we have to buy top tier gpu's to play games that should run on 3080 and no frame gen

    • @justhomas83
      @justhomas83 Před 19 dny +3

      @@saliva1305 I feel you I just can't do it anymore. I have more bills coming in now and my daughter is graduating college.
      I understand though you're right the top tier is the only path now. We are chasing rabbits in Alice Wonderland at this point 😔😔😔 damnit

    • @Koozwad
      @Koozwad Před 17 dny +1

      yes exactly and the fact that RT/PT exists - it's there to make people $pend, $pend and $pend some more
      amazing graphics are possible using non-RT/PT - just look at RDR2 from what 2018(?)
      I've been saying for years now that devs should be using RT/PT as a TOOL to see how scenes should be lit and then recreate them with NON-RT means which will give the players a ton of performance and be much friendlier to their wallets
      plus hand-crafting it can look nicer

  • @heyguyslolGAMING
    @heyguyslolGAMING Před 27 dny +117

    I'll only consider the 5090 if it puts my house at risk of burning down. If it can't do that then its not powerful enough.

    • @ThePlainswalker13
      @ThePlainswalker13 Před 27 dny +13

      Nvidia Power Plug Engineer: "Hold my half caf soy milk grande carmel macchiato."

    • @EdNarculus
      @EdNarculus Před 27 dny +10

      I'm an overheating enthusiast myself and would like to see products that carry risk of spontaneous human combustion.

    • @murray821
      @murray821 Před 26 dny

      Easy, just put steel wool on it while playing

    • @chillnspace777
      @chillnspace777 Před 26 dny

      Just get a 14900ks then your good to go

    • @dieglhix
      @dieglhix Před 26 dny

      it will be more efficient than a 4090, which can powercapped at 70% and running at 98% performance.. meaning a 5090 will be able to run at itsnfull potential at lower than 300w. stock power is too much power already

  • @hoverbike
    @hoverbike Před 27 dny +25

    the 4090 is still hugely gpu limited in games like MSFS2020 in VR, and i expect 2024 to be utterly devastating in VR - and i'm all for it. We get closer and closer to Star Trek computer simulations every year.

    • @2drealms196
      @2drealms196 Před 26 dny +2

      Visually yeah a single flagship videocard is getting closer and closer when it comes to rasterization. But the holodeck virtual npc's have GTP-5 or higher level AI, the physics simulations are orders of magnitudes more realistic and computationally demanding, true pathtracing without the need for any denoising, ultra realistic animations. Latency is also an issue with these LLM responses. So you'd need an entire futuristic datacenter's worth of power with futuristic networking that provides exponentially quicker responses, so even a single flagship videocard from 2045 wouldn't be enough.

    • @numlock9653
      @numlock9653 Před 24 dny +3

      Actually the cpu is definitely the bottle neck when using a 4090 in MSFS VR, in my experience . I have a 7800x3d and a 4090 using a Vive pro 2 on highest settings and no matter what graphical setting I change it makes little difference in frame rate, but if I lower traffic I get a huge boost, which is all CPU based calculations. Very Poor CPU optimization unfortunately.

    • @hoverbike
      @hoverbike Před 24 dny

      @@numlock9653 Vive 2 pro Res is very low

    • @hoverbike
      @hoverbike Před 23 dny

      ​@@numlock9653huh, well you must've either clogged up that CPU with poor bios settings, or your Vive 2 Pro just has subpar resolution.

    • @Myosos
      @Myosos Před 21 dnem

      ​@@numlock9653 get a better VR headset

  • @mchits9297
    @mchits9297 Před 25 dny +23

    [It's 2030, goes to buy RTX 9090 with all of my savings.]
    Me: Hey, you have latest GPU, May be RTX 9090.
    dealer: (goes inside and comes with a 8 feet server wrack.) here's Your GPU sir, just $100k dollars.

    • @akam9919
      @akam9919 Před 19 dny

      rah. it'll be a tiny quantum board... but you need to buy a giant freezer sized cooler...and not like the fridge you have at your house...like a giant walk in-freezer. You will also have to pay $500K to turn it on, wait for 2 days to get the thing to cold enough, and then spend $2.3M to run crysis, $2.345 for doom, and $76B for fortnite... no the game will not look more realistic.

    • @mchits9297
      @mchits9297 Před 19 dny

      @@akam9919 I might create physical sets for all those games with that kinda money 🤑💰

    • @mchits9297
      @mchits9297 Před 19 dny

      @@akam9919
      Virtual reality ❌
      Reality ✔️

    • @GamingXPOfficial
      @GamingXPOfficial Před 15 dny +1

      This was somehow very hard to read/understand, but I got it at the end of the day.

    • @LordKosmux
      @LordKosmux Před 9 dny +2

      What if they get smaller instead? A GPU the size of your smartphone. And the price of a house.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Před 27 dny +15

    There is always a bottleneck in a pc depending on the resolution and game settings. Doesn't matter if it's a $400 pc or a $4000 pc.

    • @ZackSNetwork
      @ZackSNetwork Před 27 dny +1

      Not exactly if your hardware pars together fine the bottleneck can be the software and not the hardware.

    • @EspHack
      @EspHack Před 26 dny +1

      thats why I aim for a monitor bottleneck

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Před 25 dny +3

      Such a dumb argument, there is a difference between something BARELY hindering performance of another part vs a HUGE hinderance.

  • @tomthomas3499
    @tomthomas3499 Před 27 dny +41

    50%, 70%, or even double the power of 4090, as long as it's not melting it's connector it's fine by me

    • @ZackSNetwork
      @ZackSNetwork Před 27 dny +6

      I expect %60 faster rasterization performance and 2.5x better raytracing. While pushing the same Power output.

    • @iansteelmatheson
      @iansteelmatheson Před 27 dny +6

      @@ZackSNetwork exactly. efficiency is really underrated

    • @nicane-9966
      @nicane-9966 Před 27 dny +1

      gonna pack 2 of those filthy ass conectors man lol

    • @nicane-9966
      @nicane-9966 Před 27 dny +1

      @@iansteelmatheson is not, is just that those very high tier cards are made to get the maximum amount of power possible, for efficiency you have the 80s and below that.

    • @aberkae
      @aberkae Před 26 dny

      ​@@iansteelmatheson 4N node how much efficency can they squeeze out of a similar node🤔.

  • @holyknighthodrick5223
    @holyknighthodrick5223 Před 26 dny +6

    Parallelism is badly needed in many game engines, and modern consumer CPUs really need more cores. Single core performance uplift is too small to keep up anymore. More game engines need to adopt the strategy of running tasks in parallel, instead of just using a render thread, lighting thread, logic thread etc. Easier said than done, but it is the only real way forward.

  • @eugkra33
    @eugkra33 Před 27 dny +18

    Alex said especially with RT it'll be CPU bound, because of the BVH workload. But what the next generation could offer is moving the BVH maintenance to the GPU alleviating a whole bunch of CPU work.

    • @Hi-levels
      @Hi-levels Před 26 dny +1

      Npu s will also come to rescure either on gpus or cpus

    • @yesyes-om1po
      @yesyes-om1po Před 26 dny

      @@Hi-levels I don't think that has anything to do with RT's BVH workload, the only thing an NPU could do better is AI denoising, but nvidia already has dedicated hardware to do that on RT cards, and I'm pretty sure an NPU would introduce too much latency for realtime denoising as a separate piece of hardware.

    • @BaieDesBaies
      @BaieDesBaies Před 24 dny

      RT is so GPU intensive that I don't see how it could CPU limit games.
      If I activate RT in games, CPU load tends to lower because GPU is struggling.
      I have i5 and 3080

  • @chrisguillenart
    @chrisguillenart Před 26 dny +5

    Ampere didn't have price cuts of any kind, it maintained the price hikes of Turing.

  • @LeoDavidson
    @LeoDavidson Před 27 dny +18

    Until it does triple 4K at 240Hz without DLSS or frame generation, there's always room for more. :) Whether there's enough of a market for that outside of the sim-racing and flight-sim niches, I don' tknow, but I'd buy one.

  • @Torso6131
    @Torso6131 Před 27 dny +54

    I mean, 4k120 has to be on the table for most games. Even 4k90, throw on some DLAA, call it good. Aside from totally broken PC ports I feel like we're still GPU limited 99% of the time, especially if you have something like a 7800x3D.

    • @StarkR3ality
      @StarkR3ality Před 27 dny +4

      depends what res like you've said, but a lot of the games alex mentioned and others, I'm bottlenecked on a 5800x3d with a 4070 super at 1440p in a lot of modern titles, which is crazy right?
      For me, only use case for 90 class and even 80 class GPU's is if you're going to be using 4K and ray tracing in every title available. When you're getting GPU's like the 4090 that are doubling performance gen on gen, and then you get a what? 20% increase in perf going from a 5800x3d to the 7800x3d.
      CPU's cannot keep up and it's really starting to show.

    • @WhoIsLost
      @WhoIsLost Před 27 dny

      @@StarkR3alitysomething must be wrong with your PC if you’re getting bottlenecked with that hardware. 5800x3d is only 12% slower than the 7800x3d in 1440p

    • @StarkR3ality
      @StarkR3ality Před 27 dny

      @@WhoIsLost defo not pal can assure you, I still great good performance, and I'm not talking about every title only the recent big ones, Baldurs gate 3, cyberpunk, witcher 3 next gen.
      My point was is if I'm CPU bottle necked at times in some titles, what's a 4090 gunna be which I think offer 2.5x the performance? not to even mention the 5090.

    • @JBrinx18
      @JBrinx18 Před 27 dny

      ​@@StarkR3ality4090 is only ~60% stronger than a 4070 super. But yes, CPU performance is an issue... I think 4K will be a problem, and there's just not a market for 8K... The only avenue that's available might be VR

    • @Plasmacat91
      @Plasmacat91 Před 27 dny +1

      @@WhoIsLost Negative. I have a 5800x3D and 6900XT and am CPU limited most of the time at 1440p.

  • @gnoclaude7945
    @gnoclaude7945 Před 26 dny +4

    Il ride my 4070 Super until next gen consoles drop. Thats when the jump to ai NPU's for gaming and other new features will push me to upgrade. Honestly its overkill for 1440p in the games i play at the moment.

  • @garethperks7032
    @garethperks7032 Před 26 dny +3

    Thankfully AMD have opened a door for us with X3D. CPUs finally start becoming useful for gaming with ultra high speed memory (e.g on-die cache).

  • @ijustsawthat
    @ijustsawthat Před 27 dny +8

    Not if it burns your whole house down.
    Cant they design a better connector instead ?

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Před 25 dny +1

      Can you have more strength than a 12yr old virgin and properly seat the connector?

    • @ijustsawthat
      @ijustsawthat Před 25 dny

      @@user-rt4ct6tq3r if you think it's force related, you need to watch/read more on the topic. GamerNexus did a full investigation on these connectors, and clearly demonstrated how they are flawed by design.

    • @rambo9199
      @rambo9199 Před 24 dny +5

      @@user-rt4ct6tq3r Has this ever been a problem in the past for you? Are you speaking from experience?

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Před 24 dny

      @@rambo9199 deflect more virgin boy.. wanna see a video of my 4090 ive had since launch working perfectly fine? I bet you dont 😂

    • @Oliver-sn4be
      @Oliver-sn4be Před 4 dny

      ​​@@user-rt4ct6tq3rwhat if you move your pc and it comes Auth a bit hmm ? It is not like you open the case always to see ther is always a cance for it and most of all even if u put it all the way in It can still burn 🔥 ther is never a 100 that it won't 😂

  • @Zapharus
    @Zapharus Před 22 dny +1

    "Hi guys exclamation point"
    LOL Dafuq! That was hilarious.

  • @Monsux
    @Monsux Před 27 dny +17

    I will just use DLDSR + DLAA on a 4k 120 Hz TV/monitor. CPU won't be the limiting factor, and I'm always getting graphical upgrade. Add path tracing with maxed out settings and the GPU (even RTX 5090 TI Super) would scream for help. I just love DLDSR and how versatile it is for all type of games… Doesn't matter if I'm playing new or older titles.

  • @Boss_Fight_Wiki_muki_twitch

    Considering the 5090 will be $1700 at least, it needs to be twice as fast as the 4090 to be worth it.

    • @SuperSavageSpirit
      @SuperSavageSpirit Před 27 dny +6

      Rumor is it's 70%.

    • @Chuck15
      @Chuck15 Před 27 dny +14

      twice? 🤣🤣🤣🤣

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 Před 27 dny +8

      70% raw performance improvement, along with a better version of frame gen to boost frame rates even more than the 40 series version would be worth it.

    • @daniil3815
      @daniil3815 Před 27 dny +4

      that's a weird argument.

    • @dpptd30
      @dpptd30 Před 27 dny +9

      I don't think so, it likely will be above $2000 due to them using the same node as Ada and Hopper, their data center B100 is already twice as large as the H100 in order to have a decent performance uplift on the same node, so twice as large on the same node should mean twice as expensive, especially when they still aren't using chiplets yet.

  • @ewitte12
    @ewitte12 Před 17 dny

    Put some kind of onboard ARM cores with direct access to the VRAM. Run calculations there.

  • @CrashBashL
    @CrashBashL Před 12 dny +1

    That's why we need an ARM/Risk architecture as soon as possible.

  • @MarcReisSyllogism
    @MarcReisSyllogism Před 11 dny

    VR Performance would be one thing looking at the like of devices like Pimax 8K and 12K HMD's sure can burn some GPU Time (and CPU), ofcourse AI - maybe AI api for gaming use and the bethought PhysiX return and ever more RTX (even in VR).

  • @dystopia-usa
    @dystopia-usa Před 27 dny +3

    Once you hit certain quality-experience performance thresholds in gaming, it doesn't matter & becomes overkill for the sake of giggles. It only matters to professional bench markers & internet braggarts.

  • @KontikiChrisSK8FE
    @KontikiChrisSK8FE Před 12 dny

    I would honestly not mind if it would be too fast or even more power efficient with the new architecture, because I would mainly use the card like that for rendering purposes.

  • @British_Dragon-Simulations

    My RTX 4090 GPU usage percentage is all over the place in 4K and in the Pimax Crystal (2x4K) with an i9-12900k.
    It never stays at 99% or even 98%.
    It usually varies from 100% to 80% constantly in 4K.
    My RTX 3080 usually stayed at 99% to 98% in 4K.
    In GPU-Z I’m now limited by Voltage and the limiting factor line is always blue instead of green. Even when I overclock my P-Cores to 52 and E-Cores to 40.
    My next PC upgrade may need to be the 14900ks.

  • @Vincornelis
    @Vincornelis Před 27 dny +3

    The CPU side is interesting cause CPU limitations seem to be universal to all modern CPU. Like pretty much every major release will either run perfectly fine on the now iconic 3600 or if it's struggling on that it's struggling on everything. Havinh a faster CPU with more threads seem to not make much of a difference at this point. Most modern CPU don't look in any inherent danger of being underpowered for the job. It's just game developers struggling to get to grips with the multithreading and that affects all modern CPUs pretty much just as badly.

    • @bricaaron3978
      @bricaaron3978 Před 26 dny +1

      *It's just game developers struggling to get to grips with the multithreading..."*
      No, it's just that ever since the Great Consolization of 2008 all AAA games have been designed and coded for console HW.
      Current consoles have only 6 cores available to games. But further, those cores are considerably less powerful than even the cores of an 11-year-old 4770K.

    • @orlandoluckey5978
      @orlandoluckey5978 Před 12 dny

      @@bricaaron3978 cap, ps5 and new xbox both have cpu's equivalent to a ryzen 7 3700x. it features 8 cores and 16 threads. idk where you got your info but its wrong my guy

    • @orlandoluckey5978
      @orlandoluckey5978 Před 12 dny

      @@bricaaron3978running at 3.8ghz

    • @bricaaron3978
      @bricaaron3978 Před 11 dny +1

      @@orlandoluckey5978 *"cap, ps5 and new xbox both have cpu's equivalent to a ryzen 7 3700x. it features 8 cores and 16 threads. idk where you got your info but its wrong my guy"*
      I repeat: Both the PS5 and the XBox Series X have only 6 cores available to games, just like the PS4 and XBox One.
      Each of those cores has a significantly lower FLOPS than a 4770K from 2013.

  • @jaredangell5017
    @jaredangell5017 Před 27 dny +2

    The 9800x3d will be able to handle it. Nothing else will though.

  • @Goblue734
    @Goblue734 Před 27 dny +24

    I have an RTX 4090 the only way you would see me with a 5090 is if we can get 4K visuals with RT and can have at least 60 FPS rasterized performance no frame gen or DLSS.

    • @SafMan89
      @SafMan89 Před 27 dny +9

      You're the 0.1% who upgrade top end GPUs each generation

    • @ZackSNetwork
      @ZackSNetwork Před 27 dny +5

      What games do you play I can do that already with my 4090?

    • @JeremyFriebel
      @JeremyFriebel Před 26 dny

      ​@@ZackSNetworksame but 4080

    • @kevinerbs2778
      @kevinerbs2778 Před 26 dny +4

      never going to happen in the next 3 years. RT takes about 100x more computational power than rasterization does & most RT relies on rasterization as a base. Expect Blackwell to only be 20%-30% faster than a RTX 4090 at most. unless Blackwell comes out with a massive 224 R.O.P.'s ore more it isn't going to be that fast.

    • @aberkae
      @aberkae Před 26 dny

      dlss set to dlaa is where it's at though imo. Better than TA AA native resolution.

  • @vexun11
    @vexun11 Před 15 dny

    Will the i9 14900k be able to work well with a 5090 even at 35000 watts?

  • @powerpower-rg7bk
    @powerpower-rg7bk Před 26 dny

    The thing I'd be hoping for on a RTX 5090 would be two 8 pin power connectors and a return to more sane power consumption. More/faster VRAM would be nice too as I feel that that has been the RTX 4090's bottleneck, especially at 4K.
    Other grab bag of features would be the return of nvLink/SLI support to scale up via multi-GPU and integrating some Thunderbolt 5 controllers. It'd be nice to be able to plug a USB-C monitor directly into the GPU without external cabling and get full USB support on the display and other peripherals connected to it. Similarly with Thunderbolt 5, it'd be clever to include a mode where you could use the GPU externally for a laptop without the need for a Thunderbolt bridge board in an external chassis. Literally just the card, power supply and a power switch to turn it on. The PCIe slot connector would go unused.

  • @phizc
    @phizc Před 26 dny

    Cyberpunk 2077 with path tracing in VR using VorpX/Luke Ross with a render resolution of ~10000x5000 at 90+ Hz would probably break the 5090 Ti Egregious Super too. Maybe when 8090 TiTi Super Duper comes around.

  • @diggler64
    @diggler64 Před 26 dny

    i play dcs and msfs2020 with a 12900k and 4090 in VR and i can get the fps i want and then it's all a matter of turning knobs for how much visual quality you want and visual quality makes you gpu limited so .... for me a 5090 would probably help

  • @chrissoucy1997
    @chrissoucy1997 Před 27 dny +20

    GPUs are getting faster at a pace that CPUs can't quite keep up with. I have an RTX 3090 paired with a Ryzen 7 5800X 3D and in games with heavy ray tracing like Spider Man Remastered, my 3090 is CPU limited even at 1440p and to some degree at 4K. I am itching a CPU upgrade right now way more than a GPU upgrade, my 3090 is still fine. I am looking to upgrade to a 9800X 3D when it comes out.

    • @vitordelima
      @vitordelima Před 27 dny +9

      Stupid methods of rendering that move too much data around, need to rebuilt complex data structures all the time, ...

    • @StreetPreacherr
      @StreetPreacherr Před 27 dny +5

      It sounds like the game engines aren't designed 'properly'... Can't the (RTX) GPU handle most of the processing necessary for high quality ray tracing? I didn't realize that even with RTX that Ray Tracing was most often restricted by CPU performance! Isn't the GPU supposed to be doing all the additional Ray Tracing processing?

    • @vitordelima
      @vitordelima Před 27 dny +4

      @@StreetPreacherr There is a lot of intermediate steps that need a lot of CPU, in case of raytracing it seems the spatial subdivision is one of them for example. Uncompressed assets, assets with excessive detail, poor code parallelism... are other examples of causes of bottlenecks in general.

    • @ZackSNetwork
      @ZackSNetwork Před 27 dny

      That’s because the Ampere GPU’s handled data weird. The 4070 Super is way faster than a 3090 in 1080p and faster in 1440p as well.

    • @mackobogdaniec2699
      @mackobogdaniec2699 Před 27 dny +3

      Spider-Man is a very specific game, it is heavily CPU-limited (but with high fps, not like DD2), but it is an exception. It is hard to find visible CPU-bottlenecks in most games in 4k, with 4090 and top CPU (or even new mid or smth like 5800X3D).
      If we're talking about RT I think it heavily differ from game to game. It's very CPU demanding in S-M:R or especially Hogwart's Legacy, but not at all in CP2077.

  • @DefinitelyNotPedro
    @DefinitelyNotPedro Před 27 dny +2

    Honestly maybe a slight increase in performance from a 4090 but with much more efficiency would be awesome. Imagine 4090 level performance at only 200 watts for example, it would be crazy!

    • @ZackSNetwork
      @ZackSNetwork Před 27 dny +1

      Dude then just get a 5080 then. The 5090 will have %60 more rasterization performance and 2.5x better raytracing than the 4090. While pushing the same amount of power. It’s called “performance per watt” not “low watt output”.

    • @DefinitelyNotPedro
      @DefinitelyNotPedro Před 27 dny +1

      @@ZackSNetwork im not going to get either, i was commenting on what i think would make a good product.

    • @lharsay
      @lharsay Před 27 dny

      That might happen in 2 or 3 generations, not in one. The 4060Ti just reached the 2080Ti's performance under 200W but the 2080TI was a 350W card at most, not 450W.

    • @DefinitelyNotPedro
      @DefinitelyNotPedro Před 27 dny

      @@lharsay maybe! Im just speculating here, the 4060 has around the same performance as the 3060 but uses 50% less power

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Před 25 dny

      Imagine undervolting.. derp. My 4090 runs at 3ghz with an undervolt and in most games im 300 watts or less.. peaks are 350ish.

  • @user-qv2wd2jc6m
    @user-qv2wd2jc6m Před dnem

    Nice conversations, but did they really answer the question? I think the user was asking that how can the most powerful consumer graphics card money can buy really enhance gaming experiences when we also need equally powerful CPU performance to make this happen which often causes choking on the GPU? I mean I can name a plethora of things video games are lacking today that REALLY need improvements to improve experiences compared to what we have had for the past decade or two, with some being physics in real-time(this will always need improvements), real-time water physics(this is EXTREMELY held back and limited because of limitations in CPU calculations), 3-D volumetric special effects(I believe this is GPU focused and CPU focused) as well as robotic NPC's that suck you out of immersion(this is heavily CPU needed as well), so point being we can have these powerful GPU's but if video game experiences don't improve we are stuck with the same mediocre gameplay with shiny polished graphics at 556 FPS. Also VR, to me at least, IS the absolute FUTURE OF GAMING, being able to actually transport yourself into these incredible worlds, and that requires much more powerful hardware both GPU and CPU. My hope though is that solutions are found to fix these concerns and allow developers to have more fun creating games and not be limited constantly by hardware. PS5 was really exciting when it released because of the way they saw SSD's bottlenecking a lot of freedom for developers, and they found a solution that even pushed outwards into the PC SSD market as well since most PC SSD's at that time weren't hitting speeds PS5 was because of I/O throughput issues. My hope is Cerny and his amazing team come up with bottleneck solutions for improving CPU performance, allowing more efficient multithread performance in games that will stretch out into the industry and become standard you know? And I imagine Ai will also assist in numerous ways with improving performance, user experience and helping developers make games more polished and advanced. SO exciting to think about...

  • @altaresification
    @altaresification Před 24 dny

    I wonder if an ASIC on the GPU would be able to compile shaders on the fly, provided a new API is exposed for that.

  • @vertigoz
    @vertigoz Před 27 dny +1

    Ehat i want is a 240hz 4k CRT

  • @odiseezall
    @odiseezall Před 15 dny +1

    it's all about VR.. 50x0 series will be the first that's really capable of full immersion hq hi-res VR.

    • @Skrenja
      @Skrenja Před 5 dny

      Yep. PCVR is where these overkill cards will shine.

  • @smurfjegeren9739
    @smurfjegeren9739 Před 26 dny

    I just hope Ill be able to afford 5000 series cards. And to be able to fit it into my tiny case

  • @jackpowell9276
    @jackpowell9276 Před 21 dnem

    I mean with power GPU power comes more scope to add VFX, textures, higher resolutions, frames, VR etc.

  • @R-yb6xt
    @R-yb6xt Před 25 dny

    high end VR-oriented features? not melting?

  • @capslock247gaming9
    @capslock247gaming9 Před 22 dny

    I have Samsung Odessa g8 4K 240hz . I’m running i9 14k 3090 TI I’m hitting 140 fps on ultra settings on story games and on fps like cod and apex low setting I peak to 240 hz and once you get used to 4K good luck going back to low resolution

  • @ymi_yugy3133
    @ymi_yugy3133 Před 26 dny

    Improvements are probably gonna be modest, both in software and in hardware.
    More interpolated frames is not bad, but it's in the marginal gains category.
    I see a couple of fronts where they could make progress, though most of this probably won't come with Blackwell.
    Generated predicated frames. Instead of interpolating real frames, the next is simply predicated. This get's rid of the latency issue.
    A more integrated neural rendering approach.
    Lot's deep learning powered NPCs in games.

  • @mkreku
    @mkreku Před 27 dny +1

    I use an RTX 3090 right now and I would actually be interested in upgrading if they would start making mini versions of GPU's again. The 40XX series (and AMD's 7900 series) are all gigantic and since I build ITX rigs, they're just not interesting to me. But imagine if they used the smaller processor nodes to keep the performance the same and instead used the node advantage to shrink GPU's. One can dream.

    • @Katastra_
      @Katastra_ Před 14 dny

      Part of the reason why I went with the NR200 for my first itx build. I still remember seeing someone with a 4090 in theirs lol makes my 3080 ftw3 look tiny

  • @kaslanaworld4746
    @kaslanaworld4746 Před 20 dny +1

    well the 9800x3d should be releasing soon once the 5090 launches

  • @xXmobiusXx1
    @xXmobiusXx1 Před 25 dny

    The problem is x86, no matter how low level your API is, you still have to run everything through the CPU due to how the PCI bus works. Basically we would need a bypass of some sort akin to what AGP did.

  • @mushroom4051
    @mushroom4051 Před 27 dny

    Lazy optimization,makes hardware upgrades a must,look at mgsv fox engine can run on old cpus

  • @johndzwon1966
    @johndzwon1966 Před 24 dny

    CPUs only bottleneck at low resolution/high refresh rates. Just means that when it's time to upgrade the CPU, I won't have to worry about updating the GPU (future proofing).

  • @johndavis29209
    @johndavis29209 Před 27 dny +3

    Rich is a gem.

  • @FMBriggs
    @FMBriggs Před 27 dny +1

    I love questions like this because they assume on some level that large companies (Nvidia, AMD, Microsoft, Intel etc) wouldn't be thinking about bottlenecks or be actively working on developing new ways to utilize cutting edge hardware.

  • @SuprUsrStan
    @SuprUsrStan Před 26 dny +1

    Just get a G9 57" monitor or any other 4K+ monitor. You'll instantly be GPU bound again.

  • @tommyrotton9468
    @tommyrotton9468 Před 27 dny

    maybe not if the GPU of the 5090 is used to take AI calculations from the CPU, like it did with PhysX. It could well max out the htz of 1080p and 1440p, but 4k is getting to 144hz

  • @RafaelSilva-yv3oh
    @RafaelSilva-yv3oh Před 23 dny

    Can it run Cyberpunk at 480hz 1440p? Cause that's gonna be my next monitor.

  • @deathstick7715
    @deathstick7715 Před 27 dny +1

    What are they going to do after the 9090 comes out will it be the 10k90

    • @Alp577
      @Alp577 Před 26 dny +1

      Probably go back to small number like how AMD did it with their HD7970 > RX 290 > RX 390 etc..

  • @1stcrueledict
    @1stcrueledict Před 19 dny

    Best current pairing for a 5090 will be the 7800x3d. its only like 300 bucks and the am5 platform is gonna be around for a few more years, we'll get a more compatible one at some point in the future, no use wasting the extra 300 on a 7950. or wasting 700 on having to replace an intel board and cpu.

  • @alexis1156
    @alexis1156 Před 27 dny

    I think it's too hard to tell.
    On 4k probably not, lower res though, maybe
    But next gen of cpus is also looking great.

    • @GabrielPassarelliG
      @GabrielPassarelliG Před 26 dny

      Who'd by a 5090 to play in less than 4k?

    • @alexis1156
      @alexis1156 Před 26 dny

      @@GabrielPassarelliG Probably no one i would say.

    • @FantasticKruH
      @FantasticKruH Před 17 dny

      I really doubt it on 4k. 4090 is pretty much the bottleneck in 4k even if you have an older cpu.

  • @jaffaman99
    @jaffaman99 Před 27 dny +2

    Enough now of this, it’s becoming click bait

  • @HielUFF
    @HielUFF Před 25 dny

    I have no idea why they are talking about 1080p, it is a GPU designed and designed for 4k or more and in those resolutions it is very clear that it will not have problems with current generation processors even with its mid-range whether it is 7600 or I5, hi From Venezuela !!

  • @yumri4
    @yumri4 Před 25 dny

    The thing of expecting the game developer to code DX12 code to the CPU to handle GPU tasks better has an issue. 1 major one right now of right now DX12 is a wrapper but to code for what they are talking about the wrapper will have to change and the change will mean limited CPU support. Even in the same SKU might not be supported if they do it wrong and remove to much of the wrapper to get to the CPU to say to let the GPU do GPU to itself not GPU to CPU to call the GPU thing.
    There are so many ways it can be messed up there is a reason why the API was made. Yes not using it is quicker just coding for the CPU whiich is part of what DX12 will allow you to do is a thing. It is a think i think was made for the datacenter not the consumer. Like .net games it contains both data center parts and consumer parts. In the data center you know the CPU model in every server as you can go look it up to know what to put into the code. For games you have literally millions of possiable CPUs that might be in the system. So saying the game will only work on this gen of CPUs not the next not the before for this one will kill the game after that gen of CPU is no longer sold. So good idea good for the server market bad for the consumer market.
    Now the part you go through the DX12 wrapper for is slower than it allowing you direct access to the hardware. How much slower? 1 to 100 cycles slower which isn't that much when you figure in the CPU is usually doing nothing waiting on another part of data to be sent to it. This part is used as the code doesn't have to be for that CPU but any x86 CPU. Yes there is a slow down but with all APIs you have a slow down.
    The RTX 5090 will most likely be bottlenecked by the coding not the CPU. When in a pure GPU load the RTX 5090 will most likely be bottleneck by the PCI gen 4 x16 speed not the CPU. Better coding is needed but easily readable by humans coding is what is taught in schools. So the compiler used doesn't always make the objective best binary output. What is used like you know is the binary output that basically is a go there do this file.

  • @neti_neti_
    @neti_neti_ Před 7 dny

    It is unlikely that even the upcoming Intel Arrow Lake 15900K or AMD Ryzen 9 will be able to keep up with the RTX5090. Intel, in particular, will need to introduce its Novalake (16P core xxE core) with DDR6 RAM to support the RTX5090 ASAP .

  • @mahouaniki4043
    @mahouaniki4043 Před 25 dny +2

    $2500 for +15% max over 4090.

  • @0wl999
    @0wl999 Před 7 dny

    ' 4K screens coming online now... ' rofl as if they haven't already been released by a year or two.

  • @brkbtjunkie
    @brkbtjunkie Před 23 dny

    Depends on your target framerate obviously. Also not everyone likes the DLSS reconstruction techniques specifically in motion. A 5080 is probably going to be my upgrade from a 3080.

  • @ainzoalgownz1570
    @ainzoalgownz1570 Před 2 dny

    7800x3D is rarely being used above 50-60% in games using the 4090. (at 4k)
    I dont think it will bottleneck the 7800x3D when it comes to gaming at all. Will still be an amazing CPU for it.

  • @doityourself3293
    @doityourself3293 Před 3 dny

    The optical GPU is just around the corner. So is the optical CPU 10K times faster than we have now.

  • @nephilimslayer
    @nephilimslayer Před 19 dny

    AAA titles at 4k with rt on will be the sweet target for 5090

  • @vonbleak101
    @vonbleak101 Před 13 dny

    I play @1080p and have an i9 13900k and a 3080... I have no issues with any modern game and prob wont for another couple of years at least lol... The 5090 would be insane for me haha...

  • @mrmrgaming
    @mrmrgaming Před 24 dny

    If they rush a 2024 launch, I will wonder more about 4090 owners needing to upgrade. Normally, there is some big, melts your PC (no pun) game that calls for the upgrade, but the only one I can see that might gain from a 5090 is Stalker 2. I would have thought a mid-2025 would have been better.

  • @BlueRice
    @BlueRice Před 24 dny

    Any performance gain from cpu, and gpu is always a gain. Now old age questions, how much people willing to pay for that little or a lot of gain?

  • @FreakyAndrew428
    @FreakyAndrew428 Před 27 dny

    I wish my games will be cpu limited when I get a RTX 5090 to use on my Neo G9 S57 with 7680x2160 @ 240hz😅
    So here you have a real usecase.

  • @nhmgmmmmhjm
    @nhmgmmmmhjm Před 25 dny

    games use to be gpu bound, over the past few years, now they are both cpu and gpu, impressive regression in developing games

  • @charizard6969
    @charizard6969 Před 24 dny

    This is a non issue at 4k when ryzen 9000 comes out, it is set ti improvw gaming 17,3% across the board

  • @boedilllard5952
    @boedilllard5952 Před 16 dny

    I'd love to see less power draw - not gonna happen, lower price - not going to happen, thinner - not gonna happen so that leaves about 4 times the ray tracing performance - probably not gonna happen.

  • @hartyeah
    @hartyeah Před 14 dny

    I’m gaming on 7680x2160 samsung g9 57” with 4090 and 7800x3d. I sure hope I can get more fps with the 5090.

  • @kathleendelcourt8136
    @kathleendelcourt8136 Před 27 dny +9

    People getting GPU limited in 95% of their games: ...
    The same people getting CPU limited in the remaining 5%, of which only 1% actually results in a sub 100fps framerate: OH NO I'M CPU BOTTLENECKED!!

    • @GabrielPassarelliG
      @GabrielPassarelliG Před 26 dny +1

      There are ways to spend the GPU extra power, like on monitors of higher resolution and refresh rate. And if you care much about a specific game where CPU bottleneck is a thing, then invest more in CPU and less in GPU. Not hard, giving high tier GPUs cost multiples of high tier CPUs.

    • @johnnymosam7331
      @johnnymosam7331 Před 21 dnem

      Yeah pretty much. CPU rarely bottlenecks unless you're playing an RTS.

    • @FantasticKruH
      @FantasticKruH Před 17 dny

      Not to mention that 4090 and 5090 are 4k cards, even older cpus rarely bottleneck the 4090 on 4k.

  • @SpOoNzL
    @SpOoNzL Před 26 dny +5

    5090 needs a built in smoke detector above 12VHP cable.

    • @tamish3551
      @tamish3551 Před 25 dny +2

      Nvidia is bundling it with a popcorn attachment so if your gpu burns at least you got a snack

    • @eliadbu
      @eliadbu Před 25 dny

      I thought about some elegant solution like 12vhpwr port with integrated temperature sensor or some way to sense resistance in the connector. Or make sure the new 12v-2x6 are working as they're supposed to.

  • @yourhandlehere1
    @yourhandlehere1 Před 2 dny

    Frame gen is like margarine gaming.
    I like real butter.

  • @armandocardona6975
    @armandocardona6975 Před 24 dny

    1440p 32:9 ultra widescreen at 240 hz is the goal

  • @divertiti
    @divertiti Před 24 dny

    This question doesn't even make sense. Every game is GPU limited at 4K with RT on a 4090

  • @whitecometify
    @whitecometify Před 17 dny

    Honestly since.vega 56 or 580 eta the leap has been small. A Vega 56 vs say a 4090 is only a 3x jump 5090 might be 4-4.5x add some for perspective. Go from 40 to 80 fps double the graphics and u max a 5090 vs a Vega 56. They need to at least 3x from here . We are 10 years behind the curve and getting ripped off.

  • @OldMobility
    @OldMobility Před 24 dny +1

    No it’s not, gaming in 4K with Ray Tracing even with DLAA or DLSS is the most beautiful graphics I’ve ever seen and that’s just with my 3090. With a 5090 I can turn off DLSS and go Native and still get spectacular FPS.

  • @dieglhix
    @dieglhix Před 26 dny

    5090 is useful for flight sim vr on 4k per eye... for fortnite like games that people love on 240hz and on1080p.. they need a less powerfull gpu

  • @numlock9653
    @numlock9653 Před 24 dny

    The problem still remains lack of multi threaded optimization in games. Modern CPUs just cant fix the fundimental nature of games being programmed to rely highly on a single thread . It would be in Nvidias best interest to explore the use of AI to potentially solve this issue. Else there is little reason to scale much beyond a 4090 without a miracle in single threaded cpu performance.

  • @aohjii
    @aohjii Před 25 dny +1

    RTX 5090 with AMD new 9000 X3D series cpu

  • @GabrielPassarelliG
    @GabrielPassarelliG Před 26 dny

    I think there will be more demanding games that could potentially make the 5090 work 100%. Also, when we think of 4K and the new high refresh rates, then a 5090 has a lot more use case scenarios. At least, with a 3090, I can barely do 60fps in modern games while playing in 4k. I think a 5090 would definitely push it to 120fps before my Ryzen 5900x has any trouble. And it's not even a gaming CPU. Some of the new generation x3D ones from AMD are a lot more powerful than it in games. Obviously, not all games require such powerful GPU, but it will not be more powerful than what we can take advantage of.

  • @robertbolzicco9995
    @robertbolzicco9995 Před 25 dny

    I'm going to say the 5090 is going to struggle pushing 144 at 4K in the games that are coming out by the time it's released.
    I'm sure it's going to do fine playing your call of duty and your other online fast Pace shooter.
    If you want graphically demanding games and 4K you are going to have to use dlss.

  • @Nintenboy01
    @Nintenboy01 Před 26 dny

    Yeah I think even Zen 5/9800X3D and the strongest 15th Gen Core i9 will be bottlenecks in a lot of games even at 4K 120Hz or faster. Plus we tend to get a lot of titles unoptimized on the CPU side or where RT eats a lot of CPU time

  • @RiderZer0
    @RiderZer0 Před 24 dny

    I’d be most interested in how much quicker 5090 renders in blender than the 4090. My 3090 still takes forever in highly detailed projects.

  • @Brent_P
    @Brent_P Před 24 dny

    CPUs are TWO generations behind now.

  • @superthrustjon
    @superthrustjon Před 18 dny

    No joke, just cashed in about 1,000,000 Marriott points for over $3,000 in Best Buy gift cards 😂 getting ready for the 5090

  • @tofu_golem
    @tofu_golem Před 26 dny +1

    If no one can afford it, does it really matter?

  • @kilroy987
    @kilroy987 Před 14 dny

    Oh, I'm sure I can make a 5090 choke with PCVR.

  • @PaulRoneClarke
    @PaulRoneClarke Před 10 dny

    As someone with no interest in resolutions above 1440, and can personally perceive no benefit from frame rates above about 80… none of this bother me… at all.

  • @ThreaT650
    @ThreaT650 Před 25 dny

    4K 240hz is totally mitigated if you use frame gen. The whole idea is a responsive refresh rate and a lower latency between frames, and that is totally ruined by upscaling. It has to be native at that resolution, or you might as well just play at 144hz.

  • @InternetOfGames
    @InternetOfGames Před 8 dny

    I'm getting ready to add a new CPU to my 4090 before adding a 6090 to my next CPU.

  • @SpielMitMirSession
    @SpielMitMirSession Před 21 dnem

    With the highest end GPU/CPU combination out now, I feel like the only games that bottleneck are poorly optimized games.
    Seems like optimization is not a priority near the top10, when making a game anymore. Is this some sort of conspiracy between game developers and hardware manufacturers?