Is Upscaling A Crutch For Lack Of Optimisation?

Sdílet
Vložit
  • čas přidán 6. 09. 2024
  • ► Watch the FULL video here: • DF Direct Weekly #125:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry CZcams: / digitalfoundry
    ► Digital Foundry at Eurogamer: eurogamer.net/...
    ► Follow on Twitter: / digitalfoundry

Komentáře • 354

  • @Evaquiel
    @Evaquiel Před rokem +66

    i think they kinda missed the point of what people are complaining:
    its not a fault of the upscalling tech. the people are just saying that the game CAN't run well without it, and THEN recommends you to USE upscalling just to get the performance that it should be at WITHOUT it. That's what they mean by "using it as a crutch" cause it takes LESS effort to optimize it.

    • @bltzcstrnx
      @bltzcstrnx Před rokem +16

      The same thing can be said for cube map, bump map, tesselation, and all other graphical tricks to achieve "better" graphics. All of them are the same, tricks, aka techniques to achieve some reasonable results with lower compute requirements.
      People just know about upscaling technology and complain about them. They don't even realize that the entire real time graphics technology is really just smoke & mirrors.

    • @pixelslaughter3492
      @pixelslaughter3492 Před rokem +6

      ​@@bltzcstrnxyou're 100% correct. No idea why everyone hates the idea of using upscaling to achieve good performance. What would the point be of creating this tech and maturing it if you can't use it to benefit your game?

    • @SnakeEngine
      @SnakeEngine Před rokem +2

      Well, if it is supposed to run well without upscaling, then what's the point of upscaling? Obviously cutting edge games recommend yout to use it until the hardware catches up.

    • @fall-neckthetallneck2977
      @fall-neckthetallneck2977 Před rokem +9

      ​@@bltzcstrnx You completely missed the point. Those techniques you described takes developers time to optimize to make a solid game. They have to make the high quality assets, retopologize it as a lower poly object, bake maps and then set it up in the engine. That's how you properly optimize a game.
      With UE5, nanite and fsr2 the devs can literally throw in the high poly unoptimized assets(all the objects that dont have to deform at least), let nanite do the work, set the internal resolution at 720p and upscale it. Having the engine doing the optimization for you real time is extremely hard on the hardware and is why the you need a 2080 super to play Immortals of Aveum at the minimum settings.

    • @bltzcstrnx
      @bltzcstrnx Před rokem +1

      @@fall-neckthetallneck2977 2080 super is a 4 years old card. Make sense that it's now the baseline target instead of the premium. Not only that, in 2020, consoles are upgraded to around 2070 super performance. Every developer out there targets consoles as their absolute lowest denominator. The previous gen (PS4/Xbone) is around RX470 or even lower performance, so the jump is quite high. Keep in mind that these consoles don't run at max settings, they're usually around medium to high. So really, having a gaming PC with less performance than current gen consoles is kind of useless, just use a console in this case.

  • @apostolos8734
    @apostolos8734 Před rokem +88

    I don’t doubt that when Nvidia, AMD, and Intel developed these upscaling technologies that they expected them to be used to make up for the performance loss from ray tracing. The problem is that developers are simply not using them for that anymore. A lot of new titles need some kind of upscaling to run well on 300.00+ gpu’s at 1080p and it’s annoying a lot of the community. They’d rather slap upscaling tech on their game and release it immediately instead of actually taking a few months to fix their product.

    • @George-um2vc
      @George-um2vc Před rokem +17

      And ironically their product still releases busted, which makes sense, if they cut corners in one area, the have doubtless cut corners in others, DLSS will not save them.

    • @mondodimotori
      @mondodimotori Před rokem +7

      Stop blaming developers. They respond to publishers, they can't do what they want. If a publishers tells them "this is the deadline", that's it.

    • @hoaxlayer
      @hoaxlayer Před rokem

      @@mondodimotori These morons that just blame devs for everything barely understand what a publisher is. They love just repeating what they hear or see in social media. They have never seen a single line of code in their lives and yet they claim that the games they play are "unoptimized". I've given up trying to educate them.

    • @tuc5987
      @tuc5987 Před rokem

      "it’s annoying a lot of the community" - it's annoying you, fine, just please don't assume you speak for all of us. Neither should I, of course, but I don't see this widespread annoyance, at all.

  • @SpnCosti
    @SpnCosti Před rokem +31

    We bought HDMI 2.1 tvs to play in 20 fps with LFC!

    • @penumbrum3135
      @penumbrum3135 Před rokem +15

      Meanwhile 8K is still on the box of the PS5...

    • @SpnCosti
      @SpnCosti Před rokem +4

      @@penumbrum3135 😂😂😂

  • @cemsengul16
    @cemsengul16 Před rokem +6

    Upscaling was the perfect gift for these lazy developers.

  • @McBoj
    @McBoj Před rokem +10

    Valid points raised by the team, but i think they glossed over the actual question somewhat. Perhaps becuase we don't have specific examples yet of where the premise is true.
    However we DO see poorly optimised games at launch (LOU pt 1) that are improved over time.
    I think it's fair to suggest that developers COULD be lazier with optimisation, given the benefits of upscaling.
    DF pretty much answered it with "Upscaling is good", not a direct answer

    • @MrEffectfilms
      @MrEffectfilms Před měsícem

      Horizon Zero Dawn wasn't well optimized either at launch, now that game will run on a 4 core 8 thread processor and a GTX 1650.

  • @GFSCN69
    @GFSCN69 Před rokem +15

    I want real pixels. Even DLSS at "Quality" is undesired blurring.
    Maybe it's different at 4K, but like many, I play at 1440p (27"),
    so the blur is very much visible. I just want games to perform well at native 1440p and offer MSAA,
    the only non-downsampling AA solution that doesn't smear Vaseline on your screen or produce ghosting.

    • @enricod.7198
      @enricod.7198 Před rokem +2

      msaa is actually down sampling (higher res to lowe res), while all other techs, dlss included, is upsampling (from lower res to higher res).

    • @tbunreall
      @tbunreall Před rokem

      MSAA doesn't work on modern games, all those fancy effects you get come from computer shaders which msaa doesnt effect

  • @michaelmonstar4276
    @michaelmonstar4276 Před rokem +18

    7:10 - What they probably meant is that the "generated pixels" are basically ones that are "made up", as in guessed by an algorithm, and not the "real" ones that were originally designed by the developers. - I think you know what they mean, but just choose to play ignorant. - This is a very real thing, like upscaling any image, an algorithm makes up a bunch of stuff that isn't there. The proof is with all the issues that upscaling has shown us, whether it's the trails in early DLSS or the speckles in FSR. - And because of that, the image isn't really completely as intended anymore. It might become more and more "as intended" as these technologies improve, but it's a completely valid idea that it's all becoming "faked". - Come on now. It's not about the literal pixels being rendered, it's about whether they're true to the source or not. - Let's be real and not pretend that movie-purists would accept an upscaled video but only a 1:1 mastering of a film. - And videogames are a bit different in that they're more dynamic and you move the camera almost all the time, making the experience varied. But some games might become so "post-generated" that they don't really look like they should anymore. Especially for people who use heavy upscaling. I've seen some real-use screenshots on Steam, oh boy... And you know it's because most people don't have a powerful enough GPU.

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 Před rokem +5

      They aren't playing ignorant lmao all said an done the end user is still seeing pixels on screen, it doesn't matter past that point. You can cry about ai upscaling tricks and break it down any way you want to feel justified, it's still putting physical pixels on display. Try to enjoy games and not be so damn uptight about something that has become a standard in graphics technology.

  • @Skay.
    @Skay. Před rokem +15

    Holy damn dudes, prime example of "I think what I'm talking about, so I will talk about it". This was unironically embarrassing to watch.
    Also literally the moment upscaling is a requirement (and still works like shit) they DO depend on it. How is that even an argument.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +3

      No one is arguing that modern engines don't depend on upscaling. That's not the point.

  • @3k4landrew
    @3k4landrew Před rokem +40

    I believe upscaling Can definitely be used as a crutch to get away with not optimizing games properly, but it also can be used as a tool to get more out of your game.

    • @George-um2vc
      @George-um2vc Před rokem +7

      It’s both, what makes Alex’s point of view crumble is games such as Gollum, which looked like absolute trash in every way, yet performed so badly that even with DLSS the fps was not justified.
      edit: could they have just made a technically incompetent game that happened to implement upscaling? theoretically yes, but really, what do you guys think actually happened? probably they botched the optimisation (if they even tried) and knew upscaling and DLSS 3 was at least a way out so people could at least run the game (they still failed btw) which brings up another point, frame generation and upscaling will not make it playable all of a sudden if those frame time graphs are all over the place.

    • @roundearthshill248
      @roundearthshill248 Před rokem

      what's an example of a game where DLSS was used as a crutch?

    • @joshm60
      @joshm60 Před rokem +1

      Yeah. I think "IS upscaling a crutch for a lack of optimization" is different from "ARE SOME DEVELOPERS USING upscaling a crutch for a lack of optimization". DF is answering the first question here.

    • @Raging
      @Raging Před rokem +2

      @@George-um2vc But you're talking about a game that WAS bad, that WAS panned, to the point it killed the studio, that's not a good example.

    • @George-um2vc
      @George-um2vc Před rokem +1

      @@Raging case and point, this is the perfect example.
      It shows why half assing optimisation is a bad idea, and how upscaling won’t save them.
      Most studios won’t die because of it, especially if the gameplay is good such as Jedi Survivor, that PC release was abominable, yet the game underneath is great.
      Gollum was both a bad optimised and a trash game all together, so the studio went under, but that has nothing to do with what we are talking about, we are talking about whether some devs are skipping crucial optimisation passes and relying on upscaling to provide playable frame rates, it’s happening, and it’s bad, that’s the point.

  • @Luizanimado
    @Luizanimado Před rokem +60

    I think I disagree with this gentleman, the moment that you put DLSS as a requirement, you are in fact depending on the uspacale to optimize your game.

    • @Klaeyy
      @Klaeyy Před rokem +6

      Well yeah, that was not the point though.
      They are saying that they have pushed the graphics quickly so much further that gpus can't keep up with the improvements and that DLSS and stuff therefore isn't just the requirement - it is the reason you can have these graphics at all.
      As in: Upscaling IS the reason for better graphics at those output resolutions and that you simply couldn't have those kind of graphics with common gaming pcs without using upscaling unless you want to go back to playing at 720p-900p and stuff. That it would hold us back to not use it at all.
      ... i do sorta agree though that there are probably devs and games that make a "little too much" usage of upscaling.
      However this is a "two sides of the same coin" situation.
      1. You get games and engines that ARE well optimized but that can be pushed even further on common gaming PCs to get even better graphics by using upscaling. Especially when using modern options like raytracing/pathtracing
      2. But you also might get games that are absolut crap performance-wise and they just slap upscaling on it to get it to run "well" AT ALL. And it still might not run that well.
      (3. and probably some games inbetween)

    • @Art0r1a
      @Art0r1a Před rokem +7

      @@KlaeyyI think that saying 'gpus can't keep up' is wrong.
      It's the developers that adjust the games to the capability of the new hardware not the other way.
      Nowadays more and more games are what Crysis was... a game developed to not the current GPUs, but for the ones in the future.

    • @Klaeyy
      @Klaeyy Před rokem +1

      @@Art0r1a True, but you could argue that good upscaling now IS part of the capabilities of modern gpus. Because they do have dedicated hardware to handle upscaling now, which accelerate the process while not taxing the "traditional parts" of the gpu.
      So them keeping something like DLSS in mind is kind of targeting the capabilities of modern nvidia gpus for example.
      I personally however agree with you that they actually should develop all games without upscaling in mind BUT THEN STILL implement it at the end.
      So that it runs great natively on current gaming hardware, while you can turn on DLSS to boost resolution and/or framerate far further or to get the game running at the same settings on much weaker hardware with similar performance.
      The only issue is heavy usage of raytracing and/or pathtracing. Those simply require upscaling (and sometimes even framegen) to work well enough at all - it's just that taxing.

    • @Luizanimado
      @Luizanimado Před rokem +5

      @@Klaeyy well, consoles don't have DLSS and the graphics have pretty much the same geometry, the same engines and stuff, so, I don't know, saying that GPU can't handle is not very accurate.

    • @Klaeyy
      @Klaeyy Před rokem +4

      @@Luizanimado Consoles do have FSR2 and temporal upscaling techniques. Which essentially do the same.
      FSR2 is even pretty much the same als DLSS, just AMDs version of it. And even if it isn't always as good - it is very comparable to it.
      Consoles do make use of this all the time and when they don't they usually use TAAU (temporal anti-aliasing upscaling). Or they use checkerboard-rendering, which is also something where you upscale an image from a lower resolution by using the previous frame.
      .. and when they don't use any of that then it is just a game that is easy enough to run and those also don't require it on pc.
      So consoles suffer from the same "issues" here

  • @marklarz4399
    @marklarz4399 Před rokem +73

    Optimization? What's that? Devs just blame the Xbox Series S

    • @deathtrooper2048
      @deathtrooper2048 Před rokem +14

      To be fair the series S is a piece of junk and is way weaker than the series x and PS5.

    • @PapaClay115
      @PapaClay115 Před rokem +15

      @@deathtrooper2048though it isn’t as powerful it is cool being able to get into the current generation for only $300. Would have loved that as a kid. I don’t have one but I see the value

    • @abdalazeemal-mashaqba6600
      @abdalazeemal-mashaqba6600 Před rokem

      Forget about consoles it is a real problem in pc with old hardware

    • @Fuuntag
      @Fuuntag Před rokem +7

      Jokes aside, I think time will be kind to the series S. genuinely a bold endeavour and well designed piece of hardware for what it is.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +2

      ​@@deathtrooper2048the PS5 is really just a beefed up Series S. At the end of the day all the consoles are just AMD APUs, way weaker than any Intel Nvidia PC combo.

  • @ni55an86
    @ni55an86 Před rokem +47

    I think the laymen idea though is that if your game cannot perform at acceptable framerates even with upscalers, then its your JOB as a dev to either optimize or DOWNGRADE to the point where performance is acceptable. And that doesnt seem to be what is happening.

    • @Harvester88
      @Harvester88 Před rokem +1

      If they downgrade people will complain, either way the studio is going to catch flack. I do agree with your overall sentiment though but I still remember the stupid puddle gate from Spider-Man, that was just dumb but that's the vocal minority that is always voicing those opinions.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      You can't downgrade your way out of CPU limitations though, that's a fundamental programming and architectural issue. And AMD Zen 2 CPUs are relatively weak.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@spiral7399those things actually change core gamplay, the topic of this video is GPU optimization (ie upscaling). If the game is multiplayer than all the platforms have to suffer for the lowest common denominator, or the game can't work at all. Most people are not on weaker than Zen 2 CPUs, read the Steam Hardware Survey lol. But ypure on to something, maybe its AMD that is holding back gaming, not Epic...

    • @yewtewbstew547
      @yewtewbstew547 Před rokem +3

      @spiral7399 _Reduce AI complexity, number of npcs, cutbacks on physics etc. And they should be doing this considering most people on PC are on older CPUs than Zen 2 and a lot of spec requirements list that architecture as compatible._
      Yep.
      Baldur's Gate 3 Act 3 be shuffling nervously with its Ryzen 3600 recommended spec lol.

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 Před rokem +1

      So you want every gpu to be limited to the strengths of a GTX 1650? Because if they were to butcher a game in that way, they'll be using the most popular steam survey card, not the 12 people with a 4090ti

  • @Ajstyle48
    @Ajstyle48 Před rokem +21

    Games in 2023. No story no emotion no optimization just graphics graphics graphics and 20fps and stutter. 😂

    • @SpnCosti
      @SpnCosti Před rokem +8

      No story, no emotion, simplest gameplay ever hailed by reviewers as impressive and worst blurry ps3 graphics! We bought hdmi 2.1 tvs to play in 20 fps!

    • @deathtrooper2048
      @deathtrooper2048 Před rokem +8

      Not even the graphics are getting that much better.

    • @Ajstyle48
      @Ajstyle48 Před rokem +1

      @@SpnCosti graphically ps4 and ps5 games have minor difference likes reflection and shadow otherwise same games with difference resolution. in my believe from ps2 to ps3 was a biggest leap in graphics . from there graphic upgrade was just little from gen to gen console.

    • @Ajstyle48
      @Ajstyle48 Před rokem

      @@deathtrooper2048yeah every gen upgrade just focus on resolution and fps but still console cant do every games on 60fps.

  • @TheSwayzeTrain
    @TheSwayzeTrain Před rokem +36

    I listened to the whole video and I understand what you guys are saying in regards to UE5 performance per pixel, I'm obviously not as technically minded as you guys but subjectively to me Immortals of Aveum and Remnant 2 (I own both) do not actually look any graphically better than older titles such as Jedi Survivor Fallen Order or Doom Eternal. Whilst at the same time they are running significantly worse on my current system.
    That's not to say UE5 won't improve in time but in their current state it appears that the games aren't very well optimized.

    • @MrPragnienie1993
      @MrPragnienie1993 Před rokem +12

      I agree 100%. Not everybody cares that these newer games are much more refined from a technical point of view. If these new and more advanced games don't have markedly better visuals, where even a casual gamer could spot the difference, to a lot of people what technologies are at work under the hood simply won't be meaningful and won't justify their performance cost.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      Doom Eternal has static baked lighting and is one of the best threaded technologies in modern gaming. Remnant 2 has DYNAMICALLY GENERATED LEVELS UNIQUE TO EVERY GAME and Aveum has completely dynamic lighting and shadowing. You're BLIND if you think those games have inferior tech, there's nothing "subjective" about it at all.

    • @yewtewbstew547
      @yewtewbstew547 Před rokem +8

      @@Wobbothe3rd He didn't say they have inferior tech, he said that he doesn't notice much difference visually. Which is both subjective and relevant here I think, because sometimes using older and less computationally expensive technology is a worthwhile trade off in exchange for increased performance. One example of that is how Starfield is using cube maps for some of its reflections instead of screen space, which I think DF talked about in one of their videos on that game.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​​@@yewtewbstew547dynamic lighting opens up entire new gameplay concepts that aren't even possible in older technologies, and like I said before Doom Eternal is an extreme outlier. I stand by my point 100%, if you don't see much of a difference you're either blind or completely missing the forest for the trees.

    • @yewtewbstew547
      @yewtewbstew547 Před rokem +9

      @@Wobbothe3rd Yes but in games that DON'T utilize their dynamic lighting in gameplay with things like lighting interactions, which is a surprisingly large number of them, it is almost visually indistinguishable from static lighting whilst being significantly more computationally expensive. At that point it is really just acting as an unnecessary performance vacuum.

  • @DeesBees85
    @DeesBees85 Před rokem +61

    Alex is very quick to suggest devs are not relying on these upscaling techs to make up for optimization, but I think reality says otherwise. Not saying that devs are literally lazy, but if management says ship the game now, suggesting DLSS as the answer to poor optimization certainly seems like the next step, and I think we’ve seen this happen a lot this past year.

    • @Isaiahchannel
      @Isaiahchannel Před rokem +4

      You’re 100% correct. I like Alex. But he is dead wrong on this.

    • @mondodimotori
      @mondodimotori Před rokem +3

      No, Alex is right. These upscaling technologies would be there no matter what. They aren't added ad the end of dev cicle to make up for supposed "lack of optimization".
      Tech incompetent people are always ready to jump on the "mhu no optimization" train (like weìhen they discovered their overpriced 8Gb GPUs weren't good enough for next gen), and it's an argument alredy looked on previous DF videos.

    • @Isaiahchannel
      @Isaiahchannel Před rokem +4

      @@mondodimotori I disagree. When developers openly admit that upscaling can be used to over come optimization, then that’s all the proof we need. Btw, just because you disagree with someone, that doesn’t mean you should call them “tech incompetent.” Use logic here. What do you think would happen if upscaling wasn’t around? Would developers just stop making games? No, of course not. They would make games more in line with todays hardware and take the extra time needed to ensure proper optimization. But instead, publishers force developers to hit deadlines. So optimization is overlooked because it can be negated with upscaling. This is well documented. I’m not sure what there is to argue about here. No one is saying upscaling is bad. We are just saying it should be the icing on the cake. Not the cake itself.

    • @mondodimotori
      @mondodimotori Před rokem +3

      @@Isaiahchannel they wouldn't use modern rendering technologies.
      And yes, I call out the ignorance because its true.
      Devs have to answer to a publisher. Yall blaming devs have no idea how the industry works, let alone how the tech behind it works.
      It's not a crutch, and Alex knows it because, would you look at that, he's a bit more competent than the average gamer.
      When a developer reads comments like this, it cringes.

    • @Isaiahchannel
      @Isaiahchannel Před rokem +4

      @@mondodimotori lol right ok. Instead of talking about the issue logically, you hurl insults at your fellow gaming community and cry about the feelings of developers as if you know. You don’t know. Stop worshiping at the alter of Alex. The majority of this comment section disagrees with you and him. Maybe you should take a step back and reevaluate.

  • @skieth9999
    @skieth9999 Před rokem +8

    The basic issue is these upscaling techniques were meant to be used for aiding in making raytraced games playable, not for replacing native resolution rasterization.
    But devs are using them for that because it's a genuinely easier solution for them to do that to get their games to 4K 60 or what have you than to try to go for a native 4K60 and the average consumer cannot tell the difference.
    And the irony is that I think it's leading to a situation where even the upscaled performance and resolution will drop. I can see a situation where devs start dropping internal resolution to 720p and use ultra-performance mode on DLSS or FSR as the baseline for how they develop games at a certain point, or start just targeting 2K instead of 4K as the upscaled resolution

    • @techsamurai11
      @techsamurai11 Před rokem +3

      I think Sony and Microsoft should enforce a limit on the internal resolution allowed in a game. You can't have the PS5 and Xbox Series X rendering at 720p - it almost constitutes fraud when the consoles have 8k labels on them.

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 Před rokem +2

      I'm very doubtful of performance going back down while using upscaling.. dlss 3.5 and fsr3 were already announced, I dont believe they'll stop working on upscaling performance options. I do think we'll hit a point where it finally breaks all these weird elitist that refuse to use it, because it'll hit a point where it's completely indistinguishable, that's far more likely.

    • @techsamurai11
      @techsamurai11 Před rokem +1

      @@smittyvanjagermanjenson182 Well, DF did a video where they separated the frame-gen frames from the rendered frames and they are impossible to tell apart.
      There is nothing wrong with these upscaling solutions. Most games should render between 1080p and 1440p and the upscaler should do its job to make it look like 4k.
      The issue is when the internal resolution drops to 720p - at 1080p FSR 2.0 has 2.25 times more pixels to work with in upscaling the image so the results are much better. At 1440p upscaled to 4k, the results are practically indinguishable and in some ways better than native 4k.

  • @matthewprothero2088
    @matthewprothero2088 Před rokem +43

    I don't think the optimization problem is necessarily due to upscaling because even with upscaling most of these games don't run well. I think the biggest problem is that these games are requiring hardware that is out of reach for the vast majority of people. If we really want the consumer to embrace these new technologies like frame gen and ray tracing then we need to make hardware that can actually do it well for a reasonable price. Back in 2016, you could get a graphics card for $200 that can run almost anything at max settings 1080p. Now, for $200 MAYBE your gpu is barely capable of doing ray tracing at a 540p internal res. Most of these games require at least a $350 gpu to run acceptably on PC now which is unacceptable when compared to the consoles. Tldr: the main issue for 9th gen pc games is that the hardware to run them properly is out of reach for most people (and thats ignoring all the releases with poor cpu perf or shader pso stutter).

    • @Ender-im5ez
      @Ender-im5ez Před rokem +14

      Sorry for the comment, I kinda misread it. But I still want to add to your argument.
      Back when the PS4 and Xbox One came out, you could buy a GTX 760, a 60-Class card that was $250 at launch, to get a comparable or better experience than those systems. When the PS5 and Series X came out, you needed a 2070 Super or even a 2080 to match it. And that was AT LEAST $500 for the card alone!
      It took 4 years to have cards under $300 to be competitive to those systems power-wise with the 4060 and 7600. But even then they have disadvantages like Vram. But 4 years after the last gen systems came out you could get a 1060 for $300, which towers over a PS4 and even competes with the pro systems power-wise.
      That is why I like to say that the Golden Age of PC is over. The market is just not able to catch up quickly enough. No longer is it about getting max settings at 60, but trying to get a similar experience to consoles.
      I do agree with the CPU and Stutter stuff though, it is the only thing I could say is an "optimization" problem.

    • @nikitos3558
      @nikitos3558 Před rokem

      ​@@Ender-im5ezyou are right

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +4

      Read the Steam Hardware Survey, more people have high end GPUs than you think. BUT EVEN IF YOU WERE RIGHT so what? Games pull technology forward, even Doom itself did not run at 35fps for most people's machines when it was first released, this is how PC gaming has ALWAYS BEEN. 2016 was a bizarre outlier and during a time of relative stagnation due to an extended console cycle. 2016 was NOT a "golden age" of PC Gaming, it was actually the NADIR of graphics tech in many ways.

    • @kenroyforrester156
      @kenroyforrester156 Před rokem +6

      ​@@Wobbothe3rdwhat a load of crap 😂😂😂😂😂😂

    • @Ender-im5ez
      @Ender-im5ez Před rokem +2

      @@Wobbothe3rd True, but it was the fact that graphics didn't move forward that made the "golden age" for pc gaming. It was what made it accessible for a lot of people to even just add a graphics card to an office pc and get a better experience than a console. But now that the new consoles are good and that we are starting to see what we can do with them with Lumen, and the fact that pc hardware can keep up like it did last gen. You can no longer get that experience like you could in 2016. But so many people have jumped into pc gaming around 2016 that when they see what is the "Status Quo" they think it is the game's fault. It is annoying to hear.
      Also with the Hardware survey, I checked and only around 35 percent of people have GPUs that can compete with the current gen systems (2070 and above). Perhaps a lot more of you add Series S competitive hardware, but they are gonna have to lower their expectations even more than usual. Because the Series S is either only at 30fps, or at really low internal resolutions (if Immortals of Aveum is 720p on Series X I cant imagine how low it is on S). Nothing wrong with. 30 fps imo, but pc gamers are, to put it nicely, spoiled from last gen.

  • @joseyparsons7270
    @joseyparsons7270 Před rokem +13

    Devs should just bake their lighting and use quality AO to make up for the lighting of dynamic objects. done. problem solved. we don't need more polygons, that's it. A game like far cry 5 looks better than 99% of the games that came out this year and still runs like a dream, that game came out in 2018.

    • @enricod.7198
      @enricod.7198 Před rokem +3

      Yeah they don't even look that good. Take hogwarts legacy. Ue4 games. Looks on par with days gone, a ue4 game, but runs at half the framerare for no reason. Also games that gain 70% more fps by dialing down some settings for basically no visual loss..that means they are not optimized and there is something crippling performance for nothing.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      Baked lighting affects GAMEPLAY. You're missing the forest for the trees.

    • @joseyparsons7270
      @joseyparsons7270 Před rokem +3

      @@Wobbothe3rd you are so right, it does affect gameplay. It gives more FPS. seriously almost all of the games with realistic real-time lighting DON'T need it. realistic real-time lighting was only a necessity in games that had dynamic environments like quantum break. does immortals of aveum have such dynamic environments? no. no they don't, so why do they need realistic real-time lighting? Simple answer: They Don't.

  • @OutrunCid
    @OutrunCid Před rokem +49

    The question is both ironic and highly valid at the same time. Ironic, because the same could be asked for techniques like cube maps and SSR being fake pixels in terms of lighting and tesselation being a cheap trick in order to substitute actual geometry creation. We've done this for years and we were fine with it, at least, until Alex showed us the shortcoming of SSR. Thanks Alex.
    Looking at the past, neither anisotropic filtering, nor anti aliasing techniques have stopped gaming from trying to achieve higher (texture) resolution.
    What I am actually more concerned about is companies like Nvidia skimping on silicon, as it's possible to achieve the same framerates with smaller chips. When you have developers optimizing games with upscaling in mind, whilst graphics cards do not have DLSS as an added feature, rather as a downright replacement of pixel rendering for pixel generation capabilities, gamers will end up in a situation where the person who asked the original question, does have a very valid point.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      Yeah, that's what Nvidia is known for, "skimping on the silicon..." when everybody complained about Turing including RT and Tensor cores at the expense of pure raster 🙄

  • @jmtradbr
    @jmtradbr Před rokem +3

    When a game is releasing at 720p on the best consoles of the newer generation... Yes.

  • @imniallg
    @imniallg Před rokem +7

    After playing some of Immortals of Aveum…yes.

  • @lovelybitofsquirrel8272
    @lovelybitofsquirrel8272 Před rokem +9

    my hot take is that games should be optimized well WITHOUT DLSS turned on.
    DLSS should be an additive not something that is required.
    I don't mind games that have settings for the future but they need to remember that these games need to be also playable on current hardware.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      Next time watch the video. Some technologies are so demanding that they require upscaling to even be playable.

    • @lovelybitofsquirrel8272
      @lovelybitofsquirrel8272 Před rokem +4

      @@Wobbothe3rd I did watch the video.
      if they require upscaling then perhaps the hardware the userbase isn't there yet? most people don't even own capable cards that can do upscaling. they're making games for a very small userbase.
      it's alright if they want a really good looking game but it needs to scale the other way to and they don't do that at all at the moment. 9/10 the low settings don't look much worse than the highest and the performance increase isn't enough.
      the fact they have a whole "optimized settings" in all their reviews should tell you enough

    • @steven7834
      @steven7834 Před rokem +1

      @@Wobbothe3rd Those can be optional you know. And it doesn't need those technologies to make the game still look good. So many games in the 360/PS3 era still look good or even better than most games nowadays.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​​@@lovelybitofsquirrel8272the hardware of the user base is NEVER "there yet" and that's OK. That's how PC gaming has always been going back to the late 80s. Gaming tech pulls forward from the future. Hint: people with newer hardware tend to buy the most games.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@@steven7834if you think any game from the ps3/360 days looks anything close to games today... get an eye test amd don't drive yourself there lol

  • @jzharayjarol2543
    @jzharayjarol2543 Před rokem +12

    I just want to separate resolution scaling from frame generation from this conversation as they are two techniques that are NOT equal. Upscaling is a technique that is great way to get more effects on screen while sometimes looking better than native. Frame generation is a very fancy method of frame doubling which can for sure ease issues of erratic frame times but is not at all a true frame rate of the game itself. It cannot smooth over issues of low level systems such as AI, pathfinding, and object interactivity which is more often being done with non-native code (blueprints in unreal) resulting in substandard performance. We cannot just wash over both of these techniques without having to put down a hard line between what is helping generate "hard" frames and what is not.

  • @olotocolo
    @olotocolo Před rokem +11

    That is such a weird take...
    - Yes DLSS is good when it provides new experiances, like with cyberpunk path-traced. But it's riddiculusly optimistic to say noone will use that to be lazy. Companies, AND ESPECIALLY AAA, are notoriously cutting corners for sake of cutting corners. What in Immortals of Aveum is new? What experiance it provides that justify it not being able to be run natively on 4090? How is it not absolute cope-out from making your game right? Especially as it was said in this very clip - ue5 is very much dependant on real pixels, and that is the definition of real pixel btw. And Immortals cannot be played without dlss so it actually detracts from new technology of ue5.
    - "DLSS is making better looking image" then why did nvidia only now introduced dlss 3.5 that fixes a lot of issues that raytracing in combination with dlss create? After all dlss exist to give us access to those new technologies, right? But when pushed to upscale with all the effects it has many issues still. And we are already forced to use it when it's far from perfect.
    All of this simply do not match. "DLSS allows us to use new technology", only not really, because baseline for requirements shift and when we used to need dlss only if we wanted rt, now we need it to simply play. "It produces even better image" unless you want to have best visuals then it introduces lot of errors.
    This is why people are saying this, they are forced to use very much imperfect tool to gain basically nothing in return other than what was expected to be given.
    czcams.com/video/O5B_dqi_Syc/video.html
    And Hardware Unboxed shows many ways of how dlss produces at best slightly worse image than native

    • @Teja
      @Teja Před rokem

      thanks for the hardware unboxed video !
      and i agree with the rest of your comment

  • @deathtrooper2048
    @deathtrooper2048 Před rokem +23

    We will never see ports as impressive as Half life 2 or doom 3 on the og xbox again. The 8th gen is going to be the last generation with devs that aren't lazy or incompetent. The biggest problem is that devs would force upscaling on consoles and would downgrade from the resolutions we were getting last gen with the same or even worse framerate while having very similar graphics to last gen systems.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      That's because thanks to technologies like UE5 we don't have to have "ports" AT ALL, games are now designed for consoles from the ground up. You're missing the forest for the trees.

    • @IHeart16Bit
      @IHeart16Bit Před rokem +2

      The Switch says hi for modern miracle ports :P

    • @enricod.7198
      @enricod.7198 Před rokem

      Yeah ff16 at 720p 60fps with framerate dips is like ps2 territory lmao

  • @luismyWH
    @luismyWH Před rokem +6

    Please tell me where is that bump in quality that you talk about at 5:13 because games (specially unreal engine ones) still look like if somebody rubbed Vaseline on my screen and they are a stutter fest. They look the same or worse than 3 years ago when I got my 3080 which now it’s considered useless by devs because it’s not a 4090.

    • @joseyparsons7270
      @joseyparsons7270 Před rokem +5

      yeah games today somehow look worse and are harder to run, they should bring back baking

    • @SpnCosti
      @SpnCosti Před rokem +1

      Yeah i really dont know what he s talking about! Just spouting stuff!

    • @steven7834
      @steven7834 Před rokem +3

      We need to ask developers for more accountability about optimization. SO much time and tools has been given for almost 20 years to be still making those kind of mistakes. Being a developer myself I can say we aren't always right and can't blame it on "gamers" who demand for a decent product.

  • @flink1231
    @flink1231 Před rokem +4

    What I think upsets people is a combination of hardware prices going up at the same time as game requirements going up (and depending on the place the economy diving and high inflation). Dlss, fsr, are great technologies reducing the effect of this combination...

    • @FreeloaderUK
      @FreeloaderUK Před rokem +1

      Hardware prices go up a lot, game hardware requirements go up, GPUs performance only improve slightly. DLSS becomes essential to get decent framerates.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      The 4060 is cheaper than the 3060. Hardware prices will probably NEVER be as relatively high as they were over the pandemic and yet gaming grew faster than ever. The 3dfx voodoo 1 cost $450 in 1996 dollars not adjusting for inflation. People just love to whine.

    • @johnmoore1495
      @johnmoore1495 Před 7 měsíci

      @@Wobbothe3rd$450 in todays dollars is still only $894, far cry from $1600 MSRP that translates to $1800-$2000 for a 4090. The age of being able to buy a 60 class card for $200-$250 is seemingly dead which is a problem. According to the Steam hardware survey there’s almost the same amount of 4090s as there are 4060s which goes to show that people aren’t finding the value in the lower class cards so they’re waiting.

  • @anthonyrizzo9043
    @anthonyrizzo9043 Před rokem +8

    To what Alex is saying, it seems to me that UE 5 games gain more from upscaling compared to other games and that makes sense.

  • @manicfoot
    @manicfoot Před rokem +7

    I've yet to play a game where DLSS hasn't been beneficial, so I have nothing against upscaling technology at all. It allows GPUs to run games at settings they'd otherwise struggle with. I really hope Nintendo end up using DLSS in the next Switch. It'd really help it punch above its weight.
    I do however think that optimisation of console ports is a big problem with PC. Shader compilation stutter is far too common. Over the last couple of years I've had so many bad experiences of games randomly hitching in ways they simply don't on consoles.

    • @Conner06
      @Conner06 Před 9 měsíci +2

      You don’t have eyes? DLSS looks horribly blurry at anything under 1440p. The 720p switch screen would be a ghosting blurry mess.

  • @uleertel
    @uleertel Před rokem +26

    I'm not sure I agree with what they say that the quality by pixel has been dramatically improved in recent years and that it justifies using dlss to improve performance. I mean for raytracing ok (dlss was more or less created to make RT manageable), but outside of that what are the improvements? It's just an example out of many but I'm playing read dead redemption 2 PC lately, and quite frankly I don't see anything groundbreaking since then. (Outside of RT again). Another example, I have played Jedi survivor, it's good but I don't really see anything justifying its lake of performance. DLSS is great no issue with that but I think that the original question was lead by the numbers of badly optimized titles that DF exposed lately.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +2

      You're blind lol.

    • @uleertel
      @uleertel Před rokem +2

      @@Wobbothe3rd what an interesting response!

    • @haroldsmith4179
      @haroldsmith4179 Před rokem

      I remember when Quake 2, Doom 3, Crysis etc came out and they were huge jumps in graphics so the performance was justified. Remnant 2 barely looks better than UE 4 games like Gears 5 and runs at a 3rd of the framerate. Maybe it's bad optimization maybe it's diminishing returns or maybe a bit of both.

  • @docmars
    @docmars Před rokem +12

    To answer the video title: Yes, unfortunately.
    It really should be used to take 60-90fps games and propel them up to 120fps+ for higher refresh rate monitors and resolutions. Modern games, even with their visual upgrades, should be targeting 60fps on modern PC hardware, without the need for upscaling. Raw native raster performance.
    If GPU makers aren't making powerful enough GPUs to hit those targets, based on the demands of newer games, then they should put their focus into prioritizing raw raster performance, so upscalers can do what they do best: opening up the way to smooth 4K @ 120hz (or higher) gaming.

    • @SummerStars504
      @SummerStars504 Před rokem

      yes I agree I want raw faster performance and I dont play at ultra settings I usually put the texture to high and everything to low if I cant hit the fps with everything else low something is wrong and I do like upscalers I like fsr2 if implemented correctly looks great at 1440 quality mode but to me dlss is a hit or miss depends on the game and most of the time even at quality Is still see the ghosting effect which is why I like fsr2 over dlss but yeah

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      Blame AMD for their duopoly over BOTH console vendors with their weaker GPUs.

  • @unamedjoe830
    @unamedjoe830 Před rokem +5

    It absolutely is. Considering we were doing everything natively 5 years ago.

  • @Zinkolo
    @Zinkolo Před rokem +5

    I don't think the comparison with CGI and practical effects is a very good one given that with video games literally everything is computer-generated so it doesn't really matter how things are done.
    There's a meaningful argument against CGI when it still to this day looks pretty terrible in most cases and just not believable compared to practical.
    Especially when it's over used but with practical effects it's basically solid 100% of the time.
    There have been examples where studios have opted to use CGI when practical effects were already used just for the sake of "modernizing"
    Practical effects will always be better wherever possible because they're real.
    You don't have to worry about it looking overly smooth or uncanny because they have to physically move it in the real world.
    There's a legitimate argument there and I'd hate for it to be buried and mixed in with video games and upscaling because those are two entirely different arguments.
    You can turn off upscalers. No one's forcing you to.
    Of course it's going to look bad when you put it in ultra-performance because you're literally using the most performant preset but at the very least it's upscaling as apposed to no upscaling.

  • @Teatime4Tom
    @Teatime4Tom Před rokem +2

    Love the clips format. I'm watching much more of your content now.

  • @steven7834
    @steven7834 Před rokem +9

    We can't depend on expensive technology to make a game look "good". That technology should be optional to make the game look better, but the base game needs to still look good and run decent at least. More important than that is art direction. I want then game to look good and playable, not to be a tech demo and run like shit, so i have to pay 1000 dolars for another card when i just bough one a year ago. It doesn't seem fair.

  • @edwardperkins1225
    @edwardperkins1225 Před rokem +2

    I wonder how the various upscaling method affect different graphical styles. Does it look better on realistic style games, and worse on stylized nintendo style graphics? What does it do to the line and fills in cell shaded style? Presumably it might be optimized for realistic since that would be most demanding.

  • @ryanh.739
    @ryanh.739 Před rokem +6

    💯🤔🤔 They keep on saying DLSS looks better 4K native. They need to do a video where they show 4k DLSS looks better than 4K Native. You can’t just say that and not show any evidence.
    Now I do think DLSS looks very good no doubt, and I’m sure they may be a few games that look better in DLSS rather than native 4K, but I think that is not the majority of games.

    • @jorismak
      @jorismak Před rokem +2

      Every game with a bad TAA implementation, of which there are many. If you are playing at 4k, setting dlss to quality isn't that much of a drop in internal resolution. And you get that detail back. TAA always seems to give a soft presentation. So a soft native 4k vs dlss quality , dlss wins depending on how soft the taa is.
      Also, in games where the native anti aliasing is just not going a good job , replacing it with dlss can yield better results . Or the native taa has ghosting artifacts , which dlss handles better, etc...
      I dont think the phrase 'better than native ' is a thing when you are talking about dlss performance mode , even in 4k :).
      Death stranding was the first examples of bad TAA native losing our against dlss2. In my experience ratched and clank on the Sargasso level wins with dlss or dlaa vs the native taa solution (but dlaa isn't upscaling ofcourse). God of War pc was better dlss if i recall, and the last RE engine titles with a dlss injection mod actually fix issues/artifacts that are there in the native rendering.
      Personally I like the solutions a lot where modern upscalere are used to do dynamic resolution and target a certain framerate. I much rather have a few frames of heavy upscaling vs a few frames with high frametimes.

    • @lordadz1615
      @lordadz1615 Před rokem

      Not every single person plays at 4k. If u play at 1080p or 1440p it looks terrible with dlss

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@@lordadz1615depends on the native TAA. Some games still look better with DLSS Quality at 1080p

    • @lordadz1615
      @lordadz1615 Před rokem +1

      @@Wobbothe3rd idk what games they are. They all look blurry as shit from my experience

  • @FR4M3Sharma
    @FR4M3Sharma Před rokem +5

    I remember when Randy Linden ported Doom to SNES. A Hardware even Carmack thought had no business doing that.

  • @EggBastion
    @EggBastion Před rokem +7

    _"graphics getting better and people getting upset about that"_ !?
    how many 200-400 £/$ cards are packing worthwhile DLSS or RTX?

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 Před rokem

      You live under a rock? FSR is for the poor man's card. As it works on something as awful as a 1050ti lol

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      There are plenty of cheap RTX cards. Most of the top twelve most used GPUs are RTX. The 4060 costs $300

  • @harleyx7332
    @harleyx7332 Před 8 měsíci +1

    Some of these games have no business using up most of our GPUs when they look terrible even at 1440p.

  • @H4GRlD
    @H4GRlD Před rokem +3

    It would be interesting to compare these new UE5 titles with similar older games in their respective genres. Like bringing up Doom Eternal comparing it to Immortals of Aveum, etc. Let's disregard the size of dev teams, because that's not important to the end user, just compare the actual product's visuals and performance. Is it really worth it to get let's say 60 fps over 300 based on what you see.
    I don't have the answer for this, so I'm just curious / asking about your opinion.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      Don't miss the forest for the trees, Doom Eternal's lighting is BAKED IN, there are no truly dynamic environment lights. Don't forget the obvious fact that the new tech allows for things the old tech could not while also looking better.
      My favorite analogy is how Quake arguably looked worse than Doom, but you could build a bridge over water in Quake whereas you could not in Doom. Game technology isn't just about making nice looking videos on youtube, it's about enabling new interactive experiences.

    • @H4GRlD
      @H4GRlD Před rokem +2

      @@Wobbothe3rd Enjoy the new interactive effects in 4k upscaled from a 480p base resolution 😂 I'm all in for RT and everything but there has to be a reasonable sacrifice for this.

  • @NintenJon64
    @NintenJon64 Před rokem +8

    I don’t think DLSS and/or other AI rendering techniques are ‘always’ used in place of a lack of optimization. It can definitely be the case, as with any other rendering or post-processing technique used to clear up image quality or improve performance. However, by-and-large it has had its primary use case be in the output of higher image quality at a lower cost to both developers and consumers.

  • @meh23p
    @meh23p Před 8 měsíci

    What *I* badly want to know is: Could these upscaling and up-framing technologies be used in a dock-console system for hybrid consoles? So you get maybe your 1080p30 on the Switch 2 on the go, and then if you connect it to your Switch Dock at home you get it upscaled to 4K @60 fps. Could that become a standard for hybrid consoles, or is the split hardware too much of a hurdle?

  • @jet613
    @jet613 Před rokem +5

    I dunno man, there are devs that do great optimization out there. Look at doom, it looks gorgeous and runs well( felt great on my 980ti). I feel like this problems stems from how projects are managed. Pushing for a time target and fumbling coordination for optimization. I dont have any evidence to back that up. But have a view from a programming side having issues with product management. Pms push down requirements and want things out faster. So we just scramble to get it done. And move on to the next thing. We end up dealing with the fallout of issues as bugs later but pm only cares about it being out. As long as it meets basic functionality ship it! Id love to hear actual devs talk about it from a couple different studios. Better if they are ex devs of a studio so we dont get pr lies lol.

    • @enricod.7198
      @enricod.7198 Před rokem +3

      Immortala of aveum looks like shit compared to doom eternal, the same AT BEST, but on the same pc, immortala can drop under 60fps while doom can do over 200fps.

    • @AL2009man
      @AL2009man Před rokem

      And DOOM Eternal has DLSS.
      No FSR, but there's mod for that.

    • @tbunreall
      @tbunreall Před rokem

      @@enricod.7198 You guys got some rose tinted glasses on, doom eternal definitely looks worse

    • @ScubaSnacks
      @ScubaSnacks Před rokem

      @@tbunreall yeah this is some "literally looks like a PS2 game"-tier hyperbole, IoA looks quite a bit better than doom eternal
      perhaps not so much better it's worth the performance differential, but that's neither here nor there

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@@tbunreallregardless of which you think looks better, the environment lighting in Doom Eternal is BAKED IN, eg STATIC. Don't miss the forest for the trees, actually interacting with dynamic lighting is more important than how the game looks in a still image or on youtube.

  • @pixelvahl
    @pixelvahl Před rokem +1

    I respectfully disagree with that. I think they absolutely are.

  • @FreeloaderUK
    @FreeloaderUK Před rokem +1

    I think upscaling is a crutch for video game demands advancing faster than GPU performance increases.

  • @SpaniardNL
    @SpaniardNL Před rokem +2

    Wake me up when the first console and games release that can do 1080 60fps all the time...
    Because that was the dream when xbox 360 and ps3 came out and till this day we still can't seem to pull that off, but we keep putting effort in retarded 4k resolution stuff that makes games feel worse to play at 30 fps.
    Devs/studios, stop with the 4k nonsense ans actually make functioning 1080p 60fps titles please...

    • @mistabrown830
      @mistabrown830 Před 6 měsíci

      They should focus on 1440p instead of 4k

  • @bowi1332
    @bowi1332 Před rokem +1

    According to you, FSR 2 in RDR on PS4 is a great AA solution. I tried FSR 2 in Death Stranding and I would definitely use it if i could disable upscaling. (I have a GTX 970, so max res is 1080p.)

    • @steel5897
      @steel5897 Před rokem

      You can use DSR to run the game at a higher resolution and then use FSR2 quality to bring it back to your intented resolution, that'd be FSR2 purely as AA.
      I understand if it's too much of a bother to do it though. I used to do it to use DLSS in native resolution before DLAA was a thing.

  • @ltxr9973
    @ltxr9973 Před rokem

    12:32 no, the '77 death star explosion is still the best

  • @yourdignity513
    @yourdignity513 Před rokem +3

    Watching them shill for UE5, trying to excuse the trash engine and devs using it is painfully cringe.
    Oh how i miss thee msaa and proper, stable images. Instead i get pATHtraCeD lighting with tons of shimmering and artifacting.
    Thats nothing to say of the cpu perf, which seems to be not optimized for at all nowadays.
    Bad take, shameful elitism. Guess if you can't splurge 2k on a single fucking gpu u don't get to enjoy latest games.
    Tbh i almost expected Alex to say something like "Just stop being poor and buy a 4090 lol"

  • @user-ec8zd8zp4x
    @user-ec8zd8zp4x Před rokem +1

    Eh... I don't care if games use DLSS, FSR, TAA, ABC, DEF, GHIJ or comes with a genie in a bottle you have to rub 3 times for a proper framerate and looks, as long as the game is fun!

  • @ryanh.739
    @ryanh.739 Před rokem +13

    💯 ABSOLUTELY TRUE. Most PC gamers could not live without DLSS (or even FSR2). All that power means nothing with out optimization. Especially when devs are trying to take their games to the next level. DLSS is the next coming of Christ to PC gamers. Alex says different…. But almost every major new game he plays, he uses it.

    • @lovelybitofsquirrel8272
      @lovelybitofsquirrel8272 Před rokem +7

      he should only review games without DLSS now and then we will see how quickly he changes his mind lol

    • @ryanh.739
      @ryanh.739 Před rokem +3

      @@lovelybitofsquirrel8272 there is not one modern major game that drops below 60fps, (even if it’s just one single frame) that he would play without DLSS. DLSS is like crack to most pc gamers. They wouldn’t be caught dead without it. I use it too, but I’m not afraid to admit that we are hopeless without it. I mean even Alex said himself we are in a era of bad optimization for AAA games like never before. Just look at this year. If it wasn’t for DLSS… I can only imagine the horror.

    • @SpnCosti
      @SpnCosti Před rokem +4

      So true man, and ray tracing, he keeps at it with ray tracing when rdr2 has great photorealistic lighting without it!

    • @bltzcstrnx
      @bltzcstrnx Před rokem +1

      ​@@lovelybitofsquirrel8272how about reviewing games without SSR, cube map, bump map, tesselation, shadow map, and countless other "cheap tricks" in real time graphics. I think you would be surprised how bad graphics would be without these techniques. FSR, TSR, DLSS, and XeSS are in the same category, they're techniques to reduce computes requirements while maintaining or increasing perceived quality.

    • @squidstone8399
      @squidstone8399 Před 8 měsíci +1

      @@bltzcstrnx Bs, you’re comparing raster graphics to fake pixels. The techniques you mentioned actually don’t take away from the fidelity of the game. RDR2 case in point. But fake resolutions make everything mushy

  • @akanar_1924
    @akanar_1924 Před rokem +1

    How else are consoles going to make the 4K claim? lmao

  • @garethperks7032
    @garethperks7032 Před rokem +1

    Virtually all videos/movies we create and watch today consist of mostly "fake" pixels. That's how encoding ("compression") algorithms work. Motion prediction & compensation, picture estimation, deblocking, sample adaptive offset, etc. It's all image reconstruction techniques that have been used for decades. People aren't complaining left, right and center about not seeing the original raw camera sensor pixels. The reconstruction techniques are good enough. Maybe some gamers are just too sensitive.

  • @ChrisCarlos64
    @ChrisCarlos64 Před rokem

    I'm less worried for upscaling. In fact I welcome it when it helps those with less powerful specs still get a better image in the end result, but the way frame generation works right now isn't a performance boost. It requires the right hardware, expensive even, but it also is more of a crutch to try on getting "better" performance. However it doesn't resolve bottlenecks and doesn't exactly help shift focus away from the right parts. If your framerate is struggling, then generating that image in the end is not solving the critical issue. Seeing "3x performance" or however much they tout isn't helping when that 30 FPS struggling game means it's struggling because the budget wasn't taken into account.

  • @drawfull
    @drawfull Před rokem +5

    Please find a way to differentiate between 'clips' and full DF. I can't be the only one that is overwhelmed by the amount of new videos with the same logo. Please!

    • @Prinz_Kasper
      @Prinz_Kasper Před rokem +3

      If you regularly watch videos on the main channel, just unsubscribe from the clips channel?

    • @drawfull
      @drawfull Před rokem

      @@Prinz_Kasper Yeah I suppose, but I want to support both channels.

  • @Dangerousdaze
    @Dangerousdaze Před rokem

    Props for saying champing instead of the usual (and wrong) chomping!! (It's the little things.) On topic: it's not about lazy optimisation. There's only so much "capacity" available and you can either use that capacity to make a natively high-res mediocre experience, or an upscaled *great* experience. As long as the upscaling impact is acceptably low, why wouldn't you take advantage of that?

  • @Anna_Rae
    @Anna_Rae Před 11 měsíci

    People who say upscaling is a crutch need to realize that poorly optimized games these days are mostly cpu or vram limited, things that upscaling cant fix.

  • @pf100andahalf
    @pf100andahalf Před rokem +1

    But Remnant 2 doesn't look much better than games that are less intensive. How is that not wasted resources more commonly known as unoptimization?

  • @moghater
    @moghater Před rokem

    Now, regarding frame gen. I also did not like the answer. I'm not against frame gen, I want to see where the tech goes. But it should not be used as a 60fps fix because, at least for now, the cost in responsiveness is too much. Framerate and responsiviness should be analyzed together, one thing is good because of the other. Smoothness is good, but not when the lag is hihger than 30 fps, or whatever the baseline is.
    My point is, It's a good technology to have, but should not be the norm to achieve acceptable performance.

  • @EverythingPS5Pro
    @EverythingPS5Pro Před rokem +2

    FSR 3.0 is going to make PS5 and Xbox Series X look way better with upscaling and run 60-80% faster in interpolated fps, this is sooo exciting.
    If the PS5 runs 4K 60fps right now, then why not max all of the fidelity settings and turn on FSR 3 to make the frame rate better at a higher fidelity.

  • @kostasvapo69
    @kostasvapo69 Před rokem +5

    Remnant 2 has more real time effects than pre-rendered effects. That means, when you lower the resolution it gets extremely huge improvements FPS wise. Look how many particles are in the air.

    • @enricod.7198
      @enricod.7198 Před rokem

      particles like that were possible with physx like 11 years ago lmao. And the games that use them run at like 200fps now. Everything drinking the coolaid from ngreedia and devs is delusional.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@@enricod.7198lol if you think 2007 PhysX technology could achieve what modern games do you're delusional. Nvidia bought PhysX and integrated it directly into their hardware.

  • @alrightylol
    @alrightylol Před rokem

    Without DF analysis not like alot of people would know "Quantum break" resolution output.

  • @wrentube
    @wrentube Před rokem +1

    How does this discussion fit with seeing very optimized games released like Granturismo 7 running native 4k 60fps on PS5?

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      It doesn't run native at 4K on PS5...

    • @wrentube
      @wrentube Před rokem

      @@Wobbothe3rd umm it does.... czcams.com/video/KwpdmSKgMkY/video.htmlsi=wey3JnbanOnEm6sr&t=258

  • @kazumakiryu7559
    @kazumakiryu7559 Před rokem +3

    Really don't agree with Alex here. Graphics hardly have improved in the last 5 years and there are so many games that look way better than Remnant 2 and Aveum and don't require a gpu from NASA to run

    • @oo--7714
      @oo--7714 Před rokem +1

      Lol no

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      If you honestly think graphics have hardly improved in the last 5 years, you're a blind clown.

    • @aftereight9143
      @aftereight9143 Před rokem

      @@Wobbothe3rd Remnant 2 and Immortals of Aveum look incredibly middling. Then there's also Gotham Knights, Forspoken, Redfall and so on. Some of the most demanding games out today are barely any better than games that came out years ago; some are actually worse.
      Not to mention since a game like IoA requires upscaling to run, it's going to look worse than older games (especially on console). There is no improvement, only an absolute visual mess of ghosting and artifacts.

  • @returningwhisper
    @returningwhisper Před rokem

    Do these frame injection methods have a negative impact on input polling?

    • @mungojerrie86
      @mungojerrie86 Před rokem

      Not sure if this is what you're asking, but game frame generated from 60 to 120 FPS input wise feels like it's at 60. So the gain is in the visual smoothness, not responsiveness.

  • @TheDarkguide
    @TheDarkguide Před rokem +2

    No matter what dlss and ai techs always should be bonus performance not something thtat you need in order to play at acceptable performance when you have 13900 and 4090 already.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem +1

      No matter what developers will always push the limits of what's possible on hardware. What's new today will ALWAYS eventually be pushed to its limit. NOTHING will ever be "future proof" for any particular resolution or setting. You whiners are essentially demanding that gaming innovation stop.

    • @TheDarkguide
      @TheDarkguide Před rokem

      @@Wobbothe3rd Wow I can't agree more I almost want to see you...

  • @Freehotdawgs
    @Freehotdawgs Před rokem

    Thank goodness for the DF guys. I’m so tired of hearing clowns crying about this topic. It’s a breath of fresh air to hear people who know what they are talking about.

  • @MartinJab
    @MartinJab Před rokem

    While I agree with the sentiment, It really muddies what 4K 120Hz actually means. Result of these practices will be vastly different in something like flight simulator and something like Counter Strike.

  • @Melsharpe95
    @Melsharpe95 Před rokem +1

    DLSS 3 won't be a problem once AMD have their own version.
    Then we'll see all these "Frame gen isn't real rasterisation!" comments drying up.
    Yes I'm saying the majority of the negative comments on frame gen are coming from AMD owners with increased sodium levels.

  • @HotGritz910
    @HotGritz910 Před rokem

    I really trust these guys and thier analysis, however i think the answer is in the middle. Games take more time to develop which leads to there being less time to optimize. So thats not the devs fault; they on lean power and tools such as dls to get them over the hump. If we didn't have ps5 and series x or 4090's etc the games would still improve but the dev cycle may be longer or less ambitious games would come out.

  • @6ch6ris6
    @6ch6ris6 Před rokem +4

    not interested in any of these technologies as i play at 1080p. upscaling doesnt workt that well when the source resolution is under 1080p.
    and frame gen is just stupid. it is not givng you better performance.

    • @KingKrouch
      @KingKrouch Před rokem +2

      Frame gen should be treated the same way that black frame insertion is. Once consoles start to adopt that technology, game over for optimization.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      My god the ignorance. These people have no idea wtf they're talking about... bfi = frame generation WTF!?!? Clowns leading the blind...

    • @Kiyuja
      @Kiyuja Před rokem

      A friend of mine showed me back in the 20- series days when control came out and we compared native 1080p vs DLSS 2 internal 540p upscaled to 1080p...DLSS looked way better, concrete walls had more detail and fences still looked sharp in the distance, native looked worse...especially on lower res its useful. Frame gen doesnt give you better performance, nobody claimed that, it just gives you more output aka smoothness

    • @Kiyuja
      @Kiyuja Před rokem

      @@KingKrouch frame gen doesnt work properly on an unoptimized game...stutters and artifacts will, in fact, get worse by frame gen. Thus you can sleep safely knowing that frame gen wont take away optimization

  • @MadsonOnTheWeb
    @MadsonOnTheWeb Před rokem

    It depends. But overall I prefer my raw pixels

  • @jorismak
    @jorismak Před rokem

    'real' or not is BS. It's the quality on screen that matters with the input latency. How you get there...
    That being said , modern techniques are heavily, HEAVILY reliant on the amount of pixels. So the amount of effects , light sources and the quality of that matter less , and the amount of pixels (resolution) matter more. The later can be made performant with upscaling , the first not. It's A way forward to bring another way of rendering that can (will eventually?) exceed the quality of todays games.

  • @freaksuyash
    @freaksuyash Před rokem +3

    This video made me not trust DF anymore. Thankfully the like to dislike ratio too is slipping. Slowly enough people will realize, that these guys are being less than genuine.

    • @papichuckle
      @papichuckle Před rokem +2

      They all play on essentially the best hardware you can get and are obviously rich enough for it so all the bad optimised games we have to put up with isn't a problem for them as they just brute force there way through the bad optimisation

  • @Ajstyle48
    @Ajstyle48 Před rokem +9

    Unreal engine make dev lazy. When you have readymade food you get lazy you dont wana spend time to cook and get experience. Same thing apply on games when you use other software to build games and when any hurdle come you dont know how to fix that and optimize it.

    • @KingKrouch
      @KingKrouch Před rokem +5

      Most Unreal Engine games ship without PlayStation controller support (Because the engine doesn't have good controller support outside of XInput), and have Vert- FOV scaling (Something which could be fixed with 2 minutes using a config tweak) so it's ass-cancer on ultrawide displays. Not to mention HDR support in UE on PC is still flawed and produces radically different results due to the lack of tonemapping. Vulkan in UE is still a bloody mess that performs poorly and caps at 60FPS when you enable VSync. Not to mention, Blueprints is still mostly single-threaded and not that performant, and Epic refuses to add C# to the engine, instead opting for their godawful Verse scripting language that is still years away. Or the fact that (if you are a developer) it's challenging to add a proper shading model (like toon shading) that reacts properly to shadows and lighting without hacking together a flawed solution in a 200GB fork of the engine. Or the fact that Epic's policies on distributing modding tools is still garbage, you are stuck with the binary builds of the engine if you want to make modding more easy, and the fact that even after that, there's poor documentation on how to go about doing this and you're stuck with Blueprints for any of the game code that you want to be modular.
      Honestly, considering how Epic has fixed none of the issues with their engine on PC outside of shader compilation stutter, I hope that the industry pulls what they did with RenderWare. Such a bad engine that's leading to monopolization, only favored because of the fast iteration and prototyping times. Between that and the Epic Games Store stuff, it's almost as if they're trying to kill PC gaming, setting aside Microsoft and NVIDIA for a moment. Go to Unreal Slacker's Discord and there's people constantly sucking off Epic thinking they're doing no wrong.
      Legit, any Unreal Engine game I'm cautious about buying, and any game that uses the upcoming Denuvo for Unreal (Which means no mods or fixes for common problems) is an immediate boycott. I'm done modding/fixing Unreal Engine games because 95% of them share the same problems.

    • @Ajstyle48
      @Ajstyle48 Před rokem +1

      @@KingKrouch thats why i always called it stutter engine

    • @KingKrouch
      @KingKrouch Před rokem +1

      @@Ajstyle48Yeah and even if there's no shader comp stutter, there's asset streaming related stuttering, hence why Returnal acts the way it does on PS5 and PC. Fortnite takes an eternity to load on PC compared to PS5. It's funny, Unreal Engine still hasn't added DirectStorage on PC despite it being out.

  • @ezg8448
    @ezg8448 Před rokem

    Going back to Remnant 2 and with that latest patch with the large performance boost with detailed shadows.
    That was an option that should have been there in the first place if they knew the game would be running well in the beginning. You shouldn't expect people to just use sceen scaling techs (DLSS, FSR, ETC..) if they need more performance. Those low, medium, and high options did almost nothing to the framerate.
    Call me crazy but shouldn't things like this done during optimization. Making the game run smoothly at any given point with whichever setting used. After all not everyone is running a high-tier graphics card, nor even a mid-tier.
    I'm getting 2080 battlefield flashbacks all over again.

  • @mikejenkins4924
    @mikejenkins4924 Před 11 měsíci

    I can't agree with Alex here. Sure, I get that stuff isn't free, but when most games can't match the graphical fidelity of say Ratchet & Clank RA, Demon's Souls Remake, and HFW, which have been out for a bit, then you just have to start asking the question. Whatever features and extra pixels and graphical quality they are pushing in these latest games that are running like shit, I can't see or feel them. Quality per pixel and higher geometry are things I saw in the games mentioned. All ran fine and looked amazing...so. What gives? These are games on a PS5. A 2070 Super equivalent. A 3070 and above shouldn't be struggling with shit.
    Out of all the UE5 games, the only one that looks anything like what I'd call a graphics leap is the Matrix demo.
    A Plague Tale Requiem looks really good, too, so I expect that game to struggle a bit. But when you get the likes of FFXVI struggling to hit 60 at 1080p and under, then there just doesn't seem to be a clear defence.

  • @05acoyn
    @05acoyn Před 9 měsíci

    I do agree with the principal of the video, however in practice and reality this is not happening. Dlss and Fsr / FG etc etc shouldn't be required to make a game playable (60fps). I simply do not believe that we've come so far ahead in quality that a 500-1000 gpu now needs these technologies just to be able to play games well at good quality. I can take a game like RDR2 and play at 4k/1440 120 fps or above at high / ultra preset and it looks phenomenal. The only two games i can hand on heart say look better by a mile are cyberpunk and alan wake 2. Many other games simply do not look better than games that came out five years ago. If they did and they were fully functional and optimised , then i could concede that dlss is a fantastic technology that will pull lower tiers of hardware up to modern gaming at acceptable fps and quality. That is not what's happening though , people with really high end hardware are having to use it to get a game to an acceptable level of quality for them to play. Someone is overshooting somewhere it isn't the customer, devs are either trying to be too fancy for the technology available or its being implimented very very poorly.

  • @SinisterPuppy
    @SinisterPuppy Před rokem +2

    I can't wrap my head around why engine features were created that current hardware can't properly run. I really hope some kind of tricks will be found as it gains maturity.
    I waited until DF put out their Immortals review before keeping this comment here. I've been watching the discourse around Immortals and Fort Solis on pc all week.
    From my perspective it's moot to add more details per pixel if you need to turn it into a blurry smudgy mess to get it performant. I think this is going to be a painful remainder of this generation.

    • @oo--7714
      @oo--7714 Před rokem +1

      Because epic was able to get new technology and decided to ship it in their game engine,simple, this has always been the case since crycis back in the day.

    • @ScubaSnacks
      @ScubaSnacks Před rokem +1

      I thought people have been complaining about devs not future-proofing their games since Crysis though

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      That's has been how the video game industry has always worked since forever.

    • @SinisterPuppy
      @SinisterPuppy Před rokem +1

      @@ScubaSnacks I think it's suicide to make games that don't target smooth game play on current consoles and/or mid range pc hardware.
      Since nanite and lumen most likely speed up development; no longer needing LODs and easier lighting; I worry all these pretty games will be made no one can run.
      I prefer gameplay over graphics. I might be the oddball here.

  • @SpnCosti
    @SpnCosti Před rokem +8

    Publishers and developers need to stop pushing the 30fps story! Nobody bought Series X and PS5 to play in 30fps! Especially that are a lot of gorgeous games that run 60fps! Dont support these kind of publishers!

  • @AdmiralBison
    @AdmiralBison Před rokem +1

    I still don't agree.
    These game development "Shortcuts" are advertized as a feature and used AFTER a game is released.
    Game developers don't really advertise their shortcuts and the short cuts are done during game development.
    - It's not a substitute to true advancements in GPU technology
    - It helps in the development and optimization?
    Sure, but it seems the biggest help in game development is getting ample time, resources (workers and tools) and support from upper management and the corporate publisher instead of squeezing the development with less workers and smaller launch windows that they have to rely on upscaling.
    The games industry and business decisions are the bleeding "cut" and FSR, DLSS etc... are just bandaids
    There's probably going to be a point most high end games will only give you half of the performance and visuals and gamers will have to do the upscaling and optimizing themselves....or rely on mods like with Bethesda games.

  • @ryansheldon6046
    @ryansheldon6046 Před rokem

    100% yes.

  • @SpnCosti
    @SpnCosti Před rokem +10

    Of course, graphics are the same as they are on PS4, altough the power is a few times greater! They dont care about optimization! Once they sell it they dont even update their games! The developers dont care to the extent that games dont run 60 fps even with upscaling!

    • @Chasm9
      @Chasm9 Před rokem +6

      I suppose you did not watch this clip or Alex's explanation, correct?

    • @Gloamy17
      @Gloamy17 Před rokem +2

      What graphics? In Immortals? They are wayy more advanced than PS4-level 😂 and its running at twice the frame rate as 99% of PS4 games.
      If you mean graphics in general then thats an even bigger load of BS.

    • @gv2212
      @gv2212 Před rokem +3

      ​@@Gloamy17God of War 2018 looks better than Immortals on Ps5. Way less blurry 2.

    • @SpnCosti
      @SpnCosti Před rokem +4

      @@gv2212 i am amazed by uncharted 4 to this day! And Tlou2! Ps4 graphics! God of war! All ran 60 with just a patch on ps5! Nobody cares about ray tracing! Only digital foundry!

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@@SpnCostiyou live in an echo-chamber bubble. If nobody cared about ray tracing then both Microsoft and Sony would not have fought AMD to get RT cores onto their consoles. If nobody cared about ray tracing RTX cards would not make up the entire top ten most used GPUs on the Steam Hardware Survey.

  • @mrb3rt811
    @mrb3rt811 Před rokem +1

    Upscaling is fine on a console but not on a PC, otherwise what's the point of spending so much money on playing on PC. As for optimization, it should be at the top of the list, right up there with gameplay.

  • @moghater
    @moghater Před rokem +1

    I'm a bit disappointed with the answer. I can see the upscaling benefits, what we have today works great as a decent anti-aliasing. Most people have no ideia how jagged games would look in native resolution without it. But there are MANY CASES of upscaling being used in a suboptimal ways, which AVEUM is a great example. There is also the fact that DLSS upscale, although being really good, is an exclusive Nvidia PC feature, and FSR is very far from looking as good as DLSS. And I think you should have focused in those aspects for upscaling conversation.
    So, using upscale in a way that hurt the visual, to a point that make everything looks bad, is not good at all, and is becoming more commom, it is not a good tradeoff.
    What pisses me off the most in the upscaling scenario is that AMD should have introduced AI cores (tensor cores) years ago, but choose not to. If they had, maybe consoles would have them, and they could give tools for devs to use them for Upscale/AA solutions, so we would not be so dependent on Nvidia for DLSS, and since consoles dictate the norm, it would have been interesting. But no we don't have that... =/

  • @ItsCith
    @ItsCith Před rokem +1

    Poor developers... They can never win against the mighty voices of the PC gamer.

  • @freaksuyash
    @freaksuyash Před rokem

    The last comment, it does "60 FPS I thought" no it doesn't your own video says it doesn't! It sucks.

  • @GarryMah85
    @GarryMah85 Před rokem

    To me its like asking if pre-baked lighting is less real or a crutch. Its just another way of allowing high quality images with the hardware capabilities we have today

  • @millennial_weeb2382
    @millennial_weeb2382 Před rokem +2

    9:19 Okay Richard, a 4060ti cannot "do" RT Ultra.. It can show you a smooth albeit, interpolated Pathtraced experience with up to **87ms of Input Delay** w/Reflex:ON.
    With Delay that high, Gameplay is literally 70% worse input lag than a 30FPS cap.
    You would be better off watching RAW captured footage of Path Tracing, than buying the RTX 4060ti specifically to play Path Traced games w/DLSS: Frame Generation + PT.
    Nobody buying a GPU Today should be looking into future-proofing for Path Tracing.
    Response to:
    "4060ti Can do Cyberpunk Pathtraching" & "That GPU (4060) is producing a Path Traced Cyberpunk"

    • @millennial_weeb2382
      @millennial_weeb2382 Před rokem

      In my opinion *you're being disingenuous*, because you didn't have the confidence to publicly say the RTX 4060(Non-TI) can do Path Tracing.
      Specifically by saying "*That GPU*(4060) is producing a Path Traced Experience".
      The RTX 4060(Non-XT) Literally hits over 100ms of Input Delay for a 60FPS:i(nterpolated).
      It's 3x worse input lag than 30FPS....
      Nobody would want to play that just to use PathT 10yrs+ early.. For cards like the 4060-class it's literally a Gimmick feature and you're parroting it as an argument in favor of your point.

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      Bullshit. The 4060 runs path tracing at 1080p about as well as a 4090 does 4K. Richard is 100% correct, that is an incredible achievement. The 4050 laptop GPU runs Quake 2 rtx about as well as a 2080!

  • @SplitScreamOFFICIAL
    @SplitScreamOFFICIAL Před rokem

    6:30 And then DLSS 3.5 was announced

  • @SimonStewart75
    @SimonStewart75 Před rokem +3

    The very idea that some pixels are more real than others is nuts.

    • @michaelmonstar4276
      @michaelmonstar4276 Před rokem +2

      Nuts?... OK, here we go: What they probably meant is that the "generated pixels" are basically ones that are "made up", as in guessed by an algorithm, and not the "real" ones that were originally designed by the developers. - I think you and John know what they mean, but just choose to play ignorant. - This is a very real thing, like upscaling any image, an algorithm makes up a bunch of stuff that isn't there. The proof is with all the issues that upscaling has shown us, whether it's the trails in early DLSS or the speckles in FSR. - And because of that, the image isn't really completely as intended anymore. It might become more and more "as intended" as these technologies improve, but it's a completely valid idea that it's all becoming "faked". - Come on now. It's not about the literal pixels being rendered, it's about that they're true to the source or not. - Let's be real and not pretend that movie-purists would accept an upscaled video but only a 1:1 mastering of a film.

    • @SimonStewart75
      @SimonStewart75 Před rokem

      @@michaelmonstar4276 Yeah, no.

  • @yupperdude1
    @yupperdude1 Před rokem

    All i know is immortals looks like you dipped the visuals in vaseline, thus taking away any awesomeness of UE 5. It is BAD. Sorry i know this was off subject 😀

  • @smittyvanjagermanjenson182

    Its really not up for debate. DLSS and FSR are here for the long run. Either you senselessly hate it an dont use it (causing your $1200 card to suck balls), or you're enjoying games with fairly high quality and frame rates and getting your moneys worth out of your whole graphics card.

  • @mal-avcisi9783
    @mal-avcisi9783 Před rokem

    The Guy WHO asked this ist a total noob

  • @MMTT7sALd
    @MMTT7sALd Před rokem +1

    Agree with what you guys are saying there has been a lot of complaining about upscalers lately when they’re giving you similar image quality to native

    • @lordadz1615
      @lordadz1615 Před rokem +4

      Absolutely not at lower resolutions. If ur at 1080p for example and u use upscaling it looks terrible

    • @MMTT7sALd
      @MMTT7sALd Před rokem

      @@lordadz1615 i agree with that under 1440 i can’t look at it

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      ​@@lordadz1615Doom eternal looks better with Quslity DLSS at 1080p. It depends on each games particular TAA.

    • @lordadz1615
      @lordadz1615 Před rokem

      @@Wobbothe3rd idk I need to check that example b4 I give a verdict.

  • @FlyBird
    @FlyBird Před rokem +1

    See, I thought so. UE5 offers technologies that do way more things in real time compared to previous engines, particularly with geometric detail and lighting. These are going to be HEAVY on current GPUs. It is what it is. DLSS allows these technologies to thrive a bit more on your 4 year old GPU.
    I understand getting upset that your $500+ purchase is getting its ass kicked and you refuse to appreciate what going on in the engine technology, but calling devs “lazy”? Come on.

  • @AdmiralBison
    @AdmiralBison Před rokem

    Because of corporate publishers the games industry will come to a point gamers will have to do the following themselves:
    - Q&A Beta testing and unofficial patching
    - Do the upscaling and optimizations post launch
    - Fill in the missing content and "cosmetics" with mods

    • @Wobbothe3rd
      @Wobbothe3rd Před rokem

      That's always been true to some extent. You blame "corporate publishers" but Independent development is often WORSE.

    • @AdmiralBison
      @AdmiralBison Před rokem

      @@Wobbothe3rd often worse?
      MMMMMMbullcrap.
      We know many of the business decisions are done by the higher ups of corporate publishers.

  • @_n8thagr8_63
    @_n8thagr8_63 Před rokem

    Doesn't help that UE5 is just ridiculously demanding. I feel like the engine has just been a hog. Has there even been a game that runs well on it?

  • @necuz
    @necuz Před rokem +1

    I think people are just pissed that they're going to have to buy new hardware to play new games. I mean, where was the outrage when TAA started popping up in every game? That's just as much of a crutch and has similar issues to spatial upscaling. In every game I've played, I've found DLAA/DLSS Quality preferable to whatever the games are doing by default (a.k.a. "native", but good luck finding games that re-render every pixel every frame).