The future of upscaling?

Sdílet
Vložit
  • čas přidán 30. 10. 2022
  • DLSS 3's framerate doubling feature is all well and good, but that lag is painful. But... what if there was a way to reduce it? Welcome to the wonders of Framerate Independent Mouse Movement.
    Download Comrade Stinger's demo here: drive.google.com/file/d/1lSON...
    It took him one evening to make, and another few hours to implement the position reprojection.
  • Hry

Komentáře • 841

  • @MFKitten
    @MFKitten Před rokem +1250

    YES! THIS IS THE THING I HAVE WANTED! Ever since Oculus demoed reprojection with a dev kit prototype years ago, I have been begging the universe to make developers realize the brilliance of this!

    • @dtrjones
      @dtrjones Před rokem +34

      Exactly this technology has been around for years in VR. DLSS 3 though is hardware accelerated which is nice as there was always an initial penalty using Asynchronous Timewarp (Meta) or Asynchronous Reprojection (SteamVR) before the extra frames were noticilble which would sometimes means lowering graphical settings. However you know what, even hardware accelerated reprojection has been around before DLSS 3. Jeri Ellsworth CEO of Tilt 5 with a product of the same name has hardware reprojection which allows the AR projectors in each pair of glasses to run at an incredible 180 Hz.

    • @Caynug
      @Caynug Před rokem +2

      so funny seeing you here! I remember listening to your agile intrepid demo songs on sevenstring over 10 years ago hahah.

    • @gazooc
      @gazooc Před rokem +10

      Its actually not easy to implement, you need to integrate it with camera controller in game, with physics, etc.
      In vr it's easier cause you already get all player movements outside game, by tracking

    • @sguploads9601
      @sguploads9601 Před rokem +6

      reprojectrion is not in any way connected to framerates. you need to know how rendering is made to understand it.

    • @abyssalczech6719
      @abyssalczech6719 Před rokem +4

      @@sguploads9601 isnt reprojection just some fancy algebra

  • @Astraben
    @Astraben Před rokem +1249

    What I fear most is that companies will just target the same hardware requirements to run things at 60fps, for example, and use all these tricks to simply save themselves the bother of optimizing the game

    • @Derpynewb
      @Derpynewb Před rokem +201

      With how shits going, making it run at least 60fps is actually quite good. But I get your point. But at the end of the day, what matters is the actual experience. Everything in 3d graphics is a lie. Most shadows are baked and are not real time. If you take your clothes off in game, its just an empty void.
      If lettuce tasted 100% like pizza, I'd be fine with lettuce
      The only issue is if they make lettuce taste like wet pizza, and most people dont give a shit except me. THEN it becomes an issue.

    • @zarwil
      @zarwil Před rokem +52

      You could say the same about every technology or hardware upgrade ever

    • @Astraben
      @Astraben Před rokem +36

      @@zarwil And I do

    • @Derpynewb
      @Derpynewb Před rokem +78

      @@zarwil On that note actually. It's proof to Astra's argument. Ever play an old game on modern hardware and see how high the FPS flies (if its not cpu bound)?
      The old games run 10 times faster even though they don't look 10 times worse.

    • @TheMrLeoniasty
      @TheMrLeoniasty Před rokem +2

      It's what they are doing now with stuff DLSS and other stuff.

  • @DeSinc
    @DeSinc Před rokem +179

    Why hadn't I thought of this? Reprojection was such a good idea, but I never even thought of putting it into normal games. Cool idea, undoubtedly will be coming in the future as gpus become more reliant on tricks and software to provide improvements, rather than ever diminishing chip speed improvements.

    • @starlight_garden
      @starlight_garden Před rokem +5

      Sinky-sama.

    • @TheoHiggins
      @TheoHiggins Před rokem +7

      And the cool thing is rendering as we know it is just the culmination of software tricks upon software tricks to bodge our 3D world into something that can be reasonably calculated 10s-100s of times a second
      Down the line looking back at tech like this once it's adopted, iterated upon, and eventually matured, it'll be hard to imagine a time when games didn't make use of it

    • @jamesboyce7467
      @jamesboyce7467 Před rokem

      Why are wee sooo incrediable stuuoooopedd?

    • @mavzikk
      @mavzikk Před rokem +2

      @@TheoHiggins You can simulate mouse input in a separate thread (Diabotical, Goldsrc) at a fixed frequency. I dont see how the same couldnt work for vr.

  • @Bambiindistress
    @Bambiindistress Před rokem +858

    The combination of all these technologies is the perfect mix for gaming

    • @Systox25
      @Systox25 Před rokem +6

      Casual gaming

    • @Dionyzos
      @Dionyzos Před rokem +18

      @@Systox25 I think some of this can also benefit competetive games. If you have a 240Hz monitor and run a game at 144Hz input lag could be reduced by polling your movements at 240Hz.

    • @brett20000000009
      @brett20000000009 Před rokem +13

      @@Dionyzos the whole can't benefit comp games is dumb, this is going to enable 1000fps@1000hz(yes they are actually working on 1080@1000hz monitors) that will give superior mousefeel that's perfectly consistent. it will be better for muscle memory with little to no sacrifice to base framerate.

    • @ShadowMKII
      @ShadowMKII Před rokem

      I prefer it for baking.

    • @Marcos42563
      @Marcos42563 Před rokem

      @@Dionyzos Yep, I find CSGO utterly unplayable since it drops below 40 fps on my 60hz PC every now and again. But it's not because the image is any more or less stuttery - my eyes can't perceive that difference - but my muscle memory can. Tracking in that game is near impossible for me without weird flicks interspaced through otherwise smooth movement. Meanwhile in Quake Champions, a game I can run at 60fps solid, I have no issues with tracking. It's all down to how responsive the input is, and if the input matches what I see, boom, framedrops and hitches become a thing of the past.
      I yearn for the day something like DLSS or FSR 2.0 is implemented on a hardware level. And even more so for the day something like this spatial warping is implemented. Especially as a developer. Not having to worry about optimisation as much would be great so I can focus on things like gameplay, sound, UI, etc. Y'know, the actually fun stuff.

  • @Remyie
    @Remyie Před rokem +310

    That's so clever. After testing out the demo, I can easily say that I cannot see or feel any difference between uncapped 750fps and 60fps capped with both tweaks. Even if I moved the mouse super fast, it's not noticeable. If we added a slight motion blur to both, it would literally be same. Also the timewarp 3d screenshot is such a brilliant idea to interpolate between frames. Now I'm wondering why these weren't a thing for all these years. Finally I did try out 30fps, input lag is definitely a lot better but the ghosting is very visible and distracting.

    • @sawyergrimm287
      @sawyergrimm287 Před rokem +27

      Its noticeable, but it's up to each player to decide whether they want ghosting or laggy 30fps

    • @Sergeeeek
      @Sergeeeek Před rokem +19

      I'm sure this same technique could be improved even further to hide reprojection artifacts. This would be great to have for console games because they run at 30fps a lot of the time.

    • @StanleyKubick1
      @StanleyKubick1 Před rokem +1

      per-pixel mb

    • @AriinPHD
      @AriinPHD Před rokem +8

      do not add motion blur to anything. ever.

    • @rdmz135
      @rdmz135 Před rokem +19

      @@AriinPHD motion blur is great for racing games

  • @Isaax
    @Isaax Před rokem +142

    This is one of your most important videos in recent times. Thanks a lot for bringing this to attention! I remember how I HATED the idea of VR because of "the lag that would be there if you just moved your head", which was before i knew that it was already a sovled issue. I was so shocked (positively) when I got my first Oculus in early 2019 to see how responsive it felt after all.

    • @fireaza
      @fireaza Před rokem +2

      Yeah, this was one of the things they were laser-focused on when they were developing the Rift and Vive. They obviously knew from the VR experiments in the 90s that it was going to be ESSENTUAL to have zero lag when turning your head, otherwise motion sickness was going to be as rampant of an issue as, well, people who have never tried VR thinks it is.

    • @Junya01
      @Junya01 Před rokem +1

      @@fireaza motion sickness still is an issue for many VR users. It’s why the Zuck is wrong and it’ll never be fully adopted by the general public

    • @Derpynewb
      @Derpynewb Před rokem +3

      @@Junya01 nah its gonna be fully adopted. There are people addicted to this shit like people were addicted to social media. Thats exactly why meta, a social media company is investing so heavily into this. Also the tech has already grounded its use in industrial settings. Its already adopted by the military and some police forces because of how versatile it is. Its just in its infancy. With how costs of everything is rising, its easier to have a pretend virtual things than real ones.
      I like vr but at the same time I know for sure its going to be hell. Imagine readyplayer me but without the good ending. I started realising all the little details after actually using vr. Its fucking annoying me how I didn't notice so much shit in that movie. Smaller irl houses since everyone lives online.

    • @Junya01
      @Junya01 Před rokem +1

      @@Derpynewb motion sickness dude. 3/4 vr users already experience it in some form or another, u think that shit is gonna go away just cuz the tech is getting better? People aren’t gonna pay extra to do shit in vr when they can do it cheaper, faster, and without feeling nauseous by just doing it irl or through a screen.

    • @Derpynewb
      @Derpynewb Před rokem +5

      @@Junya01 there's a few things you cannot replicate from vr to desktop. I've encountered some people that had horrible and I mean horrible motion sickness. They bruteforced their way till they no longer had it. You can lose motion sickness. I initially got a bit motion sick and personally lost it too.
      The other thing is the motion sickness is caused by a disconnect from visual senses and your biological accelerometers conflicting. And there's research being done on that too.
      Vr does give you motion sickness but you can get used to it. Its just right now there's no compelling content to convince the masses to.
      Fps games give some people motion sickness and people got used to that too. However the sample size isn't big enough to say say if most people would be able to get over vr motion sickness. Since there are some people who can't get used to desktop motion sickness.
      Everything I've said is anecdotal and the sample sizes arnt big enough to really give a concrete conclusion. Personal opinion is it will take off.

  • @luukvanoijen7082
    @luukvanoijen7082 Před rokem +313

    hey, the name for that "3d screenshot" thingy for frame smoothing is "Temporal reprojection" i think. i used this technique in a raytracing shader for minecraft once, it's actually really cool and not demanding at all either, just a little bit of matrix math

    • @TheMrKeksLp
      @TheMrKeksLp Před rokem +10

      Correct!

    • @luukvanoijen7082
      @luukvanoijen7082 Před rokem +36

      @@bluescorpian lol its actually not that much math to do this, but theres of course more to it than that, like discarding pixels so you dont accidentally try to reproject when you shouldnt (think a pixel being occluded or something)

    • @miguelpereira9859
      @miguelpereira9859 Před rokem +3

      @@bluescorpian Matrix based calculations aren't that complex

    • @logitech4873
      @logitech4873 Před rokem +3

      It's also similar to the framerate smoothing done in Oculus VR
      If a game happens to hang up, you'll see the same kind of weird artifacts as seen at 6:25

    • @Girugi
      @Girugi Před rokem

      This is not correct. Temporal reprojectuon is tot ale an element from the current fram and transform it to the space of the last frame (using inverse current projection and last frames projection). But this is all completely different. This is rather using something called ray marching, which can be expensive, but there are tricks to speed it up, and the closer the projections are, the larger ray marching steps you can take, and it will be cheaper. Honestly, the implementation here looks a bit rough.
      If you want to compare this technique to something common in games, compare it to screen space reflections. It's the same technique, but the rays are generated from the camera near plane and in projection direction. So all trucks toy use to get smoother results and not miss details in SSR can be used here too.
      But, SSR is a relatively expensive technique. Many games does it at quarter res to save performance. Of course, it will be cheaper here if we can take larger steps. But that is also why we see the big slices that breaks up as you can see in the video.

  • @Xarius
    @Xarius Před rokem +210

    The rhythm game "osu!" has this feature! Audio, graphics and input are all on separate threads, meaning drawing can run at 240Hz, while input is polled at 1000Hz

    • @AveryChow
      @AveryChow Před rokem +22

      this is only true for the ‘lazer’ version I believe. I haven’t really noticed much of a difference as the game runs fast on my computer anyways (it’s not a very demanding game)

    • @Derpynewb
      @Derpynewb Před rokem +7

      I don't think that actually does anything, because the screen displays the incorrect position of your cursor, meaning you cannot adjust its movement path as fast as if it was actually native. It doesn't matter if the input is at 240hz because you can't see where anything acurately is and hence you cannot react accordingly IMO.

    • @Xarius
      @Xarius Před rokem +10

      @@Derpynewb True! However that means that actual cursor position is much more accurately mapped to real movement. For example, if your game runs too slow, one frame your cursor is out of the hit object, and the next, it's out from the other side, even though you did aim correctly :p

    • @Aesthesis92
      @Aesthesis92 Před rokem +8

      Older arena game called Reflex Arena also has this. Hitreg is processed at 1000hz.

    • @Aidiakapi
      @Aidiakapi Před rokem +13

      @@Derpynewb Our own brains have significant processing latency. Even the people with fastest reaction time do not beat 100ms on visual stimuli. (We can respond faster to audio cues.)
      If you played a game like osu! by moving your cursor, observing the new position of the cursor, and then clicking. You'd be much too late.
      Similarly in shooters, you don't aim by moving your cursor, then seeing if it's on top of the enemy, and then clicking, even if your click-to-photon latency was 0ms, it'd be much too slow.
      Point is, it matters a lot.

  • @DarkSwordsman
    @DarkSwordsman Před rokem +54

    This was actually quite eye-opening. It makes sense now why turning my head with my Rift headset still feels smooth despite a game's framerate sitting around 25-40 FPS, but when I move around it feels juddery and laggy with visual artifacts (because of Asynchronous Space Warp).

  • @Dionyzos
    @Dionyzos Před rokem +23

    I remember being blown away by the concept of Foveated Rendering back when I saw the Oculus presentation. This is just as mindblowing to me right now. Just imagine the visual fidelity we can achieve and make playable with this.

  • @Tony14-_-
    @Tony14-_- Před rokem +10

    Congrats on the LTT mention Phillip! Glad two worlds collide

  • @RazielXT
    @RazielXT Před rokem +35

    Its like RTS games draw mouse cursor outside of game render loop to keep it responsive, just now we want it for whole screen :)

    • @Trikipum
      @Trikipum Před rokem +4

      they have been using hardware cursors in this kind of games for eons...That is why they feel responsive, because the mouse input is direct.. but this works with a cursor and nothing else, the game still will feel unresponsive if it is running at 10fps, specially if you move the camera around.

    • @RazielXT
      @RazielXT Před rokem +2

      @@Trikipum Well of course with 10fps, but without hw cursor even 30fps would be horrible

  • @beastbum
    @beastbum Před rokem +9

    Kliks empire expands to Canada

  • @ganja_hotdog
    @ganja_hotdog Před rokem +11

    Big papa Linus feature

  • @The_Viktor_Reznov
    @The_Viktor_Reznov Před rokem +13

    Holy fuck LTT straight up shouted out Philip, what a timeline

  • @earthling_parth
    @earthling_parth Před rokem +22

    You have possibly inspired a real use-case for this thanks to Comrade Singer and LTT pushing your vision Philip

  • @RandomBruh
    @RandomBruh Před rokem +37

    Incredible! I just tried it with a render target of 60 with a 280hz (yeah, don't ask) monitor, and aside from the little bit of weird ghosting on objects close to the camera while moving the character, it's amazing.
    I would have little-to-no reason not to turn this tech on on literally every game, which as others have pointed out could be a problem when you think about the progress of computing power and game optimisation in general.

    • @starlight_garden
      @starlight_garden Před rokem

      Why do you have a 280hz monitor?

    • @RandomBruh
      @RandomBruh Před rokem +2

      @@starlight_garden I did say don't ask.
      (It was available for around the same price as a 240hz one so why not.)

    • @keppycs
      @keppycs Před 4 měsíci

      @@RandomBruh AW2723DF?

  • @insu_na
    @insu_na Před rokem +5

    Your video was mentioned on LTT (and properly credited). Cool stuff

  • @TheLifeFruit
    @TheLifeFruit Před rokem +8

    You made it to LTT

  • @YannYoel
    @YannYoel Před rokem +13

    Congratulations on making it into and inspiring the newest LTT video with this one :)

  • @jeffmccloud905
    @jeffmccloud905 Před rokem +43

    The Unity game engine already has this feature ("frame independent input"), but only if you use the "New Input System" (Unity devs will know what that means). Unity introduced a new way to read user input in the last ~2 years that implements this feature. VR games made in Unity (which is most of them) use the new system.

  • @Girugi
    @Girugi Před rokem +129

    Just remember that everything shown here will only work reasonably well in an fps where the camera just rotates around a fixed point when you move the mouse. Any third person camera would need the last technique shown here, a lot, and break so much easier

    • @elin4364
      @elin4364 Před rokem +2

      YUP, still cool though

    • @truffles
      @truffles Před rokem +11

      i hate the games where your view orbits slightly off-center from your character and does not stay centered, and stuff that is very close to your view moves in a very strange way while you look around

    • @Dionyzos
      @Dionyzos Před rokem +3

      Developers can tweak the peripheral resolution and size of the image to a point where it's less noticeable or even give the option to tweak it yourself. This would of course come with a bigger performance impact but as there is a net benefit with Frame Generation it's an overall improvement. Let's say we're going to the extreme without reduced peripheral resolution and render a 4K Image but only display it on a 1440p display. This should be enough to even compensate for quick camera movements while still giving an overall Framerate increase with Frame Generation as 4K does not halve the framerate compared to 1440p while FG doubles it. Add Foveated rendering to this method, render the peripheral pixels at DLSS Ultra Performance or something, and the results will be even better as noticeability decreases with increased camera speed.

    • @Girugi
      @Girugi Před rokem +2

      @@Dionyzos you do know that scaling resolution is not growing the pixel count linearly right? 1440p is about 3.6 million pixels, 4k is about 8.3 million. So you will pay over double the cost for rasterization and all post effects. The gains you speak of will not be as big as you think. There is a reason why no game with ssr renders outside the frame to improve edge reflections. It is just not that cheap.
      And no matter your buffer sizes, the tps camera issue will not be solved. Depth peeling might be stronger for that, like a low resolution wider fov depth peeling. But I can see a lot of issues for that too. But, for a standard FPS that rotates around a fixed spot this technique has interesting potential. But one more thing to remember, with DLSS3, no new depth buffer is generated as far as I know, so the depth reprojection mentioned here would not work, without changing DLSS3 segnificanrly and making it more espensive. And we can easily see that all transparency will have big issues. Like, imagine a muzzle flair.
      So what would probably be best to do is to not render the weapon and weapon effects in to the regular scene buffer. But render those parts at 100% frame rate and overlay on top after reprojecting.
      That would solve other issues too. Like making feedback from the weapon have a consistent timing with your click, and not depend on if you clicked on a generated frame or not.

    • @Girugi
      @Girugi Před rokem +2

      @@Dionyzos oh and also, claiming 4k would be enough buffer for 1440p is just bogus. Maybe for gamepad players, but a mouse can easily turn 90 or 180 degrees in one frame. Especially when we talk competitive players.

  • @beenguy5887
    @beenguy5887 Před rokem +41

    Holy. That demo is actually super impressive!! I'm mostly interested in that skewing effect thing. AMD DLSS3 ALTERNATIVE COMING SOON?!!! I'm shocked that not much (if any) real games actually uses mouse independent movement, honestly in hindsight it seems so obvious

  • @Armi1P
    @Armi1P Před rokem +20

    It's amazing how combining fairly simple tricks can give such an impressive result!

  • @bernegerrits8394
    @bernegerrits8394 Před rokem +64

    Stormworks: Build & Rescue implements something like this. When a big/laggy object or vehicle is loaded, the physics engine slows down, but the camera movement keeps locked at 60fps. It is quite obvious when the physics are running at 15 fps, but it helps reduce camera stutters and makes it easier to control

    • @geemcspankinson
      @geemcspankinson Před rokem

      It's really good

    • @512TheWolf512
      @512TheWolf512 Před rokem +2

      Source engine had this for all this time. Physics in multiplayer run way slower than in single, yet are independent of the scene rendering frame rate

    • @hotmailcompany52
      @hotmailcompany52 Před rokem

      KSP Also did this but nowhere near the same level of Stormworks.

    • @Trikipum
      @Trikipum Před rokem +4

      this has nothing to do with what the video talks about. What you are talking about is called time dilation and has been done for decades in certain games. The engine simply goes in slow motion so it has time to process everything and keep a decent frame rate.

    • @shepardpolska
      @shepardpolska Před rokem +3

      It's kind of not the same thing. By that you can say Minecraft does the same thing, but the real reason is that the world you play runs on an internal server and is disconected from the render you are seeing.

  • @existentialselkath1264
    @existentialselkath1264 Před rokem +7

    Overscan (the concept of rendering stuff slightly off screen) is already used infrequently as a solution to certain screenspace artifacts from ssr etc.
    Would certainly be a good idea for this kinda thing

  • @ToonamiAftermath
    @ToonamiAftermath Před rokem +10

    Great video! I'm hoping that we don't have to wait too long for more advancements like this, I think a game engine can combine all of these technologies into 1 smooth and configurable experience that would advance gaming graphics and reduce the barrier to entry at the same time. Nice callout to Foveated rendering too.

  • @user-ub1on7ui7t
    @user-ub1on7ui7t Před rokem +3

    Congrats on being featured in LTT video!

  • @guest0191
    @guest0191 Před rokem +5

    who would have thought this would be featured on LTT im stoked this technology is getting the attention it deserves

  • @gamingmarcus
    @gamingmarcus Před rokem +4

    It's amazing to see how your very unique perspective on tech topics gets appreciated more and more in the scene.

  • @cyjan3k823
    @cyjan3k823 Před rokem +5

    Beginning of this video was very confusing but I am glad I was there to see this demo, really impressive stuff

  • @splatlingsquid4595
    @splatlingsquid4595 Před rokem +7

    Congrats on being showcased and mentioned on LTT! Really love the new content and approach to tech you've been doing.

    • @treeinafield5022
      @treeinafield5022 Před rokem +1

      Which LTT video was it? I'd like to hear what they said.

  • @Sprixitite
    @Sprixitite Před rokem +3

    This feels like the kind of thing I go down a rabbit hole of when implementing mouse movement in a fresh project lol, good job

  • @denisruskin348
    @denisruskin348 Před rokem +4

    Seeing DLSS 2.x and FSR 2.x simultaneously in a game always cheers me up. Being able to enjoy "4K" and a big boost to fps (especially for high refresh rate panels) is just amazing. Zero or minimal loss to image quality and the game feels much smoother. It really looks like magic at times.

  • @sirreoser5668
    @sirreoser5668 Před rokem +15

    CSGO major starting and a 2kliks video. Oh boy we're spoiled

  • @AmiciCherno
    @AmiciCherno Před rokem

    This was amazing to witness even just as a demonstration, thank you for posting this video, let's hope the future of budget options in gaming looks better like this.

  • @psycl0n3
    @psycl0n3 Před rokem +2

    Really like this kind of content, these advancements have their uses and make sense. Super interesting topics lately!

  • @stezzzza
    @stezzzza Před rokem +1

    Love this video! Absolutely love VR and never even considered its uses for flatscreen gaming. Great job! ❤

  • @a_blind_sniper
    @a_blind_sniper Před rokem +6

    I think this is it. This video clearly states my exact frustration with low framerates and offers a clear solution that already works. I hope someone at Nvidia sees this because this is a fantastic idea.

  • @SirSicCrusader
    @SirSicCrusader Před rokem

    That is fascinating, and that there are already examples of the idea in action is really cool.

  • @Invisibilitylock
    @Invisibilitylock Před 4 měsíci

    Thanks dude, rlly helped me understand this concept, wasnt like most videos where i cannot ever keep up with what is being said.

  • @mattdavis1475
    @mattdavis1475 Před rokem

    Great video, I had been thinking about reprojection outside VR and it's great to see that demo!

  • @elin4364
    @elin4364 Před rokem +7

    After seeing this I almost instantly wanted to try to implement this into the game im working on right now, but after thinking about it there are a few big issues with implementing this into an existing project made in a exiting game engine like unity:
    1. Unity doesn't support async rendering, and I dont think most modern game engines do either. This means that you cant really push multiple frames per frame like that in a good way.
    2. One possible approach around that could be to split the work of "1 real frame" into parts, IE if you're doing 4 "fake" frames per "real" frame you'd maybe do the cpu calculations on the first 2 frames and then the actual rendering on the next 2 (you could even do this in parallel, IE while you're rendering frame #1 you're also doing the cpu work of frame #2 in parallel). This would still probably be slower than if you could actually do async rendering properly though.
    2. Even if you did that, most existing games rely on doing operations "per frame", all of which you'd have to redo for this new system, which is A LOT of work.
    So you'd really need to write your own engine to do this effectively, which I mean you can 100% do, but is suddenly a huge project sadly. Sounds like a really fun side project and im 100% gonna try it out haha.
    (also the demo you showed of is made in unity, but they just rendered the whole "real" frame in the last "fake" frame possible, meaning you dont really get anything extra out of it sadly (except maybe lower gpu usage on more powerful gpus which is nice I guess), it certainly shows off how cool the technology could be given if engines properly supported it though!)

    • @Qrzychu92
      @Qrzychu92 Před rokem +1

      yeah, this should be supported on the engine level. Only thing you have to separate from the game logic is camera movements, so that engine can take the "3d screenshot". Image next Unreal adding a checkbox for this, then you have have to implement camera movemnet in this specific way, boom. Unlocked.
      It would be WILD, especailly for consoles

  • @TheFirstObserver
    @TheFirstObserver Před rokem +9

    Interesting idea. IIRC, the RTS Supreme Commander did something similar, decoupling inputs and graphics processing from simulation processing. This was done to allow updates to process in .1s intervals to prevent the simulation from lagging under load, but the high refresh still let the game be smooth for the time and inputs feel responsive (since clicks and ui still responded as fast as the engine could render).

    • @mrfatuchi
      @mrfatuchi Před rokem +2

      What you are talking about has been done since the dawn of videogames. What the guy in video is talking about is something different.

    • @TheFirstObserver
      @TheFirstObserver Před rokem

      @@mrfatuchi Well, I get that different items are often split off from the main gameplay loop into their own threads, and often update at their own frequencies, but I was under the impression this specific proposal was a decoupling of the engine's simulation from the graphics and UI specifically? So responses to player inputs are tied specifically to the graphics refresh rate?
      I guess I may be confused...?😅

    • @alex15095
      @alex15095 Před rokem +1

      ​@@TheFirstObserver What you're describing is doing game logic in a slower thread, and interpolating between states in the render thread while rendering as fast as possible. The video's proposal is to slap on another thread and transform the previously rendered frame to the new camera transform whenever the render thread hasn't produced a frame in time.

    • @TheFirstObserver
      @TheFirstObserver Před rokem

      @@alex15095 Oh, OK. I think I get it now. Thanks for the clarification. 🙂

  • @Mr.Electricc
    @Mr.Electricc Před rokem +2

    Hey glad people get to see the brother of 3clicksphillip

  • @ThreeCatProductions
    @ThreeCatProductions Před rokem +1

    Fantastic video, really interesting stuff, perhaps because I studied real-time 3D technologies at University. This is really well researched, and cleanly worded. Great work!

  • @SpidFightFR
    @SpidFightFR Před rokem +2

    That looks nice ! I look forward to see technologies like this in games !

  • @Dribbleondo
    @Dribbleondo Před rokem +4

    I think Stingers' onto something with that 3D Projection tech.

  • @truly8644
    @truly8644 Před rokem

    I've been thinking about this as a possibility for a while now, but having someone with this much charisma explaining it really convinces me so much more than I did myself :)

  • @olivercuddeford8920
    @olivercuddeford8920 Před rokem +43

    This is similar to how a phone’s refresh rate can be 60Hz but its touch sample rate can be triple that, which makes 60Hz feel much smoother and more responsive than it would if touch detection was pinned to screen refresh rate

    • @Jona69
      @Jona69 Před rokem +22

      I don't think that's a good comparison. That would be similar to how mouse sampling rates are usually at 500 or 1000 hrz.

    • @shanroxalot5354
      @shanroxalot5354 Před rokem

      Johnathan is correct. This is already how our inputs work.

  • @Matt-nv2qg
    @Matt-nv2qg Před rokem

    Can'ty believe you convinced me, but yeah, this is a great idea. Nicely done- the demo really did it.

  • @58geniusplayer
    @58geniusplayer Před rokem +2

    Wow the demo is really great

  • @JoshM7
    @JoshM7 Před rokem +3

    I have wanted something like this forever. For a very long time I felt like something like this should have existed.
    It could be my mind remembering games being better than what they were but I feel like older games (can't put an exact date, but maybe around ps1/ps2 era or maybe earlier?) even at low frame rates were more responsive than most games today. But I could be wrong. Either way. I really want this. It reminds me of playing at 120-240fps on a 60Hz screen. You can't see the extra frames but the game felt drastically quicker/smoother because of the low frametimes.
    But if you can make inputs a separate thing polling at a higher rate then you don't have to draw a lot more frames to feel this effect.

  • @atrombonist
    @atrombonist Před rokem +1

    LTT didn't bring me here but you getting a shoutout once more proves your in depth videos are quality content!

  • @Blaxpoon
    @Blaxpoon Před rokem +2

    I think we will always need these tricks no matter how good the hardware becomes because the artists become more ambitious and push it to the limits always

  • @raychua
    @raychua Před rokem +2

    Linus Tech redirected me here. This make even better sense to use in cloud gaming where images are rendered and streamed and your inputs clearly lag from the network round trip. There's something called GGPO rollback that dealt with this for fighting games that if put together with this could improve things even further.

  • @uwunora
    @uwunora Před 11 měsíci +2

    What an awesome demo. I'm amazed this hasn't taken over gaming

  • @fireaza
    @fireaza Před rokem +1

    Ooooh, this has cleared up a few things about synthesized frames in VR for me. I always assumed it was like motion interpolation in TVs, where it's generating a "fake" frame, but it's actually just moving the viewpoint on the current frame? That would explain why head movement always feels smooth, even when you're getting a bad framerate. This would also explain why when the FPS gets really low, that you can sometimes turn your head and see the edges of the image. This must be one of the tricks Valve came up with when they were developing the Vive and reached the point where none of their playtesters were getting motion sick anymore

  • @JDSPonYT
    @JDSPonYT Před rokem +17

    This already exists with SpecialK and it's Latent Sync + Input offset.
    EDIT: The input not being tied to framerate for games that is

    • @brett20000000009
      @brett20000000009 Před rokem

      err no specialk does not have async timewarp wtf 17 people upvoted this

    • @JDSPonYT
      @JDSPonYT Před rokem

      @@brett20000000009 read the EDIT (that was made seconds after the comment was made)

  • @jpeg8982
    @jpeg8982 Před rokem

    I'd never heard of this concept before I love it

  • @jakesdd5401
    @jakesdd5401 Před rokem +1

    This, this is what we need!

  • @vtvitibttboybobtottbo
    @vtvitibttboybobtottbo Před rokem +1

    Great video. Never heard of this approaches to smooth 30 FPS experience. But now I'm very interested.

    • @TerminalMontageClips.
      @TerminalMontageClips. Před rokem

      Hurry my friend , thank you for watching , hit me up i have something for you...

  • @stephenkamenar
    @stephenkamenar Před rokem

    very cool idea for sure. responsive mouse movement really is the most important

  • @sadago2690
    @sadago2690 Před rokem +4

    Someone got the Linus bump

  • @elliejohnson2786
    @elliejohnson2786 Před rokem +1

    Oh my god, this is genius! I actually didn't even realise the game was running at 15 FPS when you had 144 mouse movements, and at a glance the extended rendering Comrade Stinger's demo comes with is good enough. The only time I realised was when you moved the character itself.

  • @sirbughunter
    @sirbughunter Před rokem

    Really fascinating!

  • @sashamakeev7547
    @sashamakeev7547 Před rokem +1

    Well at first 1/3 of video i was like "you cant do that..." but having watched through whole video - these are great ideas and tricks that can make gaming better!

    • @TerminalMontageClips.
      @TerminalMontageClips. Před rokem

      Hurry my friend , thank you for watching , hit me up i have something for you

  • @dadgamer6717
    @dadgamer6717 Před rokem

    excellent video! very interesting! thanks!

  • @Caynug
    @Caynug Před rokem

    wow I want this in all games, truly revolutionary and I thought about this stuff for years because VR makes it work so good.

    • @TerminalMontageClips.
      @TerminalMontageClips. Před rokem

      Hurry my friend , thank you for watching , hit me up i have something for you

  • @blinded6502
    @blinded6502 Před rokem +1

    Actually, now that I recall it, I've seen this reprojection thing years ago in some VR demo
    You basically could pilot a Mars rover while basically being located in a dark room, with a small screen of what rover sees floating in front of you. And you could use your controller to steer the camera, updating that view as if it were a flashlight of sorts
    At least that's how I remember it

  • @mariuspuiu9555
    @mariuspuiu9555 Před rokem +105

    the problem is that regardless of how you try to uncouple the mouse input from the "gameplay", input lag will be the same. you are still stuck waiting for your input to manifest in the game. UNLESS you want the game to calculate stuff without them appearing on screen which introduces a whole lot of other issues. clicks won't register on screen at the correct visual timings. (it will just end up feeling like lag compensation and how you can get kills or get killed after moving behind a wall)

    • @AGuy-vq9qp
      @AGuy-vq9qp Před rokem +5

      It’ll be very small though, so not much of an issue.

    • @AGuy-vq9qp
      @AGuy-vq9qp Před rokem

      It’ll be very small though, so not much of an issue.

    • @TestingInProgress1234
      @TestingInProgress1234 Před rokem

      Just make raycasting updated on the same thread as the mouse movement, problem solved

    • @samnicholson9047
      @samnicholson9047 Před rokem +10

      There's no reason that a game can't process mouse clicks faster than the base framerate. There might be a delay in seeing the physical bullet, but the position on screen will be correct. It shouldn't be too difficult to run the game engine faster than the graphics, since we're already doing that with the interpolation anyways.

    • @luddeoland
      @luddeoland Před rokem +5

      As he said, thus should not be used for competitive games. The "click problem" would nearly only be when you make an inhuman flick and dont even know if you should have hit or not.

  • @amarynthos519
    @amarynthos519 Před rokem

    I love videos like these.

  • @devlin1991
    @devlin1991 Před rokem +2

    When I wrote the renderer for JanusVR (which did have a 2D KBM driven mode) I went with a system that prioritized mouse/head movement latency over keyboard or mouse click latency. Rather than being pipelined like most renders it would try to always render to vsync and would re-render a previous frame's content at the new camera position if the CPU "frame builder" thread was unable to provide a new frame in time. Our scene content was user controlled so badly behaving scripts or poorly built avatars (per triangle unique materials......) was common and this kept motion-to-photon latency more stable. It shouldn't be too difficult to find or modify one of the time warp or spacewarp implementations to work in a purely mouse driven scene. I'd love to try out a prototype of that.

  • @paulwolff2121
    @paulwolff2121 Před rokem

    I was super excited when Unity released the decoupled Input from Rendering Feature. I loved to develop mobile apps with unity but the biggest drawback always was that those were killing your battery suuuper quick, even if you were only rendering the main menu. This feature allowed to signifantly reduce frame rate in menus, if there was no input from the user and only up it again if the user was scrolling or anything was animating. I mean it was possible before, but you always had to wait for the next frame for your user Input to be detected and thus having a lag.
    Didn't think about FPS at the time. Great use of that feature!

  • @jcquints3364
    @jcquints3364 Před rokem

    holy shit, i'm pretty much unaware of vr tech since i'm not really interested in vr gaming. but holy this is some next level stuff lol. game optimization is such a work of art

  • @Cup_70
    @Cup_70 Před rokem

    Lots of good ideas. Anything to make games FEEL good is what I want.

  • @triplebog
    @triplebog Před rokem

    This is a fascinating idea. Games already generally have multiple internal refreshrates.

  • @Gardor
    @Gardor Před rokem

    I cant wait for this to become standard, this is so sick!!!

  • @Snoozie
    @Snoozie Před rokem +4

    The best part of this is that this is in worst case scenario in what is essentially a tech demo
    Imagine this stuff at 60fps, or 72fps.
    While you would probably not get a competitive advantage on the frames updating, with 144hz mouse movement this would completely bypass games running poorly on reaction times in a sense.
    I hope this catches on

  • @JacobLukasiewicz
    @JacobLukasiewicz Před rokem

    That video should blow up and go viral so we can have some response from true AAA games developers. Great Vid!

  • @dominikrohel2546
    @dominikrohel2546 Před rokem +2

    This goes out of topic of upscaling, but I've recently found out about DirectX-Vulkan, which as far as I know helps stabilise and or increase framerate (mostly) in older games. I reckon people would be thankful for acknowledging that they finally can run GTA 4 on max setting with smooth framerate, which dxvk helped me achieve, so video on this would be awesome.

  • @FauzanARB
    @FauzanARB Před rokem

    Excellent video 👍🏻

  • @tdreamgmail
    @tdreamgmail Před rokem +2

    It's actually makes low framerate gaming much more appealing, like on laptops and old computers with old graphics cards.

  • @aaronlink127
    @aaronlink127 Před rokem +2

    This kind of tangentially reminds me, Overwatch 2 has an interesting approach for mouse, they allow you to shoot in between frames and it actually calculates where your camera *would* have been when you shot, even if it never spent a single frame there, it's very interesting stuff. That tech + this visual interpolation could help too.

  • @JustLukeMinecraft
    @JustLukeMinecraft Před rokem

    This is a fantastic idea! Game designers please take notes

  • @isaiahbriggs2093
    @isaiahbriggs2093 Před rokem +3

    this would be great for old games that have fps limited physics

  • @qudruplem8570
    @qudruplem8570 Před rokem

    very cool thank you big man phillip

  • @MrAlaxUA
    @MrAlaxUA Před rokem +3

    As a game developer, one important issue comes to mind. What happens to the game elements which are tied to the camera rotation directly, like your gun model? VR games can freely use warp because there is never anything of value attached to your face, while your arms fluidity is not as important as camera's. The problem with warp, is that it's fairly simple, in the sense that it just takes the last frame as a plane with a texture and transforms it. It means that your gun on the screen, that usually is directly tied to your camera movement, will feel rubber-y, lagging. And I don't see a simple way of addressing that, except for trying to reduce the time before the next frame as much as possible by using frame interpolation.

    • @RazielXT
      @RazielXT Před rokem +1

      With velocity vectors you would know not to move those pixels anywhere. Ofc it would have to be "smart" warp

    • @TerminalMontageClips.
      @TerminalMontageClips. Před rokem

      Hurry my friend , thank you for watching , hit me up i have something for you

  • @EnderCrypt
    @EnderCrypt Před rokem

    this would be amazing

  • @WayStedYou
    @WayStedYou Před rokem +5

    This video just got a shoutout on LinusTechTips channel. 13:50

  • @BrainTako
    @BrainTako Před rokem

    Sounds like a great idea, I'm nowhere near informed enough to know if it's viable. But as always, you bring something interesting to the table.

  • @Snoozy96
    @Snoozy96 Před rokem +1

    One thing that I really want people to know, is that on top all of this. There are Ai generated facial animating, Ai Generated voices, and ai generated text that can adhere to storylines and world rules. Combine all of this together, and we will get an amazing VR experience like none before. You could genuinely create and entire VR MMO that has continuous quersting, that is random and follows the worlds laws, while also being unique and generating animations for specific body types of characters int he world as well.
    We are so close to achieving these types of games. I hope int he next decade something close to an alpha version of this that combines it all together to make something that just tests these foundations out will come out.

  • @Pfaeff
    @Pfaeff Před rokem

    I love this idea.

  • @brett20000000009
    @brett20000000009 Před rokem +2

    stinger released a new version with improved player reprojection and it works even better.

  • @powermaxx11
    @powermaxx11 Před rokem

    I need this in my veins right now

  • @timmybus21
    @timmybus21 Před rokem +1

    this is actually genius!!

  • @Matthew-.-
    @Matthew-.- Před rokem +1

    This stuff would almost certainly require some wizardry at the engine level to allow interaction between the player and the world if they're being rendered separately.

  • @GlassFoxGear
    @GlassFoxGear Před rokem

    God I love this channel

  • @MarioRCarbonell
    @MarioRCarbonell Před rokem +1

    To address some criticism, the scene and the user interface with the minimap, stats, etc, should be rendered separately and mixed on the mouse input response.
    The video card should render the 3D scene, be modified as shown on the demo, and then the superimposed user interface be put on top. This would solve a lot of critical comments. Also the gun or the hands in a first person game could threated as part of the user interface to not be in a different position.
    A further enhancement of the technique could be to render different scene planes at different fps.
    - Long distance objects, like mountains, sky, etc, could be rendered at 5 fps.
    - Medium distance objects, rendered at 15 fps.
    - Near objects, at 60 fps.
    The fps numbers are invented, just and example.
    Those planes would move according to the camera perspective immediately after the user input.
    The final image would be a combination of long distance + medium + near + user interface. Each one updated at different frame rates and combined to give a new frame after the user input.

  • @yellowlemonmothfreak
    @yellowlemonmothfreak Před rokem

    For the first time ever for any thing or opinion you have ever explained I agree absolutely and without Limits. Pretty much any modern graphics chip can get I'm more than reasonable frame rate in a grand majority of games except Apex Legends or cyberpunk. It really only makes sense to look at game optimization as a function of of input to frame consistency now. This is the true factor for game performance now in my opinion. Now I need to find a way to make this same optimization with my body and my brain as it's the ultimate skill in life I think