I’m Embarrassed I Didn’t Think of This.. - Asynchronous Reprojection

Sdílet
Vložit
  • čas přidán 23. 04. 2024
  • Get up to 25% Off Pulseway's IT Management Software at lmg.gg/PulsewayLTT
    Save up to *40% off and get Free Worldwide Shipping until Dec. 22nd at www.ridge.com/LINUS
    What if you didn’t need the best frame rate to reduce input latency? What if your display’s refresh rate was enough all on its own? With Async reprojection, anything is possible… Even turning 30 FPS into 240.
    Discuss on the forum: linustechtips.com/topic/14713...
    Comrade Stinger's original video + download links: • Async Reprojection out...
    2kliksphilip's video about async reprojection: • The future of upscaling?
    Purchases made through some store links may provide some compensation to Linus Media Group.
    ► GET MERCH: lttstore.com
    ► SUPPORT US ON FLOATPLANE: www.floatplane.com/ltt
    ► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
    ► PODCAST GEAR: lmg.gg/podcastgear
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    1:36 Play along at home!
    1:57 I'm sorry, what is this? Demo time!
    3:40 That looks bad, can it be improved?
    5:12 Why haven't we been using this??
    6:38 Blind frame rate tests
    9:44 If frame rate is so important, why can't they tell?
    11:52 Showing how the sausage is made
    12:31 This could be a game-changer!
  • Věda a technologie

Komentáře • 3K

  • @LinusTechTips
    @LinusTechTips  Před rokem +366

    Thanks to Ridge for sponsoring today's video! Save up to *40% off and get Free Worldwide Shipping until Dec. 22nd at www.ridge.com/LINUS

    • @theboogerbomb
      @theboogerbomb Před rokem +4

      thank you ridge

    • @nukeshine
      @nukeshine Před rokem

      ridge sucks!!!!!!!!

    • @gibbyhale4217
      @gibbyhale4217 Před rokem

      Why does it look like inside of a vagania

    • @YTxGalaxy
      @YTxGalaxy Před rokem +1

      6:43 "Top 5 - 10%" , me whos top .1% & still havent become a pro player...

    • @0opsyt
      @0opsyt Před rokem +3

      if you render a little bit off screen you might not need stretching

  • @tillson8686
    @tillson8686 Před rokem +4881

    I asked in a VR subreddit about a year ago why nobody is making Async for computer games and people gave me shit about it like "wouldn't work that way, the idea is stupid, just not possible, etc." so I gave up. Glad I asked the right people

    • @InfernosReaper
      @InfernosReaper Před rokem +766

      There are a lot of people who like to do the seemingly safe bet of saying "it won't work" without actually knowing, because they aren't the experts they want to pretend they are.
      If a person speaks in absolutes without even trying explain why, chances are, they are not truly experts. They might know some things and even genuinely think themselves to be experts, but in reality, they have much more to learn.

    • @kristmadsen
      @kristmadsen Před rokem +211

      People that reply to things on the internet tend to respond that way to new ideas.

    • @kazioo2
      @kazioo2 Před rokem +89

      Maybe you got this answer because it was discussed and tested like a hundred times since John Carmack invented it in 2012. There are serious issues with this that are much less problematic in VR.

    • @ffwast
      @ffwast Před rokem +262

      The right idea is not asking redditors anything.

    • @DJayFreeDoo
      @DJayFreeDoo Před rokem +75

      @@kristmadsen it was the same when the first computer mouse was invented. the higher ups said its useless, why would anyone need this? And bam! everyone has a mouse or a trackpad

  • @tubehellcat
    @tubehellcat Před rokem +4080

    "He owns a display" - that's gotta hunt him for ever like the "you're fired" for Colton 😂
    Love it 😁

    • @fsendventd
      @fsendventd Před rokem +33

      which video is "he owns a display" from again? having a hard time finding it

    • @Jeffrey_Wong
      @Jeffrey_Wong Před rokem +98

      @@fsendventd He's been doing a lot of monitor unboxing videos on ShortCircuit, I think it's from one of the Alienware monitor videos

    • @GigaSeadramon
      @GigaSeadramon Před rokem +58

      @@fsendventd it's from the 8k gaming video

    • @AnuraagDaniel
      @AnuraagDaniel Před rokem +94

      @@Jeffrey_Wong yeah, it's also a direct quote, he says "I own a display" in the dlss 3.0 video

    • @TwinShards
      @TwinShards Před rokem +22

      Yeah i got a really good laught out those 4 words under his name.

  • @Blap7
    @Blap7 Před rokem +1359

    2kliksphilip and LTT is a crossover I never knew I needed. Make it happen.

    • @stupot46
      @stupot46 Před rokem +16

      Bump lol

    • @Mraz565
      @Mraz565 Před rokem +10

      Wonder if it can be used with CSGO, whether Valve allows or you brute force it.

    • @morfgo
      @morfgo Před rokem +4

      They won't
      They just use him and his ideas with not even 1 full second of credit

    • @mipacem
      @mipacem Před rokem +4

      @@morfgo its not malicious

    • @isaacolukanni
      @isaacolukanni Před rokem +17

      @@morfgo Dude, they literally credited him and his video in the description!

  • @thepillowmancer
    @thepillowmancer Před rokem +233

    Not mentioned in the video: you can render frames at slightly higher fov and resolution the the screen, so that there's some information "behind" the monitor corner.
    Won't save you from turning 180 degrees, but it will fix most of the popup for a very slight hit on performance

    • @lyrilljackson
      @lyrilljackson Před rokem +1

      this is not what pc gaming is supposed to be about. using vr handmedown techs. and vr and pc scene shouldn't be segregated and minding their own scene either if yall sudden mindlessly hurrah at this crossover weirdofiesta

    • @martinkrauser4029
      @martinkrauser4029 Před rokem +29

      @@lyrilljackson What do you mean?

    • @lyrilljackson
      @lyrilljackson Před rokem

      @@martinkrauser4029 asmh

    • @JustSomeDinosaurPerson
      @JustSomeDinosaurPerson Před rokem +27

      @@lyrilljackson Brainlet take

    • @latinodollar
      @latinodollar Před rokem +15

      @@JustSomeDinosaurPerson Careful... thats a [Lvl. 163] PC Master-Supremacist, the bane of mobile, console, and vr gamers.....

  • @rodrigoteles1409
    @rodrigoteles1409 Před rokem +942

    Plouffe's "He owns a display" gag is always going to crack me up.

    • @Lu-db1uf
      @Lu-db1uf Před rokem +1

      Don't all of them own displays? It's a tech media company, I'd hope they do.

    • @Fay7666
      @Fay7666 Před rokem +57

      There was a video a couple of weeks ago about his _display._

    • @bigdoggo5827
      @bigdoggo5827 Před rokem +25

      @@Lu-db1uf You know.. that's the joke

    • @reeepingk
      @reeepingk Před rokem +37

      @@Lu-db1uf But his display is.... *special*

    • @thebyzocker
      @thebyzocker Před rokem +7

      @@Lu-db1uf he bought the alienware miniled one and hes proud that he was one of the first to get it and now its a meme

  • @mauromerconchini
    @mauromerconchini Před rokem +967

    I'm so happy Phil put a spotlight on this concept, and I'm even happier that a channel like LTT is carrying that torch forwards.

    • @SirDragonClaw
      @SirDragonClaw Před rokem +12

      I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project.
      I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly. I should find my old demo and see if I can get it compiling again.

  • @heeerrresjonny
    @heeerrresjonny Před rokem +355

    You might be able to hide a lot of the edge warping by basically implementing overscan where the game renders at a resolution that's like 5-10% higher than the display resolution, but crops the view to the display resolution. It should in theory be only a very minor frame rate hit since you're just adding a relatively thin border of extra resolution.

    • @carlo6953
      @carlo6953 Před rokem +46

      The size of border you would need to eliminate the edge warping would probably impact performance more than just using a higher refresh rate to lower the amount of warping in the first place.

    • @ABonYT
      @ABonYT Před rokem +36

      The magic combo there would be foveated rendering alongside the async reproj with overscan. The games that would make sense for will inevitably be a case-by-case thing for but the performance gains would be massive.

    • @batuhancokmar7330
      @batuhancokmar7330 Před rokem +78

      @@carlo6953 That assumes you'd need same resolution for overscan. If game is rendered at 45deg FOV at 1440p, render an overscanned area between 45 and 90deg FOV at 360p. You don't need a lot of detail, just something to make valid guesstimates within that motion blur until proper frame fills up the screen.

    • @WiggyWamWam
      @WiggyWamWam Před rokem +1

      Yes, definitely. Surprised they don’t do this.

    • @ccd03c
      @ccd03c Před rokem

      I’m glad I wasn’t the only one thinking this

  • @boysenbeary
    @boysenbeary Před rokem +115

    As someone who plays VR constantly, it’s nice to see this brought up for non-VR stuff

  • @Nabalazs
    @Nabalazs Před rokem +1508

    I am so happy that Philip managed to get the message THIS far out. I do fear that this tech might have issues with particles and moving objects and the like, but when you mentioned that we could use DLSS to ONLY FILL IN THE GAPS, my jaw dropped. Thast so genius! I really hope that this is one of those missed opportunity oversights in gaming, and there isnt like some major issue behind it not being adopted yet.

    • @antikz3731
      @antikz3731 Před rokem +33

      Exactly. On Linux this exact setup has been available for the last year. It makes a massive difference

    • @hubertnnn
      @hubertnnn Před rokem +96

      You don't need to worry about particles, just render them later.
      The whole idea behind this solution is to split rendering into two phases:
      1. Render the scene (expensive 3d phase)
      2. Render the final frame from pictures of the scene (cheap 2d rendering)
      Just move all particle and HUD rendering to phase 2
      To be honest I would suggest to go even further and add phase `1.1` where you use DLSS to draw the less important background stuff, this way you can render the important objects in 4k and background objects (buildings, grass, trees etc) in 720p or lower and just upscale with DLSS.
      Or go even further and render each layer in different framerate.
      Background in 30fps, while objects in 60fps and final image in 120fps

    • @GeneFJacket
      @GeneFJacket Před rokem +39

      This was my first thought, too. Combining Async with DLSS/FSR could potentially be the actual magic bullet we're looking for.

    • @MrDavibu
      @MrDavibu Před rokem +24

      @@hubertnnn
      I mean cheap is relative, if all animated and moving objects and particles have to be rendered later on it won't be that cheap.
      Especially if it means that transparent object have to be rendered after that.
      Also screen-space-reflections of animated objects will disappear if they are not part of an animated object itself.
      Not saying it's not interesting, but it's definitely not a solution without compromises.

    • @tralphstreet
      @tralphstreet Před rokem +12

      @@antikz3731 ??? Explain. I use Linux and there's no such thing.

  • @moonsicklegaming7990
    @moonsicklegaming7990 Před rokem +626

    2kliksphilip is an unsung hero, his DLSS coverage is also one of his best content

    • @MrChanw11
      @MrChanw11 Před rokem +23

      His upscaling content is the best ;)

    • @Aeroxima
      @Aeroxima Před rokem +2

      Never seen either of those but I agree

    • @Diie89
      @Diie89 Před rokem +38

      Personally super excited to see 2klicksphilip's video referred to in a LTT video, a lot of Philip's content is really high quality, especially the ones where he covers DLSS and upscaling as mentioned earlier. Can't recommend checking it out enough!

    • @elise3455
      @elise3455 Před rokem +21

      2kliksphilip had a good idea, but 3kliksphilip is more advanced in every way!

    • @HonoredMule
      @HonoredMule Před rokem +4

      @@elise3455 3klicksphilip is just more work. Both will be _automatically_ obsolete when 0clicksphilip releases.

  • @palaashatri
    @palaashatri Před rokem +176

    Take this a compliment : I love how LTT has now transformed more into a Computer Science/Electronics for Beginners channel than just another "Hey we got a NEW GPU [REVIEW] " channel.

    • @PrograError
      @PrograError Před rokem +6

      Well... They had covered everything on that aisle...

    • @sirspamalot4014
      @sirspamalot4014 Před rokem +4

      It's why I keep watching them, I got tired of watching reviews of hardware I can't afford/don't really need yet. Though my VR rig is getting very tired.

  • @TheTrainMaster15
    @TheTrainMaster15 Před rokem +273

    Philip is revolutionising the way we think about gaming and game dev just with common sense

    • @Neurotik51
      @Neurotik51 Před rokem +5

      what? nothing here is new

    • @TheTrainMaster15
      @TheTrainMaster15 Před rokem +55

      @@Neurotik51 using technology for VR with conventional monitors? I haven’t heard of that before

  • @KalkuehlGaming
    @KalkuehlGaming Před rokem +308

    I know philip will see this and I know he will feel awesome.
    You have come a long way Philip. I am proud to be part of your community since your first tutorial videos.

    • @charliegroves
      @charliegroves Před rokem +9

      Here's to Philip, love his videos on all 3 of his channels

    • @TuRmIx96
      @TuRmIx96 Před rokem +3

      Love him. His tutorials layed the base for my environment artist gamedev job.

    • @fargoth92
      @fargoth92 Před rokem +1

      kliki boy i love you

    • @jo_kil9753
      @jo_kil9753 Před rokem +2

      @@charliegroves more like 14 lol

  • @KingBowserLP
    @KingBowserLP Před rokem +311

    One thing that Philip's video covers that this one does not, and which I'm personally really excited about: combining this with a low shading rate border around the viewport (the fully rendered frame). Since peripheral vision is more trained on movement than detail, this is fine quality wise, and it means the screen doesn't have to guess what's at the edges - the information is already there, just in lower quality than the main viewpoint would have. That would, if not eliminate, significantly reduce the stretching artifacts.

    • @666Tomato666
      @666Tomato666 Před rokem +22

      like doing actual FOVeated rendering, where the "sharp part" is the whole normal viewport, while the low resolution is just around it, like extra 5-10% or so

    • @ffsireallydontcare
      @ffsireallydontcare Před rokem +2

      I haven't seen Philip's video but I'm guessing you'd need eye tracking as well. It'd be pointless to render the fringes of the "screen" at a lower quality if you can point your eyeball directly at it...

    • @gabrielenitti3243
      @gabrielenitti3243 Před rokem +12

      @@ffsireallydontcare what if the lower quality rendered parts are actually outside your screen? You would trade a bit of framerate for more accurate projection predictions which would recoup the lost performance and give you a better experience

    • @rileyn2983
      @rileyn2983 Před rokem +5

      @justathought No because it will be fixed in 1/30th of a second. It's obviously not perfect, but that's what this technique is about, compromises. Lower resolution fringes would be way better than stretching.

    • @ffsireallydontcare
      @ffsireallydontcare Před rokem +1

      @@gabrielenitti3243 Ahh ok, yeh that makes more sense.

  • @mrogalski
    @mrogalski Před rokem +42

    Just think about that: You can see the difference on a CZcams video! Granted it's 60FPS but it's still compressed video streamed from CZcams. I can only imagine how much of the difference you can see live running it yourself. This makes it even more amazing!

  • @stephenmurray5276
    @stephenmurray5276 Před rokem +22

    Y’all have done a stupid good job recently researching and explaining difficult concepts. Between this video and the recent windows sleep/battery video, my (already high) respect for LMG’s tech knowledge has gone through the roof! And y’all didn’t even discover this hack! Thanks for sharing (and explaining)

    • @MrPaxio
      @MrPaxio Před rokem

      older videos were more technical now they suck up to the chump who doesnt know how to navigate a settings menu

  • @hkoizumi3134
    @hkoizumi3134 Před rokem +119

    This explains the weirdest I've felt in VR. The game itself lagged for random reason but my head tracking and responsiveness of the control wasn't affected. I remember thinking if the head tracking lagged along with the other lag, I would have had severe motion sickness.

    • @HappySlappyFace
      @HappySlappyFace Před rokem +17

      Yeah it was honestly a very amazing thing when my quest 2 froze, I was like "oh no please no motion sickness" for the first time but it was so normal

    • @rpavlik1
      @rpavlik1 Před rokem +2

      Yep, it's pretty standard and required for hmd based VR. (At least some sort of reproduction or time warp) There are a lot of different variations.

  • @monkeywithocd
    @monkeywithocd Před rokem +398

    This is great, it really explains some odd behavior I've noticed while playing VR games, and using it for flat games sounds like an awesome idea, especially for consoles.

    • @TechnologistAtWork
      @TechnologistAtWork Před rokem +16

      Handhelds too. This would make any game on the Switch or Steam Deck run near perfectly without having to tap into too much hardware power.
      Why are we not funding this? GPUs are the size of a gaming console nowadays but they couldn't bother to solve those issues with much simpler and cheaper solutions?

    • @nktslp3650
      @nktslp3650 Před rokem +4

      Yes ! Sometimes when the game is stuttery, you can still move freely but you can see the black screen ! Such a cool tech. It works really well, input latency is really important.

    • @nebelwaffel8174
      @nebelwaffel8174 Před rokem +6

      @@nktslp3650 yeah, when he showed the black borders I had a strong feeling of "i have seen this before", but i couldn't put a finger on it, until he mentioned VR.

  • @dezzydayy4608
    @dezzydayy4608 Před rokem +411

    I actually was thinking about writing an injector to apply this to existing games a few years ago when I have seen the effect on the HoloLens. A few limitations though: camera movement with a static scene can look near perfect, however if an animated object moves depth reprojection cannot fix it properly, and you would need motion vectors to guess where objects will go, but that will cause artifact near object edges.

    • @ambassador.to.Christ
      @ambassador.to.Christ Před rokem +8

      couldn't you just zoom in a little bit so you would never see the artifact from the edges of screen and then use a higher resolution or AI to compensate for the crop?

    • @lawnmower16
      @lawnmower16 Před rokem +55

      @@ambassador.to.Christ I think the problem they're talking about is that this works great when objects are holding still because the algorithm knows where the object should be in the next frame, but if it's a moving object, it has to be able to adjust not only for the altered perspective of the object, but also the altered position, but since many moving objects in games are random or player controlled, there's not actually any way to know for sure where the object will be on the next frame, so the information the player is getting is not necessarily the most up to date accurate information, which could mean the result is actually worse than a low frame rate. Because slow information that's always correct is better than fast information that's sometimes wrong.

    • @yensteel
      @yensteel Před rokem +2

      Yeah, the parallax effect is a big issue to address.

    • @blitzkriegiv1169
      @blitzkriegiv1169 Před rokem +16

      This technique can be used only for the background or environment while the additional frames for the subjects of a scene can be rendered through the gpu. This way you can get the best of both technoques

    • @xogmaster
      @xogmaster Před rokem

      Not if you can somehow use light ray data from sectors in an environment to determine depth (or lack thereof)

  • @KeyT3ch
    @KeyT3ch Před rokem +29

    This technology on handhelds will be ABSOLUTELY a game changer, not only it "look" better, it will be even more difficult to spot artifacts on a way smaller screen.

    • @aTron0018
      @aTron0018 Před rokem

      We need this integrated into Steam Deck OS!

  • @aydmu
    @aydmu Před rokem +3265

    3kliks and LTT collabing is what the INTERNET NEEDS!!!!

    • @Qimchiy
      @Qimchiy Před rokem +162

      2kliks but either way Yes.

    • @RennyChuggs
      @RennyChuggs Před rokem +10

      no it isnt

    • @cora2887
      @cora2887 Před rokem +61

      @@Qimchiy its the same guy

    • @pair_of_fins
      @pair_of_fins Před rokem +246

      @@cora2887 Erm, if you paid attention you would know they were brothers 🙄

    • @aydmu
      @aydmu Před rokem +114

      @@Qimchiy might as well get kliksphillip in here too ;)

  • @randxalthor
    @randxalthor Před rokem +248

    This is probably my favorite type of video from LTT. Highlighting and explaining interesting technology is fascinating.

    • @deolamitico
      @deolamitico Před rokem

      oh wait, time for another balls to the wall computer build! only the third this week. /s
      But for real, they've been doing a great job with not doing what I just said

    • @thebaum64
      @thebaum64 Před rokem

      it's up there for sure

  • @Bonez32186
    @Bonez32186 Před rokem +28

    Realistically, if you render outside the fov that you are doing by a percentage, you would have enough scene overshoot for it to not really be a problem unless you have extremely low frame rates and incredibly fast movements.

  • @sound123mine
    @sound123mine Před rokem +25

    A LTT vídeo at 60fps?! My god, the little animations they put like the outro card look so good 👍

    • @MrPaxio
      @MrPaxio Před rokem +3

      yeah doesnt the dummy know 60fps is better than 8k uploads

  • @alexmathewelt7923
    @alexmathewelt7923 Před rokem +376

    As a Hobby Game Engine/GFX developer, I developed this technique with some tweaks: static geometry would just be rendered every few frames, but characters, grass or particles get permanently rendered. With the depth sort and extended viewport, it feels like native rendered and one can really aim precisely on a target, since this is always up to date. As mentioned in the video, dlss uses motion vectors, but has to guess the motion and static geometry. With proper implementation, this guess is not required, but can be calculated by the same hardware as the AI

    • @j1000a
      @j1000a Před rokem +9

      Does this end up looking like motion blur?

    • @Rhedox1
      @Rhedox1 Před rokem +2

      What happens if rendering your static geometry takes 20ms on the GPU? How do you schedule the reprojection to ensure it's executed in time?
      Also, which graphics API did you use to implement this?

    • @Winston-1984
      @Winston-1984 Před rokem +1

      What I'm wondering is, does the GPU in any way know what it wouldn't need to render, sections of the screen that can persist using this tech & only re-rendering additional frames for the sections that require more updates?
      This make sense?, its hard to put down.

    • @CathelijneLadyRider
      @CathelijneLadyRider Před rokem +3

      I want it, and i want it now

    • @alexmathewelt7923
      @alexmathewelt7923 Před rokem +8

      @@Winston-1984 that's what I'm currently working on, since this is now a common technique for ray tracers. Currently I'm trying to create the formulas I need and proof them for small movements. But with this fixed splitting it works for first person shooter or smth like that with much static geometry. Static geometry is nowadays really fast to render.

  • @cometor1
    @cometor1 Před rokem +279

    Yay, 2kliksphilip and his brother 3kliksphilip finally get some well deserved attention!

    • @kazioo2
      @kazioo2 Před rokem +4

      The inventor already suggested using it for normal games in 2012. Then many people made experiments and demos over the last decade. This one finally got some traction, so kudos for that, but it's nothing new.

    • @cometor1
      @cometor1 Před rokem +24

      @@kazioo2 truth to be told, I'm a long time klik empire supporter, and I'm always happy if anything good happens to him like getting mentioned by another creator I like.
      The technology is interesting, and it needs traction to take off, but i actually care more about Philip than the tech.

    • @semick4729
      @semick4729 Před rokem +3

      Brother?

    • @MrALjo0oker
      @MrALjo0oker Před rokem +12

      @@semick4729 yeah he has two brothers kliksphilip and 3kliksphilip

    • @saladgreens912
      @saladgreens912 Před rokem +7

      @@MrALjo0oker Not sure if that is necessarily true, someone should get to the bottom of that. Valve, please fix.

  • @lucasthompson6405
    @lucasthompson6405 Před rokem +51

    This feels like a slightly hacky optimisation you would see in older games, and I personally find that really cool. I always admired hearing the cleaver ways game devs to over come the limitations of hardware.
    Where as these days it feels like we rely on an abundance of processing power. That abundance of processing power is generally a good thing but it feels like these sorts of optimisations are becoming a lost art

  • @SaucedTech
    @SaucedTech Před rokem +52

    I love how much labs has instantly matured this channel. I have watched LTT for a long time but recently its really boosted its level.

    • @TikoyTV
      @TikoyTV Před rokem

      But… they are not even done?!

    • @HonoredMule
      @HonoredMule Před rokem

      It's been far from instantaneous, but we are starting to see the returns and it is definitely nice.

  • @malgaras6204
    @malgaras6204 Před rokem +411

    Seems like interesting tech. Two immediate thoughts:
    1. What about moving objects? Seems like the illusion falls apart there as this only really simulates fake frames of camera tilt, not any changes to things already in your FOV.
    2. What if you just slightly over-rendered the FOV? Then you actually have some buffer when tilting the camera where you have an actual rendered image to display before you need to start stretching things at the edges of the screen. Now obviously since you're rendering more geometry, you are going to take a further FPS hit, but is there a point where the tradeoff is a net gain?

    • @user9267
      @user9267 Před rokem +91

      2. You can do that I think.
      1. They will move at the actual framerate. All asynchronous reprojection does is make the game feel more responsive.

    • @Sunisway
      @Sunisway Před rokem +19

      I mean vr uses it and it runs pretty well

    • @Pro720HyperMaster720
      @Pro720HyperMaster720 Před rokem +22

      In the 2kliksphilips video he mentioned how interesting would be instead of stretching the borders or showing the void leaved by not yet rendered, if we rendered a bit extra of the display area (quasi like overscan) but in low resolution to impact as little as posible the performance, and as our peripheral vision is not great we barely noticed when moving fast that a small area in the corners is momentarily lower resolution.
      So yes, we would have plenty of ways to improve the illusion, for example you could boost the ondisplay area with DLSS or FSR, and maybe even the extra area (I don't think that always would a good idea, depending on your main resolution is not the same that the extra area is 480p and you're creating that image from 240p with DLSS that it to be 240p and get that from 120p, probably a bad idea at least for the ladder), maybe if the resolution on the extra area is not suitable for DLSS or FSR, you could use for on display area but the frame generation of DLSS 3.0 (and future FSR 3.0) but only the frame generation on the extra area to fill the gaps with mostly fake frames gotten of deducing what your movement would lead to showing and anticipating to it

    • @chanceslaughter3237
      @chanceslaughter3237 Před rokem +8

      moving objects still have poor framerates, that's how it is in vr as well, your hands are much more jittery feeling than the rest of the game when your fps drops... in my experience anyway.

    • @reptarien
      @reptarien Před rokem +19

      1. Yes, moving objects are still noticably 30fps, but coming from someone who has spent time in VR reprojection in a game like Skyrim, with lots of moving actors, you don't notice that nearly as much when your actions are still so instant, as shown in the demo. It's crazy how much you can find yourself forgiving if your head and hand movement is still smooth as butter.
      2. That is another technique that VR absolutely uses that works very well to solve that issue. Easily implementable and workable.

  • @mirage809
    @mirage809 Před rokem +131

    As per usual: John Carmack is the king of optimizing rendering in games. He first implemented this tech for the Oculus Rift and has a long history of coming up with awesome solutions for problems like this. This is the man that made Doom, he knows his stuff.
    He's probably laughing right now and having a big "I told you so" moment.

    • @Felipemelazzi
      @Felipemelazzi Před rokem +9

      John Carmack is the responsible for asynchronous reprojection!?!?!?!?!?!
      This living god never stops to amuse the world of technology!

    • @imdurc
      @imdurc Před rokem +5

      @@Felipemelazzi I thought JC was the one who had seen it somewhere and wanted to bring it to Oculus, but, I don't think he was responsible for its actual creation. Anyone know?

    • @HamguyBacon
      @HamguyBacon Před rokem

      he basically mimicked how your eye works in real life, I thought of this too but i thought it was already implemented.

    • @CJMAXiK
      @CJMAXiK Před rokem +1

      Meta improved on this tech, now it is called Asynchronous Spacewarp and bundled with Oculus Quest 2. And let me tell you, it is really cool.

  • @kingsizemedal
    @kingsizemedal Před rokem +43

    2KP is such an amazing channel, he always has very interesting, out of the box ideas, and I love to see more of his wacky stuff being picked up!

  • @zach99999
    @zach99999 Před rokem +16

    This is really cool! I play shooter games a lot and the most annoying thing about low fps in games is the input lag. Slow visual information is more of an annoyance as long as it's above 30, but the slow input response times at anything below 60 fps drives me insane.

  • @BlameDavid
    @BlameDavid Před rokem +265

    I'm so happy to see phillip reach this far out of the csgo bubble with this

  • @Neoxon619
    @Neoxon619 Před rokem +466

    This actually reminds me of the input delay reduction setting that Capcom added for Street Fighter 6. The game itself still runs at 60fps, but the refresh rate is 120Hz for the sake of decreasing input latency.

    • @sm7085
      @sm7085 Před rokem +20

      Good point. That's one of the added benefits of a high refresh rate monitor. Even though you might not reach a high fps, having a high refresh rate monitor can still benefit you from a reduced input latency.

    • @MudHut67
      @MudHut67 Před rokem +19

      that's not how any of this works...

    • @Kaisogen
      @Kaisogen Před rokem +2

      this tech won't really be useful for fighting games specifically, and I think it would be more counterproductive tbh.

    • @MrNicePotato
      @MrNicePotato Před rokem +2

      What does that even mean... The async shown, as I understand it, is essentially shifting your point of view before the GPU producing a new frame. But for a fighting game, it would have to make the new frame no matter what to show your input turning into a move.

    • @j1000a
      @j1000a Před rokem

      The *effect* (not reality, which is a bit different) also reminds me a little bit of QuakeWorld (and to a lesser extent, Quake and Doom). Even when the framerate is high the models use low-FPS animations, and with QuakeWorld I seem to recall objects in motion skipping frames based on your network settings. Meanwhile the movement was still buttery.

  • @ToasterTom
    @ToasterTom Před rokem +9

    I stumbled upon 2kliksphilip’s channels when I was researching how to make maps in Hammer. So glad you guys have mentioned him in multiple videos now!

  • @jamiekerr5514
    @jamiekerr5514 Před rokem +4

    Wow, not sure who wrote this one but such a good explanation. So clear and well presented, good job!

  • @twinklesprinkle1318
    @twinklesprinkle1318 Před rokem +165

    I was amazed watching Philip's video when it came out. I'm happy that it has reached you now! Hopefully the game developpers will get the message, I'd be really happy to see this implemented in actual games, because at the moment unless you have the most recent hardware, you have to choose between high resolution and very high framerate...

    • @SendFoodz
      @SendFoodz Před rokem +2

      I wanna see the video in question, 3klinkphilips is the channel right? what's the video? Im guessing around to find this guy/video, what's the title so I can show him some love?

    • @Wanklacus
      @Wanklacus Před rokem +1

      @@SendFoodz it's linked in this video's description my man (if you haven't found it yet)

  • @odytrice
    @odytrice Před rokem +4

    I literally did a spit take at 6:01 Now I have coffee all over my keyboard 😂😂

  • @sirbughunter
    @sirbughunter Před rokem

    This is so so incredible! I hope this will be the next-gen image helper in all upcoming and older games!

  • @Czllvm
    @Czllvm Před rokem +9

    THIS IS INSANE, I use this already on Assetto Corsa in VR so I play at 120hz but it renders 60fps, Such a light bulb moment at the start, Really wish this can catch on because I've already seen first hand how great this is

  • @afroninjaen
    @afroninjaen Před rokem +818

    I always had a feeling that tech like this is actually the real future of gaming / VR performance. And not just raw rtx 4090 performance.

    • @Thezuule1
      @Thezuule1 Před rokem +48

      This tech has been a part of VR for years and it's awful. They need to take a new approach and have developers actually implement it at the game level rather than it being an after effect because as it stands now it doesn't work worth a shit. Awful.

    • @-NoodleBoy
      @-NoodleBoy Před rokem +11

      @@Thezuule1 I use it in RE8 vr so I can run rtx while in vr and it doesn't feel great but it feels better than native.

    • @possamei
      @possamei Před rokem +20

      @@Thezuule1 On quest, they've built support for it in-engine, it's called SSW. It's actually better than ASW on PC because it has motion data for the image, so the interpolation is quite good. Sure, real frames are still better, but the tech is getting better

    • @Thezuule1
      @Thezuule1 Před rokem +2

      @@possamei you've got that a little twisted up but yeah. SSW is the Virtual Desktop version, AppSW is the native Quest version. It works better but still not well enough to have picked up support from any real number of devs. Step in the right direction though.

    • @DJayFreeDoo
      @DJayFreeDoo Před rokem +2

      @@Thezuule1 But what if DLSS and FSR only had to correct the flaws of this instead of making whole frames. DLSS and FSR might get you even more performance.

  • @methejuggler
    @methejuggler Před rokem +91

    I'd imagine that a lot of the edge stretching could be mitigated by rendering slightly more than is displayed on the screen, so there's a bit extra to use when turning before having to guess

    • @Blancdaddy
      @Blancdaddy Před rokem +12

      that's an interesting thought. this would bring us back to the age of overscan i feel lol

    • @RyoLeo
      @RyoLeo Před rokem +24

      @@Blancdaddyreject dlss return to overscan

    • @LasticDJ
      @LasticDJ Před rokem +20

      I think I remember 2kliksphilip talking / showing this in his video, just have the part just outside of your fov rendered at a lower resolution and use that instead of most of the stretching because you can't see the detail anyway

    • @odinsplaygrounds
      @odinsplaygrounds Před rokem +6

      I just commented something similar. Just have it render out 10% extra which is cropped off by your display anyway, so whatever "stitching" it's doing is outside your view. Would love to see this. Combine that with foveated rendering, so the additional areas rendered outside view is lower resolution.

    • @Blancdaddy
      @Blancdaddy Před rokem +3

      imagine the longevity of GPUs if this technology ever becomes standardized. man.

  • @SirDragonClaw
    @SirDragonClaw Před rokem +3

    I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project.
    I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly.

  • @saquial
    @saquial Před 9 měsíci

    That really explains some quirks of the Quest and Streaming, specially on how when something’s loading you can move your head freely and the VR picture would stay still in the space as a single frame, just like you showed here.

  • @txsurvivalandcreations
    @txsurvivalandcreations Před rokem +75

    Really cool stuff. When I’m in VR and the frames drop during loading or something, it does exactly what you showed in the 10fps demo. You can see the abyss behind the projected image on the edges, with the location of the image updating to return right in front of you with each new frame. I had no idea that that’s what it was for.

    • @DiamondDepthYT
      @DiamondDepthYT Před rokem +6

      ive been using vr for 3 years now- and I had no idea what it was until today either! Super cool to learn more about that stuff

    • @MrScorpianwarrior
      @MrScorpianwarrior Před rokem +2

      Oooohhh. You're absolutely right and not once did that occur to me! Imagine if that didn't happen and everywhere you looked was the same loading screen...

    • @merryjerry69
      @merryjerry69 Před rokem

      @@MrScorpianwarrior how to get motion sickness lol.

  • @ruix
    @ruix Před rokem +60

    Glad seeing Philip getting bigger every day. He's amazing

  • @chrisc1140
    @chrisc1140 Před rokem +6

    One thing I wondered about when I first saw that video is if the PERCIEVED improvement is good enough that you could lose a couple more frames in exchange for rendering a bit further outside the actual fov, but at a really low resolution. Basically like a really wide foveated rendering. It would mean the warp would have a little more wiggle room before things started having to stretch.

  • @JakalairVG
    @JakalairVG Před rokem

    This was just fascinating to see. I love it when a tech solution is used in a new way.

  • @Lunch-b0x
    @Lunch-b0x Před rokem +707

    been watching 3kliks for years. I'm glad he's getting some recognition.

    • @TehF0cus
      @TehF0cus Před rokem +39

      2kliks

    • @GreenzQe
      @GreenzQe Před rokem +19

      @@TehF0cus kliks

    • @veganssuck2155
      @veganssuck2155 Před rokem +10

      @@TehF0cus same person

    • @nbt1254
      @nbt1254 Před rokem +60

      @@TehF0cus but don't confuse him with his evil brother kliksphilip

    • @2opmataron991
      @2opmataron991 Před rokem

      @@veganssuck2155 Its a joke....

  • @RocketSlug
    @RocketSlug Před rokem +39

    Having just started getting into VR, I just recently learned about what asynchronous reprojection. Really cool to see it getting mentioned, because when I heard about it it seemed like what DLSS 3 wanted to do, only it's been here already for quite some time.
    Your description about how it decouples player input with the rendering makes me think of rollback netcode for fighting games and how that also decouples player input and game logic, and I'm really excited for what that means for the player experience

  • @ScorgeRudess
    @ScorgeRudess Před rokem +1

    7:06 I love how editors put: "he owns a display"

  • @davidbakersound
    @davidbakersound Před rokem +10

    I’ve never understood why this hasn’t been done before. I’ve thought it should be done since 2016 when I got my VR headset. Like you said, extremely obvious!

  • @QuaziInc
    @QuaziInc Před rokem +14

    I feel like I just had my mind blown wide open at the possibilities. This is one of my favorite ltt videos. Its hard for me to find such a technical concept so well explained. well done.

  • @waybove
    @waybove Před rokem +22

    I think we just witnessed one of those rare moments when an elegant solution clicks and starts a revolution

    • @kazioo2
      @kazioo2 Před rokem +6

      This video is poorly researched. Timewarp was invented by John Carmack in 2012 and described in his post "Latency Mitigation Strategies", more than 10 years ago, in the early 2012. In his original article games other than VR were already mentioned. I remember seeing normal desktop demos many years ago, but it never gained traction despite that.

  • @SHINYREDBULLETS
    @SHINYREDBULLETS Před rokem +2

    I'd be very interested in a followup on this in the form of interviews/queries with the likes of nvidia/amd/nintendo/steam as to whether this is something they're aware of/considering/etc! With ridiculous power-draw for graphics cards being accepted as necessary, this seems like a gigantic sidestep with machine learning assistance to amazing benefits for end-users!

  • @blackstar-genX
    @blackstar-genX Před rokem

    Who's ever PC y'all used to show the games list at the start. Nice to see you as a man of culture. 0:55

  • @Yalden_
    @Yalden_ Před rokem +17

    Holy shit, Philip MADE IT

  • @joshuacook2
    @joshuacook2 Před rokem +25

    Your demo should really have included any animated object. That would have shown some serious limitations and also would be present in nearly all games.

    • @sandmaster4444
      @sandmaster4444 Před rokem +1

      Seriously though.

    • @MrPhillian
      @MrPhillian Před rokem +5

      I keep seeing this concern, and while it might be true at low FPS, I don’t think that’s really where it would be aimed. I’d imagine most would still aim for 60+ rendering.
      Keep in mind that VR headsets are already using this method and are they experiencing these problems? I legitimately don’t know.

    • @matsv201
      @matsv201 Před rokem +1

      It proboly wouldn't have been as obvious as you might think. At least not in 30fps. The reason why 60 or even 120fps is so obviously faster have to do with light retention in the eye when you move the mouse. But you can't se that with animation. Have you ever been Iin the cinema.... 24fps... yes.. 24.... do you think cinema is chopy? I-max+ have 48 fps

    • @amysteriousviewer3772
      @amysteriousviewer3772 Před rokem +2

      @@matsv201 Movies don’t look or feel choppy because there is natural motion blur to everything and you also don’t control anything in them. It’s an apples to oranges comparison. Also IMAX doesn’t “have” 48 fps, IMAX is simply a format. A movie has to be shot at that framerate to display in that framerate. If it’s shot at 24 then it will be 24 in IMAX.

    • @TOGSolid
      @TOGSolid Před rokem

      @@MrPhillian Depends on the game and what sort of post processing is going on in the game from my experience. It can work well but if there's a bunch of fancy effects going on it can be very noticable the smoothness is being faked.

  • @wiggenvan
    @wiggenvan Před rokem +1

    I would be very curious if a hybrid solution would be possible, such as in a fps game, synchronously drawing the environment, but asynchronously drawing the players? I’m sure there’s some limitations involved with that, but it does sound intriguing

  • @The1Radakill
    @The1Radakill Před rokem +3

    Would love to see some analytics on image scaling, ie software vs hardware (NIS vs DLSS), various resolutions, % variance between gpu generated resolutions vs monitor native and how effective they are etc. Comparisons between Nvidia AMD and Intel tecs with all of the above.

  • @matiitam111
    @matiitam111 Před rokem +16

    On the other side:
    this is a static scene, no animated textures, no characters moving around, no post process effects, no particles ect. Porting this to modern game would be similar to what assassins creed syndicate (or later one, I don’t remember) did to clothing physics: capped it at 30fps while game run at 60. Effect would look similarly to what modern games do to animations when characters are too far for game engine to update them as frequent as game current fps. So I’m skeptical
    Also, nice GPU you got there, can’t wait for the review ;)

    • @jacobc5747
      @jacobc5747 Před rokem +2

      it is effective in VR games designed with it, so it will probably be effective on normal PC games if it's kept in mind.

    • @kazahesto
      @kazahesto Před rokem

      Yeah, this isn't new tech. Pretty much every web browser does something similar when scrolling or zooming, where most content is static, and it looks terrible when a heavy webpage tries to do parallax scrolling on an under powered system. The whole "no body thought about it" angle in this video is strange and patronising.

  • @gamebuster800
    @gamebuster800 Před rokem +130

    Cloud gaming would be great with this!
    You handle the reprojection locally and use the delayed frames as a source.
    It will basically eliminate the input lag.

    • @thewhywhywhy4302
      @thewhywhywhy4302 Před rokem +20

      This is a sick idea

    • @MrMoon-hy6pn
      @MrMoon-hy6pn Před rokem +3

      You would still need to sent a depth buffer and probably other information to the computer playing the game. So that means more load on the internet connection, but it still sounds interesting.

    • @khhnator
      @khhnator Před rokem +2

      not that will not do anything, but it will do less than what you think.
      even if it works, and I'm not sure it does, at least not in this form, because the time till you receive a frame that will fill the stretched gaps you just created by the camera around is so much higher, compared to a computer which will just fill that gap in the next 34ms if you are running at 30fps.
      then you might have a super smooth camera turning, but the time to shoot, jump, etc will still be the same.
      heck because of the higher disparity between camera and everything else. it might even be a worsen the experience instead of making it better.

    • @gamebuster800
      @gamebuster800 Před rokem +2

      @@khhnator You're right, but the latency for cloud gaming is already not that high. The most noticeable effect at latency is moving the mouse to look around.

    • @fayenotfaye
      @fayenotfaye Před rokem +2

      This is already done with VR cloud gaming services, when you use oculus air link, you’re basically doing the same thing but over LAN. If you drop a frame, you can still move your head around and it’s perfectly playable all the way down to 30 fps for most games.

  • @saart2212
    @saart2212 Před rokem +8

    Now that's a public interest video ! Raising awareness on this technique will certainly go a long way, especially in open source. I hope the constructors don't shy away from it from fear that it would diminish interest on their high-end GPUs.

    • @dnitz9608
      @dnitz9608 Před rokem

      High-end?
      What r u talking abt, they can make AAA game 8k 240fps without 6slot GPU

  • @EritoKaio
    @EritoKaio Před rokem

    This is pretty awesome, i can totally imagine this working together with something like DLSS in the future, exciting.

  • @elin4364
    @elin4364 Před rokem +95

    Something worth noting is that comrade stingers demo does not really do what they say it does (mostly due to issues with how unity is made being pretty incompatible with this sort of demo). The gpu draws the entire frame during the last frame, so work is NOT split up over several frames. Doing that would be a pretty complicated task in existing game engines like unity.

    • @comradestinger
      @comradestinger Před rokem +60

      This!
      The demo only distorts the *simulated* bad framerate from the slider. If you ran the demo with an actual bad framerate, it would just lag like normal.
      To actually implement it properly is much harder than what I did, in unity's case might require some severe shenanigans, or straight up engine modification.

    • @TrackmaniaKaiser
      @TrackmaniaKaiser Před rokem

      @@comradestinger Do you think that something like that could be a driver feature like the DLSS2 Stuff. Where the GPU gets some motion vectors and shifts existing objects more or less like sprites arround until a new real frame got created?

    • @comradestinger
      @comradestinger Před rokem +11

      @@TrackmaniaKaiser I think both could work, though I lean towards it being done by the devs themeslves rather than by driver. Since games vary so much, different scenes and camera modes would benefit/suffer from the effect in different ways.
      to be honest It's all very complicated.

    • @devzozo
      @devzozo Před rokem

      Wonder if using DOTS and scriptable render pipeline would allow for it, can't imagine figuring all that out in an evening though. I wouldn't trust a solution that leverages Unity's undersupported APIs to be that stable though...

    • @leongao5120
      @leongao5120 Před rokem

      @@comradestinger Good work man

  • @Adam-em1mf
    @Adam-em1mf Před rokem +48

    I knew about this because of my oculus rift, and as you mentioned, in racing games, asynchronous spacewarp (as oculus calls it) is quite noticeable, moving your head around while driving at 100 mph can be quite jarring but oculus updated the feature and the visual bugs weren't as noticeable, it's quite interesting to see how this works, excellent video guys

    • @grumbel45
      @grumbel45 Před rokem +12

      With the latest "Application Spacewarp" on Quest2 the games can now send motion vectors, so the extra-polated frames no longer have to rely on so much guesswork.

    • @Dimondminer11
      @Dimondminer11 Před rokem

      Yeah the visual artifacts can actually cause MORE issues in VR than not, at least in some very specific games. Its not super noticeable in VRChat but in the Vivecraft mod for minecraft the screen turns into a wavy, smeary mess. I actually hated that WORSE than running at 40fps natively which is what my system could do with my settings at that time

  • @edenem
    @edenem Před rokem +4

    I'm very interested in using a form of overscan where you're viewing a cropped in frame of the whole rendered image, so when you're panning your screen around it doesn't have the issue of stretching, unless you pan outside of the rendered frame.

    • @DaVince21
      @DaVince21 Před rokem

      Well, that's as simple as rendering a little more outside of the screen area.

  • @happysmash27
    @happysmash27 Před rokem +3

    Asynchronous projection is great in VRChat (in PCVR on my relatively old PC) where I usually get 15 fps and often even far less than that! I don't really mind the black borders that much in that case especially since the view usually tends to go a bit farther than my FOV making them usually only appear if things are going _extremely_ slow, like, a stutter or any other time I'm getting over 0.25 seconds per frame. So, perhaps another way to make the black bars less obvious, would be to simply increase the FOV of the rendered frames a little bit so that there is more margin. Would make lower frame rates, but it might be worth it in any case where the frame rates would be terrible anyways.

  • @50REN
    @50REN Před rokem +18

    I have said for a very long time that when it comes to refresh rate, I don't mind lower frame rates from a visual standpoint, but the input delay is more what I love about high refresh rate gaming. I'm excited to see where this technology goes.

  • @ValenteXD
    @ValenteXD Před rokem +35

    I remember watching 2klik's video last month and saying wow this is amazing and mind blowing, but thought I was just excited for it because I'm a programmer, guess not

    • @harrasika
      @harrasika Před rokem

      I was also excited for it but thought nothing would come of it since I've only ever seen him talk about it. Now perhaps there's a chance of this actually becoming popular and coming into games.

  • @ShadowNexis
    @ShadowNexis Před rokem +2

    It would be interesting to explore the increased rendering cost of running a higher internal resolution with increased fov, then hiding the edges of the rendered resolution to eliminate the screen edge artifacting.

  • @jabadahut50
    @jabadahut50 Před rokem

    it really is amazing how fast game technology is getting better now that we've hit diminishing returns on fidelity and we're starting to see a wider gamut of techniques being reached for to make a certain experience better by using tech techniques that others have used for a while in different ways.

  • @user-hk3ej4hk7m
    @user-hk3ej4hk7m Před rokem +44

    The main issue with these workarounds is that they depend on the Z buffer, they break down pretty quickly whenever you have objects superimposed like something behind glass, volumetric effects or screen space effects

    • @lucky-segfault4219
      @lucky-segfault4219 Před rokem

      Ya, that sounds like it could be a big issue...

    • @DavidGoodman
      @DavidGoodman Před rokem +1

      You technically only need the depth buffer for positional reprojection (eg. stepping side-to-side). Rotational reprojection (eg. turning your head while standing still) can be done just fine without depth, and this is how most VR reprojection works already, as well as electronic image stabilization features in phone cameras (they reproject the image to render it from a more steady perspective).
      It might sound like a major compromise but try doing both motions, and you'll notice that your perspective changes a lot more from the rotational movement than the positional one, which is why rotational reprojection is much more important (although having both is ideal).

  • @splatlingsquid4595
    @splatlingsquid4595 Před rokem +41

    I absolutely love the recognition that kliksphilip and his brothers have been getting. It really is an amazing idea and would make everything so much better!

  • @ScorgeRudess
    @ScorgeRudess Před rokem +1

    I have been in VR for almost 3 years, and as soon as meta implemented ASW and ATW, I wanted this for PC Games... I have been and still waiting for this, for years

  • @chriszuko
    @chriszuko Před rokem

    Things like Particle FX and transparency is where the issues really show up not just on the edge. A spinning rocket for instance may warp the inner sections where the fins were in the previous frame especially if you pause the game.. since the predicted velocity remains the same but the object has stopped moving. They've solved this for oculus space warp by calculating velocities per object in order to have better prediction, however that requires the game engine to support this on an object basis for all materials and edge cases. We wish we could have used this on Mothergunship: Forge, but had to disable it due to the visible warping. It has the potential to be an absolute savior though since trying to hit 72 or 90 fps basically on a phone attached to your face is a huge challenge for peformance.

  • @JoshsBookishVoyage
    @JoshsBookishVoyage Před rokem +9

    This is interesting. I'd love to see this feature compared to actual higher fps to see whether the perceived gain renders an actual competitive advantage.

  • @c6m
    @c6m Před rokem +9

    Whoa 2kliksphilip getting a shoutout on LTT before any of his brothers, imagine.

  • @semmu93
    @semmu93 Před 2 měsíci

    it would have been really nice to also do a very basic shooting accuracy test in each case with each candidate, but i know that the demo app itself unfortunately didnt contain such a feature. but it would have been REALLY interesting to see the actual, hard numbers.

  • @leonardobosnar7194
    @leonardobosnar7194 Před rokem

    "In the early days of VR" - damn, that one hits hard to hear
    5:51

  • @MichaelBlock
    @MichaelBlock Před rokem +4

    This actually seems interesting to try out, some games I play, even on a 2060 may struggle to have even just a stable fps, so perhaps this kind of thing will help for those kind of situations

  • @tomatus270389
    @tomatus270389 Před rokem +3

    Since you explored this topic, you should also do a video about foveated rendering.

  • @justinkruger
    @justinkruger Před rokem +4

    If Async Timewarp becomes popular on desktop, I wonder if it makes sense to spend the extra time render the z-frames behind some of the foreground layers, or special models that are say 'tagged' or earmarked to have render behind turned on.
    So for example you might want to render behind a fast-moving player model, or a narrow light pole, but you won't want to render behind a wall. Or maybe you only need to render behind a small pixel distance. I'm not sure how easy that is to add to game engines?

  • @lightphobe
    @lightphobe Před rokem +96

    I'm curious if you'd still get the same results if you add moving objects into the scene. Since the objects update their position at the true frame rate, I bet they would look super choppy.

    • @NichtDu
      @NichtDu Před rokem +7

      Yeah youre right
      If you maybe played the teardown lidar mod this is kinda the same (atleast the mod has the same downsides as this)
      Honestly tho it cant get any choppier because the framerate stays the same.
      The examples were with really low framerates but if you had your normal 120 fps and a 360 hz monitor this technoglogie makes a big difference

    • @R3BootYourMind
      @R3BootYourMind Před rokem +15

      Some games already separate physics fps from rendering fps.

    • @DMitsukirules
      @DMitsukirules Před rokem +16

      The thing is, you always want consistent input no matter what. The alternative in your scenario is the objects are still choppy, but your mouse movement is also sluggish

    • @timmbruce99
      @timmbruce99 Před rokem +11

      I'm actually surprise LTT got a demo version without the moving objects (or at least not showing it). Cus yes, animation still looks choppy according to the fps cap, but the perception of moving the camera is smoother than butter 0-0

    • @comradestinger
      @comradestinger Před rokem +4

      The latest version of the demo has moving objects, so you can see for yourself. (they look laggy, moving at the true framerate, as expected) x)

  • @GlorifiedGremlin
    @GlorifiedGremlin Před rokem +4

    This could have insane potential for handhelds. The steam deck can usually pull at least 30 fps on new triple A titles, which seems to be perfect for making the picture much smoother

  • @digitized_fyre
    @digitized_fyre Před rokem

    The coffee guy, the tech news guy, hi, he owns a display, Mark. Such incredible descriptions of the people and their roles

  • @StormyDoesVR
    @StormyDoesVR Před rokem

    As a huge VR fanatic, seeing the tech that makes standalone VR possible put to use on a flat screen game is amazing!

  • @jonasmostert3294
    @jonasmostert3294 Před rokem +6

    I think one caveat here, which has not been mentioned, is that dynamic objects in the focus/center of the screen will also only be updated by whatever frame rate your GPU allows. I wonder how to handle those scenarios. Still a very worthwhile improvement for a lot of games, for sure!

    • @99domini99
      @99domini99 Před rokem +3

      From my experience with VR, although while looking around is perfectly smooth, animated characters on the screen update slower.. But that really isn’t much of a problem. There is no input lag from your HMD, you can look around perfectly fine and the 45fps the headset makes when reprojecting is still smooth enough to track targets.
      The only real caveat is the input lag from your controllers. Moving your hands will feel less responsive when reprojecting than when running native framerate.
      I wonder how this will carry over in desktop reprojection.

  • @klayplayz
    @klayplayz Před rokem +124

    Linus: It's for free
    Also linus: Makes no difference

    • @SelecaoOfMidas
      @SelecaoOfMidas Před rokem

      Your point is?

    • @shib5267
      @shib5267 Před rokem +1

      @@SelecaoOfMidas Makes no difference

    • @OrRaino
      @OrRaino Před rokem +3

      It can make game smoother for low end pc gamers. The pc market will discourage it because then the high end gpu's will become relevant and only medium tier gpus are enough if the tech is further develop for pc gaming.

  • @maelonais
    @maelonais Před rokem +1

    man, this is a really interesting and intruiging technology! hope it's gonna be noted by gpu manufacturers and game developers and integrated in regular games too!

  • @MadmanLink
    @MadmanLink Před rokem

    You guys were so young and fresh faced back in the early VR days!

  • @sparkfullgaming
    @sparkfullgaming Před rokem +3

    Plouff :- owns a display

  • @RurouTube
    @RurouTube Před rokem +4

    This kind of thing is what I actually always think about since way back when motion interpolation becomes common in TV plus the fact that I'm familiar with 3D (I don't do real time 3D rendering, only non real time). What I'm thinking was the fact that you have this motion data, depth, etc should be good enough to have some kind of in game motion interpolation but not really interpolation but for future frame. Even without taking the control input into account, just creating that extra frame based on the previous frame data should be good enough to give that extra visual smoothness feel (basically you'll end up with somewhat the same latency as the original FPS). Since it already working directly within the game, we should be able to account for the controller input and the AI, physics, etc to create the fake frames with an actual lower latency benefit, so basically the game engine run at double the rendering FPS so the extra data can be used to generate the fake frames.
    For screen edge problem, the simple way to solve it is simply to overscan the rendering (or simply zoom the rendered image a bit) so the game have extra data to work with. Tied to this problem is actually the main problem with motion interpolation and this frame generation/fake frames thing, which is disocclusion. Disocclusion is something that was not in view in the previous frame becoming in view in the current frame. How can the game fill this gap because there is no data to fill the gaps. Nvidia I believe is using AI to fill those gaps which even with AI, it still looked terrible. But as it has been mentioned by people using DLSS3, you don't really see it, which is actually good for non AI solution, because if in motion people don't see that defects, then using non AI solution to fill the gaps (simple warp or something) should be good enough in most situation. Also doesn't need that optical flow accelerator because the reason why Nvidia use optical flow is to get motion data for elements that is not represented on the game motion data (like shadow movement) but in reality, that is not important, as in most probably won't notice when the shadow just move based on the surface motion (rather than the shadow motion itself) for that in between fake frames.
    For a more advanced application, what I'm thinking is a hybrid approach where most stuff are being rendered at like half the FPS and half of it will reuse the previous frame data to lessen the rendering burden. So unlike motion interpolation or frame generation, this approach will still render the in between frame, but render it less, like probably render the disoccluded part, maybe decouple the screen space stuff and also shadow so it rendered at normal FPS instead of half so what the game end up with is alternating between high cost and low cost frame.
    When I thought about that stuff, AI wasn't a thing thus I didn't think including any AI stuff in the process. Since AI is a thing right now, some stuff probably can be done better with AI like for example the disocclusion problem, rather than render the disoccluded part normally, probably it can just render the disoccluded part with flat texture as a simple guide for the AI to match that flat rendered look to the surrounding image which might be the faster way to do it.

    • @LutraLovegood
      @LutraLovegood Před rokem

      Interpolation for the future is called extrapolation

  • @vrstoned7870
    @vrstoned7870 Před rokem +3

    When ever VR is mentioned i get happy.. just super sad LTT does not do more VR

  • @TacticalFunnyMan
    @TacticalFunnyMan Před rokem

    I thought this was VR when I first clicked because VR has had this since 2017 at least. Awesome to see this finally make its way to the flatscreen :D

  • @marvinvogtde
    @marvinvogtde Před rokem +7

    i honestly think this might be really great for ultra and super wide gaming as even more of th rendered frame is already in your peripharal vision, so the suboptimal edges will be even less noticible

    • @Imevul
      @Imevul Před rokem

      As a dude with a 32:9 monitor, I'd settle for games actually supporting my monitor resolution AND aspect ratio. Most games I have to play windowed mode, because even if they allow me to go full screen (and don't add any black bars), usually the camera zoom is completely fucked and/or the UI elements are not properly positioned.
      Heck, even a newer game such as Elden Ring just give me two 25% black bars on each side, but with mods it actually supports 32:9. Wouldn't have been that much work to support it by default. A checkbox for disabling black bars and vignette, and an option to push the UI elements out to the sides and all would be fine. But for some reason, most developers don't even care to support newer monitors with weird resolutions.

    • @huttonberries768
      @huttonberries768 Před rokem

      @@Imevul it did support it by default but fromsoftware disabled it intentionally because of "competitive advantage" or some bullshit

    • @marvinvogtde
      @marvinvogtde Před rokem +1

      @@Imevul I also run with a 32:9 display and feel you, but don't be surprised with elden ring from soft does not care about proper PC support. About the UI thing, I actually can't think of a game off the top of my head that supports 32:9 without also having the option to adjust UI elements as in my experience it's very common and has always been a thing even consoles 10 years ago

  • @shawn2780
    @shawn2780 Před rokem +15

    The issue with asynchronous reprojection is that with complex scenes or fast action it creates visible artifacts and weirdness. This is where AI comes in, like DLSS 3 frame generation. By using deep learning, it can more accurately and realisticly insert additional frames. That's really the way I'd the future along with ai upscaling. It has to be, otherwise we're going to need a nuclear reactor to power the future rtx 7090 or whatever.

    • @miikahweb
      @miikahweb Před rokem

      Even bigger issue is that in VR games (where the camera can just move through walls if you move your head in the wrong place) the only thing async timewarp has to do is take the latest headset position and reproject there.
      However, in a regular game you can't just take keyboard/controller input and reproject to a new position based on that. Or your character would go through the floor, walls or obstacles in the map. Instead you would have to run full collision detections and physics simulations in order to tell where the camera is supposed to be in the reprojected frame.
      This not only makes it massively harder to implement compared to VR where it can be automatic. But it also increases the chance of hitting a CPU bottleneck and not gaining that much performance anyway. Then when you combine visual artifacts and other issues you start seeing why game developers haven't used their development time to implement this before.

    • @meowmix705
      @meowmix705 Před rokem

      Oculus/Meta is already doing that with their 3rd generation reprojection tech - Application Space Warp (AppSW). It meshes Reprojection tech with game engine motion vector data w/ a splash of Machine Learning to generate the best looking VR Reprojection yet. The recently released Among Us VR on the Quest2 is a good example of a game using the latest AppSW techniques. All thx to John Carmack, the grand daddy of Asynchronous Time Warp (ATW; the first mainstream Asynchronous Reprojection)

    • @shawn2780
      @shawn2780 Před rokem +1

      @@meowmix705 I don't play enough quest 2 games to be able to speak on AppSW (I play pc through link, and admittedly almost all of always disable regular ASW because of the inherent visual ghosting). Carmack may be the GOAT... I also get a kick out of the fact that he's probably the most reluctant Meta employee ever, but he sticks around because he just loves VR that much

    • @meowmix705
      @meowmix705 Před rokem

      @@shawn2780 Yup, Carmack is the Goat. As to PC-ASW, the Rift mostly uses 1.0 of ASW (has the typical ghosting visual artifacts). ASW 2.0 is only used for a handful of games (greatly diminished the ghosting, but required depth data from the game; many games did not support it). ASW 3.0 is unfortunately a Quest exclusive (for now?), they've rebranded it as AppSW to not confuse it with PC-ASW.

  • @geargrinder6511
    @geargrinder6511 Před rokem

    An idea: If you force the 'game' to render at a wider FOV (Like +20 degrees each way) you could potentially reduce the 'edge weirdness' and make a more smooth experience? Obviously with 90-degree plus flicks it'll still be obvious, but I feel like just a little tinkering could drastically improve the user experience.
    I want to say it was either Doom or Quake who used a similar technique with their raycasting, intentionally drawing just a little more beyond the players' FOV to prevent black spots on the edges of the screen/maps in more fast paced play.

  • @klipschorny
    @klipschorny Před rokem

    Lol. Don't worry Mark the camera confidence will come. You just gotta be in a few more videos and you'll be a regular Riley.