Next Level Triplanar Mapping: Assets, Skeletals and Dithered [UE5]

Sdílet
Vložit
  • čas přidán 25. 07. 2024
  • This video is the continuation of my previous one, in which we go through a correct implementation of the triplanar projection mapping technique for locally map textures in our shaders.
    Previous Video: • Your Triplanar is wron...
    Download: jumpeter.gumroad.com/l/tri-pe...
    Discord: / discord
    Patreon: / visualtechart

Komentáře • 51

  • @HowieDay82
    @HowieDay82 Před 3 měsíci +8

    Awesome - to prevent the "blurriness" at the blends of a single-sample dithered triplanar uv set, make sure the sampler it's plugged into has its DDX and DDYs manually set - if you set them both to "1" you'll get a crisp, non-blurry blend. :)

    • @HowieDay82
      @HowieDay82 Před 3 měsíci +4

      Also - to reduce the shader complexity, consider passing "0" to the "Random" input of the Temporal AA Dither - that keeps it from sampling a texture to get the dither and instead does a simple Bayer matrix instead.

    • @ethanwasme4307
      @ethanwasme4307 Před 3 měsíci +3

      @@HowieDay82 w0w

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci +4

      I need to try the Dither without the noise :D

  • @PrinceBrosnan
    @PrinceBrosnan Před měsícem

    I thought about how to implement this for more than two weeks, or rather there were a lot of problems, but in short, I cry when I watch this video, I can move on, thank you, lol...

  • @3D_TUTS
    @3D_TUTS Před 3 měsíci +2

    I can't thank you enough man, hope your channel grows more, appreciate your dedication!

  • @Tempus0
    @Tempus0 Před 3 měsíci +2

    Thanks for another amazing video! Your videos helped me a lot when I create my own world align material with 6 axis and 2 samplers (one sampler for world x,y,z and one sampler for a "45 degree world projection", then blend between them at the edges). It finally got rid of all the annyoing stretch artifacts I had with my existing world align materials.

  • @d.pereladov
    @d.pereladov Před 3 měsíci +1

    I was literally thinking about this yesterday. And today CZcams recommended this video to me. Thanks a lot! Now I can use triplanar for animated objects.

  • @Fafmagic
    @Fafmagic Před 3 měsíci +1

    Amazing! Thanks for sharing, this is so useful!

  • @Madlion
    @Madlion Před 3 měsíci +1

    Ur channel is golden!

  • @Darcvizer
    @Darcvizer Před 3 měsíci

    Amazing thank you!

  • @MonsterJuiced
    @MonsterJuiced Před 3 měsíci

    OH MY GOD IT'S HERE!

  • @JR3D
    @JR3D Před 3 měsíci +1

    Oh neat I'm in this one :)

  • @nintenjo64
    @nintenjo64 Před 3 měsíci

    Nice work as ever.. Out of interest for skinned meshes that are triplanar, what state of the model is it applying that mapping to.. Its bind pose? As this would differ greatly from its first frame of the anim.. And also might not work with Alembic geo caches as they dont really have a bind pose just a wierd array of point cloud data stored when no animation / blend shape is being applied?

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci

      Since we're using the "Pre-Skin" data, we're referring to the bind pose :)
      About the Blend Shapes... As long they're applied on GPU this approach should still work, not sure though, I would need to test. No idea about the alembic, but since we're talking about an array of different meshes I feel like it wouldn't probably work

  • @nicholass3430
    @nicholass3430 Před 22 dny

    Hi, thanks for this time saving material, Ive bought it already. Aparently, to get correct normals, we are supposed to disable Tangent Space Normals in the Material graph. But then how would you blend a regular untiled normal map with the triplanar normals? Thanks again for advice!

    • @VisualTechArt
      @VisualTechArt  Před 22 dny +1

      Depends on the order of blending :)
      If you use the triplanar one as "base" to add on top, you can use the BlendAngleCorrectedNormal function and they should come out nice and clean :D

  • @Lunzium
    @Lunzium Před 2 měsíci

    First off, this is just brilliant!
    Secondly, I'm struggling to figure out how to make the Texture(T2d) input accept a texture parameter and not just a texture object. So I can change the texture in the material instance. Am I overlooking something obvious or how do I fix this?

    • @VisualTechArt
      @VisualTechArt  Před 2 měsíci

      It's the TextureObjectParameter node, but you can just right click on your TextureObject one and convert it to a parameter :)

    • @Lunzium
      @Lunzium Před 2 měsíci

      @@VisualTechArt Thanks mate. Didn't even think of that after it wouldn't let me hook up a texture parameter to begin with. Just using a Texture Object and then converting is working fine :) Thanks

  • @IIStaffyII
    @IIStaffyII Před 3 měsíci

    I feel a bit foolish for not realizing I could have used dithering to blend the UV like this. I often used dithering while working on VR titles because using translucent materials has significantly impact performance especially in VR.

  • @user-ev4ud5xs2j
    @user-ev4ud5xs2j Před 3 měsíci

    Does this three plane material function not support the substrate new material system? What does the missing link mean when reporting an error

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci

      This MF relates only to how you map a texture in the shader, shouldn't care about if you're using Substrate or not :)

  • @Virtuality
    @Virtuality Před 2 měsíci

    Hmm.. In my testing your method for WA textures on Skeletal Meshes still gives artifacts on the X and Z plane (only on the exact plane, when previewing on a cube). It happens both with your 'Normalize' method before the transform and with the original. What gives? You have the same?

    • @VisualTechArt
      @VisualTechArt  Před 2 měsíci

      Mmm I'll have to double check... It shouldn't happen anymore as with the Derive Tangent Basis there shouldn't be a case where you get null vectors, I remember checking for that. But maybe I missed some exceptions

  • @juliocargnin
    @juliocargnin Před 21 dnem

    Hello! I'm wondering about the normal maps being attached to the surface in a way taht shadows are being casted in multiple different directions. If you look the thumbnails you will notice the shadows are not consistent in relation the the source light... am I missing something or this is the intended result? I wonder If we could get the normal maps interacting in away that they auto correct the expected direction of the projected shadows regarding orientation and deformations of the models... 🤔Thank you for this master class!!

    • @VisualTechArt
      @VisualTechArt  Před 21 dnem +1

      Not sure I understood your comment tbh xD can you re elaborate? Sorry

    • @juliocargnin
      @juliocargnin Před 19 dny

      @@VisualTechArt I've made 2 images elaborating my point. I've posted here but it disapered for some reason... how can I send you so you can understand better what I mean? Thanks!!

  • @baudmarch3925
    @baudmarch3925 Před 16 dny

    I wonder though, now that you have added the dithering to allow for smoother transitions between the projections in the "Simple Fetch" version, what additionnal advantage does the "Triple Fetch"
    retain ?

    • @VisualTechArt
      @VisualTechArt  Před 15 dny

      It still has better visual quality, and depending on the texture set, transition and render size, the dithering might be noticeable or causing ghosting artefacts :)

    • @baudmarch3925
      @baudmarch3925 Před 15 dny

      @@VisualTechArt Ok,thank you for your answer !

  • @user-pe6cw4xq3o
    @user-pe6cw4xq3o Před 3 měsíci

    Is this skeletal mesh triplanar optimized enough for games?

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci

      If your frame renders in less than 16.6ms, yes :D

  • @user-ev4ud5xs2j
    @user-ev4ud5xs2j Před 3 měsíci

    This three plane material function has been updated, and the texture can be fixed on the object. How to link nodes? The original method seems to be ineffective...

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci

      What do you mean?

    • @user-ev4ud5xs2j
      @user-ev4ud5xs2j Před 3 měsíci

      @@VisualTechArt How to operate this material function and link it to the material

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci

      If you download it by following the link, you get a sample project that shows it :)

    • @user-ev4ud5xs2j
      @user-ev4ud5xs2j Před 3 měsíci

      @@VisualTechArt The example project has been opened, but it is still the same as before and has not been updated.... Following the tutorial link, the first step will fail

  • @MaVmAnRoS
    @MaVmAnRoS Před 3 měsíci

    Thanks! That probably the most complete tuto/explanations on triplanar I saw and I will definitively borrow some ideas and methods for my own shaders.
    An interesting thing that would have been nice to talk about is the cost in performance compared to other approaches. I will try by my self to see if it can be, in addition to be more accurate, also less instruction intensive than the shader I currently use (wish is quite demanding, especially as I use layered materials...). For example, that guy propose a way to reduce the sample count: czcams.com/video/dFApswlwsRc/video.htmlsi=Vsmh4ioF9JTo8qOg
    Maybe it would be interesting to do researches and talk about triplanar shader optimization in a next video?

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci +1

      It could be interesting to measure with RenderDoc and see if there's any measurable difference between different approaches :)
      The single sampling thing the guy in the link is doing is pretty much the same stuff I do with my Single Sample Mode (in which I go through in my previous video) + the Dithering I add at the end of this one :D

    • @MaVmAnRoS
      @MaVmAnRoS Před 3 měsíci

      @@VisualTechArt Ho yes, I wasn't sure anymore if it was in your previous video and indeed your dithering addition recalls me that video I put link for ;)

    • @nintenjo64
      @nintenjo64 Před 3 měsíci

      @@VisualTechArt this is an interesting topic simply becuase of the unlimited ways people use Unreal these days. For example i use it simply as a renderer using MRQ and Sequencer to build scenes and render them out no real-time rendering infact there is no logic at all except on scene render. I am thereforce less concerned by performance and simply want the best visual approach (within reason) so my renders complete in a timely manner.. where as other folk who are using it for real-time purposes should be greatly concerned by performance :D

  • @FishMan1nsk
    @FishMan1nsk Před měsícem

    Have you tested this inside Packed Level Actors?
    Something tells me it will break. Because PLA has different local pivot position and scale and all the transformations will be broken.

    • @VisualTechArt
      @VisualTechArt  Před měsícem

      It shouldn't have any problems :)
      The packed level actor is just a container, so it doesn't overwrite the original transforms of every mesh it is referencing, it just stores what it contains and where.
      But let me know if you test and discover that it doesn't!

    • @FishMan1nsk
      @FishMan1nsk Před měsícem

      Well I have a different version of tri planar and yeah it works differently in word and PLA.
      And for example when you compensate the scale of an object inside pla it will work fine but when you exit the editing mode it will have the same scaling as pla (which is equal to one)
      I believe this is because local transforms are now calculated for pla not for its components.

    • @FishMan1nsk
      @FishMan1nsk Před měsícem

      Also pla is not a container it's a blue print that convert SM to HISM. And for unreal i believe it's the same as a single object with it's own transformations.
      I will take a look at your version though maybe it's just a badly done math in my version. But I tested your way of calculating local scale in a new material with debug node and inside pla it show 2 2 2 but when I exit it show 1 1 1.

  • @schrottiyhd6776
    @schrottiyhd6776 Před 3 měsíci

    how many days? or weeks?

  • @user-dt7vf2yw9w
    @user-dt7vf2yw9w Před 3 měsíci

    Very impressive series! I am experimenting with triplanar projection for dirt layer in layered material for destructible mesh. After fracture dirt layer starts to "sliding" along pieces. It is even possible? I think it is because each piece is kind of instance of original object and engine is trying to reproject texture for every piece. Thank you!

    • @VisualTechArt
      @VisualTechArt  Před 3 měsíci

      Not sure, maybe the fragments move but are still considered part of one mesh? s
      So even though you local map, they slide around