Creating realistic foliage in Blender using Luma AI

Sdílet
Vložit
  • čas přidán 4. 01. 2023
  • In this tutorial, I will show you how you can use the new NERF (neural radiance field) technology to create realistic foliage 3D models which can be used to populate your 3D scenes.
    Music: Depths by Ethan Leitch ( • depths (FULL ALBUM) )
    Lightbulb photo by Jack Moreh from Freerange Stock
    Buy my high quality photoscan of a rusty lawnmower: alex209.gumroad.com/l/rustyla...
    Download my photoscanned fruit asset pack here: alex209.gumroad.com/l/Photosc...
  • Jak na to + styl

Komentáře • 19

  • @DaanWassen
    @DaanWassen Před rokem

    Lovely tutorial man! Keep it up

  • @DriesduPreez
    @DriesduPreez Před rokem

    Thank you, this was very useful

  • @syntheticfauna
    @syntheticfauna Před rokem

    Thank you!🍺

  • @EliSpizzichino
    @EliSpizzichino Před rokem

    Nice advice on using RAW therapy!
    Can you tell me what combination of shaders you end up using for the leafes? I've seen you've used Add, did you mix a transparent (with color texture) and diffuse? Or Glass/diffuse? Did you use SSS? Can you export the material as is without having to recreate it in Unity?

    • @sudo_alex_209
      @sudo_alex_209  Před rokem

      Hi Eli, thanks for the comment! I ended up using a mix shader between a Principled BSDF and a translucent shader, both of which had the image texture plugged into as the base colour. I also plugged the image texture into a colour ramp, and crunched the values which separated the veins of the leaves from the rest of the leaf, which is much thinner, and used this as the factor for the mix shader, so that the veins would be less translucent than the rest of the leaf. I’m not sure if this is the best way to go about this, but it worked pretty well for me. Unfortunately, if you wanted to import the model into Unity, you would loose all of this shading. I imported foliage into a Unity project I was working on a while ago, and found that you can’t really use translucency (unless you want to write your own shaders that is!). I found that the best way to fake translucency was to use emission, and have it emit the image texture, although I’m sure that there’s a much better way of doing this. I can’t seem to remember whether it’s possible to have emission influenced by a texture in Unity, but if so, you could do the same thing as in Blender to mask the veins of the leaves, and have that influence it. Hopefully this helps!

  • @openlink9958
    @openlink9958 Před rokem

    Im kind of curious how much proccesing power are you saving with this?

  • @anson_kao
    @anson_kao Před rokem +3

    In this scenario, are NeRF's superior to photogrammetry?

    • @sudo_alex_209
      @sudo_alex_209  Před rokem +3

      Yes - you'll find that with photogrammetry, the foliage won't even show up in the scan. This method certainly isn't perfect, but with a bit of tweaking the raw meshed results from Luma AI, you can get some pretty good results in Blender.

  • @user-vt3yz3lr4n
    @user-vt3yz3lr4n Před rokem

    However you can't have the same control on the model like old modeling method, for example, if you scan a car do you can move wheels or any other part as it set as one part , or there's a way to solve this.

    • @sudo_alex_209
      @sudo_alex_209  Před rokem +1

      It wouldn't be particularly easy to solve this, though you could potentially by retopologising the wheels separately. If you want a model for a specific car, there will generally be one somewhere online that you can download, otherwise you can use traditional hard surface modelling, which will get you much better results than a photoscan or a NERF scan. Hope this helps!

    • @user-vt3yz3lr4n
      @user-vt3yz3lr4n Před rokem

      @@sudo_alex_209 thank you for your reply

  • @AlphaPackProductions
    @AlphaPackProductions Před 10 měsíci

    my viewport shading doesn't have the color options, not sure how to find it.

    • @sudo_alex_209
      @sudo_alex_209  Před 10 měsíci

      Click on the arrow next to the shading options (top right hand corner) and change the color from 'material' to 'texture'. Hope this helps.

  • @leoviolet5827
    @leoviolet5827 Před rokem

    I'm having a lot of different texture when downloading my obj and not one single one for the whole object. Do you have an idea how to fix this ?

    • @sudo_alex_209
      @sudo_alex_209  Před rokem

      Hi Leo. By default Luma AI will output multiple texture files. I usually prefer to retopologize my scans anyway, in which case the textures would be baked down to a single texture using Selected to Active (shown at 9:09). However, if you don't want to retopologize your model, create a new image texture in one of the materials, and select it, then copy that texture onto all of the other materials, making sure that all of them are selected. Then, you can simply do a smart uv project by hitting U>Smart UV Project. The baking settings should be the same as what I show in the video, but don't select Selected to Active. Hope this helps!

    • @user-cf2jo7mn9n
      @user-cf2jo7mn9n Před 8 měsíci

      Did you solve it? I don't understand your explanation.

  • @iPEMiC.
    @iPEMiC. Před rokem

    Does it also export the camera data movement ?

    • @sudo_alex_209
      @sudo_alex_209  Před rokem

      No sorry. However there may be another way of exporting it inside of Luma AI? Otherwise you might want to try another NERF tool, and see if there's an option for exporting that, although I'm not sure.