Adobe Substance 3D Tutorial - Put your 3D Models inside the Apple Vision Pro

Sdílet
Vložit
  • čas přidán 10. 09. 2024

Komentáře • 21

  • @nayandurge8877
    @nayandurge8877 Před 7 měsíci +2

    Fabulous sir very informative thank you so much❤❤❤

  • @KonstantinRozumny
    @KonstantinRozumny Před 7 měsíci +2

    Is there a way to change the lighting of the scene?

    • @michaeltanzillo
      @michaeltanzillo  Před 7 měsíci +1

      Great question. I don't believe so inside of Reality Composer directly. You would need to script it in using the Swift code:
      developer.apple.com/documentation/realitykit/realitykit-cameras-lighting

  • @VIREELY
    @VIREELY Před 6 měsíci +1

    Would an animated object work just as well? I’m using blender. Would I be able to import it this same way?thank you in advance

    • @michaeltanzillo
      @michaeltanzillo  Před 6 měsíci +1

      Yes! I haven't done it yet but the USDz data contains all the animation created in your 3D application. The only question I have is Blender's capability to export USD. I am confident it can do it but just haven't tried yet.
      I'm hoping to put together a video soon to demonstrate.

    • @VIREELY
      @VIREELY Před 6 měsíci

      @@michaeltanzillo thank you so much for your response! All your posts are very helpful and informative and there is such limited info out there. Thanks and look forward to more of your videos!

  • @Hobnockers
    @Hobnockers Před 4 měsíci +1

    Thank you 🙏🏻 so much. Very cool. Can you also add lights? I am interested in designing 3d spaces, and the lighting is super important. Or perhaps there are other tools for the Apple Vision Pro that can do this? Thx a buch and keep rocking, cheers 🥂

    • @michaeltanzillo
      @michaeltanzillo  Před 4 měsíci

      Great question! I explored it a bit in this video.
      czcams.com/video/fEIdDahjGjQ/video.htmlsi=TDqYiqWOai_XbA3X&t=336
      The Vision Pro does a great job creating a realtime environment map to drive lighting.

    • @Hobnockers
      @Hobnockers Před 4 měsíci

      @@michaeltanzillo got it thank you so much. I guess my question is then all about VR, instead of AR.
      When compiling a VR project using the Unreal Engine, I can deploy for desktop devices (deferred rendering | better quality) or I can deploy for a mobile device using forward shading (Vulkan, etc).
      However, forward shading has its limitations. So deferred rendering will give you the ultimate shader quality and VR Experience.
      Now here comes the interesting part. Since Apple Vision Pro was marketed as its own powerful super-computer (cpu+gpu), can you deploy with that a VR Experience with deferred rendering settings?
      Which would be nothing else than compiling your VR experience for a desktop device.
      If that’s possible, you will fix shader issues, like emission, or translucency, I guess even SSS will work.
      But the question is: can or will the Apple Vision Pro handle deferred rendering?
      If not, you would still need to use your PC to run a deferred VR application and just use the googles for the vision.
      But the computing would still all happen on the PC.
      Of Apple Vision Pro can‘t handle that, bruuhh… then this is a real bummer.
      Cause that would have been the ultimate reason to buy one. Otherwise, Meta Quest would be much cheaper in comparison.

    • @michaeltanzillo
      @michaeltanzillo  Před 4 měsíci

      @@Hobnockers For fully immersive VR experiences, you would probably want to go down the Unity path. Unity is a complex but common software for creating VR experiences.

    • @Hobnockers
      @Hobnockers Před 4 měsíci +1

      @@michaeltanzillo thx. I would use Unreal Engine.

    • @michaeltanzillo
      @michaeltanzillo  Před 4 měsíci

      Gotcha! Yes...you could use Unreal as well (most XR peeps use Unity but I prefer Unreal too) @@Hobnockers

  • @hekuranmusli1072
    @hekuranmusli1072 Před 6 měsíci +1

    Great tutorial, thank you❤.
    A stupid question: How do you import the USDZ file inside Vision Pro to view it? What app inside Vision Pro can do that, or is the process the same as on iPad? I don't know yet since I don't have a Vision Pro to test, but I'm planing to buy one for architectural visualizations.

    • @michaeltanzillo
      @michaeltanzillo  Před 6 měsíci

      No stupid questions! This is all brand new
      It's super easy! You can either AirDrop it from your computer and it magically appears. Or you save the USDZ in your "Files" and double click it and it appears. Here's a quick little video I did for it:
      x.com/michaeltanzillo/status/1755038435117764844?s=20

  • @victorgomezsa
    @victorgomezsa Před 7 měsíci +1

    Thank you very much for this information. By the way, do you know if you can export to USDz from Blender? Regards!

    • @michaeltanzillo
      @michaeltanzillo  Před 7 měsíci +1

      I believe you have 2 options. You can export FBX, Alembic, or USD and then use Mac's Reality Converter to change to USDz. Or you could search GitHub for a plugin like this one:
      github.com/robmcrosby/BlenderUSDZ
      (I did not verify this plugin so use at your own peril)

    • @victorgomezsa
      @victorgomezsa Před 7 měsíci

      @@michaeltanzillo Thank you very much for your reply!!! 👍

  • @mahmoudabdulaziz8811
    @mahmoudabdulaziz8811 Před 7 měsíci

    Nice one Michael