Advanced Hand Tracking & Lighting Fast Interaction Setup | Meta’s huge v62 Update
Vložit
- čas přidán 7. 07. 2024
- Hi XR Developers! In this quick video we are going to look at the new improvements that version 62 of Meta’s XR SDK brought to our hand tracking and the interaction SDK. We are going to look at Multimodal, Wide Motion Mode, or WMM for short, and Cap-sense. Furthermore, we will take a quick look at Meta’s new comprehensive interaction sample and how to set up interactions in seconds!
Resources:
🔗 Discover the benefits of new hand tracking capabilities available in SDK v62: developer.oculus.com/blog/han...
🔗 Multimodal Documentation: developer.oculus.com/document...
🔗 Wide Motion Mode Documentation: developer.oculus.com/document...
🔗 Capsense Documentation: developer.oculus.com/document...
Other Videos:
📹 Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK: • Mixed Reality Utility ...
📹 Meta’s Building Blocks | Develop Mixed Reality Apps lightning fast: • Meta’s Building Blocks...
📹 How to use Lights & Shadows in Passthrough Mode: • How to use Lights & Sh...
📹 Get Started with Meta Quest Development in Unity: • Get Started with Meta ...
📹 Meta Quest Scene API: • Meta Quest Scene API |...
📹 Meta XR Simulator: • Meta XR Simulator | Sy...
Join the Discord!
👾 / discord
Support the Channel:
🫂 Patreon: / blackwhalestudio
🐦 Twitter: / xrdevrob
🤝 Join this channel to get access to perks:
/ @blackwhalestudio
Need professional XR & Multiplayer development for your project?
👨💻 Get in touch with us: bento.me/blackwhale
Chapters:
0:00 Intro
0:43 Project Setup
1:22 Use Simultaneous Hands and Controllers (Multimodal)
3:53 Use Wide Motion Mode (WMM)
4:58 Use Capsense
6:22 Add interaction in 15 seconds!
7:22 Comprehensive Rig Example Scene
8:40 Outro
Wide and inside out body tracking is really exciting.
I hope they bring these features over to the home environment, and let you pull stuff out the web to show your friends
I think this play into the whole OS dilemma they have at the moment. Currently the Quest doesn't have its own OS that lets you open multiple apps alongside each other like visionOS for example. I'm curious to see how Meta will answer!
Hey there, can you make an in depth UI tutorial for the new sdk? Very confused about how the comprehensive rig is setup and how it can be changed for customized UI. Thank you!
Distance grabbing iş more like Vr in Vr. We live in simulation folks 😮
Hay, great content. One question: Where can I find the comprehensive sample scene?
Thanks in advance :)
Thanks, glad you like it! The sample scene is part of the sample of the Meta XR Interaction SDK OVR samples package. I install it in the tutorial at the very beginning!
great video, but how can I change the secene to the MR enviroment, because when I open, it's still the VR
stay tuned for that!
Hi. I'm using quest2 and I'm failing to implement hand tracking in unity editor using Meta XR v62 Unity plug-in. Hand tracking is only possible after Build & Run. Can you give me some advice? Please Help.
- Desktop OS: Windows 11
- Software version of quest2: 63.0.0
- Meta XR Unity plug-in version: 62.0.0.
Yes, unfortunately, as stated by Meta in their documentation, the sample scene currently does not reliably work in the Editor, however I had no issues. You are right, you will need to make a build to make it work consistently!
I didn't know there was such a thing in their documentation. Thank you so much for letting me know.😊
Let me ask you one more question. What version of Windows OS do you use? I saw a post from the Meta community forum that we have to use Windows 10 instead of Windows 11 to implement hand tracking on Unity Editor. Do you think that's reasonable?
help me please. i have problem installing samples. i get authentication errors and it tells me that package is licenced and i dont have it's licence.
What kind of license? You don't need any license for these packages
@@blackwhalestudio i get multiple errors while trying to install the package 😢