AR Foundation With Meta OpenXR Support Is Here! (Unity Setup & Demos)

Sdílet
Vložit
  • čas přidán 28. 08. 2024

Komentáře • 116

  • @danreid70
    @danreid70 Před rokem +4

    VERY cool! Will be playing with this soon! Can't wait to get a Quest 3 eventually to really start playing with color passthrough and AR functionality in a Quest!
    Thanks as always, Dilmer!!!

    • @dilmerv
      @dilmerv  Před rokem

      Couldn’t agree more, this is the beginning of a lot to great things, I am also very excited about Quest 3 and Apple Vision Pro!

  • @alec_almartson
    @alec_almartson Před rokem +1

    Thank you for the detailed explanation of the New Features that Meta OpenXR brings to us.
    This is just the beginning of a New Era...

    • @dilmerv
      @dilmerv  Před rokem +1

      Couldn’t agree more Alec 🚀 thanks for your comment!

  • @HassanKaraouni
    @HassanKaraouni Před rokem +2

    Great stuff Dilmer - this is exactly what I was looking for!

    • @HassanKaraouni
      @HassanKaraouni Před rokem

      Also the Anchors scene works by default for me now. I'm using 5.1.0-pre.9. It seems that Unity updated the script for Anchor creation, so that worked for me without needing to use the AnchorCreatorV2 script.

    • @dilmerv
      @dilmerv  Před rokem

      That’s great news thanks for the update!

    • @dilmerv
      @dilmerv  Před rokem

      Thank you for your feedback!

  • @walkerdev1
    @walkerdev1 Před rokem +2

    Nice tutorial as usual!
    Do you plan on doing a video with Homluer's mediapipe? The instant motion tracking in specific since it can be used as an ARCore for rven computers!
    For some reason nobody has touched it outside hand tracking in tutorials

    • @dilmerv
      @dilmerv  Před rokem +1

      The reason nobody has touched is because there is no camera access available with Meta Quest devices, hand tracking and body tracking works only within Meta ecosystem but you can’t access the raw camera feed from those devices 🥲

  • @leoismaking
    @leoismaking Před rokem

    Thanks for this video. Good breakdown of how to get started.

    • @dilmerv
      @dilmerv  Před rokem

      Glad it was helpful! Thank you

  • @LanceLarsen
    @LanceLarsen Před rokem +3

    Well done as always - thank you! So... in your conversations with Meta -- any idea if there is any possibility of unfiltered camera stream access? would certainly be opt-in -- but having the ability to do more advanced Computer Vision detections of objects, etc would unlock so much much more. Obviously that's balanced against privacy, but if users opted in -- would be same thing we do with using phone cameras, etc. Just interested if you asked that question and if they gave any insights. :)

    • @dilmerv
      @dilmerv  Před rokem +3

      I haven’t heard anything related to stream access to the camera, that’s one feature that is a bit difficult to pull off due to privacy as you mentioned, but I will ask around to see if there are any discussion around that topic and perhaps during the Quest 3 launch we will know more.

    • @LanceLarsen
      @LanceLarsen Před rokem

      @@dilmerv Thanks! I'll keep feelers out as well -- thinking about it, this really isn't any more access than every phone app that asks for camera permissions has already. Think Pokemon Go and and every other phone AR application. Can do this today on Hololens as well - but doing it on the Quest or the Apple Vision Pro with full FoV would be amazing.

    • @deepanshhh
      @deepanshhh Před rokem +3

      I also really want this feature to be available to developers. Imagine the number of applications that will be possible if we can get the camera feed.

    • @dilmerv
      @dilmerv  Před rokem +1

      @@deepanshhh looking forward for it! Hopefully it becomes available with MQ3, this is an expensive operation in terms of how much it pushes the hardware which is why it could be more than just privacy.

    • @HassanKaraouni
      @HassanKaraouni Před rokem

      I was also at an event for Apple Vision Pro in San Francisco (SF Game Developers Meetup) and I don't believe Apple has plans to share camera access either. The Unity representative also didn't seem confident that would change but I don't think anyone can be certain.

  • @waihanlim6699
    @waihanlim6699 Před rokem +5

    Does Oculus Quest 2 support this feature?

    • @dilmerv
      @dilmerv  Před rokem +6

      Yes AR Foundation with Meta OpenXR Package supports all of the features shown during this video, great question!

    • @NianticLighter
      @NianticLighter Před 10 měsíci

      Yup, but in Black and White...

  • @patrick_wade
    @patrick_wade Před 11 měsíci +1

    This is a great lesson as always thankyou. Could i ask did you manage to develop and test using a quest link or did you have to push to apk each time?

    • @dilmerv
      @dilmerv  Před 10 měsíci +1

      Thanks Patrick, yes with this I had to push to the device every time, I don’t believe Oculus link works with this just yet, but I talked to Unity and at the time of this video they were working on a solution, I will check and see if that is available to be tested.

  • @jeffg4686
    @jeffg4686 Před rokem +1

    Thanks for the video. Lot of great info packed in there.
    Rather than there being a "meta openXR" dependency, shouldn't there just be an "open xr" dependency?
    Is this like a temp fill in for functionality not yet in the regular openXR support or something?

    • @dilmerv
      @dilmerv  Před rokem

      Great question and comment. Meta OpenXR was created mainly because Unity needs a place to implement the subsystems from AR Foundation that are specifically to Meta, this works the same way with ARKit and ARCore, they both have their own plugin because each has its own implementation.

    • @jeffg4686
      @jeffg4686 Před rokem

      @@dilmerv - gotcha, thanks

  • @tompainter7167
    @tompainter7167 Před 10 měsíci +1

    hey DILMER, thanks for this tutorial, very nice.
    Seems quite a lot has changed since your tutorial, if you run the latest which only seems to work on 2023.2beta
    I'd love to know how you can get your CG objects to track to your spaces created in Oculus Quest, I guess this would be a way to hack the functionality that Zuck showed off at Oculus connect, the persistent anchors things

    • @dilmerv
      @dilmerv  Před 10 měsíci +1

      Thanks Tom, currently yes you can use anchors which is a feature provided by Meta + now exposed through AR Foundation, also with Quest 2 you will be able to easily scan rooms without having to manually map everything, I haven’t tested it yet but it will for sure be easier to implement than this current solution.

  • @Spacewarpstudio
    @Spacewarpstudio Před rokem +1

    Hey Dilmer, love your content! I've been working with passthrough and anchors with the meta sdk, and just ran into the issue that we want to use hand tracking, but do not want the users to be able to access the hand tracking quick menu. This is for a small installation where users will not be given a controller, but would need some very simple interaction through hand tracking. I would rather not force them to stare at a button for 5 seconds just to select the language they want to use, hand tracking would be so much better! I thought you might be the guy to ask if this is currently possible? :)

    • @dilmerv
      @dilmerv  Před rokem

      Great question but since that’s a OS feature I am pretty certain that is not an option, that menu is the only way someone could exit an experience today.
      Also thanks for your amazing feedback on my content!

  • @tompainter7167
    @tompainter7167 Před 9 měsíci +1

    random question for you Dilmer, and the community.
    Do people do their dev with a way to test their code in the editor? i.e. press the play button, and whilst we can use the linked quest's input, we get all the info in the editor to use for debugging?
    or is that not possible for this type of dev? and we have to just create some kind of text logger and debug GUIs of our own inside the scene so we can see that info inside the quest?

    • @dilmerv
      @dilmerv  Před 9 měsíci +1

      Is not currently possible when using AR Foundation with the Meta XR Plugin, but yes definitely possible when you use the “Oculus” plugin which activates oculus link features.

  • @shawnhoareau7137
    @shawnhoareau7137 Před 9 měsíci +1

    Hi. I'm doing a time sensitive project but I've been stuck on the error:
    Library\PackageCache\com.unity.xr.arfoundation@6.0.0-pre.4\Runtime\ARSubsystems\Util\AwaitableUtils.cs(11,25): error CS0246: The type or namespace name 'Awaitable' could not be found (are you missing a using directive or an assembly reference?)
    I haven't managed to resolve this. The error is the same between this cloned project and my personal one. Please help!

    • @dilmerv
      @dilmerv  Před 9 měsíci +1

      Thanks for your feedback. This is happening because AR Foundation 6.0 is compatible with Unity 2023.2 or greater. You can look at the table here under “What version should I use” github.com/Unity-Technologies/arfoundation-samples to determine which Unity version to use.

  • @lautarolopez9992
    @lautarolopez9992 Před 10 měsíci +1

    Nice video! Do you know if there is any chance to play in the Quest 2 from the editor? I couldn’t run the ar foundation samples but I set up my own project with the docs . And I can’t get it to run in the visor without making the build

    • @dilmerv
      @dilmerv  Před 10 měsíci

      Unfortunately that’s the only way to make it work, you could add the XR Toolkit + XR Sim to test in the editor but not with Oculus link, I know Unity is working on a solution as we speak.

  • @HassanKaraouni
    @HassanKaraouni Před rokem +1

    How will you decide to use AR Foundation with Meta OpenXR vs the Meta Passthrough APIs moving forward?

    • @dilmerv
      @dilmerv  Před rokem +2

      AR Foundation to me will become useful when a Quest 3 releases since we will have real-time depth (hopefully) I also would say if I had an app built with AR Foundation then it makes sense to also support it with Meta (if the use case applies to those devices)

  • @EgoJacky1
    @EgoJacky1 Před rokem +2

    nice

    • @dilmerv
      @dilmerv  Před rokem +1

      Thanks 🙏 glad you liked it!

  • @Aliakbar-is5ll
    @Aliakbar-is5ll Před 9 měsíci

    Hi Dilmer. Should I use this package or use oculus integration for MR/AR application?

  • @brandroid2960
    @brandroid2960 Před 9 měsíci

    Excellent information thanks. Dilmer, do you know if it is possible to create an application that will activate passthrough automatically while another VR application is running. It would be really useful if you could run a utility in the background that automatically switches into passthrough when looking in the direction of a door or a specific table, while playing any VR game.

    • @dilmerv
      @dilmerv  Před 9 měsíci

      You could use scene understanding for this use case, basically that will tell you what’s in your environment, then based on that information you could adjust your VR experience. For instance: if you are close to a door then you could activate passthrough programmatically, same with tables, lamps, etc

  • @litonavarro4400
    @litonavarro4400 Před rokem +1

    I was wating for this!!! thank you!!... I tested it out, but the Passthrough does not get activated on my Quest PRO. I can see how your script, enables and disables the AR Camera Manager script, on play mode but my Quest PRO do not get activated. Is black al the time. I'm testing SimpleAR scene. Clear flags on solid color and alpha cero. I took a look at the updates and everything is updated... What I'm missing?

    • @dilmerv
      @dilmerv  Před rokem

      Thanks for the info, I saw your message in Patreon, let’s look into it there as we can DM each other.

  • @PeterOleynikov
    @PeterOleynikov Před 3 měsíci +1

    Doesn't work on Quest 3, after installing the app on Oculus, I can only see the controllers and the dialog with buttons, nothing else happens no matter what I do.

    • @dilmerv
      @dilmerv  Před 3 měsíci

      I will test this very soon but I do believe that it works well, more likely you are missing a setting. Thanks for the info.

    • @PeterOleynikov
      @PeterOleynikov Před 3 měsíci

      @@dilmerv Apparently, when building with Unity 6, the Oculus app didn't ask for the permission to allow using the spatial data so the passthough was empty. The permission was silently disabled by Oculus 3 - I didn't see any dialog asking for this permission when I started the app for the first time. However, when I switched to older Unity 2022, the dialog with permission appeared immediately on the first app start in Quest 3. Setting manually the permission in my Quest 3 to the app that is build using Unity 6, finally made the app running in passthrough will planes and anchors correctly. Sorry for false alarm.

  • @deepanshhh
    @deepanshhh Před rokem +1

    How is it different than using the oculus integration sdk? Like the scene capture and scene anchors that is listed on their documentation page?

    • @dilmerv
      @dilmerv  Před rokem

      This is more focused on AR/Mixed Reality so if you have an augmented reality app or game today, you can easily add support to Meta Quest devices by just using this package, imagine all the AR apps available today that could make use of this package with a simple checkbox.
      Great question 😉

  • @JulienBaessens
    @JulienBaessens Před 9 měsíci +1

    Nice vidéo ! Do we have access to meshing?

    • @dilmerv
      @dilmerv  Před 9 měsíci

      Not currently, 5.1 doesn't support Meshing yet, here's what's supported: docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.1/manual/index.html
      Also, I will find out if there is a new version in progress, I am sure meshing will be available in the near future.

    • @JulienBaessens
      @JulienBaessens Před 9 měsíci

      Thanks! Hope to have it quickly 😊

  • @whynotcode
    @whynotcode Před rokem

    13:00 - How did you capture the passthrough video? Are you using scrcpy directly and just cropping a square manually from the eye buffer?

    • @dilmerv
      @dilmerv  Před rokem +1

      I took a regular screen recording from the headset, nothing complex just the built-in recording features.

    • @whynotcode
      @whynotcode Před rokem

      ​​@@dilmervdid you try that on Quest 2 as well? The passthrough video rendering is just black if I record natively from a Quest 2. Unfortunately I don't have a Quest Pro to try.

  • @tobi6758
    @tobi6758 Před 2 měsíci

    Does this work with the XR Simulator in Editor?

  • @thestonefox
    @thestonefox Před rokem +1

    Does this allow for passthrough and AR on Pico and HTC devices without the need for the vendor SDK?

    • @dilmerv
      @dilmerv  Před rokem

      Currently this only works for Meta devices, but yes something similar could be done if Unity or the vendors were to create similar packages. Great question!

    • @thestonefox
      @thestonefox Před rokem

      @@dilmerv That's a shame, hoping for a time where no Vendor SDKs are required and a single build works across multiple vendor headsets! So close to that with what OpenXR offers, but its still missing some key elements like passthrough and controller models.

  • @tlskillman
    @tlskillman Před 10 měsíci +1

    Holy f**k that's complicated!

    • @dilmerv
      @dilmerv  Před 10 měsíci

      Was there a part that you found needing more clarification? I could review this video and possibly add to it to make it easier. Thanks for your feedback!

  • @oemresagir
    @oemresagir Před 10 měsíci +1

    How about object or tracker detection?

    • @dilmerv
      @dilmerv  Před 10 měsíci

      Currently those are not supported with Meta devices, Meta doesn’t provide access to the camera or provides such features with their SDK. Good question!

  • @ingmarwittmann2916
    @ingmarwittmann2916 Před 10 měsíci +1

    Does anyone know if image or object tracking will be possible with the quest 3?
    For example scannig a QR code and to open a virtual model over the real model for a virtual service and maintenance manual.

    • @Colarocker
      @Colarocker Před 10 měsíci +1

      Should be possible with the given sensors, just a question where and how the images will be processed (on quest 3 or on pc?) and when it will be implemented in OpenXR. I'm still looking for open-source alternatives for object tracking..

    • @dilmerv
      @dilmerv  Před 10 měsíci

      There are QR code capabilities at the OS level so I am sure those features are there, but just not exposed to developers yet.

    • @FP-in1kq
      @FP-in1kq Před 10 měsíci

      i saw on the feature list that they support 2d image tracking. how would i add this to the setup you are showing? id like to spawn 3d objects absed on a printed out image in the room@@dilmerv

  • @yuukiasuna3739
    @yuukiasuna3739 Před 4 měsíci

    Hi, I'm not sure why the simpleAR and Anchor scenes are not working when build into Meta 3. I have followed the settings but there is no Red Cube and Plane Visualizer when I run in Meta 3. Are there any other setting need to be done before build the APK? Thanks.

    • @dilmerv
      @dilmerv  Před 4 měsíci

      What do you see when you run on the meta quest 3?

  • @acwestmountain
    @acwestmountain Před rokem

    Hmm is it truly Plane detection when you need to setup the planes yourself in an initial room setup? I was expecting it to find planes similar to how you use plane detection with AR Foundation on iOS

    • @dilmerv
      @dilmerv  Před rokem

      There is no tech on the Meta Quest 2 that will allow you to do so, no depth cameras. The Quest 3 will provide that functionality.

  • @nicolasabondano9297
    @nicolasabondano9297 Před rokem +1

    Is possible to use pass-through features for chroma keying? I couldn't find anything

    • @dilmerv
      @dilmerv  Před rokem

      There is a mixed reality capture tool available which you can use with OBS, perhaps this can help www.meta.com/help/quest/articles/in-vr-experiences/social-features-and-sharing/quest-2-mixed-reality-capture/

    • @nicolasabondano9297
      @nicolasabondano9297 Před rokem

      @@dilmerv No but I mean chroma keying from the headset. So I can see the virtual world wherever there is green screen and the real world in the rest.

    • @dilmerv
      @dilmerv  Před rokem +1

      @@nicolasabondano9297 got it makes sense but no that feature is not available with any of the Meta devices, good question.

  • @AuxiliarCoordinacionCETI-my8mj
    @AuxiliarCoordinacionCETI-my8mj Před 9 měsíci +1

    Hi, can the hp windows mixed reality 1440 support this?

    • @dilmerv
      @dilmerv  Před 9 měsíci +1

      Hey good question, currently they only support ARCore, ARKit, HoloLens 2 (OpenXR), and Meta Quest (OpenXR), for HP Windows Mixed Reality I recommend looking into MRTK 3.

    • @AuxiliarCoordinacionCETI-my8mj
      @AuxiliarCoordinacionCETI-my8mj Před 9 měsíci

      @@dilmerv Thank you, it is definitely a somewhat old headset, it was a donation to my university.

  • @seanthecommandblockenginee1238

    Could you make a updated tutorial on how to make a vr game

    • @dilmerv
      @dilmerv  Před rokem

      Will do thanks for your feedback!

  • @underbelly69
    @underbelly69 Před rokem +1

    finally

    • @dilmerv
      @dilmerv  Před rokem

      Exciting times 🙏

    • @underbelly69
      @underbelly69 Před rokem

      @@dilmerv what is the best narrative use of AR pass0through you've seen so far - I'm excited to develop a film experience you are inside of - like an extra - having a lucid dream

  • @Pinkokinawa84
    @Pinkokinawa84 Před 10 měsíci

    Thank you for the video, I loaded the SimpleAR scene to Quet 3, but I don't see the red cube. Do I need to change or add settings or prefabs to test this?

    • @dilmerv
      @dilmerv  Před 10 měsíci

      Did you verify that all OpenXR plugins were enabled as shown on this video? Also what rendering pipeline are you currently using?

    • @Pinkokinawa84
      @Pinkokinawa84 Před 10 měsíci

      @@dilmerv Yes the Plug-in provider OpenXR Meta Quest feature group is checked and I also checked the OpenXR setting as you showed in the video. I am using a Built-in Render Pipeline. Do you know if we could incorporate the Geospatial creator with OpenXR Meta?

  • @Exooku
    @Exooku Před rokem +2

    So, is not possible use AR Foundation with Oculus Integration SDK?. Don't tell me I wasted my time 😂

    • @dilmerv
      @dilmerv  Před rokem

      You can use XR Toolkit with Oculus Integration and that should work without problems, is there a specific feature you are looking for that is not available within XR Toolkit?

    • @Exooku
      @Exooku Před rokem +1

      @@dilmerv I'm not really sure yet. I started with Oculus Integration recently. Maybe passthrough has more features in Oculus Integration, hand tracking is better? (Idk).
      It's just a matter of having to study again xD

  • @haraldgundersen7303
    @haraldgundersen7303 Před rokem +1

    Could this be hacked to work for the Pico4?

    • @dilmerv
      @dilmerv  Před rokem +1

      Unfortunately no currently, but I will find out from Unity if they plan to add other devices. For now, you can use XR Toolkit with PICO and it works very well.

  • @ivansutherland7411
    @ivansutherland7411 Před rokem

    Is this the only way to enable basic passthrough support when using OpenXR + XRI for Quest Pro?

    • @dilmerv
      @dilmerv  Před rokem +1

      This is what Unity recommends, the workflow should always be to add AR Foundation (as a way to add environment awareness) to future XR experiences.

    • @ivansutherland7411
      @ivansutherland7411 Před rokem

      Thanks! A video showing off this work but with hand tracking using Unity XR Hands would be helpful too!@@dilmerv

  • @Sharramonz1
    @Sharramonz1 Před rokem

    Does casting from the Quest Pro show the passthrough?

    • @dilmerv
      @dilmerv  Před rokem

      Yes it does, sometimes I have issues showing it but I made it work by using Meta Oculus Developer Hub Screen recording features. Good question!

    • @Sharramonz1
      @Sharramonz1 Před rokem

      @@dilmerv Thanks! Is there something special that needs to be done? I'm pretty much doing the black is transparent thing, and passthrough is showing in the Quest Pro, but I only see black in the cast

  • @user-lk5le1ec7x
    @user-lk5le1ec7x Před rokem

    Downgrading the unity2023 project to 2022, will it cause any issues?

    • @dilmerv
      @dilmerv  Před rokem

      As long as the packages support 2022 which they all do then you should be fine, if you use git just test downgrading and if it doesn’t work just revert it all, if you don’t use git then take a backup of your project.

  • @user-yu2oe6sx8m
    @user-yu2oe6sx8m Před rokem

    Can't this project be viewed live in the Unity editor?

    • @dilmerv
      @dilmerv  Před rokem

      Currently AR Foundation with Meta OpenXR package doesn’t support Unity play mode, I believe Unity is working on it as they mentioned it in Twitter. Good question!

    • @user-yu2oe6sx8m
      @user-yu2oe6sx8m Před rokem

      @@dilmerv Thank you!

  • @user-yu2oe6sx8m
    @user-yu2oe6sx8m Před rokem

    Why can't I enter the SampleAR scene with Quest Pro?

    • @dilmerv
      @dilmerv  Před rokem

      You can, deploy it and make sure the scene is the only one or the first added under build settings.

    • @user-yu2oe6sx8m
      @user-yu2oe6sx8m Před rokem

      @@dilmerv Thank you!

  • @afrosys11
    @afrosys11 Před rokem

    Does QR code tracking works?

    • @dilmerv
      @dilmerv  Před rokem +1

      Currently, there is no support for any type of tracking with Meta Quest devices.
      Plane detection works because we can use Room Setup which is all manually done by the user at the OS level.
      Good question!