iPhone X Facial Expression Capture test PART 1

Sdílet
Vložit
  • čas přidán 5. 09. 2024
  • Test #1. A rough facial capture test using the iPhone X's front facing depth camera. I took the WIP Beby character/rig I've been working on for our VR game Bebylon and drove some of its blendshapes using the iPhones tracking data.
    Sorry for long ass video, apparently I couldn't stop.
    - Cory
    MORE INFO:
    Apple bought Faceshift a while back (facial capture software that tracked using depth data) and essentially made it mobile on the iPhone X and through their ARKit API you can access the 52 different facially tracked motion groups which can drive your own characters blendshapes at 60fps!
    I'm interested in if it can be used for cheap & fast facial motion capture for our VR game which is over run by wild and crazy immortal babies who all want to express themselves.
    This is a quick first pass test and there's a bit more to be done before I hit the quality ceiling in regards to the captured data. Of course if it were meant to be an AR gag a lot lot more could be done to improve the visual rendering and lighting quality.
    Still to try -
    - Getting the eyes tracked.
    - Re-sculpting some of the blend shapes from the beby rig to be better suited for this setup.
    - Tracking performance is much smoother and more precise if you run it without lighting, which you would do if you were really capturing data this way.
    - dial in the data ratios coming from the iPhone to better match each muscle groups range. *Character specific visual tuning
    - figure out why blinking causes the whole head to move.
    - Get Ikrima to write this in native C++ (which will never happen unless we figure out how to freeze time)
    - add features to record the data
    - make a harness to mount the iPhone into a mocap helmet to record face and body at the same time. *more of this to come as i just busted out our Xsens suit!!
    - Get it working in UE4 once they add the ARKit front camera access. *I did this test in Unity thanks to Jimmy and their ARKit team for already having access to the front camera.
    Last note* The babies blendshapes & rig were created by eiskoservices.com and their Rigs on demand service. I couldn't recommend it more. Its a really solid maya based rig and the blend shapes are top notch based on scanned data. The beauty is i uploaded my babies head model on a Thursday night and woke up to a finished rig! I love the times we live in! *More of the Eisko rig to come.
    #animation #motioncapture #iphone #xsens #bebylon #gamedev #arkit #apple #realtime

Komentáře • 42

  • @bradleywelch9290
    @bradleywelch9290 Před 2 lety +8

    Coming from the future to see where it all began. So astonishing.

  • @KasultyZu
    @KasultyZu Před 2 lety +8

    wow it's been a long way since this one 4 yrs ago...

  • @iamnft6201
    @iamnft6201 Před 2 lety +1

    You was so far ahead of the world 🌎

  • @nokedlipudinggal8276
    @nokedlipudinggal8276 Před 3 lety +4

    what software do I need for this? Is it enough to buy an Iphone x? Is this a free app for Iphone?

  • @johnreymonceda6258
    @johnreymonceda6258 Před 2 lety

    This dude living in future

  • @gamecity7265
    @gamecity7265 Před 6 lety +4

    Waooo ! Real time avatar is the future of social network communication.
    This is awesome thanx u so much

  • @TheWorld_2099
    @TheWorld_2099 Před 6 lety +2

    Dude, this is amazing...
    I love how the Elvis baby is talking CG tech with a Brooklyn-ish psycho accent....

  • @malcriado3d
    @malcriado3d Před 6 lety +2

    Cory you did it again! Always first! go high Kite!

  • @CHITUS
    @CHITUS Před 2 lety +1

    Coool

  • @BasemeshInfo
    @BasemeshInfo Před 6 lety +1

    I so want to try this with my models. Awesome work !

  • @simplychyna2720
    @simplychyna2720 Před 6 lety +2

    Will you do a step by step video on what you did and a intro for creating a character and or using a character with this process

  • @PikaPetey
    @PikaPetey Před 6 lety +6

    oh man. Think of all the furries that would pay up the wazoo to have their Fursona animated using this.

    • @WilliamDildine
      @WilliamDildine Před 6 lety +1

      Pete quit hanging out in the same pages as me.

  • @StudioCapon
    @StudioCapon Před 6 lety +2

    lol. This is great.

  • @computrik
    @computrik Před 6 lety +2

    hah.. when can we try it? :-)

  • @sanahlehwelere9056
    @sanahlehwelere9056 Před 2 lety

    which app you using to capture the motion

  • @fredericsidler
    @fredericsidler Před 6 lety +1

    So fun. Subscribed ;-)

    • @pumpuppthevolume
      @pumpuppthevolume Před 6 lety

      btw google FaceRig or any of the thousands of apps doing face tracking

    • @fredericsidler
      @fredericsidler Před 6 lety

      Many applications are better. Here, the application is included with the phone. People will discover it without searching. If they want more, like I do, they will find FaceRig ;-)

    • @pumpuppthevolume
      @pumpuppthevolume Před 6 lety

      yes the only difference is apple has installed the software on the phone....some people r being really absurd about it being revolutionary or whatever

    • @fredericsidler
      @fredericsidler Před 6 lety

      pumpuppthevolume they make it mainstream. All other app developers will benefit from it.

    • @pumpuppthevolume
      @pumpuppthevolume Před 6 lety

      snapchat and all the other apps already made face tracking mainstream ....that's why they r putting it in

  • @agingmirror4524
    @agingmirror4524 Před 6 lety +2

    this is great

  • @DrumAddict324
    @DrumAddict324 Před 6 lety +1

    Duuuuuuuude!!!

  • @tehrahedwards4504
    @tehrahedwards4504 Před 8 měsíci

    Lol😂hay Xanadu like video big fan❤

    • @xanaduBlu
      @xanaduBlu  Před 5 měsíci

      wow your watchin all the videos! Thx 💙 for digging them :)

  • @npaisfordummies2162
    @npaisfordummies2162 Před 6 lety +2

    A.i. channel is gonna have a bunch of competitors now

  • @user-tv5tf1cv2t
    @user-tv5tf1cv2t Před 6 měsíci

    Is it free soft?

  • @vladkostin7557
    @vladkostin7557 Před 5 lety

    tongue movement is probably pretty important

    • @jungoogie
      @jungoogie Před 5 lety +1

      Problem is Apple's ARKit doesn't have a blendshape for the tongue and that would be a bit tricky to track. Another blendshape they don't have set up is for individual eyebrow movement.

  • @semereff1
    @semereff1 Před 6 lety

    Wow great job 👍🏿 Can we also import the data for other softwares like 3dsmax ?

  • @madjetey
    @madjetey Před 6 lety

    Will you release the maya plugin that reads the iphone X data?

  • @Jamraj
    @Jamraj Před 2 lety

    ye

  • @epereyralucena
    @epereyralucena Před 6 lety

    can you extract the mocap raw data to use on a model ?

    • @trickdiggidy
      @trickdiggidy Před 6 lety +1

      thats the next step. Should be fairly easy to do and then reformat it for importing into maya.

    • @epereyralucena
      @epereyralucena Před 6 lety +1

      make tutorial and tag me plz,
      Would be possible to import in cinema 4d ?

  • @MechNominal
    @MechNominal Před 6 lety +1

    Yo you like looking into your own eyes too? Must have had handsome parents.

  • @trapnationpro7034
    @trapnationpro7034 Před 2 lety

    .

  • @powerblades
    @powerblades Před 6 lety +1

    Coool