iPhone X Facial Expression Capture test PART 1
Vložit
- čas přidán 5. 09. 2024
- Test #1. A rough facial capture test using the iPhone X's front facing depth camera. I took the WIP Beby character/rig I've been working on for our VR game Bebylon and drove some of its blendshapes using the iPhones tracking data.
Sorry for long ass video, apparently I couldn't stop.
- Cory
MORE INFO:
Apple bought Faceshift a while back (facial capture software that tracked using depth data) and essentially made it mobile on the iPhone X and through their ARKit API you can access the 52 different facially tracked motion groups which can drive your own characters blendshapes at 60fps!
I'm interested in if it can be used for cheap & fast facial motion capture for our VR game which is over run by wild and crazy immortal babies who all want to express themselves.
This is a quick first pass test and there's a bit more to be done before I hit the quality ceiling in regards to the captured data. Of course if it were meant to be an AR gag a lot lot more could be done to improve the visual rendering and lighting quality.
Still to try -
- Getting the eyes tracked.
- Re-sculpting some of the blend shapes from the beby rig to be better suited for this setup.
- Tracking performance is much smoother and more precise if you run it without lighting, which you would do if you were really capturing data this way.
- dial in the data ratios coming from the iPhone to better match each muscle groups range. *Character specific visual tuning
- figure out why blinking causes the whole head to move.
- Get Ikrima to write this in native C++ (which will never happen unless we figure out how to freeze time)
- add features to record the data
- make a harness to mount the iPhone into a mocap helmet to record face and body at the same time. *more of this to come as i just busted out our Xsens suit!!
- Get it working in UE4 once they add the ARKit front camera access. *I did this test in Unity thanks to Jimmy and their ARKit team for already having access to the front camera.
Last note* The babies blendshapes & rig were created by eiskoservices.com and their Rigs on demand service. I couldn't recommend it more. Its a really solid maya based rig and the blend shapes are top notch based on scanned data. The beauty is i uploaded my babies head model on a Thursday night and woke up to a finished rig! I love the times we live in! *More of the Eisko rig to come.
#animation #motioncapture #iphone #xsens #bebylon #gamedev #arkit #apple #realtime
Coming from the future to see where it all began. So astonishing.
wow it's been a long way since this one 4 yrs ago...
You was so far ahead of the world 🌎
what software do I need for this? Is it enough to buy an Iphone x? Is this a free app for Iphone?
This dude living in future
Waooo ! Real time avatar is the future of social network communication.
This is awesome thanx u so much
Dude, this is amazing...
I love how the Elvis baby is talking CG tech with a Brooklyn-ish psycho accent....
Cory you did it again! Always first! go high Kite!
Coool
I so want to try this with my models. Awesome work !
Will you do a step by step video on what you did and a intro for creating a character and or using a character with this process
oh man. Think of all the furries that would pay up the wazoo to have their Fursona animated using this.
Pete quit hanging out in the same pages as me.
lol. This is great.
hah.. when can we try it? :-)
which app you using to capture the motion
So fun. Subscribed ;-)
btw google FaceRig or any of the thousands of apps doing face tracking
Many applications are better. Here, the application is included with the phone. People will discover it without searching. If they want more, like I do, they will find FaceRig ;-)
yes the only difference is apple has installed the software on the phone....some people r being really absurd about it being revolutionary or whatever
pumpuppthevolume they make it mainstream. All other app developers will benefit from it.
snapchat and all the other apps already made face tracking mainstream ....that's why they r putting it in
this is great
Duuuuuuuude!!!
Lol😂hay Xanadu like video big fan❤
wow your watchin all the videos! Thx 💙 for digging them :)
A.i. channel is gonna have a bunch of competitors now
Is it free soft?
tongue movement is probably pretty important
Problem is Apple's ARKit doesn't have a blendshape for the tongue and that would be a bit tricky to track. Another blendshape they don't have set up is for individual eyebrow movement.
Wow great job 👍🏿 Can we also import the data for other softwares like 3dsmax ?
Will you release the maya plugin that reads the iphone X data?
ye
can you extract the mocap raw data to use on a model ?
thats the next step. Should be fairly easy to do and then reformat it for importing into maya.
make tutorial and tag me plz,
Would be possible to import in cinema 4d ?
Yo you like looking into your own eyes too? Must have had handsome parents.
.
Coool