iPhone X Facial Capture test PART 4 - GAME CHARACTERS
Vložit
- čas přidán 21. 05. 2018
- Test #4 using the iPhone X depth camera for facial animation capture along with the Xsens full body motion capture suit and a ghetto DIY moCap helmet!
Starring a new Beby Character "Killahertz" from our game Bebylon!
This test shows new improvements to the face by using apples official blendshape set to generate a new set of baby face blendshapes. Also a WIP of using the iPhoneX to animate a non-human character. Lastly a rendered clip of actual game content using this pipeline that will soon make its way into the game engine. *note: in this video this scene is rendered in Vray, not realtime... yet! I actually made 90% of this video months ago but have been rendering the zillion frames very slowly in the background!
Check out our blog if you want to download the Apple Blendshapes as a reference. blog.kiteandlightning.la
Check out the previous tests for more info regarding the iPhone X capture process.
• iPhone X Facial Expres...
• iPhone X Facial Captur...
* As before all the iPhone face capture data is raw, unaltered. The Xsens body capture data was post processed in HD, otherwise unaltered.
Some context:
These are a series of test I'm doing to find a fast pipeline for performance capture (Face and body) that we can use for our up and coming game Bebylon: Battle Royale. bebylon.world
#animation #unrealengine #iphone #motioncapture #bebylon #xsens #gamedev #arkit #ue4 #apple - Krátké a kreslené filmy
This is CRAZY!!! Cant wait for more of this !
Sooo much awesomeness!!
Epic work!!
Wooo, better and more fun this time!
Inspired by you Ive been making my own facial mocap solutions! Thank you dude! Will have it for different platforms. You are my hero!
You are a genius!! This is awesome! Love it!
Hyper Amazing!
aweeeeeesome job man! cant wait for the next video. thanks!
wow this is amazing! great work! will follow to see where this goes
This so Awesome
I love this guy
Just watched WWDC and thought of you and these babies! Cant wait to see what you end up with and wonder if you and apple have a similar vision.
Thanks! I'm excited to see what the face tracking improvements are and can't help but keep wanting to make a Bebylon app so peeps can build their games beby character and use them as their memoji!
mind blowing... is this concept usable for creating short animation films too?
WHAT! I've been theorizing this since the X came out that it would be so cool to do for game dev or motion work. I KNEW someone would get it and of course it's you guys at kite & lightning! Do you ever think someone will release a guide on this kind of stuff or will I always be dreaming for an easy export. :p
please release it on a kickstarter, etc. I will def buy it
Impressive
This is impressive. And very clean too. As for your helmet, do you have access to a 3d printer?
super!!
Aweseome. I would need if you can publish some of the work for others to follow like the ZBrush/Apple Targets. However, thanks for sharing all the great WIP info. looks great.
thanks man. You can download the apple targets from our blog. blog.kiteandlightning.la/iphone-x-facial-capture-apple-blendshapes/
Great work, it's nice seeing your development and milestones you're reaching for. Your passion and hard work really shows. Albeit the baby narration is creeping me out haha. What game engine are you feeding the live mocap data to?
lol, hopefully using real performers and bebyfied voices will reduce the creep factor! we're using UE4 for our game though i haven't got live streaming going yet (using maya as the middleman). I think UE4 4.19 will make live streaming the data easier to implement so stay tuned!
Ha! "Davor"!
man this is pritty cool! gonna look out for the siggraph presentation:) btw can u shortly give a tip on how u generate the iphone blendshapes to the alien guy or baby? what tool in maya u use? like just skin painting or wrap?shrinkwrap?cheers!
Thanks Stian! yup, in Maya i used wrap and Delta mush to generate blendshapes. Before that stage I used shrink-wrap as well to help fit the neutral pose to the alien face.
Epppic
Please mention the equipment u use to do this...
Dude your work mocap is incedible but I'm mostly impressed at how stunning this looks! :O Is this all in engine in realtime?
Thanks man! This video is rendered in vray but the last short one i posted was rendered in Realtime in Unreal, which is shockingly close to vray! Sadly CZcams compresses these things so much a lot of the sweet details get lost but its pretty amazing seeing such a high quality subsurface rendering in Unreal.
Wow dude! The Unreal one actually looks amazing! I Can't wait to see more from you dude, you're doing amazing work, especially with the iPhone for face cap. I popped into the animation studio for Star Citizen in the UK and got to check out what they were doing with Faceware. They had pretty made tracking markers redundant because they were getting such amazing tracking right out the box without any markers at all but it's awesome to see that you are managing pretty much the same thing in real time on a damn phone haha! I also had the pleasure of being scanned into the game and had to go through all those blend poses myself. Probably the most unflattering thing I've ever done :D
Also, I may have missed this but how are you tracking your hands in real time with so many bones for full articulation? When you were throwing metal signs I was impressed by how accurate it was.
incredible. is it rendered in some GPU renderer? (redshift or octane?)
vray though not GPU.
wow!!!!
you should make an app
This is amazing!
Can you tell me the price of XSens Suit?
7k - 12k
Great Work! What helmet are you using for this? I would love to try this out too.
Its a cheap paintball helmet, and not very comfortable but gets the job done. heres the helmet details uploadvr.com/iphone-xsens-performance-capture-bebylon/
Is this type of thing possible with later iPhone models?
At the moment the way I'm doing it, just on the iPhoneX. Technically though you can still do facial capture on any mobile with a camera (same idea as snap chat) using the camera feed versus the depth map created on the iPhone X.
hi ...
amazing work ;]
where can i buy that helmet?
what is that helmet model?
thanks.
Standard Deviation is the helmet im using now… its super awesome
Been about the metaverse
i have fullbady perception neuron system and it is not fixed working it is so bad system
What is that effect on the baby's face at 0:17?
2:30 I'm gonna get medieval probably
How much does the suit cost?
How do you use Airdrop the recorded data to the Mac?
the recorded data stores locally on the iPhone and i just use airdrop to copy the data to my desktop. I could potentially stream the data right to the desktop via wifi but haven't had a chance to dive into that.
Cory Strassburger your solution is much better... I’m using the plugin ofUnity to extract the data from the phone which doesn’t support “wireless”.... the phone has to be physically connected to the Mac. But what I need is your solution. Is it possible to share with me? :) Btw, I’m using Perception Neuron instead of Xsen coz it tracks every detail of my finger movement.
Did you ever manage to build a facial rig to animate ontop of the ark kit motion capture data? I'm in the same boat and i'm wonder if this is possible. Thanks
not a rig per se, i've been doing it in a janky way when i need to do little fixes. Im getting ready to migrate all my characters to full metahuman which will have new challenges.
@@xanaduBlu Good luck lol. Love the work & the new video. Look forward to seeing more.
Please list all of the things u used including softwares and equipments ...please (mention price if possible)...please anyone reply
Xsens Link suite, iPhoneX, Unreal Engine 4.21. Maya was used to make the beby model and Maya & zBrush for the blendshapes.
helmet: uploadvr.com/iphone-xsens-performance-capture-bebylon/
@@trickdiggidy can i use the iphoneX +cinema 4d+xsens for my character mocaps?
I don't know enough about c4d and how it handles blendshapes and importing animation data. I'm quite sure getting xsens body capture data should be simple using FBX. @@emekaeffi
@@trickdiggidy please i need your advice on an animation project i'm working on, can i send you a mail?
What is the iphone app called?
What is the iphone app called?
This one is not available to the public. Take a look at MocapX app, it streams the data directly from the iPhone to Maya in real-time.
Bro Duck selling the game sell the app your making 😂
Why did he stop making videos
Who?
JAJAJAJAJAJA
Is this what misanthropy feels like?
Lol so is this brown face? Good job.