Handpose: Fully Articulated Hand Tracking
Vložit
- čas přidán 2. 10. 2014
- Introducing a new real-time articulated hand tracker which can enable new possibilities for human-computer interaction (HCI). Our system accurately reconstructs complex hand poses across a variety of subjects using only a single depth camera. It also allows for a high-degree of robustness, continually recovering from tracking failures. However, the most unique aspect of our tracker is its flexibility in terms of camera placement and operating range.
Please note, we are using a standard Xbox One Kinect without any hardware modifications. The sunglasses are optional and were originally used for anonymity, and partly tongue-in-cheek.
research.microsoft.com - Věda a technologie
Wow that's awesome, you know what isn't awesome? You guys keeping this to yourself all these years and still not having it in the SDK.
kinect just got deprecated and you never release this, please release code so we can try to create wrapper for unity
That's impressive.What is even more impressive is the fact that you guys have the best end user test environment in the form of XBOX, and your utter inability to communicate between divisions.
Over a year and this isn't in the SDK yet?
gar-bage
Over 6 years this isn't in the SDK yet :(
That's really cool. Maybe we'll see it in a commercial version of the Kinect one day...
THIS IS SO FREAKING AMAZING!
Do I have to wear sun glasses when using this method??
Looks fast and robust. However, the fast hand movements make it difficult to judge the amount of noise in tracking. Also, it will be interesting to know the CPU & memory requirements of the tracker (to see if it is actually usable on a laptop, game console, etc.)
Come on! Release this!!!! this is what msfs2020 needs!
Is there a publicly available API for this?
Please God integrate this with Oculus Rift. Having Digital in-game arms/hands would make immersion beyond belief.
Nice idea, but I think the latency will be too high. Even taking into account the lag of the display, you can see when he uses it in front of the monitor that there is a fair amount of lag, too much for VR.
napalmhardcore yea, um, this is new tech. When oculus first started, the latency was over 125ms. Now its about 20. It'll improve.
@@Unkn0wnC0mmand its now 2020, i want to see this kind of tech NOWWW
I'm so wanna work for Microsoft Research, they have the best lab among all tech companies!
1:08 he gives himself a pat on the back. alpha as fuck
This is what we were waiting for... !!
Is their open source software for this tracking? Maybe a future release under the sdk? I have some coworkers interested in this technology with their Kinects.
Is this something that will be released in an upcoming sdk update?
Wow, great advancement.
Looks like the user's hands are much closer than the prescribed "minimum distance" of ~1.8m. Is this because that distance is required only in the case of needing the user's entire body to be within the camera's viewing frustrum? If so, (assuming only the hands and/or face need to captured,) what is the *actual* minimum distance?
When will this be available in the Kinect SDK?
When will this be integrated with WindowsMR Immersive Headsets? Facebook's Oculus and HTC Vive already have this functionality with RGB cameras on consumer devices, and from this video, this seems to be functioning great in 2014 already and we still don't have it Windows MR platform (excluding HoloLens 2). Any plans on that?
best work....is there .any way i can have the source code with full demonstration for study only
What if the hand is flat on a table or resting on the lap whilst sitting, how fast will it be recognized when it moves?
is it an appearance-based or a model based approach? or maybe a mix of both approaches?
¿where i can get the code for using it?
Interesting technology. Is the system really bright or something? Why is everyone wearing sunglasses indoors?
it's kinect....
> _Why is everyone wearing sunglasses indoors?_
Because they are hackers! (from cheesy 90s movies)
Seriously now, probably in order to improve the tracking, maybe eyes get mistaken for finger parts?
because they change the IR laser to much stronger one and they worry to their eyes
Shauli Zak Yes, this is the real reason. The normal Kinect will definitely not work like this. The IR must be pretty bright.
Hilarious how everyone is wearing the same sunglasses indoors
Oculus Rift + THIS = Awesome!
So this is like way way superior to leap motion. Right
They didn't show how system react when hand cover behind body or if system is capable to track 4 hands of two people (+ some of them may hide behind body from time to time as well).
This+Oculus+UnrealEngine4=Epic Win!
Cool. But what's with the sunglasses?
So, does this have a release date? There is absolutely no information anywhere online as to when this software will be integrated into the Kinect's existing API, or if it ever will be... Is it already in there and I missed it?
Alex M it will be integrated into the hololens
There are many potential applications that devs could make from this that don't require a VR helmet. I hope MS includes it in the Kinect4W API also.
If I duct tape a kinect to my google cardboard
will I have a WMR HMD?
You have to wear the glasses? Or is everyone there a wannabe Tom Cruise?
the testers had to be anonymous and so they decided to wear glasses, since they didn't like to blur the eyes
tdreamgmail A bag with two eye holes was the other option.
wow, is this in the SDK?
Very good indeed
Is this available?
Why the aviator sun glasses?
Is this possible with the xbox 360 kinect sensor?
What's up with 3d tracking and molecule building always? I have seen this from okreylos others before. Is this such an important field? I would consider bran surgery more important (and funnily enough I saw a presentation of a brain surgery learning app [for tables] presented at a recent conference). ;)
can't wait to use this with oculus its gonna be mental
can i use this for vr ?
Why do all the users have the same glasses?
All them glasses augmented reality?
Impressive.
I need this sooo badly.... make this export FBX files---- and Profit?
master hand's body!!!!! omg the internet is breaking.
Can this work on kinect v1
What's with the glasses?
Was there a paper?
Why are they all wearing the same sunglasses?
This would be SOOOO usefull for Animators- anyone has a link to built software like this? All I need is .bvh motions of hands for my project pwease
Amazing =)
why is this not in kinect
why are they all wearing sunglasses inside?
i want a scultping software using this
i want source!!!!!!!!!!!!
source code please
How tolerant to partial digit occlusion is this approach? The video showcases some "easy" cases, but the challenging cases are fists and grasping motions, especially away from the camera.
Finally I can give a finger to all the Kinect haters out there!
NSA will be happy.
Now nsa can see when i give them the finger :D
Now i know where the boss hand came from on super smash bros. 64 lol
"tested on a wide range of adult male and female subjects"
I counted seven males and one female.
lol now there is guys making boneworks saying they are the first to do this
Hand Language to Text anyone?
Where I come from we call that 'typing'
neatodd
OOOOOOOOOOOOOOH YEEEA... I forgot about actually having hands... I guess it'll be more useful as "Hand Language to Automated Voice" because it might be faster.
Already done nearly a year ago now...
do u mean sign language??
Latency latency latency!
Are the aviators a required component to the device's operation? If so, I don't care. They're bitchin'.
Bye bye privacy. NSA is watching one/kinect users
Ah Kinect... so many possibilities, such lousy platform management.
All ended up a total waste of time as they never released SDK or anything workable. Makes me wonder if it wasn't all fake.