FaceFX with Body Gestures

I'm currently working on a project that requires a significant amount of dialog across multiple characters. In order to keep things manageable, I'm trying to integrate some hand/arm gestures in with the lip-sync animations so they will be more dynamic and varied per line of dialog. The results I'm getting are kind of stiff and robotic at the moment and I have not been able to find any good resources or examples on how to really make this work smoothly.

I'm using the BodyGestures analysis actor and tying "activateArmGestures" along with "gestureBump" and a few of the others to drive the characters' arm/hand motions. This is still a relatively new piece of software for me and I'm sure that its just my lack of understanding that's giving me these unnatural looking results.

Does anyone have experience with hooking up dynamic hand/body motions in FaceFX and, if so, how are you setting up your face graph and analysis actor to drive them?

I've included a screen shot of my current face graph that is being
driven by the BodyGestures analysis actor that comes with FaceFX by default. It is a pretty simple setup for the moment and I would like
to be able to create a more complex graph in the future for more dynamic results. Of course, getting the most simple parts working first is my top priority.

If anyone has any thoughts, suggestions, critiques, or questions - I would greatly appreciate it.

Thanks!

Permalink

Hi Alex,

It's a tough problem, creating body gestures from speech. I gave it my best shot with the BodyGestures analysis actor, and I'd rank the results somewhere better than no animation at all but worse than the believability we can achieve with the head and facial gestures. There are lots of degrees of freedom with body gestures, and you can deliver a lot more meaning with your arms and your hands than you can with your neck bone and eyebrows, so it's tough to keep up the illusion of intelligence for long.

You can try to avoid the problem by limiting the degrees of freedom your character's hands have. Crossing their arms for example eliminates the problem entirely. In one early example, I made a pose where each finger tip touched the fingertip of the opposite hand and gave the character a pensive look. A simple wrist rotation of one hand using a curve like gestureBump was all that was needed to bring a bit of movement to the character. While this early example was encouraging, I wanted to avoid "self-touch" poses because they were difficult to transfer to another character without adjusting them. But perhaps something like that will work for your application.

When I tried to provide more degrees of freedom with the BodyGestures analysis actor, it was difficult to "scale up" the simple case without creating a very complex event and face graph setup. Moreover, the more degrees of freedom you gave the character's hands, the more meaningful the viewer expects the gestures to be.

It's still on my todo list to take another stab at body gestures using mocapped or hand-made animation clips and a combination of analysis curves and an analysis text preprocessor (I think a good system would need to be language specific, taking advantage of grammar to figure out the flow of a sentence that would drive the gestures in many cases). It would also require an animation system like unity's that supports animations that only impact a subset of the bones in the skeleton, that supports playing multiple copies of the same animation simultaneously, that supports leaving the skeleton where it is at the end of the animation with clamping. Ogre's animation system does not have these features.

That project would be a significant amount of work however, so it's not something I would encourage you to take on while you have other commitments for your project. I'd really enjoy hearing and seeing the way other people are approaching this problem, so if you do take it on, please keep me in the loop, but my recommendation would be to find a way to avoid the problem because solving it in the general case is very difficult.

I hope that gives you a good idea where body gestures in FaceFX stand.

Doug

Permalink

Thanks, Doug!

I definitely agree with you on limiting the degree with which the hands and arms can actually move. In my initial tests, setting up the key poses on my hands to move from the character's side up to almost chest height resulted in animations that I can only describe as "spastic" looking. Tweaking the face graph a bit more to dampen the values helped quite a bit, but I've been working on limiting the range of movement required by changing the character's idle(neutral) position. One thought along these lines is to use a curve like "activateArmGestures" to move a hand/arm into an initial position and then (like you mentioned) use values from "gestureBump" to create smaller motions that seem a bit more natural. Using curves that ramp up to a constant 1.0 value can end up making the character seem very stiff though, so planning a natural looking initial pose for the smaller gestures to originate from is key. It would be great to get some really nice procedurally generated full-body gestures, but I think limiting the range of motion is going to be the direction I head in for now.
I would still like to hear more from other people on this forum as well with regards to their experiences on this. The timeframe for my current project is pretty tight, so I probably won't be able to devote as much time as I would like to figuring out a more complex solution. Nonetheless, I will be working on it for a little while longer and I will be sure to post my findings here for anyone else who may be a bit confused on setting up body gestures.