Hello, I'd like to add new emoticons.
Looking at the python scripts, there seem to be only one file for this - GestureLib.py. I've added into this
emotions = [["happy", ":)"], ["sad", ":("], ["angry", ";("], ["surprised", ":@"], ["scared", ":%"], ["disgust", ":$"], ["contempt", ":^"]]
and line 718:
emotionOutputCurves = ["happy", "sad", "angry", "surprised", "scared", "disgust", "contempt"]
I've also edited the batch file, source fbx, and dialog lines to reflect these. However, after re-analyzing, the Studio still only recognizes the initial four emoticons from the dialog. Is there something I'm missing?
Edit1: After digging around a bit more, perhaps I'm missing an edit on the Analysis Actor? However, I'm having troubles saving this actor, returning error: "FaceFX Runtime Plugin: error: the actor temp does not contain any tracks".
An important thing to note is that you don't have to use the Analysis Actor system to use emoticons. You can create any emoticon you want with any combination of 2 or more of the allowed punctuation characters (https://facefx.com/documentation/2017/W230). The result of this will be an "event" in your animation that has a duration equal to distance between the emoticon tags (or a duration of 1 if there is no text in between the opening and closing tag). If you double-click on this event, you can create the animation that will be driven by this event.
What the analysis actor system in GestureLib does is convert these events with duration into curves with friendly names like "angry". As the event gets shorter and shorter, the magnitude of the output curve gets smaller and smaller so it is impossible to get an "angry" spike that ramps up unrealistically fast. This setup is stored in the Analysis Actor so you aren't polluting your primary actor with unnecessary animation groups and events. You just get a friendly output curve called "angry".
If you modify GestureLib to create new curves, then start a fresh actor and run the GestureLib script, you will create a new "Analysis Actor" that in order to use, you will need to save in your Analysis Actors folder and select when you generate an animation (by default the Default.facefx analysis actor file is used, and this one only has the initial 4 emoticons)
The error you are getting on save from the Runtime plugin is telling you that there are no "tracks" in the actor because there are no bone poses or morph targets to drive. So it is telling you that this would not be a valid actor to drive in your game. You can ignore this because this actor does not need to be compiled by the runtime compiler, it will only be used by FaceFX Studio to generate curves.
Thanks for your detailed reply, Doug!
It worked following your suggestion creating new Analysis Actor with executing the GestureLib script. Since with our project, we prefer as little hands-on work as possible, this worked out great.
Thanks so much! :)