Pose Preset Animation for UE Live Link Face Capture

UpL8RenderingUpL8Rendering Posts: 129
edited November 2020 in Unreal Discussion

I made a Daz Pose Preset Animation for all the poses needed for the Unreal Engine Live Link Face capture.

They are all made from the basic Daz head poses. Some may still need work but it should get you started.

It is up on my Github page now. Documentation is in the ReadMe.md file but I will add it below too.

https://github.com/GNVR-Dev/DazPosePreset-ArKitFacePoses

Instead of using morphs this method exports the poses as an animation.  The animation can be shared between characters while maintaining their unique character shape.

This also allows for animation of facial hair, jewelry, or any other item that needs to be attached to a bone that will be following the animation.

 

Post edited by UpL8Rendering on

Comments

  • UpL8RenderingUpL8Rendering Posts: 129
    edited November 2020

    Basic poses for ArKit Face capture.  Made with default Parameter settings. Saved as Daz Pose Preset.

    Reference Pose:

    Eyes:

    Jaw:

    Mouth

    Cheek, Nose, Tongue

    Post edited by UpL8Rendering on
  • UpL8RenderingUpL8Rendering Posts: 129
    edited November 2020

    Unreal Engine Project Setup:

    In your Project Settings make sure the following Plugins are enabled:

    DazToUnreal

    Apple ARKit and Apple ARKit Face Support

    As well as Live Link

    Click Restart Now to restart your project.

    Post edited by UpL8Rendering on
  • UpL8RenderingUpL8Rendering Posts: 129
    edited November 2020

    In Daz Studio load a base character into the scene.

    Send the character mesh to Unreal by going to:

    File -> Send To -> Daz To Unreal

    Set the Asset Type to Skeletal Mesh

    Uncheck Enable Morphs

    Uncheck Enable Subdivisions

    and click Accept


     

    Using Pose Preset Animation

    Download and extract or clone the Github three files into either:

    Daz3DFiles\Presets\Poses

    or

    My Library\Presets\Poses

    Open the ARKitFacePoseNames.txt file

    You will need these names later

    In the Content Library tab, navigate to and double click the ARKitFacePoses preset

    This will pop up a message asking if you want add frames to the timeline. Click Yes.

    All the poses will be added to the timeline.

    Again, go to File -> Send To -> DazToUnreal

    Change the asset name to something like ARKitFacePoses

    This time set the Asset Type to Animation

    Uncheck Enable Morphs

    Uncheck Enable Subdivisions

    Click Accept

    Post edited by UpL8Rendering on
  • UpL8RenderingUpL8Rendering Posts: 129
    edited November 2020

    Once imported right click on the animation

    Go up to Create and click Create PoseAsset

    Go back to the ARKitFacePoseNames.txt file and copy all the names of poses

    Paste them into the Pose Asset dialog box and click Accept

    Open the Pose Asset

    In the Asset Details Tab under Additive

    Check Additive and set the Base Pose to the ARKit_Ref_Pose

    Click Convert to Additive Pose

    If you have other characters you can change the Preview Mesh for the animation to see what the poses look like on other characters

    Post edited by UpL8Rendering on
  • UpL8RenderingUpL8Rendering Posts: 129
    edited November 2020

    Setting Up the Animation Blueprint

    Right click in the Content Browser

    Mouse over Animation and Select Animation Blueprint

    In the Animation Blueprint AnimGraph right click and create a Live Link Pose node:

    Set the Live Link Subject Name to the device you are using for face capture

    Pull out from the output pin and create a modify Curve Node

    Change the Apply Mode to Weighted Moving Average

    Pull off from the Alpha input and promote to variable

    Change the name of the Variable to WMA_Alpha
    Compile the Blueprint
    and set WMA_Alpha's the default Value to .8

    Pull off from the Modify Curve node and create a node to Evaluate your face poses Pose Asset

    Connect this node to the Output Pose and Compile to test that this part is working OK

    To add head movement pull a pin out and create a Local To Component Node

    Then create three Transform (Modify) Bone nodes and reconnect to the Output Pose (a Component To Local node will be created automatically)

    On each of the Transform (Modify) Bone nodes set the Rotation Mode to "Add to Existing" and the Rotation Space the "Bone Space"

    On the first Transform (Modify) Bone node set the Bone to Modify as neckLower

    On the second Transform (Modify) Bone node set the Bone to Modify as neckUpper

    On the third Transform (Modify) Bone node set the Bone to Modify as head

    Pull off one of the Rotation pins and Make Rotator

    Connect the rest of the rotator pins to the make roator node

    Right click on the graph and create a Get Curve Value node

    Enter headPitch into the Curve name
    add a multiply node and enter -90.0
    add a divide node and enter 3.0

    Duplicate these nodes twice and connect to the Y(Pitch) and Z(Yaw) of the Make Rotator Node

    On the second Get Curve Value node change the Curve Name to headYaw

    On the third Get Curve Value node change the Curve Name to headRoll

    Compile and test.

    If the head movement isn't working it's probably because the skeleton doesn't have a headPitch, headYaw, or headRoll anim curve set up yet.

    In the Skeleton setup of your character go to the Anim Curve tab and search for Head.

    If those curves aren't listed Right Click and "Add Curve"

    Add a curve for:
    HeadPitch
    HeadRoll
    HeadYaw

    Head movement should now be working

    Post edited by UpL8Rendering on
  • UpL8RenderingUpL8Rendering Posts: 129
    edited September 2020

    Placeholder - Just in Case

    Post edited by UpL8Rendering on
  • WendyLuvsCatzWendyLuvsCatz Posts: 38,312
    edited September 2020

    pity all these facial capture softwares need Apple iphones crying

    I just absolutely refuse to buy one or any expensive smartphone for that matter

    honestly if Apple released a webcam with the same technology I would buy that, just don't like linking my phone to that sort of data, I even unplug my webcam when not in use

     

    Post edited by WendyLuvsCatz on
  • pity all these facial capture softwares need Apple iphones crying

    I just absolutely refuse to buy one or any expensive smartphone for that matter

    honestly if Apple released a webcam with the same technology I would buy that, just don't like linking my phone to that sort of data, I even unplug my webcam when not in use

     

    I wouldn't be surprised if there was something out there already that you could make work. Apple just makes it easy to use and the FaceARSample Project is already set up to go.

  • I made a Daz Pose Preset Animation for all the poses needed for the Unreal Engine ARKit face capture project.

    They are all made from the basic Daz head poses. Some may still need work but it should get you started.

    It is up on my Github page now. Documentation is in the ReadMe.md file but I will add it below too.

    https://github.com/GNVR-Dev/DazPosePreset-ArKitFacePoses

    Instead of using morphs this method exports the poses as an animation.  The animation can be shared between characters while maintaining their unique character shape.

    This also allows for animation of facial hair, jewelry, or any other item that needs to be attached to a bone that will be following the animation.

     

    Wow, fantastic. :) Ty so much.

  • Wow, fantastic. :) Ty so much.

    You're welcome. I hope it works OK. I'm planning on doing the UE side of the tutorial this weekend.

  • dc8989dc8989 Posts: 0

    Was wondering if you were planning on doing the UE part of this - I got it working on my own with a weird exception: can't seem the get the brows to stop mirroring one another.... 

  • Very Interesting, I was working on a similar approach for Unity.  

    It is important to match ARkit's control points to DAZ for facial animation either for motion capture or for real-time remapping.

    thanks, I will check your approach.

     

  • UpL8RenderingUpL8Rendering Posts: 129
    edited November 2020

    Sorry, I didn't realize this thread had gotten a couple more responses.

    I was trying to get this working using the UE FaceARSample project as a starting point but I eventually gave up and then got distracted by another couple projects.

    I just updated the tutorial above so you don't have to use the UE FaceARSample project at all, you can just add this setup to your existing project or template.

    I haven't update the Github Read Me file with these changes yet.

    Post edited by UpL8Rendering on
  • Awesome stuff, thank you for all the documentation :) Is it possible to use this with the Daz to Unreal exporter or do you have to export via fbx?

  • klubbvisuals said:

    Awesome stuff, thank you for all the documentation :) Is it possible to use this with the Daz to Unreal exporter or do you have to export via fbx?

    You're welcome.  Yes, the Daz to Unreal exporter will work great for this.  The steps for using the exporter are included in the documentation.

  • Meeeehhhhhhh...

     

    I'm stuck on the "add a node to evaluate your pose" step.  That option isn't there.  Neither is the Genesis 8 pose that's in your example image :(  I went back, redid everything... and they still aren't showing up :-/  Help? 

  • Hello,

    It may be one of two issues.

    When sending to Unreal are you making sure you are changing the Asset Type to Animation?

    If yes, did you do the step where you right click on the Animation and Create a Pose Asset?

  • Hi! thank you for your work

    I have a problem. Avatar is everytime with open mouth and half-closet eyes. When i smile she looks like crying girl. What i made wrong? HELP PLEASE

    изображение_2021-07-11_001228.png
    1920 x 1200 - 780K
  • https://imgur.com/asiIbV6 ; her face like here

     

  • Great work! Does this work for G8.1 BTW?

     

     

Sign In or Register to comment.