Skip to content

Face moving but not properly calibrated/synced way #6

@iPsych

Description

@iPsych

It's amazing work!

I found that the LLink Face with Metahuman-generated human data doesn't 'sync' or 'calibrated' properly.
Below is the captured expression from video, the smiling person. mediapipe tracked it quite accurately.

Capture_3d

The metahuman response to MeFaMo-transfered data.
Capture_unreal

Is there any parameter or step I should check or need to improve?

In your demo video, your smile is quite well-synced.
https://www.reddit.com/r/unrealengine/comments/r8wbe3/my_livelink_facetracking_without_an_apple_device/

Does the fresh-exported metahuman data need blueprint modification following below link you mentioned?
https://docs.unrealengine.com/4.27/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/

Another Strange example.
Just image input, and the result.
Capture_2_Input
Capture_2_Unreal

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions