Skip to content

Facial blendshapes not driven after applying Meta face tracking scripts to GLB imported via UnityGLTF #852

@EfimBash

Description

@EfimBash

Describe the bug 💬

I'm encountering an issue when instantiating a .glb file using the UnityGLTF loader from KhronosGroup. After loading the model, I apply Meta's face tracking setup script using:

AddComponentsHelper.SetUpCharacterForA2EARKitFace(gameObject, false, true);

The script components are correctly added to the avatar, but the face tracking does not animate or affect the blendshapes of the model. The blendshapes remain static, even though the tracking pipeline appears to be active.

I tested the exact same process using GLTFast instead of UnityGLTF, and in that case, everything worked as expected — the facial expressions were correctly applied through the blendshapes. However, due to some package constraints, I prefer using UnityGLTF.

Steps to reproduce 🔢

Steps to Reproduce:

  1. Instance a .glb avatar using UnityGLTF.
  2. Apply SetUpCharacterForA2EARKitFace(...) from Meta's Movement SDK.
  3. Run the scene with active face tracking (using a OVRCameraRig, from the All-In-One-SDK of Meta, with "Record Audio for audio based Face Tracking"). Or Just a script that after instancing the model modifies its blendshapes
  4. Build and run the app on a Quest 1, 2 or 3 device

Image

Observe that no blendshape animation occurs.

Files to reproduce the issue ♻

ModelForTesting.zip

Editor Version 🎲

6000.0

Render Pipeline and version

URP 17.0.4

UnityGLTF Version

2.14.1

Operating System 👩‍💻

Windows

When does this problem happen?

  • Editor Import
  • Runtime Import
  • Editor Export
  • Runtime Export

Additional Info 📜

No response

Validations 🩹

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions