Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Breakout test project 'face tracking' #178

Closed
wekitecs opened this issue Jan 7, 2021 · 1 comment
Closed

Breakout test project 'face tracking' #178

wekitecs opened this issue Jan 7, 2021 · 1 comment

Comments

@wekitecs
Copy link

wekitecs commented Jan 7, 2021

In GitLab by @fwild on Jan 7, 2021, 12:15

  • download an example vocal tract OBJ for a specific phoneme (e.g. "P", "TH", "A") from https://www.vocaltractlab.de/index.php?page=vocaltractlab-download
  • use face tracking (AR foundation?) to detect face position
  • overlay 3D face model (like in lens studio) with the right lip-sync position on the position where the user's face is
  • change lip colour to blue :)
  • show beside the head (rotated by 90 degrees) a side view of the OBJ from above
@wekitecs
Copy link
Author

In GitLab by @Wild on Sep 10, 2021, 12:40

MediaPipe and the MediaPipe Unity plugin seem to support this cross platform. Equally, AR foundation is offering this, but not yet on all platforms.

@fwild fwild added this to the More flexible calibration milestone Jun 8, 2022
@fwild fwild closed this as completed Jun 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants