Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expression Deformation #24

Open
royrs opened this issue Feb 20, 2025 · 2 comments
Open

Expression Deformation #24

royrs opened this issue Feb 20, 2025 · 2 comments

Comments

@royrs
Copy link

royrs commented Feb 20, 2025

In the paper you describe there is a mapping from the implicit keypoints to control of "mesh movement", such as opening mouth or eye.

Is there some documentation about this mapping?
For example, in the code you use points 33,34,45,56 to control the gaze. Do you mind sharing some details about the mapping that you found?

@digital-avatar
Copy link
Collaborator

This mapping relationship is inspired by LivePortrait's "regional control". You can refer to it here. More detailed mapping relationships can be further explored based on it.

@royrs
Copy link
Author

royrs commented Feb 23, 2025

Thank you.
I noticed some differences between what you and LivePortrait do.

When looking at the blinking function it seems it is based on "delta_eye_arr" which is loaded from the config file.
If I understand correctly, you pre-calculated 15 deformation vectors each represent different position of the blinking process.
Is there a reason you implemented it like this? In Live portrait they change the following specific indices relatively:
`blink = -eyeball_direction_y / 2.

    delta_new[0, 11, 1] += blink * -0.001
    delta_new[0, 13, 1] += blink * 0.0003
    delta_new[0, 15, 1] += blink * -0.001
    delta_new[0, 16, 1] += blink * 0.0003`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants