You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the paper you describe there is a mapping from the implicit keypoints to control of "mesh movement", such as opening mouth or eye.
Is there some documentation about this mapping?
For example, in the code you use points 33,34,45,56 to control the gaze. Do you mind sharing some details about the mapping that you found?
The text was updated successfully, but these errors were encountered:
This mapping relationship is inspired by LivePortrait's "regional control". You can refer to it here. More detailed mapping relationships can be further explored based on it.
Thank you.
I noticed some differences between what you and LivePortrait do.
When looking at the blinking function it seems it is based on "delta_eye_arr" which is loaded from the config file.
If I understand correctly, you pre-calculated 15 deformation vectors each represent different position of the blinking process.
Is there a reason you implemented it like this? In Live portrait they change the following specific indices relatively:
`blink = -eyeball_direction_y / 2.
In the paper you describe there is a mapping from the implicit keypoints to control of "mesh movement", such as opening mouth or eye.
Is there some documentation about this mapping?
For example, in the code you use points 33,34,45,56 to control the gaze. Do you mind sharing some details about the mapping that you found?
The text was updated successfully, but these errors were encountered: