Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

512 vs 224 model #451

Open
x4080 opened this issue Oct 22, 2023 · 1 comment
Open

512 vs 224 model #451

x4080 opened this issue Oct 22, 2023 · 1 comment

Comments

@x4080
Copy link

x4080 commented Oct 22, 2023

Hi, I tried both models and the 512 one result is pretty bad, anyone got good result with it ? Or anybody successfully trained the 512 model and willing to share ? Thanks

@G-force78
Copy link

I've tried a few different videos and found the size of the face in the image/video should be less than 400 as the entire masked area is 512px which can mean the forehead and hair is included in the transplant which makes the end result look bad.

It wont correct the blending of the tones and colours of skin the recent apps like facefusion deal with that much better presumably through some sort of blending program and mask erosion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants