We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I try to train llava v1.6 7B in lora mode, but can not find pretrain_mm_mlp_adapter file, where can i find it?
The text was updated successfully, but these errors were encountered:
@YQYI I guess you can just use the mlp_adapter of llava v1.5 7B as the blog says
Sorry, something went wrong.
No branches or pull requests
Question
I try to train llava v1.6 7B in lora mode, but can not find pretrain_mm_mlp_adapter file, where can i find it?
The text was updated successfully, but these errors were encountered: