-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Several MLLM Models #4136
Conversation
All models are hf official models |
Thanks for the clarification! Please let me know when you support the LlaVA-Next-Video model. Thank you! |
This PR has not been merged into main, so every feature in this PR can not be tried yet using latest llama-factory. |
I did! Thank you for your help! Could you also ping hiyouga for this issue? Sorry for bothering you. My lab (USC ISI) is working on something and we would really appreciate your help if we could use your tool to use LlaVA-Next. Please let me know when this is done :D Thanks again! |
…uning problem of idefics2
How about miniCPM V2.5? |
On the road! |
Add special handling conditions to the llava-next-video model.
add visual model config for llava-next-video
What does this PR do?
This PR is working!!
If you are interested, you can use my branch https://github.com/BUAADreamer/LLaMA-Factory for now.
Support models:
Features:
Before submitting