-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Load with transformers? #7
Comments
I'm having the same problem with table-llava-v1.5-7b, transformers==4.31.0. |
This is probably because the saved Table-LLaVA checkpoints from the original LLaVA repository is not directly compatible with the Transformers, which is mentioned in this github issue. I will try the provided conversion script and upload new checkpoints. But for now, maybe the checkpoints can only be loaded locally with the provided script instead of directly loading from HuggingFace, i.e., download Table-LLaVA checkpoints and set the 'model-path' to your local path of the model weights folder. Sorry for the inconvenience. |
I downloaded the Table-LaVA checkpoint from huggingface and set "model-path" to the local path of the model weights folder, but it still doesn't work. |
Some weights of LlavaForConditionalGeneration were not initialized from the model checkpoint at table-llava-7b and are newly initialized: ['model.language_model.lm_head.weight.... |
I download the checkpoints and set the model-path but when I run inference,it still chose to load files by trying to connect to 'https://huggingface.co' which led to errors |
Hi,
I downloaded the checkpoint from this link, and I load it like this:
The loading process is not successful. My transformer version is: 4.42.4
Part of the errror messages are like this:
Besides, it seems that there should be a
preprocessor_config.json
which is not released. Would you please tell me how could I use the released model like we use ordinary llava models?The text was updated successfully, but these errors were encountered: