Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load with transformers? #7

Open
CoinCheung opened this issue Aug 2, 2024 · 5 comments
Open

Load with transformers? #7

CoinCheung opened this issue Aug 2, 2024 · 5 comments

Comments

@CoinCheung
Copy link

Hi,

I downloaded the checkpoint from this link, and I load it like this:

from transformers import AutoModelForVision2Seq
model = AutoModelForVision2Seq.from_pretrained('./table-llava-v1.5-13b/')

The loading process is not successful. My transformer version is: 4.42.4

Part of the errror messages are like this:
image

Besides, it seems that there should be a preprocessor_config.json which is not released. Would you please tell me how could I use the released model like we use ordinary llava models?

@sssuperrrr
Copy link

I'm having the same problem with table-llava-v1.5-7b, transformers==4.31.0.

@SpursGoZmy
Copy link
Owner

This is probably because the saved Table-LLaVA checkpoints from the original LLaVA repository is not directly compatible with the Transformers, which is mentioned in this github issue. I will try the provided conversion script and upload new checkpoints. But for now, maybe the checkpoints can only be loaded locally with the provided script instead of directly loading from HuggingFace, i.e., download Table-LLaVA checkpoints and set the 'model-path' to your local path of the model weights folder. Sorry for the inconvenience.

@sssuperrrr
Copy link

I downloaded the Table-LaVA checkpoint from huggingface and set "model-path" to the local path of the model weights folder, but it still doesn't work.

@xiamaozi11
Copy link

Some weights of LlavaForConditionalGeneration were not initialized from the model checkpoint at table-llava-7b and are newly initialized: ['model.language_model.lm_head.weight....

@MrZhangmg
Copy link

I download the checkpoints and set the model-path but when I run inference,it still chose to load files by trying to connect to 'https://huggingface.co' which led to errors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants