Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Successful installation leads to error during execution #55

Closed
steph280 opened this issue Nov 24, 2024 · 2 comments
Closed

Successful installation leads to error during execution #55

steph280 opened this issue Nov 24, 2024 · 2 comments

Comments

@steph280
Copy link

Expected Behavior:
Launch of program

Actual Behavior:
(MagicQuill) C:\Users\VR2\Desktop\MagicQuill> python gradio_run.py
Total VRAM 24576 MB, total RAM 32684 MB
pytorch version: 2.1.2+cu118
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : native
Using pytorch cross attention
['C:\Users\VR2\Desktop\MagicQuill', 'C:\Users\VR2\.conda\envs\MagicQuill\python310.zip', 'C:\Users\VR2\.conda\envs\MagicQuill\DLLs', 'C:\Users\VR2\.conda\envs\MagicQuill\lib', 'C:\Users\VR2\.conda\envs\MagicQuill', 'C:\Users\VR2\.conda\envs\MagicQuill\lib\site-packages', 'editable.llava-1.2.2.post1.finder.path_hook', 'C:\Users\VR2\Desktop\MagicQuill\MagicQuill']
Traceback (most recent call last):
File "C:\Users\VR2.conda\envs\MagicQuill\lib\site-packages\transformers\utils\hub.py", line 385, in cached_file
resolved_file = hf_hub_download(
File "C:\Users\VR2.conda\envs\MagicQuill\lib\site-packages\huggingface_hub\utils_validators.py", line 106, in inner_fn
validate_repo_id(arg_value)
File "C:\Users\VR2.conda\envs\MagicQuill\lib\site-packages\huggingface_hub\utils_validators.py", line 160, in validate_repo_id
raise HFValidationError(
huggingface_hub.errors.HFValidationError: Repo id must use alphanumeric chars or '-', '
', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: 'C:\Users\VR2\Desktop\MagicQuill\models\llava-v1.5-7b-finetune-clean'.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\VR2\Desktop\MagicQuill\gradio_run.py", line 21, in
llavaModel = LLaVAModel()
File "C:\Users\VR2\Desktop\MagicQuill\MagicQuill\llava_new.py", line 26, in init
self.tokenizer, self.model, self.image_processor, self.context_len = load_pretrained_model(
File "C:\Users\VR2\Desktop\MagicQuill\MagicQuill\LLaVA\llava\model\builder.py", line 116, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
File "C:\Users\VR2.conda\envs\MagicQuill\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 758, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File "C:\Users\VR2.conda\envs\MagicQuill\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 590, in get_tokenizer_config
resolved_config_file = cached_file(
File "C:\Users\VR2.conda\envs\MagicQuill\lib\site-packages\transformers\utils\hub.py", line 450, in cached_file
raise EnvironmentError(
OSError: Incorrect path_or_model_id: 'C:\Users\VR2\Desktop\MagicQuill\models\llava-v1.5-7b-finetune-clean'. Please provide either the path to a local folder or the repo_id of a model on the Hub.

(MagicQuill) C:\Users\VR2\Desktop\MagicQuill>


Additional Context/Details


Environment

  • OS: Windows
  • Version: 11
  • Any Relevant Dependencies:

@tristan88888
Copy link

From the main installation steps:

Follow the following guide to set up the environment.

git clone repo. Please don't forget the --recursive flag. Otherwise, you will find LLaVA submodule missing.
git clone --recursive https://github.com/magic-quill/MagicQuill.git
cd MagicQuill

@zliucz
Copy link
Member

zliucz commented Nov 24, 2024

Check issue#54.

@zliucz zliucz closed this as completed Nov 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants