Replies: 1 comment 3 replies
-
ExLlama (v1) was removed recently, so the extension would need to be updated. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Anyone else see that the Playground extension broke after a pull today? Yesterday it was working fine but when I just did a Pull, I'm getting errors. Just curious if this is a known issue. Error:
Closing server running on port: 7860
18:47:59-142431 INFO Loading the extension "gallery"
18:47:59-144431 INFO Loading the extension "Playground"
18:47:59-147432 ERROR Failed to load the extension "Playground".
Traceback (most recent call last):
File "E:\StableLLM\Dec23\text-generation-webui\modules\extensions.py", line 37, in load_extensions
exec(f"import extensions.{name}.script")
File "", line 1, in
File "E:\StableLLM\Dec23\text-generation-webui\extensions\Playground\script.py", line 12, in
from modules.LoRA import add_lora_autogptq, add_lora_exllama, add_lora_exllamav2
ImportError: cannot import name 'add_lora_exllama' from 'modules.LoRA' (E:\StableLLM\Dec23\text-generation-webui\modules\LoRA.py)
--Update ---
I was able to at least suppress this error by removing the reference to "add_lora_exllama" in the script.py of Playground.
Beta Was this translation helpful? Give feedback.
All reactions