You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
If I am saying anything wrong. Please dont judge me since I am very new with AI projects and apps.
I noticed that there is a GPU button that is inaccessible. But is there a way to use it on a support model with GPU?
For example: https://huggingface.co/TheBloke/WizardLM-30B-Uncensored-GGML
It mentions GPU + CPU Inference, so could I somehow use that Model onto the local.ai app?
Thanks.
The text was updated successfully, but these errors were encountered:
Hello,
If I am saying anything wrong. Please dont judge me since I am very new with AI projects and apps.
I noticed that there is a GPU button that is inaccessible. But is there a way to use it on a support model with GPU?
For example: https://huggingface.co/TheBloke/WizardLM-30B-Uncensored-GGML
It mentions GPU + CPU Inference, so could I somehow use that Model onto the local.ai app?
Thanks.
The text was updated successfully, but these errors were encountered: