Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Ollama (via openai compatible APIs) #16

Merged
merged 1 commit into from
Aug 19, 2024

Conversation

raff
Copy link
Contributor

@raff raff commented Jul 14, 2024

Use -cm ollama to select the default model (llama3) or -cm ollama:{model} to select a differnt model.
Ollama should be running locally on the default port http://localhost:11434

Use `-cm ollama` to select the default model (llama3) or `-cm ollama:{model}` to select a differnt model.
Ollama should be running locally on the default port `http://localhost:11434`
Copy link
Owner

@baalimago baalimago left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, and many thanks for the addition! One request though: could you please update the README.md with an entry similar to the other vendors? I think a link that gives a brief start to get started with ollama plus the requirement of default ollama port.

Also, I don't have a beefy enough computer to run any models supplied via ollama, but I trust you've verified functionality yourself

@raff
Copy link
Contributor Author

raff commented Jul 15, 2024

Sure, I'll update the readme. And yes, I did verified myself (chat and query commands)

Copy link
Owner

@baalimago baalimago left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This also needs to get updated: /baalimago/clai/blob/main/internal/text/querier_setup.go#L20 in order to support the config system for the different models. Right now, I suspect the configuration file for any ollama model will overwrite each others at <os-config-dir>/.clai/VENDOR_NOT_FOUND.json.

I think <os-config-dir>/.clai/ollama_ollama_{model} would be an appropriate format, with {model} being just ollama for the default case.

You can verify the problem+solution by checking out <os-config-dir>/.clai/ (probably ~/.config/.clai) or running clai setup -> 1 (model files) -> c (configure) and seeing ollama_ollama_... config files (and the lack thereof, until change)

EDIT: I think I'll refactor this so that this is inside the respective vendors, but I'll do so aftter this PR to make it less confusing

@baalimago
Copy link
Owner

Hello! I'm planning on making a new minor release soon and I'd really like to have this PR in it. Do you have any progress on the suggested changes?

I also had a thought that we could just merge this as is, and I'll refactor as mentioned above. But since I don't have ollama myself, I'd appreciate if you'd run and verify that the configuration works as intended. Could this work?

@raff
Copy link
Contributor Author

raff commented Aug 15, 2024 via email

@baalimago
Copy link
Owner

I'll proceed and merge this and do the refactor! If something doesn't work, please make an issue and I'll check it out, or submit another PR to fix it.

Thanks for your contributions!

@baalimago baalimago merged commit c61dd3f into baalimago:main Aug 19, 2024
@baalimago
Copy link
Owner

baalimago commented Aug 19, 2024

I doublechecked my own suggested refactors and it didn't really make any sense, apologies for the confusion! I updated the readme.

@raff
Copy link
Contributor Author

raff commented Aug 19, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants