Release 1.3.0 #471
TheR1D
announced in
Announcements
Replies: 2 comments
-
Farkhod, Amazing work as always!!! Thank you for your work on this project!! |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thank you. Are LiteLLM function calls using together.ai supported? They were added into LiteLLM a week ago. I tried it with the preinstalled example |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What's Changed
Multiple LLM backends
ShellGPT now can work with multiple Backends using LiteLLM. You can use locally hosted open source models which are available for free. To use local models, you will need to run your own LLM backend server such as Ollama. To setup ShellGPT with Ollama, please follow this comprehensive guide. Full list of supported models and providers here. Note that ShellGPT is not optimized for local models and may not work as expected❗️
Markdown formatting
Markdown formatting now depends on the role description. For instance, if the role includes
"APPLY MARKDOWN"
in its description, the output for this role will be Markdown-formatted. This applies to both default and custom roles. If you would like to disable Markdown formatting, edit the default role description in~/.config/shell_gpt/roles
.Full Changelog: 1.2.0...1.3.0
This discussion was created from the release 1.3.0.
Beta Was this translation helpful? Give feedback.
All reactions