Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama: add support for moderation model #3028

Merged
merged 6 commits into from
Nov 25, 2024
Merged

ollama: add support for moderation model #3028

merged 6 commits into from
Nov 25, 2024

Conversation

rockwotj
Copy link
Collaborator

  • ollama: support saving the prompt as metadata
  • ollama: add ollama_moderation for llama-guard and shieldgemma

This allows us to classify LLM responses for safety.

@rockwotj rockwotj force-pushed the llama-guard branch 2 times, most recently from 87d26cf to bc5aa1a Compare November 21, 2024 22:28
Copy link
Collaborator

@mihaitodor mihaitodor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice job @rockwotj! Just left a few nits. Feel free to :shipit:

@@ -155,6 +155,7 @@ ockam_kafka ,input ,ockam_kafka ,0.0.0 ,commun
ockam_kafka ,output ,ockam_kafka ,0.0.0 ,community ,n ,n ,n
ollama_chat ,processor ,ollama_chat ,4.32.0 ,enterprise ,n ,n ,y
ollama_embeddings ,processor ,ollama_embeddings ,4.32.0 ,enterprise ,n ,n ,y
ollama_moderation ,processor ,ollama_moderation ,0.0.0 ,enterprise ,n ,n ,y
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need to set the version in here

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -121,6 +122,9 @@ For more information, see the https://github.com/ollama/ollama/tree/main/docs[Ol
Optional().
Advanced().
Description(`Sets the stop sequences to use. When this pattern is encountered the LLM stops generating text and returns the final response.`),
service.NewBoolField(ocpFieldEmitPromptMetadata).
Default(false).
Description(`If enabled the prompt is saved as @prompt metadata on the output message.`),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we also mention system_prompt?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks - yes

Comment on lines 112 to 123
p, err := o.prompt.TryString(msg)
if err != nil {
return nil, err
}
r, err := o.response.TryString(msg)
if err != nil {
return nil, err
}
g, err := o.generateCompletion(ctx, p, r)
if err != nil {
return nil, err
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be worth adding some extra context to these errors.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

@rockwotj rockwotj merged commit 1c54126 into main Nov 25, 2024
3 checks passed
@rockwotj rockwotj deleted the llama-guard branch November 25, 2024 21:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants