Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

googleai: fix options need add default value #625

Merged
merged 1 commit into from
Feb 21, 2024
Merged

Conversation

Abirdcfly
Copy link
Contributor

@Abirdcfly Abirdcfly commented Feb 21, 2024

You can find out the problem by first executing the tests added in the current pr.

Let me explain what happened:
When calling llm via chain, because it goes through the function:

func getLLMCallOptions(options ...ChainCallOption) []llms.CallOption {
opts := &chainCallOption{}
for _, option := range options {
option(opts)
}
if opts.StreamingFunc == nil && opts.CallbackHandler != nil {
opts.StreamingFunc = func(ctx context.Context, chunk []byte) error {
opts.CallbackHandler.HandleStreamingFunc(ctx, chunk)
return nil
}
}
chainCallOption := []llms.CallOption{
llms.WithModel(opts.Model),
llms.WithMaxTokens(opts.MaxTokens),
llms.WithTemperature(opts.Temperature),
llms.WithStopWords(opts.StopWords),
llms.WithStreamingFunc(opts.StreamingFunc),
llms.WithTopK(opts.TopK),
llms.WithTopP(opts.TopP),
llms.WithSeed(opts.Seed),
llms.WithMinLength(opts.MinLength),
llms.WithMaxLength(opts.MaxLength),
llms.WithRepetitionPenalty(opts.RepetitionPenalty),
}
return chainCallOption

Before this function, only a certain ChainCallOption may have been called, but after this function, it will return the 11 llms.CallOptions that are currently available (although some of them are with empty value in there), so the empty value need to be dealt with at the llm level.

I'm guessing that openai's handler is in the

func (c *Client) setCompletionDefaults(payload *CompletionRequest) {
// Set defaults
if payload.MaxTokens == 0 {
payload.MaxTokens = 256
}
if len(payload.StopWords) == 0 {
payload.StopWords = nil
}
switch {
// Prefer the model specified in the payload.
case payload.Model != "":
// If no model is set in the payload, take the one specified in the client.
case c.Model != "":
payload.Model = c.Model
// Fallback: use the default model
default:
payload.Model = defaultChatModel
}
}

For #626

cc @tmc @eliben

@Abirdcfly Abirdcfly marked this pull request as ready for review February 21, 2024 14:23
Copy link
Owner

@tmc tmc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tmc tmc merged commit 3aaa209 into tmc:main Feb 21, 2024
3 checks passed
@@ -50,6 +50,7 @@ func (g *GoogleAI) GenerateContent(ctx context.Context, messages []llms.MessageC
for _, opt := range options {
opt(&opts)
}
g.setCallOptionsDefaults(&opts)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The defaults were just set a few lines up; this doesn't seem right - we should be setting defaults in one place.

Also I don't understand the logic here -- for example, what if the user requested Temperature = 0 explicitly? That's a valid request, why do we override it with a default now?

Copy link
Contributor Author

@Abirdcfly Abirdcfly Feb 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Temperature = 0

I later realized this issue: the current code cannot distinguish whether a zero value means it's unset or intentionally set to zero. Currently, only topP and Temperature can be set to zero.

The more appropriate way I guess might be to modify getLLMCallOptions so that it only changes the passed parameters to lms.CallOptions? But in that case, it will affect all LLM... Maybe more testing is needed...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please revert this PR until we figure out the right way to address this, and create an issue describing the problem where it can be discussed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create an issue

#626

revert this PR

IMO, the pr that fixes this problem completely can revert this commit before submitting their own fix, the current approach has small bug(Can't set topP and Temperature to 0 on chain calls), but at least it ensures that google ai is working and can return results, without the current pr, chain call without full ChainCallOption will only return 404...

TestLLMChainWithGoogleAI
Entering LLM with messages:
Role: human
Text: What is the capital of France
2024/02/22 10:15:23 googleapi: Error 404:

@eliben
Copy link
Collaborator

eliben commented Feb 22, 2024

Sorry, I really wanted to have the opportunity to review this PR before it was merged. I left a comment

@eliben
Copy link
Collaborator

eliben commented Feb 22, 2024

I'm going to revert this change in googleai for now. Let's keep discussing how to solve the real issue in #626

eliben added a commit that referenced this pull request Feb 22, 2024
Resurrect test from #625, but add explicit settings until #626 is resolved
eliben added a commit that referenced this pull request Feb 23, 2024
Resurrect test from #625, but add explicit settings until #626 is resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants