-
-
Notifications
You must be signed in to change notification settings - Fork 632
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
googleai: fix options need add default value #625
Conversation
Signed-off-by: Abirdcfly <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@@ -50,6 +50,7 @@ func (g *GoogleAI) GenerateContent(ctx context.Context, messages []llms.MessageC | |||
for _, opt := range options { | |||
opt(&opts) | |||
} | |||
g.setCallOptionsDefaults(&opts) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The defaults were just set a few lines up; this doesn't seem right - we should be setting defaults in one place.
Also I don't understand the logic here -- for example, what if the user requested Temperature = 0 explicitly? That's a valid request, why do we override it with a default now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Temperature = 0
I later realized this issue: the current code cannot distinguish whether a zero value means it's unset or intentionally set to zero. Currently, only topP
and Temperature
can be set to zero.
The more appropriate way I guess might be to modify getLLMCallOptions
so that it only changes the passed parameters to lms.CallOptions
? But in that case, it will affect all LLM... Maybe more testing is needed...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please revert this PR until we figure out the right way to address this, and create an issue describing the problem where it can be discussed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
create an issue
revert this PR
IMO, the pr that fixes this problem completely can revert this commit before submitting their own fix, the current approach has small bug(Can't set topP
and Temperature
to 0 on chain calls), but at least it ensures that google ai is working and can return results, without the current pr, chain call without full ChainCallOption will only return 404...
TestLLMChainWithGoogleAI
Entering LLM with messages:
Role: human
Text: What is the capital of France
2024/02/22 10:15:23 googleapi: Error 404:
Sorry, I really wanted to have the opportunity to review this PR before it was merged. I left a comment |
I'm going to revert this change in |
You can find out the problem by first executing the tests added in the current pr.
Let me explain what happened:
When calling
llm
viachain
, because it goes through the function:langchaingo/chains/options.go
Lines 132 to 158 in 853fc04
Before this function, only a certain
ChainCallOption
may have been called, but after this function, it will return the 11 llms.CallOptions
that are currently available (although some of them are with empty value in there), so the empty value need to be dealt with at thellm
level.I'm guessing that openai's handler is in the
langchaingo/llms/openai/internal/openaiclient/completions.go
Lines 49 to 70 in 853fc04
For #626
cc @tmc @eliben