Best way to avoid "response too long" error #2399
Unanswered
DanaMartens
asked this question in
Extension Development QnA
Replies: 1 comment
-
I thought prompt-tsx should help here. Maybe @roblourens has some advice. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I understand there are ways you provide to gracefully prune input tokens to stay within the max, but what is the best method to prevent the response from exceeding the token limit and causing a 'response too long' error? When I ask GitHub Copilot, it suggests the following (passing maxTokens) but it doesn't seem to be valid:
const chatResponse = await model.sendRequest(messages, { maxTokens: maxOutputTokens }, token);
Beta Was this translation helpful? Give feedback.
All reactions