Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Important] Add max_prompt_tokens and max_completion_tokens for Runs API #765

Closed
1 task done
neilord opened this issue Apr 14, 2024 · 3 comments
Closed
1 task done

Comments

@neilord
Copy link

neilord commented Apr 14, 2024

Confirm this is a feature request for the Node library and not the underlying OpenAI API.

  • This is a feature request for the Node library

Describe the feature or improvement you're requesting

Hi, I belive that max_prompt_tokens and max_completion_tokens are essential features for using the whole Assistants API. Without being able to specify the max tokens, the thread is just getting longer, and running the AI model getting very expensive. Please add the max_prompt_tokens and max_completion_tokens parameters to the openai.beta.threads.runs.create. Thanks!

Additional context

No response

@rattrayalex
Copy link
Collaborator

I believe @pstern-sl is working on this.

@pstern-sl
Copy link
Collaborator

This is available starting in version 4.34 - released today #767

@neilord
Copy link
Author

neilord commented Apr 16, 2024

Awesome! Thanks a lot for implementing this! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants