-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support batches api #746
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #746 +/- ##
==========================================
+ Coverage 98.46% 98.78% +0.32%
==========================================
Files 24 25 +1
Lines 1364 1237 -127
==========================================
- Hits 1343 1222 -121
+ Misses 15 9 -6
Partials 6 6 ☔ View full report in Codecov by Sentry. |
can this be looked into and merged, since it's quite a useful cost effective feature |
@eiixy thank you so much for the PR and sorry for such a long review! |
} | ||
|
||
// CreateBatchWithUploadFile — API call to Create batch with upload file. | ||
func (c *Client) CreateBatchWithUploadFile( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure we need this method. Do you think it would be used in >90% of cases when people use batches?
An alternative would be to have a README example of how to upload file and create a batch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think providing this method would reduce complexity, and I have provided an example: https://github.com/eiixy/monorepo/blob/master/example/openai/main.go
} | ||
|
||
// ListBatch API call to List batch. | ||
func (c *Client) ListBatch(ctx context.Context, after *string, limit *int) (response ListBatchResponse, err error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use the Pagination struct here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The request parameters provided by openai are inconsistent with the Pagination structure
https://platform.openai.com/docs/api-reference/batch/list
batch.go
Outdated
|
||
const BatchEndpointChatCompletions BatchEndpoint = "/v1/chat/completions" | ||
const BatchEndpointCompletions BatchEndpoint = "/v1/completions" | ||
const BatchEndpointEmbeddings BatchEndpoint = "/v1/embeddings" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Possibly:
type BatchEndpoint string
const (
BatchEndpointChatCompletions BatchEndpoint = "/v1/chat/completions"
BatchEndpointCompletions BatchEndpoint = "/v1/completions"
BatchEndpointEmbeddings BatchEndpoint = "/v1/embeddings"
)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok~
support batches api #724