-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: json_schema support for Anthropic #6741
Comments
If I am not mistaken, using the Python SDK, I can see the JSON response from
Which I can use but it's extra steps only if I use |
Can I see the request you're making @Seluj78 ?
|
Able to repro with this request litellm.completion(model='claude-3-5-sonnet-20241022', messages=[{'role': 'system', 'content': 'Your output should be a JSON object with no additional properties. '}, {'role': 'user', 'content': 'Respond with this in json. city=San Francisco, state=CA, weather=sunny, temp=60'}], response_format={'type': 'json_object'}) |
The exact problem was that |
fixed here: #6748 |
Amazing thanks ! Can you let me know once it's released in pypi ? |
The Feature
Currently, you only support a small amount of
json
format models :https://docs.litellm.ai/docs/completion/json_mode
Motivation, pitch
I would need to be able to do the same with Anthropic models, without having to specify to the model how to output in json
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: