Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

release: 0.39.0 #723

Merged
merged 8 commits into from
Nov 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.38.0"
".": "0.39.0"
}
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
configured_endpoints: 10
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/anthropic-d95f5b98650cf1d0a75bd514eaa6705bef41aa89e8fe37e849ccdde57a91aaa2.yml
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/anthropic-25f83d91f601c1962b3701fedf829f678f306aca0758af286ee1586cc9931f75.yml
30 changes: 30 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,35 @@
# Changelog

## 0.39.0 (2024-11-04)

Full Changelog: [v0.38.0...v0.39.0](https://github.com/anthropics/anthropic-sdk-python/compare/v0.38.0...v0.39.0)

### ⚠ BREAKING CHANGES

* **client:** remove legacy `client.count_tokens()` method ([#726](https://github.com/anthropics/anthropic-sdk-python/issues/726))

### Features

* **api:** add new haiku model ([#731](https://github.com/anthropics/anthropic-sdk-python/issues/731)) ([77eaaf9](https://github.com/anthropics/anthropic-sdk-python/commit/77eaaf9c76f9b267706c830a5f7c1d81df6013d9))
* **project:** drop support for Python 3.7 ([#729](https://github.com/anthropics/anthropic-sdk-python/issues/729)) ([7f897e2](https://github.com/anthropics/anthropic-sdk-python/commit/7f897e253ae09e6a85fe64ba8004c2c3a8133e4e))


### Bug Fixes

* don't use dicts as iterables in transform ([#724](https://github.com/anthropics/anthropic-sdk-python/issues/724)) ([62bb863](https://github.com/anthropics/anthropic-sdk-python/commit/62bb8636a3d7156bc0caab5f574b1fa72445cead))
* support json safe serialization for basemodel subclasses ([#727](https://github.com/anthropics/anthropic-sdk-python/issues/727)) ([5be855e](https://github.com/anthropics/anthropic-sdk-python/commit/5be855e20f40042f59e839c7747dd994dc88c456))
* **types:** add missing token-counting-2024-11-01 ([#722](https://github.com/anthropics/anthropic-sdk-python/issues/722)) ([c549736](https://github.com/anthropics/anthropic-sdk-python/commit/c5497360a385f5dbaa5ab775bc19a0d7eee713bc))


### Documentation

* **readme:** mention new token counting endpoint ([#728](https://github.com/anthropics/anthropic-sdk-python/issues/728)) ([72a4636](https://github.com/anthropics/anthropic-sdk-python/commit/72a4636a7798170d69e7551ba58a0213d82d1711))


### Refactors

* **client:** remove legacy `client.count_tokens()` method ([#726](https://github.com/anthropics/anthropic-sdk-python/issues/726)) ([14e4244](https://github.com/anthropics/anthropic-sdk-python/commit/14e4244752b656cedfe7d160088e9744d07470a1))

## 0.38.0 (2024-11-01)

Full Changelog: [v0.37.1...v0.38.0](https://github.com/anthropics/anthropic-sdk-python/compare/v0.37.1...v0.38.0)
Expand Down
18 changes: 15 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

[![PyPI version](https://img.shields.io/pypi/v/anthropic.svg)](https://pypi.org/project/anthropic/)

The Anthropic Python library provides convenient access to the Anthropic REST API from any Python 3.7+
The Anthropic Python library provides convenient access to the Anthropic REST API from any Python 3.8+
application. It includes type definitions for all request params and response fields,
and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).

Expand Down Expand Up @@ -165,7 +165,19 @@ Alternatively, you can use `client.messages.create(..., stream=True)` which only

## Token counting

You can see the exact usage for a given request through the `usage` response property, e.g.
To get the token count for a message without creating it you can use the `client.beta.messages.count_tokens()` method. This takes the same `messages` list as the `.create()` method.

```py
count = client.beta.messages.count_tokens(
model="claude-3-5-sonnet-20241022",
messages=[
{"role": "user", "content": "Hello, world"}
]
)
count.input_tokens # 10
```

You can also see the exact usage for a given request through the `usage` response property, e.g.

```py
message = client.messages.create(...)
Expand Down Expand Up @@ -670,7 +682,7 @@ print(anthropic.__version__)

## Requirements

Python 3.7 or higher.
Python 3.8 or higher.

## Contributing

Expand Down
8 changes: 3 additions & 5 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "anthropic"
version = "0.38.0"
version = "0.39.0"
description = "The official Python library for the anthropic API"
dynamic = ["readme"]
license = "MIT"
Expand All @@ -15,14 +15,12 @@ dependencies = [
"distro>=1.7.0, <2",
"sniffio",
"cached-property; python_version < '3.8'",
"tokenizers >= 0.13.0",
"jiter>=0.4.0, <1",
]
requires-python = ">= 3.7"
requires-python = ">= 3.8"
classifiers = [
"Typing :: Typed",
"Intended Audience :: Developers",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
Expand Down Expand Up @@ -146,7 +144,7 @@ filterwarnings = [
# there are a couple of flags that are still disabled by
# default in strict mode as they are experimental and niche.
typeCheckingMode = "strict"
pythonVersion = "3.7"
pythonVersion = "3.8"

exclude = [
"_dev",
Expand Down
20 changes: 0 additions & 20 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,6 @@ cachetools==5.5.0
certifi==2023.7.22
# via httpcore
# via httpx
# via requests
charset-normalizer==3.4.0
# via requests
colorlog==6.7.0
# via nox
dirty-equals==0.6.0
Expand All @@ -44,10 +41,7 @@ exceptiongroup==1.2.2
# via anyio
# via pytest
filelock==3.12.4
# via huggingface-hub
# via virtualenv
fsspec==2024.10.0
# via huggingface-hub
google-auth==2.35.0
# via anthropic
h11==0.14.0
Expand All @@ -57,12 +51,9 @@ httpcore==1.0.2
httpx==0.25.2
# via anthropic
# via respx
huggingface-hub==0.26.1
# via tokenizers
idna==3.4
# via anyio
# via httpx
# via requests
importlib-metadata==7.0.0
iniconfig==2.0.0
# via pytest
Expand All @@ -82,7 +73,6 @@ nodeenv==1.8.0
# via pyright
nox==2023.4.22
packaging==23.2
# via huggingface-hub
# via nox
# via pytest
platformdirs==3.11.0
Expand All @@ -109,10 +99,6 @@ python-dateutil==2.8.2
# via time-machine
pytz==2023.3.post1
# via dirty-equals
pyyaml==6.0.2
# via huggingface-hub
requests==2.32.3
# via huggingface-hub
respx==0.20.2
rich==13.7.1
rsa==4.9
Expand All @@ -129,13 +115,9 @@ sniffio==1.3.0
# via anyio
# via httpx
time-machine==2.9.0
tokenizers==0.20.1
# via anthropic
tomli==2.0.2
# via mypy
# via pytest
tqdm==4.66.5
# via huggingface-hub
types-awscrt==0.23.0
# via botocore-stubs
types-s3transfer==0.10.3
Expand All @@ -144,13 +126,11 @@ typing-extensions==4.12.2
# via anthropic
# via anyio
# via boto3-stubs
# via huggingface-hub
# via mypy
# via pydantic
# via pydantic-core
urllib3==1.26.20
# via botocore
# via requests
virtualenv==20.24.5
# via nox
zipp==3.17.0
Expand Down
22 changes: 0 additions & 22 deletions requirements.lock
Original file line number Diff line number Diff line change
Expand Up @@ -25,17 +25,10 @@ cachetools==5.5.0
certifi==2023.7.22
# via httpcore
# via httpx
# via requests
charset-normalizer==3.4.0
# via requests
distro==1.8.0
# via anthropic
exceptiongroup==1.2.2
# via anyio
filelock==3.16.1
# via huggingface-hub
fsspec==2024.10.0
# via huggingface-hub
google-auth==2.35.0
# via anthropic
h11==0.14.0
Expand All @@ -44,19 +37,14 @@ httpcore==1.0.2
# via httpx
httpx==0.25.2
# via anthropic
huggingface-hub==0.26.1
# via tokenizers
idna==3.4
# via anyio
# via httpx
# via requests
jiter==0.6.1
# via anthropic
jmespath==1.0.1
# via boto3
# via botocore
packaging==24.1
# via huggingface-hub
pyasn1==0.6.1
# via pyasn1-modules
# via rsa
Expand All @@ -68,10 +56,6 @@ pydantic-core==2.23.4
# via pydantic
python-dateutil==2.9.0.post0
# via botocore
pyyaml==6.0.2
# via huggingface-hub
requests==2.32.3
# via huggingface-hub
rsa==4.9
# via google-auth
s3transfer==0.10.3
Expand All @@ -82,16 +66,10 @@ sniffio==1.3.0
# via anthropic
# via anyio
# via httpx
tokenizers==0.20.1
# via anthropic
tqdm==4.66.5
# via huggingface-hub
typing-extensions==4.12.2
# via anthropic
# via anyio
# via huggingface-hub
# via pydantic
# via pydantic-core
urllib3==1.26.20
# via botocore
# via requests
6 changes: 0 additions & 6 deletions scripts/test
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,3 @@ rye run pytest "$@"

echo "==> Running Pydantic v1 tests"
rye run nox -s test-pydantic-v1 -- "$@"

# this is a separate script instead of a pytest test as we can't rely on the
# execution order, so a tokenizer test could be executed before this check which
# invalidates everything
echo "==> Verifying that \`tokenizers\` is lazily imported"
rye run python -c 'import anthropic, sys; assert "tokenizers" not in sys.modules; print("true")'
41 changes: 0 additions & 41 deletions src/anthropic/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,6 @@
from ._version import __version__
from ._streaming import Stream as Stream, AsyncStream as AsyncStream
from ._exceptions import APIStatusError
from ._tokenizers import (
TokenizerType, # type: ignore[import]
sync_get_tokenizer,
async_get_tokenizer,
)
from ._base_client import (
DEFAULT_MAX_RETRIES,
DEFAULT_CONNECTION_LIMITS,
Expand Down Expand Up @@ -267,24 +262,6 @@ def copy(
# client.with_options(timeout=10).foo.create(...)
with_options = copy

def count_tokens(
self,
text: str,
) -> int:
"""Count the number of tokens in a given string.
Note that this is only accurate for older models, e.g. `claude-2.1`. For newer
models this can only be used as a _very_ rough estimate, instead you should rely
on the `usage` property in the response for exact counts.
"""
# Note: tokenizer is untyped
tokenizer = self.get_tokenizer()
encoded_text = tokenizer.encode(text) # type: ignore
return len(encoded_text.ids) # type: ignore

def get_tokenizer(self) -> TokenizerType:
return sync_get_tokenizer()

@override
def _make_status_error(
self,
Expand Down Expand Up @@ -531,24 +508,6 @@ def copy(
# client.with_options(timeout=10).foo.create(...)
with_options = copy

async def count_tokens(
self,
text: str,
) -> int:
"""Count the number of tokens in a given string.
Note that this is only accurate for older models, e.g. `claude-2.1`. For newer
models this can only be used as a _very_ rough estimate, instead you should rely
on the `usage` property in the response for exact counts.
"""
# Note: tokenizer is untyped
tokenizer = await self.get_tokenizer()
encoded_text = tokenizer.encode(text) # type: ignore
return len(encoded_text.ids) # type: ignore

async def get_tokenizer(self) -> TokenizerType:
return await async_get_tokenizer()

@override
def _make_status_error(
self,
Expand Down
6 changes: 4 additions & 2 deletions src/anthropic/_compat.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

from typing import TYPE_CHECKING, Any, Union, Generic, TypeVar, Callable, cast, overload
from datetime import date, datetime
from typing_extensions import Self
from typing_extensions import Self, Literal

import pydantic
from pydantic.fields import FieldInfo
Expand Down Expand Up @@ -137,9 +137,11 @@ def model_dump(
exclude_unset: bool = False,
exclude_defaults: bool = False,
warnings: bool = True,
mode: Literal["json", "python"] = "python",
) -> dict[str, Any]:
if PYDANTIC_V2:
if PYDANTIC_V2 or hasattr(model, "model_dump"):
return model.model_dump(
mode=mode,
exclude=exclude,
exclude_unset=exclude_unset,
exclude_defaults=exclude_defaults,
Expand Down
9 changes: 6 additions & 3 deletions src/anthropic/_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@
PropertyInfo,
is_list,
is_given,
json_safe,
lru_cache,
is_mapping,
parse_date,
Expand Down Expand Up @@ -279,8 +280,8 @@ def model_dump(
Returns:
A dictionary representation of the model.
"""
if mode != "python":
raise ValueError("mode is only supported in Pydantic v2")
if mode not in {"json", "python"}:
raise ValueError("mode must be either 'json' or 'python'")
if round_trip != False:
raise ValueError("round_trip is only supported in Pydantic v2")
if warnings != True:
Expand All @@ -289,7 +290,7 @@ def model_dump(
raise ValueError("context is only supported in Pydantic v2")
if serialize_as_any != False:
raise ValueError("serialize_as_any is only supported in Pydantic v2")
return super().dict( # pyright: ignore[reportDeprecated]
dumped = super().dict( # pyright: ignore[reportDeprecated]
include=include,
exclude=exclude,
by_alias=by_alias,
Expand All @@ -298,6 +299,8 @@ def model_dump(
exclude_none=exclude_none,
)

return cast(dict[str, Any], json_safe(dumped)) if mode == "json" else dumped

@override
def model_dump_json(
self,
Expand Down
Loading
Loading