-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for generate and return tool_calls with Anthropic models #50
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
changes are good
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bigbernnn
Thanks for making this change. A minor suggestion to improve readability, looks good otherwise.
"role": role, | ||
"content": message_content, | ||
} | ||
) | ||
return system, formatted_messages |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A lot of this can be improved by early returns and refactoring to separate functions. I have not tested the validity of this change, so do check against existing logic before taking these changes as-is. Adding some unit tests to check the validity of these would be even better.
def _handle_anthropic_system_message(message):
if not isinstance(message.content, str):
raise ValueError(
"System message must be a string, "
f"instead was: {type(message.content)}"
)
return message.content, _message_type_lookups["human"]
def _handle_anthropic_message(message):
role = _message_type_lookups[message.type]
if isinstance(message.content, str):
return message.content, role
assert isinstance(message.content, list), "Anthropic message content must be str or list of dicts"
content = []
for item in message.content:
if isinstance(item, str):
content.append({"type": "text", "text": item})
elif isinstance(item, dict):
if "type" not in item:
raise ValueError("Dict content item must have a type key")
if item["type"] == "image_url":
source = _format_image(item["image_url"]["url"])
content.append({"type": "image", "source": source})
else:
content.append(item)
else:
raise ValueError(f"Content items must be str or dict, instead was: {type(item)}")
return content, role
def _format_anthropic_messages(
messages: List[BaseMessage],
) -> Tuple[Optional[str], List[Dict]]:
"""Format messages for anthropic."""
"""
[
{
"role": _message_type_lookups[m.type],
"content": [_AnthropicMessageContent(text=m.content).dict()],
}
for m in messages
]
"""
if isinstance(messages, langchain_system.BaseMessage):
if messages.type == "system":
system, role = _handle_anthropic_system_message(messages)
else:
content, role = _handle_anthropic_message(messages)
return system, [{"role": role, "content": content}]
system = None
formatted_messages = []
for i, message in enumerate(messages):
if message.type == "system":
if i != 0:
raise ValueError("System message must be at beginning of message list.")
system, _ = _handle_anthropic_system_message(message)
continue
content, role = _handle_anthropic_message(message)
formatted_messages.append({"role": role, "content": content})
return system, formatted_messages
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implemented and tested.
Hi, Is there anything we can do to allow the PR to get merged ? |
Hi, is the tool_choice argument of bind_tools method functioning? Because I would like to force the LLM to use a tool, but it does not seem to work. Are there any workarounds to do so? Thank you in advance. |
For those coming across this PR in search of a solution, I had an issue where the tools were not actually being called. I have created a PR against the fork created by @bigbernnn which can be found here. This has resolved my related issue here. |
Can this please be reviewed and merged? Can't switch my existing codebase to bedrock due to this error. # llm = ChatOpenAI(model="gpt-4o", openai_api_key=OPENAI_API_KEY) [WORKS FINE]
# llm = ChatAnthropic(model="claude-3-opus-20240229") [WORKS FINE]
llm = ChatBedrock(
model_id="anthropic.claude-3-sonnet-20240229-v1:0",
model_kwargs=dict(temperature=0),
) [ERROR]
ValueError: System message must be a string, instead was: <class 'list'> |
The proposed changes include:
1/ Ability to use tools with
.generate()
2/ Returning
stop_reason
andtool_calls
to AIMessage when using tools in the response metadataSupport for function call using the
generate
function not directly implemented in ChatBedrock.