-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat!: llama.cpp - unified support for tools + refactoring #1357
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks 99% there, some minor clarification needed
formatted_msg["content"] = message.tool_call_result.result | ||
|
||
return formatted_msg | ||
text_contents = message.texts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should externalize these next 15 or so lines of code to some util as we repeat it in many chat generators. Not now though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Basically, this is very similar to OpenAI format.
In the past we decided to move away from a single generic conversion function:
this allows more freedom, but implies a bit of duplication.
self.model_kwargs["tokenizer"] = tokenizer | ||
|
||
if self._model is None: | ||
self._model = Llama(**self.model_kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wait this can easily fail, no? Noone mandates that model_kwargs must be provided in init, maybe I'm off here...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should be safe.
(I tried not to change much the __init__
. If I could start from scratch, I would have handled this differently)
In __init__
we have these assignments:
model_kwargs = model_kwargs or {}
# ...
model_kwargs.setdefault("model_path", model) # model is mandatory init param
model_kwargs.setdefault("n_ctx", n_ctx)
model_kwargs.setdefault("n_batch", n_batch)
# ...
self.model_kwargs = model_kwargs
Am I missing something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No I did, this was hidden in Github and I didn't expand it. 👍
Related Issues
Proposed Changes:
Tool
abstraction (breaking)to_dict
/from_dict
methodswarm_up
(instead of__init__
)How did you test it?
CI, several new tests
Checklist
fix:
,feat:
,build:
,chore:
,ci:
,docs:
,style:
,refactor:
,perf:
,test:
.