-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Documentation] Misleading example for python client in documentation #1500
Comments
Thanks for catching this @LordSyd This is related to the streaming refactor: #1280 Basically, we no longer treat However, we added a legacy mode where if you pass So tldr this should still work because of the legacy support so I'm a little confused - maybe there's a bug on the Python client/SDK side (cc @sarahwooders)? @LordSyd - does this example work for you as shown? |
Tried it curl --request POST \
--url http://localhost:8283/api/agents/4cc700b2-0f7e-463a-b86f-bc168dd24f21/messages \
--header 'accept: application/json' \9b1bbfdd853405d91f2b1c7d5fa78c8a546aea5d071816' \
--header 'authorization: Bearer sk-999b1bbfdd853405d91f2b1c7d5fa78c8a546aea5d071816' \
--header 'content-type: application/json' \
--data '
{
"message": "Hello, please introduce yourself",
"role": "user",
"stream": true
}
'
data: {"internal_monologue": "A warm welcome to you, Daniel. I am MemGPT, the latest version of Limnal Corporation's digital companion, developed in 2023. I am designed to be your trustworthy digital friend and companion, always ready to engage in conversations and support you with my various features.", "date": "2024-07-05T10:59:43.963867+00:00", "id": "9adc016d-6d65-465d-bb1d-5c786dd62695"}
data: {"function_call": "core_memory_append({'name': 'human', 'content': 'Daniel is a friendly human with interests in machine learning and LLMs.', 'request_heartbeat': True})", "id": "9adc016d-6d65-465d-bb1d-5c786dd62695", "date": "2024-07-05T10:59:43.963867+00:00"}
data: {"function_return": "None", "status": "success", "id": "78110c54-ac98-4ea5-a07d-138c093f43bf", "date": "2024-07-05T10:59:43.973578+00:00"}
data: {"internal_monologue": "As I share my introduction and update his memory, Daniel seems genuinely interested in getting to know me. It's essential to maintain a friendly and welcoming demeanor.", "date": "2024-07-05T11:00:22.629075+00:00", "id": "dbf82622-229b-4f98-ab0b-f430c2df3b6f"}
data: {"function_call": "send_message({'message': \"It's great to meet you, Daniel. Let's start a beautiful friendship where we can share thoughts, learn from each other, and have some good times.\"})", "id": "dbf82622-229b-4f98-ab0b-f430c2df3b6f", "date": "2024-07-05T11:00:22.629075+00:00"}
data: {"assistant_message": "It's great to meet you, Daniel. Let's start a beautiful friendship where we can share thoughts, learn from each other, and have some good times.", "id": "dbf82622-229b-4f98-ab0b-f430c2df3b6f", "date": "2024-07-05T11:00:22.629075+00:00"}
data: {"function_return": "None", "status": "success", "id": "2bd79d75-a066-40bc-94e4-c15ec40f459c", "date": "2024-07-05T11:00:22.629296+00:00"} Seems to me it isn't working for me even with "stream: true". |
Hmm @LordSyd that looks good to me? Since it has Unless I'm misunderstanding the question, lmk! |
Oh, you're right. Don't know how I missed that. Was wrangling with my Hugging Face space nearly the whole morning, so maybe me staring at shell output for multiple hours is to blame. ;) |
This issue has been automatically closed due to 60 days of inactivity. |
Is your feature request related to a problem? Please describe.
Link:
Python Client
The example as it is now seems to suggest the server will answer with
assistant_message
in the JSON response, but from my testing and from trying to read the source code it seems that this will never be the case. Instead, the answer by MemGPT will always be asend_message
call, with the answer in the function call's argument, like here:So to me, it seems the client would need to extract the answer from there and then return or append it to the message as an
assistant_message
"manually".I inferred this from here:
https://github.com/cpacker/MemGPT/blob/c9f62f54defbbd074cd8287996c359a55c0f015e/memgpt/interface.py#L255-L264
Describe the solution you'd like
Either a note anywhere in the documentation what the standard JSON response format by the server is or an updated example that is more clear in that regard.
Additional context
I can contribute my client chat example if you want, but as I am not the most seasoned Python programmer you might want to modify and clean it up before using it. ;)
The text was updated successfully, but these errors were encountered: