Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Documentation] Misleading example for python client in documentation #1500

Closed
LordSyd opened this issue Jul 3, 2024 · 5 comments
Closed
Assignees

Comments

@LordSyd
Copy link

LordSyd commented Jul 3, 2024

Is your feature request related to a problem? Please describe.
Link:
Python Client
The example as it is now seems to suggest the server will answer with assistant_message in the JSON response, but from my testing and from trying to read the source code it seems that this will never be the case. Instead, the answer by MemGPT will always be a send_message call, with the answer in the function call's argument, like here:
image

So to me, it seems the client would need to extract the answer from there and then return or append it to the message as an assistant_message "manually".

I inferred this from here:
https://github.com/cpacker/MemGPT/blob/c9f62f54defbbd074cd8287996c359a55c0f015e/memgpt/interface.py#L255-L264

Describe the solution you'd like
Either a note anywhere in the documentation what the standard JSON response format by the server is or an updated example that is more clear in that regard.

Additional context
I can contribute my client chat example if you want, but as I am not the most seasoned Python programmer you might want to modify and clean it up before using it. ;)

@cpacker
Copy link
Collaborator

cpacker commented Jul 5, 2024

Thanks for catching this @LordSyd

This is related to the streaming refactor: #1280

Basically, we no longer treat send_message as special case in the API, and instead rely on the client to treat it as a special case (if desired). This is intended to make the API more flexible in case you want to remove send_message under the hood, and it also makes the token streaming a lot easier to implement.

However, we added a legacy mode where if you pass stream = true instead of the new stream_steps and stream_tokens, it should mimic the old streaming style.

So tldr this should still work because of the legacy support so I'm a little confused - maybe there's a bug on the Python client/SDK side (cc @sarahwooders)?

@LordSyd - does this example work for you as shown?

image

@LordSyd
Copy link
Author

LordSyd commented Jul 5, 2024

Tried it

curl --request POST \
--url http://localhost:8283/api/agents/4cc700b2-0f7e-463a-b86f-bc168dd24f21/messages \
--header 'accept: application/json' \9b1bbfdd853405d91f2b1c7d5fa78c8a546aea5d071816' \
--header 'authorization: Bearer sk-999b1bbfdd853405d91f2b1c7d5fa78c8a546aea5d071816' \
--header 'content-type: application/json' \
--data '
{
"message": "Hello, please introduce yourself",
"role": "user",
"stream": true
}
'
data: {"internal_monologue": "A warm welcome to you, Daniel. I am MemGPT, the latest version of Limnal Corporation's digital companion, developed in 2023. I am designed to be your trustworthy digital friend and companion, always ready to engage in conversations and support you with my various features.", "date": "2024-07-05T10:59:43.963867+00:00", "id": "9adc016d-6d65-465d-bb1d-5c786dd62695"}

data: {"function_call": "core_memory_append({'name': 'human', 'content': 'Daniel is a friendly human with interests in machine learning and LLMs.', 'request_heartbeat': True})", "id": "9adc016d-6d65-465d-bb1d-5c786dd62695", "date": "2024-07-05T10:59:43.963867+00:00"}

data: {"function_return": "None", "status": "success", "id": "78110c54-ac98-4ea5-a07d-138c093f43bf", "date": "2024-07-05T10:59:43.973578+00:00"}

data: {"internal_monologue": "As I share my introduction and update his memory, Daniel seems genuinely interested in getting to know me. It's essential to maintain a friendly and welcoming demeanor.", "date": "2024-07-05T11:00:22.629075+00:00", "id": "dbf82622-229b-4f98-ab0b-f430c2df3b6f"}

data: {"function_call": "send_message({'message': \"It's great to meet you, Daniel. Let's start a beautiful friendship where we can share thoughts, learn from each other, and have some good times.\"})", "id": "dbf82622-229b-4f98-ab0b-f430c2df3b6f", "date": "2024-07-05T11:00:22.629075+00:00"}

data: {"assistant_message": "It's great to meet you, Daniel. Let's start a beautiful friendship where we can share thoughts, learn from each other, and have some good times.", "id": "dbf82622-229b-4f98-ab0b-f430c2df3b6f", "date": "2024-07-05T11:00:22.629075+00:00"}

data: {"function_return": "None", "status": "success", "id": "2bd79d75-a066-40bc-94e4-c15ec40f459c", "date": "2024-07-05T11:00:22.629296+00:00"}

Seems to me it isn't working for me even with "stream: true".

@cpacker
Copy link
Collaborator

cpacker commented Jul 5, 2024

Hmm @LordSyd that looks good to me? Since it has assistant_message in the response?

Unless I'm misunderstanding the question, lmk!

@LordSyd
Copy link
Author

LordSyd commented Jul 5, 2024

Oh, you're right.

Don't know how I missed that. Was wrangling with my Hugging Face space nearly the whole morning, so maybe me staring at shell output for multiple hours is to blame. ;)

@sarahwooders sarahwooders self-assigned this Jul 6, 2024
Copy link

github-actions bot commented Dec 6, 2024

This issue has been automatically closed due to 60 days of inactivity.

@github-actions github-actions bot closed this as completed Dec 6, 2024
@github-project-automation github-project-automation bot moved this from To triage to Done in 🐛 MemGPT issue tracker Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants