Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference doesn´t work (both letta-free and openai) #2329

Open
2 tasks
quantumcthulhu opened this issue Jan 5, 2025 · 1 comment
Open
2 tasks

Inference doesn´t work (both letta-free and openai) #2329

quantumcthulhu opened this issue Jan 5, 2025 · 1 comment

Comments

@quantumcthulhu
Copy link

Describe the bug
when I try to use letta-free model it returns error:
"requests.exceptions.HTTPError: HTTP error occurred: 500 Server Error: Internal Server Error for url: https://inference.memgpt.ai/chat/completions | Status code: 500, Message: {"detail":"Internal server error (unpack): "}
"

When I try to use gpt4o model it returns error:

httpx_sse._exceptions.SSEError: Expected response header Content-Type to contain 'text/event-stream', got 'application/json'

Please describe your setup

  • How did you install letta?
    • pip install letta
  • Describe your setup
    • What's your OS (Windows/MacOS/Linux)?
      MacOS

    • How are you running letta? (cmd.exe/Powershell/Anaconda Shell/Terminal)
      Terminal

Letta Config
Please attach your ~/.letta/config file or copy paste it below.

[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[archival_storage]
type = sqlite
path = /Users/user/.letta

[recall_storage]
type = sqlite
path = /Users/user/.letta

[metadata_storage]
type = sqlite
path = /Users/user/.letta

[version]
letta_version = 0.6.7

@pgiki
Copy link

pgiki commented Jan 18, 2025

I have been also facing this issue for a couple of days now. After debugging I realized that this happens when one of the messages sent on the make_post_request doesn't have any tool_calls. Here is the patch way to solve it for now. I hope someone has a better solution but atleast this solves the unpack error for now. Here is the updated function

def make_post_request(url: str, headers: dict[str, str], data: dict[str, Any]) -> dict[str, Any]:
    printd(f"Sending request to {url}")
    try:
        cleaned_messages = []
        for message in data.get("messages", []):
            role = message.get("role")
            tool_calls = message.get("tool_calls")
            if role == "assistant" and not tool_calls:
                continue
            else:
                cleaned_messages.append(message)

        data["messages"] = cleaned_messages
        # Make the POST request
        response = requests.post(url, headers=headers, json=data)
        printd(f"Response status code: {response.status_code}")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants