-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
404 Client Error when using Azure OpenAI endpoints #959
Comments
Hi @briantani - I believe this should be fixed by #982, which will be part of the upcoming 0.3.2 release (you can also grab it right now via If this still persists please let me know by reopening the issue. |
I installed pymemgpt-nightly-0.3.1.dev20240212103909 and ran into the same issue: This is the error message when I run the code in using --------------------------------------------------------------------------------
Traceback (most recent call last):
File "C:\Users\tanibr\projects\automemgpt\app.py", line 171, in <module>
user_proxy.initiate_chat(manager, message=prompt_1)
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 793, in initiate_chat
self.send(self.generate_init_message(**context), recipient, silent=silent)
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 504, in send
recipient.receive(message, self, request_reply, silent)
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 679, in receive
reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 1637, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\autogen\agentchat\groupchat.py", line 526, in run_chat
reply = speaker.generate_reply(sender=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 1637, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\autogen\memgpt_agent.py", line 162, in _generate_reply_for_user_message
) = self.agent.step(user_message, first_message=False, skip_verify=self.skip_verify)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 676, in step
raise e
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 608, in step
response = self._get_ai_reply(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 359, in _get_ai_reply
raise e
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 339, in _get_ai_reply
response = create(
^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 352, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 461, in create
return azure_openai_chat_completions_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 288, in azure_openai_chat_completions_request
raise http_err
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 277, in azure_openai_chat_completions_request
response.raise_for_status() # Raises HTTPError for 4XX/5XX status
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: model_error for url: https://{resource}.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2023-03-15-preview And this is the error I get when I run An exception occurred when running agent.step():
Traceback (most recent call last):
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\main.py", line 347, in run_agent_loop
new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\main.py", line 323, in process_agent_step
new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step(
^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 676, in step
raise e
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 596, in step
response = self._get_ai_reply(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 359, in _get_ai_reply
raise e
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\agent.py", line 339, in _get_ai_reply
response = create(
^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 352, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 461, in create
return azure_openai_chat_completions_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 288, in azure_openai_chat_completions_request
raise http_err
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\memgpt\llm_api_tools.py", line 277, in azure_openai_chat_completions_request
response.raise_for_status() # Raises HTTPError for 4XX/5XX status
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\tanibr\Anaconda3\envs\memgpt\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: model_error for url: https://{resource}.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2023-03-15-preview
? Retry agent.step()? (Y/n) Is there anything I can provide to better understand the issue? Or while I did install the nightly version, it didn't have the update yet for some reason? |
@briantani thank you for the update! My guess is that the credentials aren't saved properly (this was patched but you'll still need to re-run It should have azure fields that look like this (replace the
If you're missing any of these fields, try settings the values here, then running If you're not missing any of these fields, let me know and I can try to debug your stacktrace. |
The credentials have the correct value, and running It seems to be trying to call chat completioins, not even embedding, and it fails. The AutoGen portion of the code, however, seems to be able to contact the chat completions endpoint no problem. So I placed a breakpoint on the {
"error": {
"message": "Unrecognized request argument supplied: tools",
"type": "invalid_request_error",
"param": null,
"code": null
}
} I reproduced it using curl: curl https://{resource}.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2023-05-15 -H "Content-Type: application/json" -H "api-key: ..." -d '@package.json'
{
"error": {
"message": "Unrecognized request argument supplied: tools",
"type": "invalid_request_error",
"param": null,
"code": null
}
} The package,json was created dumping the request data to a file. It seems the chat endpoint does not understand the |
Ah OK this makes more sense now - it seems like you're using Azure OpenAI version According to the Azure OpenAI REST API docs, Is it possible for you to upgrade your Azure OpenAI version to any of these? 2023-06-01-preview (retiring 2024-04-02) Swagger spec If not, no worries, I can also add some sort of fallback inside the MemGPT code that checks if you're using an Azure spec older than |
Using a more recent API preview did the work. This solves my problem. I was using the same API preview for consistency but didn't know about the tools since I wasn't testing tools yet. Thanks for taking the time to help me. |
Awesome, really glad to hear it @briantani ! Thank you so much for your detailed bug reporting. |
Describe the bug
I use
memgpt configure
to set up the Azure OpenAI backend with these environment variables.Then I run
memgpt run
and try to converse, this error appear:I'm able to use this AzureOpenAI with AutoGen for instance, no problem using essencialy the same configuration.
I also tried to use code, and I get the same 404 Client error when the MemGPT agent attempts to call the API. The other agents respond as expected.
Please describe your setup
pip install -U pymemgpt
memgpt
? (cmd.exe
/Powershell/Anaconda Shell/Terminal)The text was updated successfully, but these errors were encountered: