-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MemGPT AutoGen with Local LLM Error #720
Comments
I tried creating a new environment with all the recommended steps, still the same api error. Here is the full error: OpenAIError Traceback (most recent call last) File ~/Documents/DataScience/installs/MemGPT/memgpt/autogen/memgpt_agent.py:95, in create_memgpt_autogen_agent_from_config(name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, nonmemgpt_llm_config, default_auto_reply, interface_kwargs, skip_verify) File ~/Documents/DataScience/installs/MemGPT/memgpt/autogen/memgpt_agent.py:184, in create_autogen_memgpt_agent(agent_config, skip_verify, interface, interface_kwargs, persistence_manager, persistence_manager_kwargs, default_auto_reply, is_termination_msg) File ~/Documents/DataScience/installs/MemGPT/memgpt/autogen/memgpt_agent.py:204, in MemGPTAgent.init(self, name, agent, skip_verify, concat_other_agent_messages, is_termination_msg, default_auto_reply) File ~/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py:117, in ConversableAgent.init(self, name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, default_auto_reply) File ~/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py:83, in OpenAIWrapper.init(self, config_list, **base_config) File ~/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py:138, in OpenAIWrapper._client(self, config, openai_config) File ~/.conda/envs/memgpt/lib/python3.11/site-packages/openai/_client.py:92, in OpenAI.init(self, api_key, organization, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation) OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable |
This error is happening because AutoGen uses the openai package (even when using local LLMs), and the openai package needs OPENAI_API_KEY to be set to something, even if it's a dummy variable. First do: export OPENAI_API_KEY="null" Then do: python agent_docs.py |
Describe the bug
When trying to run examples from the autogen directory with local LLMs (WebUI and LMstudio), I get the OpenAI API error:
File "/home/user1/Documents/DataScience/installs/MemGPT/memgpt/autogen/examples/agent_docs.py", line 148, in
memgpt_agent = create_memgpt_autogen_agent_from_config(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/memgpt/autogen/memgpt_agent.py", line 95, in create_memgpt_autogen_agent_from_config
autogen_memgpt_agent = create_autogen_memgpt_agent(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/memgpt/autogen/memgpt_agent.py", line 184, in create_autogen_memgpt_agent
autogen_memgpt_agent = MemGPTAgent(
^^^^^^^^^^^^
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/memgpt/autogen/memgpt_agent.py", line 204, in init
super().init(name)
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 117, in init
self.client = OpenAIWrapper(**self.llm_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py", line 83, in init
self._clients = [self._client(extra_kwargs, openai_config)]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py", line 138, in _client
client = OpenAI(**openai_config)
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/openai/_client.py", line 92, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Please describe your setup
OS: Ubuntu 22.04 desktop
HW: HP Z8 Fury with 4 RTX 6000 ADAs and 1TB RAM
python version = 3.11.3 (I have tried the 3.10.9) with Conda
memgpt version
? 0.2.10pip install pymemgpt
and from sourcememgpt
? (cmd.exe
/Powershell/Anaconda Shell/Terminal) Terminal and also SSH from a windows machineScreenshots
If applicable, add screenshots to help explain your problem.
Additional context
Add any other context about the problem here.
If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run MemGPT with local LLMs, please provide the following information:
dolphin-2.1-mistral-7b.Q6_K.gguf
)The text was updated successfully, but these errors were encountered: