Skip to content
forked from letta-ai/letta

Create LLM agents with long-term memory and custom tools πŸ“šπŸ¦™

License

Notifications You must be signed in to change notification settings

sarahwooders/MemGPT

Β 
Β 

Repository files navigation

MemGPT logo

Try out our MemGPT chatbot on Discord!

Discord arxiv 2310.08560 Documentation

Create perpetual chatbots πŸ€– with self-editing memory!


MemGPT demo video

Chat with your data πŸ—ƒοΈ - try talking to the LlamaIndex API docs!

MemGPT demo video for llamaindex api docs search

ChatGPT (GPT-4) when asked the same question:

GPT-4 when asked about llamaindex api docs
(Question from run-llama/llama_index#7756)

Quick setup

Join Discord and message the MemGPT bot (in the #memgpt channel). Then run the following commands (messaged to "MemGPT Bot"):

  • /profile (to create your profile)
  • /key (to enter your OpenAI key)
  • /create (to create a MemGPT chatbot)

Make sure your privacy settings on this server are open so that MemGPT Bot can DM you:
MemGPT β†’ Privacy Settings β†’ Direct Messages set to ON

set DMs settings on MemGPT server to be open in MemGPT so that MemGPT Bot can message you

You can see the full list of available commands when you enter / into the message box.

MemGPT Bot slash commands

What is MemGPT?

Memory-GPT (or MemGPT in short) is a system that intelligently manages different memory tiers in LLMs in order to effectively provide extended context within the LLM's limited context window. For example, MemGPT knows when to push critical information to a vector database and when to retrieve it later in the chat, enabling perpetual conversations. Learn more about MemGPT in our paper.

Running MemGPT locally

Install dependencies:

pip install -r requirements.txt

Add your OpenAI API key to your environment:

# on Linux/Mac
export OPENAI_API_KEY=YOUR_API_KEY
# on Windows
set OPENAI_API_KEY=YOUR_API_KEY

To run MemGPT for as a conversation agent in CLI mode, simply run main.py:

python3 main.py

To create a new starter user or starter persona (that MemGPT gets initialized with), create a new .txt file in /memgpt/humans/examples or /memgpt/personas/examples, then use the --persona or --human flag when running main.py. For example:

# assuming you created a new file /memgpt/humans/examples/me.txt
python main.py
# Select me.txt during configuration process

-- OR --

# assuming you created a new file /memgpt/humans/examples/me.txt
python main.py --human me.txt

GPT-3.5 support

You can run MemGPT with GPT-3.5 as the LLM instead of GPT-4:

python main.py
# Select gpt-3.5 during configuration process

-- OR --

python main.py --model gpt-3.5-turbo

Note that this is experimental gpt-3.5-turbo support. It's quite buggy compared to gpt-4, but it should be runnable.

Please report any bugs you encounter regarding MemGPT running on GPT-3.5 to letta-ai#59.

Local LLM support

You can run MemGPT with local LLMs too. See instructions here and report any bugs/improvements here letta-ai#67.

main.py flags

--first
  allows you to send the first message in the chat (by default, MemGPT will send the first message)
--debug
  enables debugging output
Configure via legacy flags
--model
  select which model to use ('gpt-4', 'gpt-3.5-turbo-0613', 'gpt-3.5-turbo')
--persona
  load a specific persona file
--human
  load a specific human file
--archival_storage_faiss_path=<ARCHIVAL_STORAGE_FAISS_PATH>
  load in document database (backed by FAISS index)

Interactive CLI commands

These are the commands for the CLI, not the Discord bot! The Discord bot has separate commands you can see in Discord by typing /.

While using MemGPT via the CLI (not Discord!) you can run various commands:

//
  toggle multiline input mode
/exit
  exit the CLI
/save
  save a checkpoint of the current agent/conversation state
/load
  load a saved checkpoint
/dump
  view the current message log (see the contents of main context)
/memory
  print the current contents of agent memory
/pop
  undo the last message in the conversation
/heartbeat
  send a heartbeat system message to the agent
/memorywarning
  send a memory warning system message to the agent

Use MemGPT to talk to your Database!

MemGPT's archival memory let's you load your database and talk to it! To motivate this use-case, we have included a toy example.

Consider the test.db already included in the repository.

id name age
1 Alice 30
2 Bob 25
3 Charlie 35

To talk to this database, run:

python main_db.py 

And then you can input the path to your database, and your query.

Please enter the path to the database. test.db
...
Enter your message: How old is Bob?
...
πŸ€– Bob is 25 years old.

Support

If you have any further questions, or have anything to share, we are excited to hear your feedback!

  • By default MemGPT will use gpt-4, so your API key will require gpt-4 API access
  • For issues and feature requests, please open a GitHub issue or message us on our #support channel on Discord

Datasets

Datasets used in our paper can be downloaded at Hugging Face.

πŸš€ Project Roadmap

  • Release MemGPT Discord bot demo (perpetual chatbot)
  • Add additional workflows (load SQL/text into MemGPT external context)
  • Integration tests
  • Integrate with AutoGen (discussion)
  • Add official gpt-3.5-turbo support (discussion)
  • CLI UI improvements (issue)
  • Add support for other LLM backends (issue, discussion)
  • Release MemGPT family of open models (eg finetuned Mistral) (discussion)

Development

You can install MemGPT from source with:

git clone [email protected]:cpacker/MemGPT.git
poetry shell
poetry install

We recommend installing pre-commit to ensure proper formatting during development:

pip install pre-commit
pre-commit install
pre-commit run --all-files

Contributing

We welcome pull requests! Please run the formatter before submitting a pull request:

poetry run black . -l 140

About

Create LLM agents with long-term memory and custom tools πŸ“šπŸ¦™

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.4%
  • Shell 1.2%
  • Dockerfile 0.4%