Skip to content

Commit

Permalink
Update local_llm_settings.md (letta-ai#765)
Browse files Browse the repository at this point in the history
  • Loading branch information
cpacker authored and norton120 committed Feb 15, 2024
1 parent 323ccd1 commit 4073667
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions docs/local_llm_settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,11 +61,11 @@ Let's try changing the temperature to `1.0`. In our `completions_api_settings.js

```json
{
"temp": 1.0
"temperature": 1.0
}
```

Note how we're using the naming conventions from llama.cpp. In this case, using `"temperature"` instead of `"temp"` will also work.
Note how we're using the naming conventions from llama.cpp. In this case, using `"temperature"` instead of `"temp"`.

Now if we save the file and start a new agent chat with `memgpt run`, we'll notice that the LM Studio server logs now say `"temp": 1.0`:

Expand Down Expand Up @@ -96,7 +96,7 @@ If your parameters are getting picked up correctly, they will be output to the t
Found completion settings file '/Users/user/.memgpt/settings/completions_api_settings.json', loading it...
Updating base settings with the following user settings:
{
"temp": 1.0
"temperature": 1.0
}
...(truncated)...
```
Expand Down Expand Up @@ -143,7 +143,7 @@ Now copy the following to your `completions_api_settings.json` file:
{
"top_k": 1,
"top_p": 0,
"temp": 0,
"temperature": 0,
"repeat_penalty": 1.18,
"seed": -1,
"tfs_z": 1,
Expand Down

0 comments on commit 4073667

Please sign in to comment.