Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bot Generates Multiple Replies Due to DEFAULT_MAX_TWEET_LENGTH #3394

Open
naiveai-dev opened this issue Feb 9, 2025 · 0 comments
Open

Bot Generates Multiple Replies Due to DEFAULT_MAX_TWEET_LENGTH #3394

naiveai-dev opened this issue Feb 9, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@naiveai-dev
Copy link

Eliza Version
latest v0.1.9

Issue Summary
When testing a bot that replies to its followers, I noticed that it sometimes generates multiple replies to a single tweet. The API logs indicate that the bot only makes two calls per response (one for the medium model and one for the large model). However, due to DEFAULT_MAX_TWEET_LENGTH, the generated response is split into multiple tweets to fit within the 280-character limit.

This is usually a minor compliance issue, but in some cases, the agent detects these split responses as incomplete sentences and attempts to generate additional replies repeatedly. As a result, the API sometimes makes 4 to 6 calls instead of 2, leading to excessive LLM token usage and increased costs.

Steps to Reproduce

  1. Enable the bot to reply to follower tweets.
  2. Observe API logs, which show only two calls being made.
  3. Due to DEFAULT_MAX_TWEET_LENGTH, the reply is split into multiple tweets.
  4. In some cases, the agent perceives the split content as incomplete and generates additional responses.
  5. The bot may end up making 4–6 API calls per tweet, leading to unnecessary token consumption.

Expected Behavior

  • The bot should generate only one response per tweet and appropriately split it if needed, without triggering unnecessary additional responses.

Actual Behavior

  • The bot sometimes misinterprets split responses as incomplete and keeps generating additional replies, leading to multiple API calls (4–6 instead of 2).

Impact

  • High token usage, leading to increased API costs.
  • Potential rate limits or compliance issues with excessive replies to a single tweet.

Temporary Fixes

  • set DEFAULT_MAX_TWEET_LENGTH to much higher number(like 500)

I think this is mainly from interation.ts and continue.ts.

Screenshots

Image
@naiveai-dev naiveai-dev added the bug Something isn't working label Feb 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant