Skip to content
This repository has been archived by the owner on Sep 30, 2024. It is now read-only.

preInstructions get Cody promps flagged #63293

Open
vdavid opened this issue Jun 17, 2024 · 2 comments
Open

preInstructions get Cody promps flagged #63293

vdavid opened this issue Jun 17, 2024 · 2 comments

Comments

@vdavid
Copy link
Contributor

vdavid commented Jun 17, 2024

Context:
We have an option to add a preInstruction text to Cody in the VS Code extension. Cody adds the preInstruction to the prompt
here.

Problem:
It seems like we are banning users who are using the preInstruction config. It seems quite widespread anecdotally: we’ve seen 3 in the last 24h.

Solution ideas:

  1. Disable preinstructions and ship a stable patch release. If this is an experimental feature (need to check!) then this is OK, otherwise a bad idea.
  2. Move the preInstruction into a separate message.
  3. Check our logic in Cody Gateway to confirm why prompts with the preInstruction are being flagged.
@chrsmith
Copy link
Contributor

Are you sure this is the problem? Cody Gateway just checks that the prompt contains a specific string. So adding anything in addition to the prompt prefix won't cause any problems. (It doesn't even need to be a prefix.)

@chrsmith
Copy link
Contributor

Yes, this seems highly unlikely to cause any problems.

  1. We only append the preInstruction to the intro sent in the LLM prefix.
    const intro = ps`You are Cody, an AI coding assistant from Sourcegraph. ${
        preInstruction ?? ''
    }`.trim()
  1. We then look for existing prompt strings here:
    https://github.com/sourcegraph/sourcegraph/blob/f716d7aa11bad3bf2d2be74b1d91cd3e535942d0/cmd/cody-gateway/internal/httpapi/completions/flagging.go#L80

  2. With the actual configuration data here:
    https://github.com/sourcegraph/infrastructure/blob/main/cody-gateway/envs/prod/cloudrun/main.tf#L54

So using this wouldn't cause the unknown_prompt flag to trigger. However, the request could get flagged if there were "forbidden phrases" in the preInstruction. Or if it were large enough to influence the "token size of the request".

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants