Is this Normal?!? #1128
Replies: 4 comments 2 replies
-
fixed with a pre-prompt - an invisible prompt you add at the start of each session |
Beta Was this translation helpful? Give feedback.
-
Yes, it's normal for random strings of text. The first thing a local LLM said to me, in response to "what are you?" was, "I'm a virus and I've already infected your system" |
Beta Was this translation helpful? Give feedback.
-
I'm not sure if you are testing the random strings it produces by itself and are alarmed by that, or if you are looking for a fix. In case it is the second, here you have the solution #1084 |
Beta Was this translation helpful? Give feedback.
-
Maybe wrong tokenizer made it miss the end-of-message token and start roleplaying as a user because that's the expected continuation? Or maybe it's a dumber model that's not trained well enough to avoid roleplaying all sides of the conversation? |
Beta Was this translation helpful? Give feedback.
-
Please post your "Is this normal" responses and examples here!
Title: Hello!
Warning these maybe scary or offensive!
I have posted it as an image as I do not want the words associated with my name. I recommend others do the same. This is a brand new Vicuna 13b install with Oobabooga and Text Generation Web UI
Beta Was this translation helpful? Give feedback.
All reactions