-
Notifications
You must be signed in to change notification settings - Fork 427
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
print_llm_calls_summary() unpexpectedly says No LLM calls were made #355
Comments
Hi @leehsueh ! We need to investigate this. If everything works as expected, and you see the right results, the only thing I can think of is that the "LLM callbacks" (e.g., If you have a "normal" OpenAI key, can you test if using |
same issue |
same issue, I just followed the official doc for setting up the "disallowed_topics.co". My config.yml:
prmopts.yml:
config/rails/disallowed_topics.co:
and finally the python file to run it:
I'd say it is most likely an error from Langchain. On top of not logging the LLM calls, it also returns an error about the logging. Pasting it out here from the terminal:
|
Showing the error in my terminal
|
The problem with seeing the LLM calls should have been fixed in #379. It was published with |
@drazvan it seems to be solved at 0.8.1 for the problem "TypeError: can only concatenate list (not "dict") to list". |
@jackchan0528 : can you check if #412 fixes this? |
I'm trying to go through the getting started tutorials, but I'm running with azure open ai as the LLM endpoint. I was able to get the hello world example working with some dialog rails. However after executing a message and running
info = rails.explain()
portion, it says there are no LLM calls that were made, and theinfo.llm_calls
list is empty.Here's my code:
My config:
When I run the rails with verbose=True, I can see:
So I see the generate_user_intent action happening. This doesn't seem expected - could it be an issue specific to using azure openai?
The text was updated successfully, but these errors were encountered: