Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request for the possibility of adding user_id to the trace while using Haystack<>Langfuse connector #916

Closed
uvdepanda opened this issue Jul 23, 2024 · 5 comments
Labels
feature request Ideas to improve an integration integration:langfuse P3

Comments

@uvdepanda
Copy link

Hi there,

It seems like there is not a possibility to send out user_id to the trace while using Haystack<>Langfuse connector. It would be lovely if you could add this feature on your roadmap.

Thanks.

Best regards,
Yubraj

@uvdepanda uvdepanda added the feature request Ideas to improve an integration label Jul 23, 2024
@vblagoje
Copy link
Member

vblagoje commented Sep 4, 2024

Let me understand you correctly by using an example of Langfuse tracing @uvdepanda

import os

os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.connectors.langfuse import LangfuseConnector

if __name__ == "__main__":

    pipe = Pipeline()
    pipe.add_component("tracer", LangfuseConnector("Chat example"))
    pipe.add_component("prompt_builder", ChatPromptBuilder())
    pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))

    pipe.connect("prompt_builder.prompt", "llm.messages")

    messages = [
        ChatMessage.from_system("Always respond in German even if some input data is in other languages."),
        ChatMessage.from_user("Tell me about {{location}}"),
    ]

    response = pipe.run(data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "template": messages}})
    print(response["llm"]["replies"][0])
    print(response["tracer"]["trace_url"])

What you would like to have is a payload, say a dict of keys/values that is passed to LangfuseConnector i.e. "tracer" component for each run invocation?

So that run invocation becomes:

response = pipe.run(data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "template": messages},
                              "tracer": {"id":{"user_id": "123"}}})

Is that correct?

@uvdepanda
Copy link
Author

uvdepanda commented Sep 6, 2024

@vblagoje That is absolutely what I would like to have. And as of now, we are achieving by doing following:

Current Solution:

langfuse = Langfuse()
if context['user_id']:
    trace_url = response["tracer"]["trace_url"]
    trace_id = trace_url.split('/')[-1] 
    langfuse.trace(id=trace_id, session_id=context['user_id'])

It works but from time to time, it does following:

Problem:
we end of having all of our traces being populated onto the single one (a long trail of traces) instead of being standalone.

It would be cool if you could point out the right way of doing it.

Thanks.

@julian-risch julian-risch added the P3 label Sep 9, 2024
@vblagoje
Copy link
Member

vblagoje commented Sep 17, 2024

Hey @uvdepanda @julian-risch I experimented a bit how we can solve this one easily and effectively. We can do this:

  1. Add invocation_context: Dict[str, Any] run parameter to LangfuseConnector
  2. For every invocation of pipeline run users can set this variable with whatever they need to identify this pipeline run invocation
  3. Invocation context is available in tracer
Screenshot 2024-09-17 at 11 05 16

In the example above my pipeline run invocation looks like:

response = pipe.run(data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "template": messages},
                              "tracer": {"invocation_context": {"user": "123"}}})

Users can pass whatever they need to identify this particular pipeline run and it will be properly identified in Langfuse traces.

LMK your thoughts.

@vblagoje
Copy link
Member

@uvdepanda you can try the proposal at #1089

@vblagoje
Copy link
Member

vblagoje commented Oct 1, 2024

@vblagoje vblagoje closed this as completed Oct 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Ideas to improve an integration integration:langfuse P3
Projects
None yet
Development

No branches or pull requests

4 participants