How to add links as a response to LLM model #248
adityaworkfusion
started this conversation in
General
Replies: 1 comment 1 reply
-
One way to do this: when you lookup the embeddings, you should also lookup what the documentation source of that embedding is, then you can feed that link as part of the prompt to the LLM and do some prompt engineering to also return the documentation link alongside the response. You can also look into RetrievalQAWithSourcesChain from LangChain, and we have an example of doing this with Vertex AI Search:
Output:
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
Could you kindly help me with below need, and how it can be done.
Need - I want my LLM model to provide links along with the text to the response, is it possible that text-bison-001 model provides links to the question asked.
I am using chroma embeddings with LLMs, my dataset contains documentations texts.
@polong-lin appreciate your time here.
Thanks
Aditya
Beta Was this translation helpful? Give feedback.
All reactions