You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NeMo Guardrails depends on torch/cuda to generate embeddings when using SentenceTransformers models but in other cases that's not required. Can we move the torch import to the sentence transformer class and make it optional if the user is going to use OpenAI embeddings models ?
Yes, we can definitely do that. Can you test if this is the only change that is needed?
It is on the roadmap to make SentenceTransformers an optional dependency, but might take another couple of months. The changes you suggested, if enough, can be included in the 0.7.0 release at the end of this month.
NeMo Guardrails depends on torch/cuda to generate embeddings when using SentenceTransformers models but in other cases that's not required. Can we move the torch import to the sentence transformer class and make it optional if the user is going to use OpenAI embeddings models ?
move it from https://github.com/NVIDIA/NeMo-Guardrails/blob/develop/nemoguardrails/embeddings/basic.py#L19 to https://github.com/NVIDIA/NeMo-Guardrails/blob/develop/nemoguardrails/embeddings/basic.py#L117
The main reason is when installing nemo-guardrails the container image gets really big because of torch and cuda however we might not need them.
The text was updated successfully, but these errors were encountered: