Very good experience to use LangChain together with TensorRT-LLM backend #591
Replies: 3 comments 2 replies
-
Hi, it is glad to hear that TensorRT-LLM can help you achieve great performance and provide value for your use case. There are more and more new features, enhancements and optimizations coming on the way, hoping you will like it:) And welcome for providing any feedbacks to the TensorRT-LLM team. Thanks |
Beta Was this translation helpful? Give feedback.
-
@npuichigo thanks a lot for the great feedback! Do not hesitate if June (@juney-nvidia) or myself can help you in any way. We want everyone to have a good experience with TensorRT-LLM. |
Beta Was this translation helpful? Give feedback.
-
@npuichigo awesome to see this integration. Is this something that could be added to the langchain repository? |
Beta Was this translation helpful? Give feedback.
-
I build a service (https://github.com/npuichigo/openai_trtllm) to provide an OpenAI compatible endpoint for TensorRT-LLM triton backend, to be used together with LangChain or other popular community LLM tools, and it shows really good performance compared to FastChat. Thanks for the great work of TensorRT-LLM.
Beta Was this translation helpful? Give feedback.
All reactions