Query local data PLUS the main ChatGPT LLM #3511
interzonez
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I can setup Langchain to ingest data or text into a vector database. I can then ask questions of that data.
How can I call-out to the full ChatGPT model using a local memory state for my next query - to the core LLM.
Example:
"Based on the document that has just been analysed write some Python code to implement x,y,z."
OR
"Based off understanding the ingested data, describe other cases where there have been similar x,y,z results."
I guess this is about keeping a live memory state and building on top of it?
Beta Was this translation helpful? Give feedback.
All reactions