How do I login into LangSmith? #20754
Replies: 1 comment
-
Please follow the instructions and use Q&A for usage questions. https://api.smith.langchain.com/ is for the API rather than users. Use: https://smith.langchain.com/ to login |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Why is this not working? I tried to log on to https://api.smith.langchain.com/, but the IP couldn't be found, not sure why.
I attached the code py file. I use Windows PowerShell with following lines, and all worked fine, except the API endpoint.
Anyone know how the api.smith.langchain works, please?
python -m venv .venv; ..venv\Scripts\Activate.ps1 (To create a virtual env.)
Get-Content requirements.txt (to see the list below)
chainlit
langchain
langchain_community
PyPDF2
chromadb
groq
langchain-groq
ollama
python -m pip install -r requirements.txt (To install all libraries listed above)
ollama pull nomic-embed-text (To pull nomic-embed-text)
python -m chainlit run app.py (To run the file app.py which is shown below)
At this point. I was redirected to the Chatbot to upload a pdf file which I was able to do so.
Then the prompt from the Chatbot is "Done. You may ask any questions regarding this pdf file's content".
When I asked questioins, the 404 error was shown due to the api.smith.langchain's end point.
(a) if you know how this works, pls show me how to make the api langchain endpoint work since this is not working.
(b) In a structure logic way, can you explain how these libraries are linked / associated with each other? Chainlit vs. Langchain vs. chromadb vs. groq vs. api.langchain vs. ollama?
app.py code:
import PyPDF2
from langchain_community.embeddings import OllamaEmbeddings
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain_community.vectorstores import Chroma
from langchain.chains import ConversationalRetrievalChain
from langchain_community.chat_models import ChatOllama
from langchain_groq import ChatGroq
from langchain.memory import ChatMessageHistory, ConversationBufferMemory
import chainlit as cl
for chainlit, .env is loaded automatically
#from dotenv import load_dotenv
#load_dotenv() #
#groq_api_key = os.environ['GROQ_API_KEY']
llm_local = ChatOllama(model="mistral:instruct")
llm_groq = ChatGroq(
#groq_api_key=groq_api_key,
#model_name='llama2-70b-4096'
model_name='mixtral-8x7b-32768'
)
@cl.on_chat_start
async def on_chat_start():
@cl.on_message
async def main(message: cl.Message):
Beta Was this translation helpful? Give feedback.
All reactions