Skip to content

Q&A application leveraging the Ollama language model (LLAMA 3.2:1b) to provide intelligent responses to user queries.

Notifications You must be signed in to change notification settings

NAVANEETHELITE/Ollama_CHAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Q&A WITH LLAMA3

DESCRIPTION

This Q&A application leverages the Ollama language model (LLAMA 3.2:1b) to provide intelligent responses to user queries.

TOOLS AND MODELS

  • LANGCHAIN
  • OllamaLLM
  • STREAMLIT

MODEL INSTALLATION

Download and install Ollama. Then pull the required model as follows:

ollama run llama3.2:1b

Ollama has to be running locally for this app to work.

GENERATING API KEYS

Generate API key for Langsmith to get an interactive tracking of your requests to LLM via langchain here.

Refer sample .env in project's root directory.

INSTALLING REQUIREMENTS

pip install -r requirements.txt

RUNNING THE APPLICATION

streamlit run app.py

Navigate to localhost to interact with LLAMA3.

HOME PAGE:

HOME

LANGSMITH DASHBOARD:

DASHBOARD

About

Q&A application leveraging the Ollama language model (LLAMA 3.2:1b) to provide intelligent responses to user queries.

Topics

Resources

Stars

Watchers

Forks

Languages