Skip to content

A Streamlit user interface for local LLM implementation on Ollama. With just three python apps you can have a localized LLM to chat with. I'm running Ollama Windows (just updated) and DuckDuckGo browser and it's working great as a coding assistant.

License

Notifications You must be signed in to change notification settings

romilandc/streamlit-ollama-llm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Streamlit webapp UI for local LLM implementation on Ollama. With just three python libraries (torch, streamlit, ollama) you can have a localized LLM to chat with. I'm running Ollama Windows (new update as-of 03/24) and DuckDuckGo browser and it's working great as a coding assistant.

For coding, try code llama, dolphin-mixtral, or deepseek-coder. For everyday questions, try mixtral or mistral. There's a good selection available: https://ollama.com/library

How to use

  • choose if you want to use gpu or cpu; llm_app.py will work in both instances
  • create folder in path with gpu/cpu for the project
  • change directory to this path `cd /path/to/project/folder'
  • use git to clone repository git clone https://github.com/romilan24/streamlit-ollama-llm.git
  • create virtual env using conda conda create -n ollama or venv
  • activate the environment conda activate ollama and you should see this:

Image1

  • install requirements.txt pip install -r /path/to/requirements.txt
  • check Ollama is activated:
  • download models (https://ollama.com/library) from Ollama if you haven't yet ollama pull model_name_here in cmd terminal

Image2

  • change directory to local path where llm_app.py is saved
  • in terminal enter streamlit run llm_app.py, you should see this:

Image3

  • copy/paste the link in your browser and see this

  • select the model of choice:

Image4

  • enter your question in the prompt and hit enter

Image5

  • and the code works!

Image6

  • enjoy your local LLM and let me know if you come up with any additions

About

A Streamlit user interface for local LLM implementation on Ollama. With just three python apps you can have a localized LLM to chat with. I'm running Ollama Windows (just updated) and DuckDuckGo browser and it's working great as a coding assistant.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages