Skip to content

2.3.3 Satellite: Dify

av edited this page Sep 20, 2024 · 2 revisions

Handle: dify URL: http://localhost:33961/

cover-v5-optimized

Dify Cloud · Self-hosting · Documentation · Enterprise inquiry

Static Badge Static Badge chat on Discord follow on Twitter Docker Pulls Commits last month Issues closed Discussion posts

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.

Starting

# [Optional] Pull the dify images
# ahead of starting the service
harbor pull dify

# Start the service
harbor up dify

harbor open dify

When started for the first time - you'll be asked to create an admin account.

Dify enables implementing some quite advanced workflows with high degree of specialization. For example:

dify-harbor

On this screenshot, we have two Dify workflows:

Web Llama classifies User input to detect if a Web Search is needed, or if the input can be answered directly. If a search is needed, it launches the "Advanced Web RAG" workflow.

Advanced Web RAG workflow transforms the User input into a query for SearXNG, then uses code to scrape the links from the search results. The links are then passed to the Web Scraper and a summarization model to generate a response.

Then, Web Llama generates a final response based on the Web RAG.

Neat!

Configuration

Unfortunately, there's no direct way to configure Dify via Harbor CLI. So, all the configuration should be done via Dify UI.

However, Dify provides good integration guides for relevant services.

Use the command below to list all Harbor configuration options specific to Dify:

user@os:~$ ▼ h config list | grep DIFY
DIFY_HOST_PORT                 33961
DIFY_DB_HOST_PORT              33962
DIFY_D2O_HOST_PORT             33963
DIFY_VERSION                   0.6.16
DIFY_SANDBOX_VERSION           0.2.1
DIFY_WEAVIATE_VERSION          1.19.0
DIFY_VOLUMES                   ./dify/volumes
DIFY_BOT_TYPE                  Chat
DIFY_OPENAI_WORKFLOW
Updating Dify

Due to the way Dify is setup, its versions are pinned in Harbor. You can use dify.version config option to set the desired version.

# Set the version to a different one
harbor config set dify.version 0.6.16

# Re-pull the images
harbor pull dify

# Restart the service
harbor restart dify
Ollama

Dify + Ollama

# Get your ollama URL for dify
# Note the "-i" flag - the URL is internal to Docker
harbor url -i ollama

# See which models are available
harbor ollama list

dify-ollama

Ensure to also setup the Embedding model to be better equipped for Dify's RAG pipeline.

SearXNG

Dify can use SearXNG in a similar way to what Open WebUI does, however you can be much more specific with what LLM will search for when conifugring the workflow.

Dify + SearXNG

# SearXNG should be running to be
# available for Dify
harbor up searxng

# Get the URL for the SearXNG
harbor url -i searxng

# Open the Dify UI
harbor open dify

dify-searxng

Integration

Once you've configured one or more interesting workflows - they can be integrated back with the rest of the Harbor via an OpenAI-compatible (-ish) API.

Harbor runs an custom dify-openai proxy that can translate requests from OpenAI API to Dify API.

In order to use it, for example:

  • Configure the Workflow/Agent/Chatbot in Dify
  • Create the API key for your workflow, copy it
  • Point Dify OpenAI proxy to the workflow
harbor config set dify.openai.workflow <workflow-key>

# Remember to restart the stack
harbor restart dify webui

Done!

Here's a cool example based on the sample workflows from above:

dify-webui

  • WebUI uses Dify "Web Llama" workflow as a chat backend
  • Dify runs the whole workflow as outlined in the "Starting" section of this guide
Clone this wiki locally