Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to add Multiple Dify.AI Models into OpenWebui #102

Open
ZacharyKehlGEAppliances opened this issue Jan 17, 2025 · 2 comments
Open
Labels
question Further information is requested

Comments

@ZacharyKehlGEAppliances
Copy link
Contributor

Currently, from what I can tell we can only add one model from Dify.ai into the OpenWebui Connections. It would be great/useful if we can add multiple

@ZacharyKehlGEAppliances
Copy link
Contributor Author

I think something like this would work?

Not sure, can create PR to test or if there is a better idea...

# compose.dify.yml  
dify-weaviate:
    image: semitechnologies/weaviate:${HARBOR_DIFY_WEAVIATE_VERSION}
    container_name: ${HARBOR_CONTAINER_PREFIX}.dify-weaviate
    volumes:
      - ${HARBOR_DIFY_VOLUMES}/weaviate:/var/lib/weaviate
    env_file:
      - ./.env
      - ./dify/override.env
    networks:
      - harbor-network

  # Dynamic OpenAI service that can be scaled
  dify-openai:
    build:
      context: ./dify/openai
      dockerfile: Dockerfile
    container_name: ${HARBOR_CONTAINER_PREFIX}.dify-openai-${WORKFLOW_INDEX:-1}
    ports:
      - "${HARBOR_DIFY_D2O_HOST_PORT:-3000}-${WORKFLOW_INDEX:-1}:3000"
    env_file:
      - ./.env
      - ./dify/override.env
    volumes:
      - ./dify/openai/app.js:/dify2openai/app.js
    networks:
      - harbor-network
    environment:
      - DIFY_API_URL=http://harbor.dify:80
      - BOT_TYPE=${HARBOR_DIFY_BOT_TYPE}
      - WORKFLOW_ID=${WORKFLOW_ID}
    deploy:
      mode: replicated
      replicas: ${HARBOR_DIFY_WORKFLOW_COUNT:-1}
# open webui connnections tab
Name: Your Workflow Name (e.g., "Dify Workflow 1")
API Endpoint: http://localhost:3000-X  (where X is your workflow number)
API Key: Your Dify API key
Context Length: Match your workflow's context length

@av
Copy link
Owner

av commented Jan 18, 2025

Hey 👋 thanks for the report!

Indeed, it's currently only possible to configure a single specific workflow to be proxied.

The whole thing setup in such a way out of a few things:

  • Dify doesn't have an OpenAI-compatible API, so I had to write a proxy on my own, none of the existing ones were working with Open WebUI due to streaming support and models endpoint
  • Dify's own workflow API only has a single endpoint for all the workflows and it resolves them automatically based on the API key

The custom proxy itself actually isn't bound to serve only a single workflow. That comes mostly from being able to pre-connect things out of the box, as Harbor can't (yet) provision multiple instances of specific configs dynamically.

So, for Open WebUI, it means that we have a static set of files:

# Adds a volume mount when webui and dify are run together
compose.x.webui.dify.yml

# Connects Open WebUI to Dify
config.dify.json

So, you can either

Mount multiple configs in compose.x.webui.dify.yml

services:
  webui:
    volumes:
      # This one is already there
      - ./open-webui/configs/config.dify.json:/app/configs/config.dify.json
      # This one is custom
      - ./open-webui/configs/config.dify.custom.json:/app/configs/config.dify.custom.json

You can copy the contents from the original one, just hardcode the workflow ID

Add multiple endpoints with different SKs to config.dify.json

A single Open WebUI config can have multiple OpenAI-compatible endpoints defined, so the default one cap be expanded as follows:

{
  "openai": {
    "api_base_urls": [
      "http://dify-openai:3000/v1",
      "http://dify-openai:3000/v1"
    ],
    "api_keys": [
      "${HARBOR_DIFY_OPENAI_WORKFLOW}",
      "< custom workflow ID here >"
    ],
    "enabled": true
  }
}

Just provide multiple API instances of the same proxy endpoint with different API keys

Use existing setup for OpenAI-compatible endpoints <- Probably the most "Harbor" way to achieve this

We already support adding arbitrary amount of OpenAI-compatible endpoints to Open WebUI via the harbor openai CLI commands

harbor openai urls add http://dify-openai:3000/v1
harbor openai keys add <custom dify workflow ID>

It'll be then added to Open WebUI config via open-webui/configs/config.override.json

@av av added the question Further information is requested label Jan 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants