- Third-party API proxies: Use a unified API Base to call multiple models
- Local models: Models deployed locally via Ollama, vLLM, LocalAI, etc.
- Private deployments: Self-hosted model services within your organization
Unlike the
openai provider, switching models under the Custom provider will not auto-switch the provider type. Your custom API address is always preserved.Configuration
Third-party API Proxy
| Parameter | Description |
|---|---|
bot_type | Must be set to custom |
model | Model name, any model supported by your proxy service |
custom_api_key | API key provided by your proxy service |
custom_api_base | API base URL, must be OpenAI-compatible |
Local Models
Local models typically don’t require an API key — just set the API base:Switching Models
Under the Custom provider, switching models only changesmodel without affecting bot_type or the API address:
