Open WebUI APIBox Setup: Connect GPT, Claude, Gemini, and DeepSeek with One Base URL
A practical Open WebUI setup guide for APIBox. Configure an OpenAI-compatible connection, add model IDs, test chat, and fix common base URL or model list issues.
Open WebUI can connect to APIBox as an OpenAI-compatible provider. The key values are simple: API URL https://api.apibox.cc/v1, your APIBox API key, and the model IDs you want to expose in Open WebUI. This gives a self-hosted chat interface one entry point for GPT, Claude, Gemini, DeepSeek, and other models supported by APIBox.
1. The quick configuration table
| Open WebUI field | APIBox value |
|---|---|
| Provider type | OpenAI-compatible or OpenAI connection |
| API URL / Base URL | https://api.apibox.cc/v1 |
| API Key | Your APIBox key |
| Model IDs | Add the APIBox model names you want to use |
| First test | Send a short message with one model |
This setup is useful when you want to:
- connect Open WebUI to an OpenAI-compatible API
- use one API base URL for several model families
- add Claude, GPT, Gemini, or DeepSeek to a private chat interface
- fix model list or model ID issues
- give a team a controlled web UI for multiple models
2. Step-by-step setup in the UI
- Open your Open WebUI instance in a browser.
- Go to
Admin Settings. - Open
Connections. - Choose the OpenAI or OpenAI-compatible connection area.
- Add a new connection.
- Set API URL to
https://api.apibox.cc/v1. - Paste your APIBox API key.
- If model auto-detection does not show the models you need, add model IDs manually.
- Save the connection.
- Pick a model in the chat interface and send a short test message.
Recommended first test:
Reply briefly: APIBox is connected.If that works, test streaming and longer prompts next.
3. Optional Docker environment variables
If you prefer environment variables, Open WebUI documents OpenAI-compatible settings such as OPENAI_API_BASE_URL, OPENAI_API_BASE_URLS, OPENAI_API_KEY, and OPENAI_API_KEYS. Exact behavior can vary by Open WebUI version, so the admin UI is usually the clearest first setup.
A simple single-provider configuration often looks like this:
OPENAI_API_BASE_URL=https://api.apibox.cc/v1
OPENAI_API_KEY=your_apibox_keyFor multi-provider setups, follow the Open WebUI version you are running and use its documented plural variables if needed.
4. Which model IDs should you add?
Start with a small allowlist. Too many models make the picker harder to use and make evaluation less disciplined.
| Use case | Example model direction |
|---|---|
| General chat | GPT or Claude general model |
| Coding help | Claude or GPT coding model |
| Cost-sensitive chat | DeepSeek or lower-cost GPT/Gemini model |
| Long context reading | Model with strong long-context support |
| Reasoning tasks | Reasoning-capable model |
Use the APIBox pricing page to confirm currently available model names:
5. Common errors and fixes
Connection test fails
Open WebUI may verify a provider by calling /models. If model discovery fails but chat completions are supported, manually add model IDs in the filter or allowlist and test a chat request.
401 Unauthorized
Check:
- the API key was copied correctly
- the key belongs to APIBox, not another provider
- there are no spaces before or after the key
- the connection was saved after editing
404 or model not found
Check:
- model ID spelling
- whether the model is enabled in your APIBox account
- whether the model ID was added manually to the Open WebUI allowlist
Base URL error
Use:
https://api.apibox.cc/v1Do not omit /v1 for OpenAI-compatible chat completions.
Streaming does not feel smooth
Check:
- whether the selected model streams output
- whether your reverse proxy buffers responses
- whether your hosting layer has a low timeout
- whether the prompt asks for very long output
6. When Open WebUI plus APIBox is a good fit
This setup is useful when:
- your team wants a private web chat UI
- you need more than one model family
- you want a simple OpenAI-compatible connection instead of many provider accounts
- you want to compare GPT, Claude, Gemini, and DeepSeek in one interface
- non-developers need a controlled model picker
It is less useful if:
- you only need a one-off API test
- you require a provider-native feature that Open WebUI does not expose
- your company needs a fully custom product UI
7. Good next steps after the first chat works
After the first response succeeds:
- Limit the visible model list.
- Add a cheaper default model for routine chat.
- Reserve stronger models for coding, reasoning, or long-context work.
- Document which model to use for each workflow.
- Track spending in the APIBox console.
- Test model behavior before letting a team rely on it.
Related guides:
8. Recommended setup
To connect Open WebUI to APIBox, add an OpenAI-compatible connection, set the API URL to https://api.apibox.cc/v1, paste your APIBox API key, and add the model IDs you want to use. If Open WebUI cannot auto-load the model list, manually add model names in the Model IDs filter and test chat completions directly.
Register for APIBox to get an API key for your Open WebUI setup.
Try it now, sign up and start using 30+ models with one API key
Sign up free →