← Back to Blog

Open WebUI APIBox Setup: Connect GPT, Claude, Gemini, and DeepSeek with One Base URL

A practical Open WebUI setup guide for APIBox. Configure an OpenAI-compatible connection, add model IDs, test chat, and fix common base URL or model list issues.

Open WebUI can connect to APIBox as an OpenAI-compatible provider. The key values are simple: API URL https://api.apibox.cc/v1, your APIBox API key, and the model IDs you want to expose in Open WebUI. This gives a self-hosted chat interface one entry point for GPT, Claude, Gemini, DeepSeek, and other models supported by APIBox.

1. The quick configuration table

Open WebUI fieldAPIBox value
Provider typeOpenAI-compatible or OpenAI connection
API URL / Base URLhttps://api.apibox.cc/v1
API KeyYour APIBox key
Model IDsAdd the APIBox model names you want to use
First testSend a short message with one model

This setup is useful when you want to:

  • connect Open WebUI to an OpenAI-compatible API
  • use one API base URL for several model families
  • add Claude, GPT, Gemini, or DeepSeek to a private chat interface
  • fix model list or model ID issues
  • give a team a controlled web UI for multiple models

2. Step-by-step setup in the UI

  1. Open your Open WebUI instance in a browser.
  2. Go to Admin Settings.
  3. Open Connections.
  4. Choose the OpenAI or OpenAI-compatible connection area.
  5. Add a new connection.
  6. Set API URL to https://api.apibox.cc/v1.
  7. Paste your APIBox API key.
  8. If model auto-detection does not show the models you need, add model IDs manually.
  9. Save the connection.
  10. Pick a model in the chat interface and send a short test message.

Recommended first test:

Reply briefly: APIBox is connected.

If that works, test streaming and longer prompts next.

3. Optional Docker environment variables

If you prefer environment variables, Open WebUI documents OpenAI-compatible settings such as OPENAI_API_BASE_URL, OPENAI_API_BASE_URLS, OPENAI_API_KEY, and OPENAI_API_KEYS. Exact behavior can vary by Open WebUI version, so the admin UI is usually the clearest first setup.

A simple single-provider configuration often looks like this:

OPENAI_API_BASE_URL=https://api.apibox.cc/v1
OPENAI_API_KEY=your_apibox_key

For multi-provider setups, follow the Open WebUI version you are running and use its documented plural variables if needed.

4. Which model IDs should you add?

Start with a small allowlist. Too many models make the picker harder to use and make evaluation less disciplined.

Use caseExample model direction
General chatGPT or Claude general model
Coding helpClaude or GPT coding model
Cost-sensitive chatDeepSeek or lower-cost GPT/Gemini model
Long context readingModel with strong long-context support
Reasoning tasksReasoning-capable model

Use the APIBox pricing page to confirm currently available model names:

5. Common errors and fixes

Connection test fails

Open WebUI may verify a provider by calling /models. If model discovery fails but chat completions are supported, manually add model IDs in the filter or allowlist and test a chat request.

401 Unauthorized

Check:

  • the API key was copied correctly
  • the key belongs to APIBox, not another provider
  • there are no spaces before or after the key
  • the connection was saved after editing

404 or model not found

Check:

  • model ID spelling
  • whether the model is enabled in your APIBox account
  • whether the model ID was added manually to the Open WebUI allowlist

Base URL error

Use:

https://api.apibox.cc/v1

Do not omit /v1 for OpenAI-compatible chat completions.

Streaming does not feel smooth

Check:

  • whether the selected model streams output
  • whether your reverse proxy buffers responses
  • whether your hosting layer has a low timeout
  • whether the prompt asks for very long output

6. When Open WebUI plus APIBox is a good fit

This setup is useful when:

  • your team wants a private web chat UI
  • you need more than one model family
  • you want a simple OpenAI-compatible connection instead of many provider accounts
  • you want to compare GPT, Claude, Gemini, and DeepSeek in one interface
  • non-developers need a controlled model picker

It is less useful if:

  • you only need a one-off API test
  • you require a provider-native feature that Open WebUI does not expose
  • your company needs a fully custom product UI

7. Good next steps after the first chat works

After the first response succeeds:

  1. Limit the visible model list.
  2. Add a cheaper default model for routine chat.
  3. Reserve stronger models for coding, reasoning, or long-context work.
  4. Document which model to use for each workflow.
  5. Track spending in the APIBox console.
  6. Test model behavior before letting a team rely on it.

Related guides:

To connect Open WebUI to APIBox, add an OpenAI-compatible connection, set the API URL to https://api.apibox.cc/v1, paste your APIBox API key, and add the model IDs you want to use. If Open WebUI cannot auto-load the model list, manually add model names in the Model IDs filter and test chat completions directly.

Register for APIBox to get an API key for your Open WebUI setup.

Try it now, sign up and start using 30+ models with one API key

Sign up free →