Skip to main content

Custom Provider

Connect any OpenAI-compatible API as a custom provider.

Setup

  1. In Pawz → Settings → ModelsAdd ProviderCustom
  2. Enter:
    • Base URL — the API endpoint (must be OpenAI-compatible)
    • API key — if required
    • Default model — the model name to use

Configuration

SettingDefault
Base URL(none — you provide)
API keyOptional
Default model

Compatible services

Any service that implements the OpenAI chat completions API:
  • LM Studiohttp://localhost:1234/v1
  • text-generation-webuihttp://localhost:5000/v1
  • vLLMhttp://localhost:8000/v1
  • LocalAIhttp://localhost:8080/v1
  • Together AIhttps://api.together.xyz/v1
  • Fireworks AIhttps://api.fireworks.ai/inference/v1
  • Perplexityhttps://api.perplexity.ai
  • Groqhttps://api.groq.com/openai/v1

Requirements

The API must support:
  • POST /chat/completions endpoint
  • Streaming via stream: true
  • Tool/function calling (for agent tools to work)

Tips

  • Test with a simple chat message first to confirm connectivity
  • If tools don’t work, the API may not support function calling — the agent will still work for conversation
  • You can add multiple custom providers pointing to different services