Skip to main content

Connect Your First Provider

Pawz needs at least one AI provider to power your agents. You can use a local model (Ollama — free) or a cloud API.

Option A: Ollama (local, free)

The easiest way to get started — runs entirely on your machine.
  1. Install Ollama: ollama.com
  2. Pull a model:
    ollama pull llama3.1
    
  3. Launch Pawz — it auto-detects Ollama on localhost:11434
That’s it. Pawz will also auto-pull the nomic-embed-text embedding model for the memory system.

Option B: Cloud provider

OpenAI

  1. Go to platform.openai.com/api-keys and create an API key
  2. In Pawz, go to Settings → Models
  3. Click Add Provider
  4. Select OpenAI, paste your API key
  5. Choose a default model (GPT-4o recommended)

Anthropic

  1. Go to console.anthropic.com/settings/keys and create an API key
  2. In Pawz, go to Settings → Models
  3. Click Add Provider
  4. Select Anthropic, paste your API key
  5. Choose a default model (Claude Sonnet 4 recommended)

Google Gemini

  1. Go to aistudio.google.com/apikey and create an API key
  2. In Pawz, go to Settings → Models
  3. Click Add Provider
  4. Select Google, paste your API key
  5. Choose a default model (Gemini 2.5 Pro recommended)

Verify

After adding a provider, go to any agent chat and send a message. If you get a response, your provider is working. The model name with a checkmark (✓) in the chat header confirms which model is actually being used.

Multiple providers

You can add as many providers as you want. Each agent can use a different model — just change it in the agent’s settings.

Next steps

See the full Providers guide for all 10 supported providers, model routing, and advanced configuration.