API

OpenAI-Compatible API Through the Architectural Passage

Zaguán provides a fully OpenAI-compatible API at https://api.zaguanai.com/v1. Use any OpenAI SDK or library—simply change the base URL and API key. Supply a namespaced modelidentifier (e.g., openai/gpt-4o or anthropic/claude-sonnet-4-5) and we handle the translation and routing to the correct provider.

Request essentials

  • Base URL:https://api.zaguanai.com/v1
  • Headers: SendAuthorization: Bearer <API_KEY>and Content-Type: application/json.
  • Model parameter: Use namespaced identifiers such as openai/gpt-4o-mini or anthropic/claude-sonnet-4-5-20250929.

Quick start samples

Each example uses the official OpenAI client for the language shown. Set your Zaguán API key in the environment, override the baseURL, and drop the snippet into your project.

cURL

Official package: HTTP

curl https://api.zaguanai.com/v1/chat/completions \
  -H "Authorization: Bearer $ZAGUAN_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-4o-mini",
    "messages": [
      {"role": "user", "content": "Say hello from Zaguán"}
    ]
  }'

Python

Official package: openai

import os
from openai import OpenAI

client = OpenAI(
    api_key=os.environ["ZAGUAN_API_KEY"],
    base_url="https://api.zaguanai.com/v1",
)

response = client.chat.completions.create(
    model="openai/gpt-4o-mini",
    messages=[{"role": "user", "content": "Say hello from Zaguán"}],
)

print(response.choices[0].message.content)

Go

Official package: github.com/openai/openai-go

package main

import (
    "context"
    "fmt"
    "log"
    "os"

    openai "github.com/openai/openai-go"
)

func main() {
    client := openai.NewClient(os.Getenv("ZAGUAN_API_KEY"))
    client.BaseURL = "https://api.zaguanai.com/v1"

    resp, err := client.Chat.Completions.New(
        context.Background(),
        openai.ChatCompletionNewParams{
            Model:    openai.F("openai/gpt-4o-mini"),
            Messages: openai.F([]openai.ChatCompletionMessageParam{
                openai.ChatCompletionUserMessageParam{
                    Content: openai.F([]openai.ChatCompletionMessageContent{
                        openai.ChatCompletionMessageContentTextParam{
                            Text: openai.F("Say hello from Zaguán"),
                        },
                    }),
                },
            }),
        },
    )
    if err != nil {
        log.Fatal(err)
    }

    fmt.Println(resp.Choices[0].Message.Content[0].Text)
}

TypeScript

Official package: openai

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.ZAGUAN_API_KEY,
  baseURL: "https://api.zaguanai.com/v1",
});

const response = await client.chat.completions.create({
  model: "openai/gpt-4o-mini",
  messages: [{ role: "user", content: "Say hello from Zaguán" }],
});

console.log(response.choices[0].message);

Extra Body Translation Layer

Our key differentiator: Access any provider-specific feature through the OpenAI-compatible API using the extra_body parameter.

Example: Anthropic Prompt Caching (90% cost savings)

response = client.chat.completions.create(
    model="anthropic/claude-3-5-sonnet-20241022",
    messages=[
        {"role": "user", "content": "Long document..."},
        {"role": "user", "content": "Summarize this"}
    ],
    extra_body={
        "system_cache_control": {"type": "ephemeral"}
    }
)
# First call: Full cost
# Subsequent calls within 5 minutes: 90% cheaper!

Example: Google Gemini Thinking

response = client.chat.completions.create(
    model="google/gemini-2.5-pro",
    messages=[
        {"role": "user", "content": "Explain quantum computing"}
    ],
    extra_body={
        "thinking_budget": 2048,
        "include_thoughts": True
    }
)
# Access the thinking via response.metadata

Example: xAI Stateful Conversations

response = client.chat.completions.create(
    model="xai/grok-beta",
    messages=[{"role": "user", "content": "What's the capital of France?"}],
    extra_body={
        "use_responses_api": True,
        "store": True
    }
)
# Context maintained server-side for 30 days

Learn more: See the Advanced Features guide for complete documentation on provider-specific capabilities.

Alternative API endpoint

If you experience connection issues with the standard API endpoint, you can use our alternative direct-connection URL:

https://api-eu-fi-01.zaguanai.com/v1

The standard URL (https://api.zaguanai.com/v1) is proxied through Cloudflare for enhanced security and performance. The alternative URL provides a direct connection to our servers, which can help resolve connectivity issues in certain network environments.

Error handling

Zaguán surfaces upstream provider errors without mutating the payload. Network and provider timeouts are propagated to your client so you can implement the retry or fallback strategy that matches your product. For idempotent operations, send a zaguan-request-id header to deduplicate retries on our side.