Supported Providers

Proxle supports 5 major LLM providers. Each provider can be used through the SDK wrappers or by calling the proxy directly.

OpenAI

SDK wrapper: OpenAI / AsyncOpenAI

Supported endpoints:

  • /chat/completions - Chat completions (GPT-4o, GPT-4, GPT-3.5, O1, O3)
  • /embeddings - Text embeddings

Setup:

from proxle import OpenAI

client = OpenAI(
    api_key="sk-...",
    proxy_key="pk_live_..."
)

Proxy URL: POST /v1/proxy/openai/chat/completions

Anthropic

SDK wrapper: Anthropic / AsyncAnthropic

Supported endpoints:

  • /messages - Messages API (Claude Opus, Sonnet, Haiku)

Setup:

from proxle import Anthropic

client = Anthropic(
    api_key="sk-ant-...",
    proxy_key="pk_live_..."
)

Proxy URL: POST /v1/proxy/anthropic/messages

Cohere

SDK wrapper: Cohere (lightweight, no official SDK required)

Supported endpoints:

  • /chat - Chat completions (Command R+, Command R)
  • /generate - Text generation
  • /embed - Text embeddings

Setup:

from proxle import Cohere

client = Cohere(
    api_key="...",
    proxy_key="pk_live_..."
)

Proxy URL: POST /v1/proxy/cohere/chat

Google Gemini

SDK wrapper: Gemini (lightweight, no official SDK required)

Supported endpoints:

  • /generateContent - Content generation (Gemini 2.0 Flash, 1.5 Pro, 1.5 Flash)
  • /embedContent - Text embeddings
  • /countTokens - Token counting

Setup:

from proxle import Gemini

client = Gemini(
    api_key="...",
    proxy_key="pk_live_..."
)

Proxy URL: POST /v1/proxy/gemini/models/{model}:generateContent

Azure OpenAI

SDK wrapper: AzureOpenAI / AsyncAzureOpenAI

Supported endpoints:

  • /chat/completions - Chat completions
  • /embeddings - Text embeddings

Setup:

from proxle import AzureOpenAI

client = AzureOpenAI(
    api_key="...",
    azure_endpoint="https://your-resource.openai.azure.com",
    api_version="2024-02-01",
    proxy_key="pk_live_..."
)

Azure requires additional headers: X-Azure-Endpoint and X-Azure-Api-Version.

Proxy URL: POST /v1/proxy/azure/openai/deployments/{deployment}/chat/completions

Using the Proxy Directly

You can use Proxle without an SDK by calling the proxy endpoints directly:

curl -X POST https://api.proxle.dev/v1/proxy/openai/chat/completions \
  -H "X-Api-Key: pk_live_your-key" \
  -H "X-Provider-Key: sk-your-openai-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Required headers: | Header | Description | |--------|-------------| | X-Api-Key | Your Proxle API key | | X-Provider-Key | Your provider API key (pass-through, never stored) | | Content-Type | application/json | | X-Metadata | Optional: JSON-encoded metadata object |

For Azure, also include: | Header | Description | |--------|-------------| | X-Azure-Endpoint | Your Azure resource endpoint | | X-Azure-Api-Version | Azure API version |