JavaScript SDK

The Proxle JavaScript SDK provides drop-in replacements for popular LLM client libraries with full TypeScript support. Replace your import, add a proxy key, and all requests are automatically logged, cached, and tracked.

Installation

npm install proxle

Requires Node.js 18+. Full TypeScript definitions included.

Configuration

Constructor Arguments

import { OpenAI } from "proxle";

const client = new OpenAI({
  apiKey: "sk-...",           // Your provider API key
  proxyKey: "pk_live_...",    // Your Proxle API key
  proxyUrl: "https://...",    // Optional: custom proxy URL
});

Environment Variables

PROXLE_API_KEY=pk_live_...
PROXLE_URL=https://api.proxle.dev  # optional

Provider Wrappers

OpenAI

import { OpenAI } from "proxle";

const client = new OpenAI({
  apiKey: "sk-...",
  proxyKey: "pk_live_...",
});

// Chat completions
const response = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
  metadata: { feature: "chat", userId: "user_123" },
});

// Embeddings
const embeddings = await client.embeddings.create({
  model: "text-embedding-3-small",
  input: "Hello world",
});

Anthropic

import { Anthropic } from "proxle";

const client = new Anthropic({
  apiKey: "sk-ant-...",
  proxyKey: "pk_live_...",
});

const response = await client.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
  metadata: { feature: "assistant" },
});

Azure OpenAI

import { AzureOpenAI } from "proxle";

const client = new AzureOpenAI({
  apiKey: "...",
  endpoint: "https://your-resource.openai.azure.com",
  apiVersion: "2024-02-01",
  proxyKey: "pk_live_...",
});

const response = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
});

Cohere

import { Cohere } from "proxle";

const client = new Cohere({
  apiKey: "...",
  proxyKey: "pk_live_...",
});

// Chat
const response = await client.chat({
  model: "command-r-plus",
  message: "Hello!",
  metadata: { feature: "chat" },
});

// Embeddings
const embeddings = await client.embed({
  model: "embed-english-v3.0",
  texts: ["Hello world"],
  inputType: "search_document",
});

Gemini

import { Gemini } from "proxle";

const client = new Gemini({
  apiKey: "...",
  proxyKey: "pk_live_...",
});

const response = await client.generateContent({
  model: "gemini-2.0-flash",
  contents: [{ role: "user", parts: [{ text: "Hello!" }] }],
  metadata: { feature: "chat" },
});

Metadata

The metadata parameter lets you tag requests for cost attribution:

const response = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [...],
  metadata: {
    feature: "chat_assistant",
    userId: "user_123",
    environment: "production",
  },
});

Metadata is limited to 4KB when serialized.

Streaming

All providers support streaming:

// OpenAI streaming
const stream = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Tell me a story" }],
  stream: true,
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) process.stdout.write(content);
}
// Anthropic streaming
const stream = await client.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Tell me a story" }],
  stream: true,
});

for await (const event of stream) {
  if (event.type === "content_block_delta") {
    process.stdout.write(event.delta.text);
  }
}

Error Handling

import { OpenAI } from "proxle";
import { ProxleError, ProxleConfigError } from "proxle";

try {
  const client = new OpenAI({ apiKey: "sk-...", proxyKey: "pk_live_..." });
  const response = await client.chat.completions.create({...});
} catch (error) {
  if (error instanceof ProxleConfigError) {
    console.error("Configuration error:", error.message);
  } else if (error instanceof ProxleError) {
    console.error("Proxle error:", error.message);
  }
}

TypeScript

Full type definitions are included. All methods return properly typed responses:

import type { OpenAIConfig, Metadata } from "proxle";

const config: OpenAIConfig = {
  apiKey: "sk-...",
  proxyKey: "pk_live_...",
};

const metadata: Metadata = {
  feature: "chat",
  userId: "user_123",
};