Providers
BundleLLM supports multiple AI providers. Users choose which provider to connect. Your integration code stays the same.
OpenRouter
| Auth | OAuth PKCE (one-click) or API key |
| Models | 200+ models. Claude, GPT, Gemini, Llama, and more |
| Context | Varies by model |
| Streaming | SSE (OpenAI-compatible) |
OpenRouter is the recommended provider. Users click one button, authorize in a popup, and they’re connected. No API key copy-paste required. Gives access to models from every major provider through a single account.
Users can also use an API key from openrouter.ai/keys.
Available models in the SDK:
- Claude Sonnet 4 (
anthropic/claude-sonnet-4) - Claude Haiku 4.5 (
anthropic/claude-haiku-4-5-20251001) - GPT-4o (
openai/gpt-4o) - GPT-4.1 Mini (
openai/gpt-4.1-mini) - Gemini 2.5 Pro (
google/gemini-2.5-pro-preview) - Llama 4 Maverick (
meta-llama/llama-4-maverick)
Anthropic
| Auth | API key + model selector |
| Models | Claude Sonnet 4, Haiku 4.5, Opus 4 |
| Context | 200K tokens |
| Streaming | SSE |
| Vision | Yes |
For users who want to use their Anthropic account directly. API key from console.anthropic.com.
Available models in the SDK:
- Claude Sonnet 4 (
claude-sonnet-4-20250514) - Claude Haiku 4.5 (
claude-haiku-4-5-20251001) - Claude Opus 4 (
claude-opus-4-20250514)
Provider-Agnostic Code
Your integration doesn’t need to know which provider the user chose:
// This works the same regardless of provider
const stream = ai.chat({
messages: [{ role: 'user', content: 'Hello' }],
})
stream.on('delta', (text) => {
// Same event format from every provider
})
The SDK normalizes all provider differences. Streaming protocols, message formats, and auth mechanisms. Into a unified API.
Switching Models
Users can switch models anytime while connected:
// Widget: dropdown in header (automatic)
// Custom UI: use setModel()
ai.setModel('anthropic/claude-haiku-4-5-20251001')
// Get available models
const models = ai.getModels()
// [{ id: 'anthropic/claude-sonnet-4', name: 'Claude Sonnet 4' }, ...]