Users bring their own LLM. One script tag, any provider, your UI. Zero AI costs.
<script src="cdn.bundlellm.com/sdk.js"></script>
<div id="chat"></div>
<script>
BundleLLM.init().renderChat('#chat')
</script> Add AI chat to any website in 3 lines.
Three steps to AI-powered features on your site.
Include the SDK and call renderChat() for a drop-in widget, or use the event API for full control over your UI.
They choose OpenRouter (200+ models, one-click OAuth) or connect Anthropic with an API key.
Your site sends prompts and receives streaming responses. The SDK connects directly to the provider — no middleman, no AI costs for you.
Drop-in widget or full custom UI — your choice.
Complete chat UI in 3 lines. Sign-in, streaming, sign-out — all handled.
<script src="cdn.bundlellm.com/sdk.js"></script>
<div id="chat" style="height:500px"></div>
<script>
BundleLLM.init().renderChat('#chat', {
context: 'Product page for...',
placeholder: 'Ask about this product',
})
</script> Full control with event-driven API. Build your own chat experience.
const ai = BundleLLM.init()
ai.renderSignIn('#btn')
ai.on('connected', ({ provider }) => {
// Show your chat UI
})
const stream = ai.chat({
messages: [{ role: 'user', content: '...' }],
context: 'Your data...',
})
stream.on('delta', (text) => {
// Append to your container
})
Your users choose their provider. You write zero provider-specific code.
200+ models — Claude, GPT, Gemini, Llama
Claude Sonnet 4, Haiku 4.5, Opus 4
More providers added as they support OAuth. API key fallback works with any provider.
Everyone wins when users own their AI.
What we're building next.
See how users interact with AI on your site. Track connections, messages, token usage, and provider breakdown per domain.
Upload your docs, product data, or knowledge base. BundleLLM automatically retrieves relevant context for every chat message.
Want early access? Get in touch