Custom UI
For full control over the chat experience, use the raw SDK instead of the drop-in widget.
Setup
<script src="https://cdn.bundlellm.com/sdk.js"></script>
<button id="sign-in"></button>
<button id="sign-out" style="display:none">Sign out</button>
<div id="status"></div>
<div id="messages"></div>
<input id="input" placeholder="Ask something..." />
<button id="send">Send</button>
Initialize
const ai = BundleLLM.init({ siteId: 'my-app' })
// Render sign-in button
ai.renderSignIn('#sign-in')
Handle Auth State
const signOutBtn = document.getElementById('sign-out')
const statusEl = document.getElementById('status')
ai.on('connected', ({ provider, model }) => {
statusEl.textContent = `Connected to ${provider} (${model})`
signOutBtn.style.display = 'inline'
})
ai.on('disconnected', () => {
statusEl.textContent = 'Not connected'
signOutBtn.style.display = 'none'
})
signOutBtn.addEventListener('click', () => ai.disconnect())
Send Messages
const history = []
function sendMessage(text) {
history.push({ role: 'user', content: text })
const stream = ai.chat({
messages: [...history],
context: 'Your page context here...',
})
let fullText = ''
stream
.on('delta', (text) => {
fullText += text
// Update your UI with the accumulated text
})
.on('done', ({ usage }) => {
history.push({ role: 'assistant', content: fullText })
// Show token usage if desired
})
.on('error', ({ message }) => {
// Show error in your UI
})
}
Cancel a Stream
const stream = ai.chat({ messages: [...] })
// Cancel after 5 seconds
setTimeout(() => stream.cancel(), 5000)
Check Status Programmatically
const status = await ai.getStatus()
if (status.connected) {
console.log(`Using ${status.provider} / ${status.model}`)
}
Framework Examples
React
function Chat() {
const [ai] = useState(() => BundleLLM.init())
const [connected, setConnected] = useState(false)
const signInRef = useRef(null)
useEffect(() => {
ai.renderSignIn('#sign-in-btn')
ai.on('connected', () => setConnected(true))
ai.on('disconnected', () => setConnected(false))
return () => ai.destroy()
}, [])
const send = (text) => {
const stream = ai.chat({ messages: [{ role: 'user', content: text }] })
stream.on('delta', (chunk) => { /* update state */ })
}
return (
<div>
{!connected && <div id="sign-in-btn" />}
{connected && <YourChatUI onSend={send} />}
</div>
)
}
Vue
<template>
<div v-if="!connected" id="sign-in-btn" />
<YourChatUI v-else @send="send" />
</template>
<script setup>
import { ref, onMounted, onUnmounted } from 'vue'
const ai = BundleLLM.init()
const connected = ref(false)
onMounted(() => {
ai.renderSignIn('#sign-in-btn')
ai.on('connected', () => connected.value = true)
ai.on('disconnected', () => connected.value = false)
})
onUnmounted(() => ai.destroy())
function send(text) {
const stream = ai.chat({ messages: [{ role: 'user', content: text }] })
stream.on('delta', (chunk) => { /* update state */ })
}
</script>
Required User Protections
When building a custom UI, you must provide the following to comply with the BundleLLM Terms of Service:
1. Disconnect Button
Users must be able to disconnect their Provider at any time.
ai.disconnect()
2. Token Usage Display
Show token usage after each response so users can see their cost.
stream.on('done', ({ usage }) => {
if (usage) {
showTokens(`${usage.inputTokens} in / ${usage.outputTokens} out`)
}
})
3. Provider Identity
Show which Provider the user is connected to.
ai.on('connected', ({ provider, model }) => {
showStatus(`Connected to ${provider}`)
})
4. No Key Interception
Do not access, log, or transmit the user’s API key. The connected event only includes { provider, model }. The API key is never exposed to your code. It is stored in the user’s browser localStorage and used internally by the SDK.
Cleanup
Always call destroy() when your component unmounts:
ai.destroy()
This removes event listeners, cancels pending streams, and cleans up internal state.