SDK Reference

BundleLLM.init(config?)

Creates a new BundleLLM instance.

const ai = BundleLLM.init({ siteId: 'my-site' })
ParameterTypeDescription
config.apiUrlstring (optional)OAuth redirect handler URL
config.siteIdstring (optional)Site identifier for analytics (auto-derived from domain if omitted)

Returns: BundleLLMInstance


Instance Methods

renderSignIn(selector)

Renders a provider picker in the target element. Shows OpenRouter (OAuth) and Anthropic (API key) with model selection.

ai.renderSignIn('#sign-in-container')

renderChat(selector, options?)

Renders a complete chat widget with provider picker, model selector, streaming, token usage, and sign-out.

ai.renderChat('#chat', {
  placeholder: 'Ask anything...',
  context: 'Page content here...',
  welcomeMessage: 'Connect your AI to start.',
  theme: 'light',
})

See Chat Widget for full options.


chat(request)

Starts a streaming chat. Returns a ChatStream.

const stream = ai.chat({
  messages: [{ role: 'user', content: 'Hello' }],
  context: 'Optional system prompt...',
  model: 'anthropic/claude-sonnet-4',
  maxTokens: 4096,
  temperature: 0.7,
})
ParameterTypeDescription
messagesArrayConversation history with role and content
contextstring (optional)Sent as system prompt with every message
modelstring (optional)Model override; uses selected model if omitted
maxTokensnumber (optional)Maximum response tokens (default: 4096)
temperaturenumber (optional)Sampling temperature

Returns: ChatStream


getStatus()

Returns the current connection status.

const status = ai.getStatus()
// { connected: true, provider: 'anthropic', model: 'claude-sonnet-4-20250514' }

Returns: { connected: boolean, provider?: string, model?: string }


setModel(modelId)

Changes the active model while connected. Persists to localStorage.

ai.setModel('anthropic/claude-haiku-4')

getModels()

Returns available models for the connected provider.

const models = ai.getModels()
// [{ id: 'claude-sonnet-4-20250514', name: 'Claude Sonnet 4' }, ...]

Returns: Array<{ id: string, name: string }>


disconnect()

Signs out the user’s AI provider. Clears stored connection.

ai.disconnect()

on(event, callback)

Listens for events.

ai.on('connected', ({ provider, model }) => { ... })
ai.on('disconnected', () => { ... })
ai.on('error', ({ message }) => { ... })
EventCallback DataDescription
connected{ provider, model }User connected a provider
disconnected.User signed out
error{ message }Error occurred

off(event, callback)

Removes an event listener.


destroy()

Cleans up all listeners, cancels pending streams, and removes internal state.

ai.destroy()

ChatStream

Returned by ai.chat(). Supports chained .on() calls.

stream
  .on('start', ({ messageId }) => { ... })
  .on('delta', (text) => { ... })
  .on('done', ({ usage }) => { ... })
  .on('error', ({ message }) => { ... })
EventCallback DataDescription
start{ messageId }Stream started
deltastringText chunk received
done{ usage? }Stream complete; usage has inputTokens and outputTokens
error{ message }Stream error

stream.cancel()

Cancels the active stream.

stream.cancel()