agentskit.js
Cookbook

Streaming chat

useChat + abort + back-pressure. The minimum viable streaming chat, production-ready.

The shortest path to a real streaming chat with cancel support and smooth rendering.

Verified·controller
import { useChat } from '@agentskit/react'
import { openai } from '@agentskit/adapters/openai'

const adapter = openai({ model: 'gpt-4o-mini' })

export function Chat() {
  const { messages, input, setInput, send, stop, status } = useChat({ adapter })
  return (
    <form
      onSubmit={(e) => {
        e.preventDefault()
        send(input)
        setInput('')
      }}
    >
      {messages.map((m) => (
        <p key={m.id} data-role={m.role}>
          {m.content}
        </p>
      ))}
      <input value={input} onChange={(e) => setInput(e.target.value)} />
      {status === 'streaming' ? (
        <button type="button" onClick={stop}>
          Stop
        </button>
      ) : (
        <button type="submit">Send</button>
      )}
    </form>
  )
}
Live playground · runs in your browser
Basic chat

Streaming chat with a mock adapter.

Tip

The adapter is a pure function — no singletons, no side effects. Safe to create per-request.

⚡ Performance

useChat batches stream chunks on requestAnimationFrame. That's why fast token streams still render smoothly — the UI never gets spammed with re-renders faster than the browser paints.

Explore nearby

✎ Edit this page on GitHub·Found a problem? Open an issue →·How to contribute →