Cookbook
Streaming chat
useChat + abort + back-pressure. The minimum viable streaming chat, production-ready.
The shortest path to a real streaming chat with cancel support and smooth rendering.
Verified·controllerimport { useChat } from '@agentskit/react'
import { openai } from '@agentskit/adapters/openai'
const adapter = openai({ model: 'gpt-4o-mini' })
export function Chat() {
const { messages, input, setInput, send, stop, status } = useChat({ adapter })
return (
<form
onSubmit={(e) => {
e.preventDefault()
send(input)
setInput('')
}}
>
{messages.map((m) => (
<p key={m.id} data-role={m.role}>
{m.content}
</p>
))}
<input value={input} onChange={(e) => setInput(e.target.value)} />
{status === 'streaming' ? (
<button type="button" onClick={stop}>
Stop
</button>
) : (
<button type="submit">Send</button>
)}
</form>
)
}Live playground · runs in your browser
Basic chat
Streaming chat with a mock adapter.
Tip
The adapter is a pure function — no singletons, no side effects. Safe to create per-request.
⚡ Performance
useChat batches stream chunks on requestAnimationFrame. That's why fast token streams still render smoothly — the UI never gets spammed with re-renders faster than the browser paints.
Explore nearby
- PeerCookbook
Copy-paste recipes for the things every agent app needs. Each recipe stands on its own.
- PeerTools + memory together
The "chat with state and actions" loop — persistent memory plus tool execution.
- PeerAuth in tool calls
Scope tool execution to the current authenticated user. Never trust the model with who is calling.