@agentskit/adapters
20+ LLM chat + embedder adapters, plus router / ensemble / fallback. Swap providers with one import line.
When to reach for it
- You need to talk to a hosted LLM (Anthropic, OpenAI, Gemini, Mistral, Cohere, Groq, Together, Fireworks, OpenRouter, Hugging Face, …).
- You want local-only (Ollama, LM Studio, vLLM, llama.cpp).
- You want to compose multiple candidates (
createRouter,createEnsembleAdapter,createFallbackAdapter). - You want to test agents without hitting a real LLM (
mockAdapter,recordingAdapter,replayAdapter).
Install
npm install @agentskit/adaptersHello world
import { anthropic } from '@agentskit/adapters'
const adapter = anthropic({ apiKey: process.env.ANTHROPIC_API_KEY!, model: 'claude-sonnet-4-6' })Surface
Hosted: anthropic · openai · gemini · grok · deepseek · kimi · mistral · cohere · together · groq · fireworks · openrouter · huggingface · langchain · langgraph · vercelAI · generic
Local: ollama · lmstudio · vllm · llamacpp
Embedders: openaiEmbedder · geminiEmbedder · ollamaEmbedder · deepseekEmbedder · grokEmbedder · kimiEmbedder · createOpenAICompatibleEmbedder
Higher-order: createRouter · createEnsembleAdapter · createFallbackAdapter
Testing: mockAdapter · recordingAdapter · replayAdapter · inMemorySink · simulateStream · chunkText · fetchWithRetry
Recipes
Related
Source
npm: @agentskit/adapters · repo: packages/adapters
@agentskit/core
The shared contract layer for AgentsKit: TypeScript types, the headless chat controller, stream helpers, and building blocks used by @agentskit/react, @agentskit/ink, @agentskit/runtime, and adapt
@agentskit/runtime
Standalone agent runtime. ReAct loop, durable execution, multi-agent topologies, speculative execution, background agents.