agentskit.js
Data layerProviders

Local runtimes

Run fully offline — Ollama, LM Studio, vLLM, llama.cpp.

AdapterImportDefault URL
Ollamaollamahttp://localhost:11434
LM Studiolmstudiohttp://localhost:1234/v1
vLLMvllmhttp://localhost:8000/v1
llama.cppllamacpphttp://localhost:8080

Usage

import { ollama } from '@agentskit/adapters'

const adapter = ollama({ model: 'llama3.2', url: 'http://localhost:11434' })
✎ Edit this page on GitHub·Found a problem? Open an issue →·How to contribute →

On this page