agentskit.js

WebLlmConfig

Auto-generated API reference for WebLlmConfig.

Interface: WebLlmConfig

Defined in: webllm.ts:14

Browser-only adapter backed by WebLLM (https://github.com/mlc-ai/web-llm). Models run on-device via WebGPU; no network for inference. The MLCEngine is loaded lazily on first stream so apps can ship the import without paying the wasm cost up front.

@mlc-ai/web-llm is an optional peer dependency β€” install it alongside this package when you opt into browser-only inference.

#Properties

#engine?

optional engine?: WebLlmEngineLike

Defined in: webllm.ts:21

Override the engine to inject a pre-loaded one (the MLCEngine spin-up is non-trivial β€” apps usually warm it once, not per turn).


#model

model: string

Defined in: webllm.ts:16

Model id from MLC's catalog, e.g. Llama-3.1-8B-Instruct-q4f16_1-MLC.


#onProgress?

optional onProgress?: (info) => void

Defined in: webllm.ts:23

Engine progress callback (model download / compile percent).

#Parameters

info
progress

number

text

string

#Returns

void

Explore nearby

✎ Edit this page on GitHubΒ·Found a problem? Open an issue β†’Β·How to contribute β†’

On this page