agentskit.js

VLLMConfig

Auto-generated API reference for VLLMConfig.

Interface: VLLMConfig

Defined in: vllm.ts:3

#Extends

  • OpenAICompatibleConfig

#Properties

#apiKey

apiKey: string

Defined in: openai.ts:5

#Inherited from

OpenAICompatibleConfig.apiKey


#baseUrl?

optional baseUrl?: string

Defined in: openai.ts:7

#Inherited from

OpenAICompatibleConfig.baseUrl


#includeUsage?

optional includeUsage?: boolean

Defined in: openai.ts:16

Ask the provider to include token usage in the final stream chunk via stream_options: \{ include_usage: true \}. Off by default because some OpenAI-compatible providers (OpenRouter proxies to a long tail of backends) reject unknown params with a 4xx and break the whole stream. Turn this on for vanilla api.openai.com.

#Inherited from

OpenAICompatibleConfig.includeUsage


#model

model: string

Defined in: openai.ts:6

#Inherited from

OpenAICompatibleConfig.model


#retry?

optional retry?: RetryOptions

Defined in: openai.ts:8

#Inherited from

OpenAICompatibleConfig.retry

Explore nearby

✎ Edit this page on GitHub·Found a problem? Open an issue →·How to contribute →

On this page