Class: VLLM
Unified language model interface
Extends
Constructors
new VLLM()
new VLLM(
params
):VLLM
Parameters
• params: VLLMParams
Returns
Overrides
Defined in
packages/providers/vllm/dist/index.d.ts:16
Properties
additionalChatOptions?
optional
additionalChatOptions:OpenAIAdditionalChatOptions
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:240
additionalSessionOptions?
optional
additionalSessionOptions:Omit
<Partial
<ClientOptions
>,"apiKey"
|"maxRetries"
|"timeout"
>
Inherited from
OpenAI
.additionalSessionOptions
Defined in
packages/providers/openai/dist/index.d.ts:244
apiKey?
optional
apiKey:string
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:241
lazySession()
lazySession: () =>
Promise
<LLMInstance$1
>
Returns
Promise
<LLMInstance$1
>
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:245
maxRetries
maxRetries:
number
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:242
maxTokens?
optional
maxTokens:number
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:239
model
model:
string
&object
|ChatModel
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:236
temperature
temperature:
number
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:237
timeout?
optional
timeout:number
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:243
topP
topP:
number
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:238
Accessors
metadata
Get Signature
get metadata():
LLMMetadata
&object
Returns
LLMMetadata
& object
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:252
session
Get Signature
get session():
Promise
<LLMInstance$1
>
Returns
Promise
<LLMInstance$1
>
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:246
supportToolCall
Get Signature
get supportToolCall():
boolean
Returns
boolean
Inherited from
Defined in
packages/providers/openai/dist/index.d.ts:251
Methods
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<ChatResponseChunk
<ToolCallLLMMessageOptions
>,any
,any
>>
Get a chat response from the LLM