the unique id for this language model. It will be used to identify the model in the UI.
the unique model name as used in the Ollama environment.
Optional
tokenUsageService: TokenUsageServiceProtected
Readonly
DEFAULT_Protected
hostReadonly
idthe unique id for this language model. It will be used to identify the model in the UI.
Protected
Readonly
modelthe unique model name as used in the Ollama environment.
Readonly
providerProtected
Optional
Readonly
tokenReadonly
vendorProtected
checkCheck if the Ollama server supports thinking.
Use the Ollama 'show' request to get information about the model, so we can check the capabilities for the 'thinking' capability.
The Ollama client instance.
The name of the Ollama model.
A boolean indicating whether the Ollama model supports thinking.
Protected
dispatchOptional
cancellation: CancellationTokenProtected
getRetrieves the settings for the chat request, merging the request-specific settings with the default settings.
The language model request containing specific settings.
A partial ChatRequest object containing the merged settings.
Protected
handleOptional
cancellation: CancellationTokenProtected
handleOptional
cancellation: CancellationTokenProtected
handleProtected
initializeOptional
cancellationToken: CancellationTokenProtected
toProtected
toProtected
to
See also VS Code
ILanguageModelChatMetadata