the unique id for this language model. It will be used to identify the model in the UI.
the unique model name as used in the Ollama environment.
Optionalproxy: stringOptionalreasoningSupport: ReasoningSupportProtected ReadonlyDEFAULT_ProtectedhostReadonlyidthe unique id for this language model. It will be used to identify the model in the UI.
Protected Readonlymodelthe unique model name as used in the Ollama environment.
ReadonlyproviderOptionalproxyOptionalreasoningReadonlyvendorProtectedcheckCheck if the Ollama server supports thinking.
Use the Ollama 'show' request to get information about the model, so we can check the capabilities for the 'thinking' capability.
The Ollama client instance.
The name of the Ollama model.
A boolean indicating whether the Ollama model supports thinking.
ProtecteddispatchOptionalcancellation: CancellationTokenOptionalreasoning: ReasoningSettingsProtectedgetRetrieves the settings for the chat request, merging the request-specific settings with the default settings.
The language model request containing specific settings.
A partial ChatRequest object containing the merged settings.
ProtectedgetProtectedhandleOptionalcancellation: CancellationTokenOptionalreasoning: ReasoningSettingsProtectedhandleOptionalcancellation: CancellationTokenOptionalreasoning: ReasoningSettingsProtectedhandleProtectedinitializeOptionalcancellationToken: CancellationTokenProtectedrequiresChecks if the model requires effort levels instead of a boolean for think.
ProtectedtoProtectedtoProtectedto
See also VS Code
ILanguageModelChatMetadata