Implements

  • LanguageModel

Constructors

  • Parameters

    • id: string

      the unique id for this language model. It will be used to identify the model in the UI.

    • model: string

      the unique model name as used in the Ollama environment.

    • host: (() => undefined | string)
        • (): undefined | string
        • Returns undefined | string

    Returns OllamaModel

Properties

DEFAULT_REQUEST_SETTINGS: Partial<Omit<ChatRequest, "stream" | "model">> = ...
host: (() => undefined | string)

Type declaration

    • (): undefined | string
    • Returns undefined | string

id: string

the unique id for this language model. It will be used to identify the model in the UI.

model: string

the unique model name as used in the Ollama environment.

providerId: "ollama" = 'ollama'
vendor: string = 'Ollama'

Methods

  • Parameters

    • ollama: Ollama
    • ollamaRequest: ExtendedChatRequest
    • structured: boolean
    • Optional cancellation: CancellationToken

    Returns Promise<LanguageModelResponse>

  • Retrieves the settings for the chat request, merging the request-specific settings with the default settings.

    Parameters

    • request: LanguageModelRequest

      The language model request containing specific settings.

    Returns Partial<ChatRequest>

    A partial ChatRequest object containing the merged settings.

  • Parameters

    • ollama: Ollama
    • chatRequest: ExtendedChatRequest
    • Optional prevResponse: ChatResponse

    Returns Promise<LanguageModelResponse>

  • Parameters

    • request: LanguageModelRequest
    • Optional cancellationToken: CancellationToken

    Returns Promise<LanguageModelResponse>