Interface LanguageModelChatRequestOptions

Options for making a chat request using a language model.

interface LanguageModelChatRequestOptions {
    justification?: string;
    modelOptions?: {
        [name: string]: any;
    };
    toolMode?: LanguageModelChatToolMode;
    tools?: LanguageModelChatTool[];
}

Properties

justification?: string

A human-readable message that explains why access to a language model is needed and what feature is enabled by it.

modelOptions?: {
    [name: string]: any;
}

A set of options that control the behavior of the language model. These options are specific to the language model and need to be lookup in the respective documentation.

Type declaration

  • [name: string]: any

The tool-selecting mode to use. LanguageModelChatToolMode.Auto by default.

An optional list of tools that are available to the language model. These could be registered tools available via lm.tools, or private tools that are just implemented within the calling extension.

If the LLM requests to call one of these tools, it will return a LanguageModelToolCallPart in LanguageModelChatResponse.stream. It's the caller's responsibility to invoke the tool. If it's a tool registered in lm.tools, that means calling lm.invokeTool.

Then, the tool result can be provided to the LLM by creating an Assistant-type LanguageModelChatMessage with a LanguageModelToolCallPart, followed by a User-type message with a LanguageModelToolResultPart.