Theia API Documentation v1.68.0
    Preparing search index...

    An abstract chat agent that supports mode selection for selecting prompt variants.

    Agents extending this class define their available modes via modeDefinitions. The modes getter dynamically computes which mode is the default based on the current prompt variant settings. When a request is made with a specific modeId, that mode's prompt variant is used instead of the settings-configured default.

    Hierarchy (View Summary)

    Index

    Constructors

    Properties

    additionalToolRequests: ToolRequest<ToolInvocationContext>[]
    agentSpecificVariables: AgentSpecificVariables[]

    The list of local variable identifiers that can be made available to this agent during execution, these variables are context specific and do not exist for other agents.

    This array is primarily used for documentation purposes in the AI Configuration View to show which variables can be made available to the agent. Referenced variables are NOT automatically handed over by the framework, this must be explicitly done in the agent implementation or in prompts.

    chatToolRequestService: ChatToolRequestService
    contentMatchers: ResponseContentMatcher[]
    defaultContentFactory: DefaultResponseContentFactory
    defaultLanguageModelPurpose: string
    description: string

    A markdown description of its functionality and its privacy-relevant requirements, including function call handlers that access some data autonomously.

    functions: string[]

    The list of global function identifiers that are always available to this agent during execution, regardless of whether they are referenced in prompts.

    This array is primarily used for documentation purposes in the AI Configuration View to show which functions are guaranteed to be available to the agent. Referenced functions are NOT automatically handed over by the framework, this must be explicitly done in the agent implementation.

    iconClass: string
    id: string

    Used to identify an agent, e.g. when it is requesting language models, etc.

    This parameter might be removed in favor of name. Therefore, it is recommended to set id to the same value as name for now.

    languageModelRegistry: LanguageModelRegistry
    languageModelRequirements: LanguageModelRequirement[]

    Required language models. This includes the purpose and optional language model selector arguments. See #47.

    languageModelService: LanguageModelService
    locations: ChatAgentLocation[]
    logger: ILogger
    modeDefinitions: Omit<ChatMode, "isDefault">[]

    Mode definitions without the isDefault property. Subclasses must provide their specific mode definitions. Each mode's id should correspond to a prompt variant ID.

    name: string

    Human-readable name shown to users to identify the agent. Must be unique. Use short names without "Agent" or "Chat" (see tags for adding further properties).

    prompts: PromptVariantSet[]

    The prompts introduced and used by this agent.

    promptService: PromptService
    systemPromptId: undefined | string
    tags: string[]

    A list of tags to filter agents and to display capabilities in the UI

    toolCallResponseContentFactory: ToolCallChatResponseContentFactory
    variables: string[]

    The list of global variable identifiers that are always available to this agent during execution, regardless of whether they are referenced in prompts.

    This array is primarily used for documentation purposes in the AI Configuration View to show which variables are guaranteed to be available to the agent. Referenced variables are NOT automatically handed over by the framework, this must be explicitly done in the agent implementation.

    Accessors

    Methods

    • Creates a ToolCallChatResponseContent instance from the provided tool call data.

      This method is called when parsing stream response tokens that contain tool call data. Subclasses can override this method to customize the creation of tool call response contents.

      Parameters

      Returns ChatResponseContent

      A ChatResponseContent representing the tool call.

    • Determines the effective variant ID, considering mode override. If modeId is provided and is a valid variant for the prompt set, it takes precedence. Otherwise falls back to settings-based selection.

      Parameters

      • OptionalmodeId: string

      Returns undefined | string

    • Returns undefined | { [key: string]: unknown }

      the settings, such as temperature, to be used in all language model requests. Returns undefined by default.

    • Invoked after the response by the LLM completed successfully.

      The default implementation sets the state of the response to complete. Subclasses may override this method to perform additional actions or keep the response open for processing further requests.

      Parameters

      Returns Promise<void>