Theia API Documentation v1.65.0
    Preparing search index...
    interface OpenAiModelDescription {
        apiKey: undefined | string | true;
        apiVersion: undefined | string | true;
        deployment?: string;
        developerMessageSettings?:
            | "user"
            | "system"
            | "developer"
            | "mergeWithFollowingUserMessage"
            | "skip";
        enableStreaming: boolean;
        id: string;
        maxRetries: number;
        model: string;
        supportsStructuredOutput: boolean;
        url?: string;
    }
    Index

    Properties

    apiKey: undefined | string | true

    The key for the model. If 'true' is provided the global OpenAI API key will be used.

    apiVersion: undefined | string | true

    The version for the api. If 'true' is provided the global OpenAI version will be used.

    deployment?: string

    Optional deployment name for Azure OpenAI.

    developerMessageSettings?:
        | "user"
        | "system"
        | "developer"
        | "mergeWithFollowingUserMessage"
        | "skip"

    Property to configure the developer message of the model. Setting this property to 'user', 'system', or 'developer' will use that string as the role for the system message. Setting it to 'mergeWithFollowingUserMessage' will prefix the following user message with the system message or convert the system message to user if the following message is not a user message. 'skip' will remove the system message altogether. Defaults to 'developer'.

    enableStreaming: boolean

    Indicate whether the streaming API shall be used.

    id: string

    The identifier of the model which will be shown in the UI.

    maxRetries: number

    Maximum number of retry attempts when a request fails. Default is 3.

    model: string

    The model ID as used by the OpenAI API.

    supportsStructuredOutput: boolean

    Flag to configure whether the OpenAPI model supports structured output. Default is true.

    url?: string

    The OpenAI API compatible endpoint where the model is hosted. If not provided the default OpenAI endpoint will be used.