AI Assistant Help

Models

Use this page to set up custom models to use in AI Assistant features and configure local models to use in offline mode.

AI Assistant models settings

Third-party AI providers

Item

Description

Enable LM Studio

Specify the URL to connect to the local LM Studio service.

For additional information, refer to Connect AI Assistant chat to your local LLM.

Enable Ollama

Specify the URL to connect to the local Ollama service.

For additional information, refer to Connect AI Assistant chat to your local LLM.

Local models

Item

Description

Core features

Select the custom local model that must be used for in-editor code generation, commit message generation, chat responses, and other core features.

Instant helpers

Select the custom local model that must be used for lightweight features, such as name suggestion.

Offline mode

Enable this setting to prioritize local models over cloud-based ones.

For additional information, refer to Switch to offline mode.

Last modified: 14 April 2025