Switch to offline mode
If you want AI Assistant to work locally, you can switch from using predefined cloud-based models to custom local models provided by third parties. In this case, most cloud model calls will be blocked, and all AI-related features will rely on the local models instead.
Before switching to offline mode, you need to set up custom models from third-party AI providers, which will be used to process your requests instead of cloud-based ones.
Click the
JetBrains AI widget located in the toolbar in the window header.
Hover over the Offline Mode option and click Set Up Models.
Alternatively, you can go to
.In the Third-party AI providers section, select your LLM provider, specify your local host URL, and click Test Connection.
Once you have configured your third-party AI provider, specify the local models to be used for core and lightweight features in the Local models section.
Core features – this model will be used for in-editor code generation, commit message generation, chat responses, and other core features.
Instant helpers – this model will be used for lightweight features, such as name suggestion.
Enable the Offline mode setting if you want to switch to offline mode right away.
Click Save Changes.
Once you have finished the setup, you can toggle offline mode on and off whenever applicable:
Click the
JetBrains AI widget located in the toolbar in the window header.
Hover over the Offline Mode option and click Enable or Disable.