AI Assistant Help

Supported LLMs

AI Assistant supports a variety of advanced cloud-based LLMs, as well as the option to use locally hosted models. This flexibility allows you to choose the most suitable model for your specific task. For example, you might want to use large models for complex codebase-related tasks, compact models for quick responses, or local models if you prefer to keep your data private.

Cloud-based models

The table below lists models available for selection in AI Assistant:

Model

Capabilities

Model context window

Anthropic   Claude 4.5 Sonnet

 

200k

Anthropic   Claude 4.5 Haiku

200k

Anthropic   Claude 4.1 Opus

 

200k

Anthropic   Claude 4 Sonnet

200k

Anthropic   Claude 3.7 Sonnet

200k

Anthropic   Claude 3.5 Sonnet

200k

Anthropic   Claude 3.5 Haiku

200k

Google   Gemini 2.5 Pro

1M

Google   Gemini 2.5 Flash

1M

Google   Gemini 2.5 Flash-Lite

1M

Google   Gemini 2.0 Flash

1M

OpenAI   GPT-5

 

400k[1]

OpenAI   GPT-5 mini

 

400k[1]

OpenAI   GPT-5 nano

 

400k[1]

OpenAI   GPT-4.1

1M

OpenAI   GPT-4.1 mini

1M

OpenAI   GPT-4.1 nano

1M

OpenAI   GPT-4o

128k

OpenAI   o1

 

200k

OpenAI   o3

 

200k

OpenAI   o3-mini

200k

OpenAI   o4-mini

 

200k

Supported models history

The following table lists AI models that have been available in AI Assistant, along with the plugin versions in which they were introduced or removed.

Model

Added in version

Removed in version

Anthropic   Claude 4.5 Sonnet

2025.2.x

Anthropic   Claude 4.5 Haiku

2025.2.x

Anthropic   Claude 4.1 Opus

2025.2.x

Anthropic   Claude 4 Sonnet

251.26094.80.x

Anthropic   Claude 3.7 Sonnet

243.23654.270

Anthropic   Claude 3.5 Sonnet

243.23654.270

Anthropic   Claude 3.5 Haiku

243.23654.270

Google   Gemini 2.5 Pro

251.23774.42.x

Google   Gemini 2.5 Flash

251.23774.42.28.x

Google   Gemini 2.5 Flash-Lite

251.26094.80.19

Google   Gemini 2.0 Flash

243.23654.270

Google   Gemini 1.5 Pro

2024.3

251.23774.42.28.x

Google   Gemini 1.5 Flash

2024.3

251.23774.42.28.x

OpenAI   GPT-5

2025.2

OpenAI   GPT-5 mini

2025.2

OpenAI   GPT-5 nano

2025.2

OpenAI   GPT-4.1

251.23774.42.28.x

OpenAI   GPT-4.1 mini

251.23774.42.28.x

OpenAI   GPT-4.1 nano

251.23774.42.28.x

OpenAI   GPT-4o

2024.2

OpenAI   o4-mini

251.23774.42.28.x

OpenAI   o3

251.23774.42.28.x

OpenAI   o3-mini

243.23654.270

OpenAI   o1

243.23654.270

OpenAI   o1-mini

243.23654.270

251.23774.42.28.x

Local models

AI Assistant supports a selection of models available through Ollama, LM Studio, and other OpenAI-compatible endpoints like llama.cpp or LiteLLM. These models are optimized for local use, enabling powerful AI capabilities without the need for cloud access.

The default model context window for local models is set to 64 000 tokens. If needed, you can adjust this value in the settings.

For more information about setting up local models, refer to Use local models.

23 October 2025