Read Product data collection notice for more details.
Only if you explicitly allow it. If you choose to share data with us, this data is used exclusively to improve JetBrains tools and train our own models, like Mellum, and it is never shared with third parties.
For companies that are unwilling or, for legal reasons, unable to opt in to the program, detailed data collection is disabled by default, and their admins remain in full control.
Admin settings in JetBrains Accounts:
Levels of protection:
For individuals using JetBrains IDEs with commercial licenses, free trials, free community licenses, or EAP builds who do not explicitly consent to the new data collection model, we do not collect detailed code-related data.
If you’re using a non-commercial license, detailed code‑related data collection is enabled by default – you can disable it in the IDE settings via Settings | Appearance & Behavior | System Settings | Data Sharing.
You can configure your data sharing settings separately for each JetBrains IDE and each machine you work on, with further customized configuration possible for non-commercial and other types of licenses. However, it is important to note that organization-level data sharing settings take precedence over user-level settings.
To protect your privacy, we don’t link detailed data sharing settings to your JetBrains Account, so you need to adjust preferences for each device and each JetBrains product you use.
Users of non-commercial license who opted out of detailed data sharing on one machine will need to opt out again on any new machines they add to their network. JetBrains ensures that clear and timely notifications of this requirement are displayed to users when additional machines are added.
JetBrains is also considering implementing project-level data sharing settings.
Yes. Any data you share will be handled responsibly, in full compliance with all EU data protection laws. This data is used exclusively to improve JetBrains tools and train our own models, like Mellum, and it is never exposed to third parties.
No. We will never sell your data or share it with third parties. It is used only for product analytics, model evaluation, and training our own models.
Your contributions directly improve the tools you use every day. Data helps us detect unsafe code, reduce false positives, improve completions, and make AI features adapt better to real-world workflows.
Anonymous telemetry tells us how features are used, but not how AI behaves on real tasks. Detailed data (like edit history and AI prompts and responses) shows where the AI succeeds or fails, which is critical for making it more reliable for professional development.
We use filtering, anonymization, and strict access controls to avoid using sensitive information for LLM training. Our processes are continuously audited, and we treat your code as intellectual property that must remain protected.