AI SkillsApril 23, 2026·4 min read

Your Jira Data Is Training Atlassian's AI — Unless You're Enterprise Tier

Atlassian will start training AI on Jira and Confluence data in August 2026. Most plans are opted in by default. Here's what that means for your team and what to actually do about it.

By Forge Team

Starting August 2026, Atlassian will train its AI models on your Jira tickets and Confluence pages. If your team runs on a Free, Standard, or Premium plan, you are opted in by default. Full opt-out requires Enterprise tier, which costs more than most teams will pay just for the control it provides.

What Atlassian announced

On April 17, Atlassian confirmed that AI training on customer data will begin in August 2026 across Jira, Confluence, and connected products. Free and Standard users have no opt-out mechanism. Premium users get limited controls. Enterprise customers can fully disable it. The announcement surfaced on Hacker News on April 19 and reached 530 points — a signal that it landed badly with people who actually read terms of service, even if most business users have not seen it yet.

This is not a data breach. Atlassian is not selling your information. But the training relationship means your sprint planning notes, product specs, engineering decisions, and internal documentation become part of the corpus that shapes how Atlassian's AI features respond — for your team and for everyone else on the platform.

What to think about Monday

Atlassian is a symptom of a wider pattern. Many of the SaaS tools teams use for work have added AI features in the last two years. Most of those features are opt-in. The training data policies behind them are often opt-out — or unavailable below certain plan tiers.

Teams rarely audit the tools they use daily for this. They click through updated terms during a routine software prompt and move on. The useful skill is not blanket suspicion of AI products. It is knowing what you have agreed to and making a deliberate call about whether it works for your team and your clients.

What it looks like when you haven't checked

A product manager at a 35-person SaaS startup tracks everything in Jira: sprint planning, bug backlogs, customer feature requests. Some of those feature requests come from enterprise clients under NDAs. The tickets are detailed — they name clients, describe workflows, quantify business problems. Nobody on the product team would share that documentation externally without clearing it with legal first.

Under Atlassian's updated policy, that backlog becomes training data in August unless the team moves to Enterprise tier or migrates to a different tool. The question is not whether to stop using Jira. It is whether the team knows this is happening, has decided it is acceptable, and has checked whether any client agreements create a conflict. A twenty-minute review with whoever owns data agreements at the company is what this actually requires.

Run a permissions audit on the AI tools your team uses.

A different version of the same problem

An HR director at a 180-person professional services firm uses Confluence to store the organization's policies, interview scorecards, performance review templates, and compensation bands. None of this is illegal to store there. Most of it is sensitive enough that she would not want it appearing in training data that improves Atlassian's AI features across its entire customer base.

Her immediate question should not be whether Atlassian will misuse the data. It should be whether her company's data governance policy covers this situation — and whether employees whose information appears in those documents would consider this within scope of what they agreed to when they joined. Those answers may well be yes. But discovering the answer in September, after training has already started, is the wrong time.

Assess the data risk before your tools share what you haven't reviewed.

The actual decision

List the tools your team uses for work. Check whether each trains AI on your data and what opt-out controls exist at your current plan tier. Then decide whether the cost of upgrading, migrating, or restricting what you document in those tools is worth the control it buys you.

For most teams, the current setup will be acceptable once they understand it — but that is a different conclusion from not knowing the setup exists.

Like this post?

Get the next one in your inbox. Practical AI skills, no filler.