Granola's "private by default" notes are visible to anyone with a link and training their AI unless you dig through settings to opt out.
The Summary
- Granola, an AI meeting note-taker, defaults to link-shareable notes and uses your content for AI training unless users manually opt out
- The gap between "private by default" marketing and actual default settings exposes how agent economy companies are defining privacy
- Users storing meeting notes with clients, internal strategy, or competitive intel may have exposed sensitive data without realizing it
The Signal
The Granola privacy gap reveals the fundamental tension in the agent economy: these tools only get smarter by eating your data, but most users assume "private" means actually private. Granola's model is textbook growth-stage AI company playbook. Integrate with calendar. Capture audio. Generate structured output. Use that output to train better models. The value loop is clean. But the disclosure loop is murky.
"Private by default" in Granola's case means notes live in your account, not that they're access-controlled. Anyone with a link can view them. That's not private in any traditional sense. It's unlisted. The distinction matters because unlisted content can leak through forwarded links, compromised email accounts, or simple human error. If you're taking notes on customer calls, M&A discussions, or product roadmaps, unlisted is nowhere near private enough.
The AI training opt-out is buried in settings. This is the pattern now: companies know users want privacy, so they claim it in marketing, then configure defaults to maximize data collection. Granola uses your notes for internal AI training to improve their models. That's reasonable if disclosed clearly. But when the disclosure is friction and the data collection is frictionless, the power dynamic is obvious.
This is bigger than one note-taking app. Every AI agent touching business workflows faces this trade-off. Better models need training data. Users need privacy. The companies threading this needle honestly, with opt-in defaults and clear disclosure, will build trust. The ones hiding behind "private by default" while defaulting to data collection will eventually face backlash or regulation or both.
The Implication
If you use Granola, go to settings now and change your defaults. If you build AI tools, understand that user trust is a moat and erosion happens fast. The agent economy runs on data, but it can't run on betrayed expectations. Companies that treat "private" as marketing fluff rather than engineering requirement are building on sand. Watch for the backlash here. It's coming.
Source: The Verge AI