Google's Gemini just got the feature ChatGPT shipped 18 months ago, and that tells you everything about who's building the agent layer and who's playing catch-up.
The Summary
- Google launched "notebooks" for Gemini, letting users organize files, conversations, and instructions in project-specific containers that provide context to the AI
- This is functionally identical to ChatGPT's Projects feature from 2024, down to the core value proposition
- The real tell: Google frames these as "personal knowledge bases shared across Google products," which sounds like an integration strategy they're still figuring out
The Signal
Google positioned notebooks as persistent context containers where Gemini can reference files, past conversations, and custom instructions while you work. The pitch is solid. Project-based AI context actually matters when you're trying to build anything that requires memory across sessions.
But here's what matters: ChatGPT shipped Projects in 2024. OpenAI spent the last year iterating on persistent context while Google was still deciding whether Gemini should be a product or a brand strategy. This isn't about one feature. It's about velocity.
The only interesting angle is Google's hint about syncing notebooks "across Google products." If that means your Gemini notebook pulls context from Docs, Sheets, and Drive without you manually feeding it, that's actually useful infrastructure. That would be playing to Google's home field advantage. But the announcement doesn't say that. It says "starting in Gemini," which reads like they haven't built the rest yet.
The agent economy runs on context. Whoever builds the best context layer wins the orchestration layer. Right now, that race isn't close.
The Implication
If you're building on AI infrastructure, watch what ships, not what gets announced. Google has distribution and data advantages that could matter once they figure out product velocity. Until then, the builders shipping agents that remember, learn, and act across sessions aren't waiting for Google to catch up. They're using what works today.
Source: The Verge AI