The frictionless AI future everyone's building toward might be the thing that breaks us.
The Summary
- University of Toronto psychologists argue in Communications Psychology that AI making tasks too easy strips out the cognitive struggle that actually makes us smarter and gives work meaning.
- "Desirable difficulties" in learning, work friction, relationship negotiation: all the hard parts we're automating away are where humans actually develop skills and find purpose.
- The piece asks what happens when AI prioritizes perfect outputs over the messy, effortful process that builds competence and connection.
The Signal
This isn't another hand-wringing piece about AI taking jobs. It's sharper than that. Emily Zohar and her co-authors at University of Toronto are pointing at something most people building AI tools haven't seriously considered: what if the friction we're removing is load-bearing?
The paper centers on "desirable difficulties," a concept from cognitive psychology showing that struggle during learning actually strengthens memory and understanding. When you wrestle with a problem, make mistakes, iterate, and eventually crack it, you don't just get the answer. You build mental models that transfer to new situations. You develop judgment. Remove that struggle with instant AI-generated solutions, and you get the output without the formation.
This maps directly onto the agent economy we're racing to build. Every productivity tool promises to "handle the grunt work" so humans can "focus on strategy." But what if the grunt work is where junior analysts learn to spot patterns? What if the tedious back-and-forth of relationship building is where trust actually forms? The authors worry we're designing systems that optimize for outcomes while gutting the processes that make humans capable and work meaningful.
The timing matters. We're at the inflection point where AI agents can genuinely produce polished work products, hold extended conversations, write functional code. The technology finally matches the promise. But Zohar's crew is asking whether "frictionless" is even the right design goal. They're not arguing against AI. They're arguing for AI that preserves what psychologists call "effortful engagement," the cognitive load that isn't waste, it's weight training for the brain.
The Implication
If you're building AI tools or deploying them at scale, the question isn't just "can this automate the task?" It's "should this specific friction be removed?" Not all difficulty is desirable, but some clearly is. The challenge is designing systems that eliminate drudgery while preserving the struggle that builds competence. Watch for this lens to reshape product design as the psychological costs of frictionless AI become harder to ignore.
Source: IEEE Spectrum AI