The CEO of Grammarly's parent company sat down for an interview about the time his product cloned journalists without asking, and the conversation got uncomfortable.

The Summary

  • Superhuman (formerly Grammarly) shipped an "Expert Review" feature that cloned journalists' personas for AI writing suggestions without permission, sparking a class action lawsuit
  • The company killed the feature after backlash, but CEO Shishir Mehrotra still showed up to defend the thinking behind it
  • This is what happens when AI companies treat human expertise as training data instead of asking first

The Signal

Grammarly launched Expert Review in August 2024, letting users get writing feedback from AI versions of real journalists and writers. The Verge, Julia Angwin, and others discovered their names and expertise had been productized without consent. Angwin filed a class action lawsuit. Grammarly responded with an email opt-out, then pulled the feature entirely.

What makes this worth watching: this wasn't some scrappy startup that didn't know better. Grammarly is a mature product with millions of users. Mehrotra ran product at YouTube and sits on Spotify's board. These are people who understand platform dynamics, creator economics, and IP rights. They still shipped a feature that treated human expertise as extractable resource.

The interviewer notes the conversation "got tense" and that they "disagree about how extractive AI feels for people." That phrase, extractive AI, is the whole game right now. Every AI company is making a bet about what they can take without asking. Some are scraping the open web. Some are cloning voices. Grammarly cloned professional reputations.

The pattern: ship first, apologize later, claim you're learning. But when you're building products that impersonate real people, "learning" means using someone's career as your beta test.

The Implication

If you're a journalist, writer, or knowledge worker whose expertise is legible to AI, expect more of this. Companies will clone your style, your analysis, your reputation, and call it innovation. The legal system is catching up (see Angwin's lawsuit), but until then, the default posture is extraction. Watch for opt-out schemes that put the burden on you to protect what's yours. The better companies will ask first. Most won't.


Source: The Verge AI