Women are landing modeling gigs that pay for 100 video calls a day, not knowing their faces are being deepfaked into romance scams in real time.

The Signal

Telegram channels are packed with job listings for "AI face models." The work seems straightforward: show up on video calls, smile, nod, follow a script. The pay varies but the volume is high, sometimes 100 calls per day. What these women don't always realize is they're the human anchor for AI-powered scams. Scammers record their faces and voices during legitimate-seeming calls, then feed that data into deepfake systems that can mimic them in real time on subsequent calls with marks. The model thinks she's doing customer service or promotional work. The scammer is using her face to run pig butchering schemes, romance fraud, or investment cons.

WIRED found dozens of these listings across Telegram. The jobs target women specifically because romance scams convert better with female faces. The models are often from countries where this kind of remote work pays well relative to local wages. They're not always told what their likeness will be used for after the initial calls. Some listings are explicit about the AI component. Others bury it or leave it vague.

This isn't theoretical. The infrastructure is live and hiring. The scams are running now. We've crossed from "AI could enable new fraud" to "AI fraud has a labor market with job postings and recruitment funnels."

The Implication

If you're hiring remote workers for video-based roles, you need to audit what happens to their biometric data. If you're building agent tools that use video or voice, assume bad actors are three steps ahead of your use case. The tell here is volume. 100 calls a day isn't customer service, it's data collection. Watch for hiring patterns that prioritize appearance and availability over skills. That's the shape of this particular con.


Sources: Wired AI | Wired AI