Robinhood tried to tokenize private equity without asking the companies first, and now Vlad Tenev has to explain why that seemed like a good idea.

The Signal

A year ago, Robinhood announced plans to let retail traders buy tokenized shares of private companies like OpenAI and SpaceX. The pitch was pure Web3 promise: democratize access to hot pre-IPO deals that VCs have locked up for decades. But something went sideways between the press release and reality. The companies themselves pushed back hard, saying they never agreed to participate in this tokenization scheme.

That's the fascinating part. Robinhood apparently believed it could create synthetic exposure to private equity without getting buy-in from the actual companies. This isn't like creating a derivative or an ETF that tracks public securities. Private company shares have transfer restrictions baked into their operating agreements. They're private precisely because the founders and boards want control over who owns pieces of their company.

Now Tenev is back on Bloomberg explaining "where things stand," which usually means explaining why things didn't go as planned. The company is also pivoting into prediction markets, the other hot retail trading category. Kalshi and Polymarket proved Americans will bet on literally anything if you give them a slick interface and call it "forecasting."

What Robinhood actually built here is a case study in Web3's central tension. You can't tokenize ownership without the owner's permission. The technology is easy. The legal and social infrastructure is not. Every RWA tokenization project hits this wall eventually.

The Implication

Watch how Robinhood threads this needle going forward. If they succeed in getting private companies to voluntarily tokenize, that's a real unlock for retail access to pre-IPO deals. If they can't, this whole effort becomes another example of crypto solutions searching for problems. The prediction markets play is safer, legally speaking, but also more crowded. Either way, Robinhood is testing the actual boundaries of what tokenization can do when legacy power structures say no.


Source: Bloomberg Tech