Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
Lisa Huang started building the AI assistant for Meta Ray-Ban smart glasses in 2019. She had to convince the team that the AI assistant would become the most important feature on the glasses. Not everyone agreed.
That zero-to-one process surfaced constraints that pure software PMs never face. Weight. Battery life. Privacy. Bystander concerns about a camera on someone's face. And the fact that Luxottica, a fashion company, doesn't operate like a Silicon Valley engineering team. The amount of engineering complexity packed into something that still needs to look like a pair of sunglasses is staggering.
The biggest technical question: cloud or on-device processing?
Cloud is the default today. But Lisa made a prediction on this episode that I think is worth paying attention to. She believes the vast majority of AI for AR will eventually run on-device. Her reasoning: once you're wearing a device on your face all day, capturing what you see and hear, people are going to want that data staying local. As models get smaller and more efficient, the technical barriers keep dropping.
This maps to a broader pattern playing out across AI hardware. Apple is investing heavily in on-device models. The new wave of AI phones is pushing more processing to the edge. Privacy is becoming a product feature, not just a compliance checkbox.
The lesson Lisa drew for any PM building AI features, in any context: deeply understand the technology, but don't fall in love with it. The best products live at the intersection of what the user actually needs and what the technology can reliably do today. Build fast. See what users do. Update your assumptions. Repeat.
Top
Ranking
Favorites
