Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
Right now we’re in the mindset that the models are changing so fast and are so large that the only way to rely on them is via roundtripping to the hyperscaler.
This is the dynamic that happened with web servers. HTML was changing so fast that apps written using it were improving faster than “client code” could ever be distributed in traditional ways. The problem was bandwidth was not increasing fast enough. So the constraint forced more compute to happen on servers. This in turn slowed down the advance of “web UI” for end-users. Like today’s chat interface is the most effective way to shuttle information back and forth to the model running on a hyperscaler.
Since natural language is notoriously imprecise and back-and-forth is a frustrating way to do work, it seems natural that more structured interfaces will make their way to using models. And with that the “runtime” for using models will gradually migrate to be on device in order to be faster and more efficient.
This is why today when you use a “web app” even though it round trips a lot, it also makes use of a very large “edge” runtime contained in the browser + the browser cache of executable code.
Top
Ranking
Favorites
