Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
A GPT-5.4 rumor was that the model can persist state.
Jeff Dean mentioned this on his @latentspacepod appearance, so clearly something the AI Labs are thinking about.
I'd bet there's a good chance they've discovered how to effectively integrate State-Space Models with Transformers at scale.
SSMs are designed to carry a hidden state forward with every computational step, and scale linearly rather than quadratically like Transformers.
Also backs up the rumor that GPT-5.4 will have a 2 million token context window.
Persistent state would essentially mean that AI models go from being Guy Pearce's character in "Memento" to Dustin Hoffman's character in "Rain Man" overnight.
It would give AI models true long-term memory.
It would be a massive breakthrough for real.
Top
Ranking
Favorites
