Trending topics
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
Andrew Ng just revealed why the AI companies throwing the most compute at the problem are going to lose.
The winner of the intelligence race won’t use the most compute.
They’ll waste the least.
Ng: “Most of your high-dimensional data lies on a lower-dimensional subspace. It’s just a fact of life.”
Here’s what that means in practice.
You have a 10,000-dimensional dataset.
Every dimension dragged through every calculation.
Every training cycle hauling dead weight the model will never use.
Ng: “You’re carrying around these 10,000-dimensional examples throughout your whole training process.”
That bloat isn’t just inefficient.
It’s a tax on every computation you run.
Memory bandwidth. Network bandwidth. Computational speed.
All of it eaten by dimensions that contribute nothing to intelligence.
They contribute noise.
The insight that separates the architects from the arms race: that 10,000-dimensional dataset is almost entirely captured by a much smaller subspace.
...
Top
Ranking
Favorites
