The CEO of OpenAI just compared his own technology to a law of physics and not in the way you think. He compared deep learning to finding a new element on the periodic table. Sam Altman stood on stage in India and said the core ideas that make AI models so capable will eventually be simplified and well known. And understood the same way we understand gravity or electromagnetism today. He pointed to the scaling laws OpenAI published around seven years ago as the moment everything changed. There was a measurable, almost perfect correlation between the resources poured into a model and the intelligence that came out of it. He called that discovery hair raising, the realization that intelligence could be manufactured on a predictable curve, like a law of physics. And then he said the part that matters most. "Eventually this recipe will be well understood as a scientific principle." That means the hundred billion dollar moats being built right now around proprietary models are sitting on top of a truth that will eventually belong to everyone. You cannot patent the laws of physics or trademark a fundamental property of nature. When the science simplifies and Altman says it will open-source teams, sovereign governments and garage startups will all be cooking from the same playbook. At the same event he told India's prime minister that AI had gone from doing high school math to producing research level mathematics and novel results in theoretical physics in a single year. The technology racing toward artificial general intelligence is not a trade secret guarded inside a San Francisco vault. It is a discovery about the universe itself and discoveries, once made, spread. ...