Google has agreed to throttle power use at its AI data centers during grid stress events in parts of the U.S., marking one of the first major “demand-response” commitments in the AI era. As AI workloads surge, so does their appetite for electricity. This move hints at a new kind of balance: intelligence that scales responsibly with the grid that feeds it. The future of compute might not just be about power but knowing when to use less of it.