With news like this, there should at least be a significant surge, right? Now, there really isn't much of a bubble. $MU's latest AI server memory, low power consumption (3 times energy-saving) with a high capacity of 256G, is developed in collaboration with NVIDIA for the next generation of server main memory AI chips, specifically addressing applications with long contexts. The AI memory market is shifting from competing on capacity to competing on energy efficiency, and Micron is directly addressing the pain point by opening up a new high-margin growth curve outside of HBM. Also, the concept of American manufacturing.