Post-WW2 America was never an empire. You can keep repeating the phrase "American empire" over and over -- and many people do -- but that was never what it was. And calling it that will just make it much harder to understand what happened.