This is really what we're allowing the West to turn into? Who in their right minds thinks that any of this is conducive to our societies? England, a great nation and the once global hegemon, a Christian country-looks like this? And the United States, the greatest country in the history of the world, the global empire, a Christian nation-is following suit. Has to stop across the board.