In the past couple of days, Vibe Coding created a useless little tool, casually named slug2text, and here’s the link: The function is very simple: it can help you losslessly compress large chunks of text into a single link. Just save this link, and when you open it again, the original text can be automatically decompressed and restored. The compression rate depends on the scale and redundancy of the text, but from my tests, it can handle the length of a thesis without any issues. However, it is limited by the length of the URL, so you can't fit too much content. The main reason for creating this was out of two curiosities: 1. How extreme of a compression scheme can GPT-5.2-Pro provide? 2. How is GPT-5.2's programming ability in the Codex environment? Conclusion: I can't assess the compression rate, but according to its own evaluation, it seems to be quite good. I didn't encounter any issues in the engineering implementation; Codex compiled successfully on the first try.