Thrilled to share that Alibaba has 146 papers accepted at NeurIPS 2025, covering model training, datasets, foundational research, and inference optimization, one of the highest among tech companies! 🚀Our winning paper, "Gated Attention for Large Language Models: Non-linearity, Sparsity, and Attention-Sink-Free", is the first to systematically explore how attention gating impacts large model performance. Read more: #AlibabaAI