Alibaba Cloud’s Qwen family of AI models has captured more than 50 percent of global open-source model downloads as of March 2026, reaching nearly one billion cumulative downloads on the Hugging Face developer platform. The milestone, reported by U.S.-based research newsletter Interconnects AI, places Qwen far ahead of rivals including Meta’s Llama and China’s own DeepSeek.
A Dominance Built on Small Models
The research attributed Qwen’s commanding position to the widespread adoption of its smaller-parameter models — those under 10 billion parameters — which allow developers to freely customize and deploy AI at low cost. In February 2026 alone, Qwen generated 153.6 million downloads, more than double the combined total of the next eight major players.
The shift toward Qwen’s dominance began in September 2024 with the release of Qwen 2.5, after which Chinese models began consistently leading global open-source download charts. Since Alibaba first open-sourced Qwen in 2023 — becoming the first major Chinese technology company to publicly release a homegrown large language model — the family has expanded to nearly 400 models supporting 119 languages and dialects, with more than 200,000 derivative models created by the global developer community.
A Pivot Toward Revenue
Even as Qwen’s open-source footprint continues to grow, Alibaba has signaled a shift in priorities. The company has consolidated its AI units under a new group called Alibaba Token Hub, raised cloud service prices by as much as 34 percent, and begun releasing closed, API-only model versions that do not include publicly available weights. CEO Eddie Wu has confirmed that Qwen’s consumer interface has surpassed 300 million monthly active users, making the stakes in this transition especially high.
With AI roles now making up more than 80 percent of Alibaba’s open job positions, the company is clearly placing its long-term bets on turning its AI stack into a recurring revenue engine — even as it works to preserve the developer trust that got it here.