The Double 11 shopping festival ended on November 11. Credit: Alibaba””post-190439″”> Alibaba released its new reasoning model QwQ-32B on March 6 and made it open-source. It has 32 billion parameters. QwQ32B is smaller than DeepSeekR1, which has 6,710 trillion parameters (with 3.7 million active). However, it matches its performance on various benchmarks. QwQ 32B outperformed OpenAI’s o1 mini and distilled versions DeepSeek R1 in math and coding test. It also scored higher in some evaluations, such as LiveBench and IFEval. The model integrates agent capabilities to support critical thinking and adaptive reasoning. QwQ 32B is a model that requires less computing power and can be deployed on consumer-grade hardware. This release is in line with Alibaba’s AI Strategy, which includes significant investment in cloud and AI Infrastructure. Alibaba’s US shares rose 8.61%, to $141.03, while Hong Kong shares rose over 7%.[Jiemian (in Chinese)