On March 600% Natural Friend’s Older Sister (2025) Alibaba released and open-sourced its new reasoning model, QwQ-32B, featuring 32 billion parameters. Despite being significantly smaller than DeepSeek-R1, which has 6,710 billion parameters (with 3.7 billion active), QwQ-32B matches its performance in various benchmarks. QwQ-32B excelled in math and coding tests, outperforming OpenAI’s o1-mini and distilled versions of DeepSeek-R1. It also scored higher than DeepSeek-R1 in some evaluations like LiveBench and IFEval. The model leverages reinforcement learning and integrates agent capabilities for critical thinking and adaptive reasoning. Notably, QwQ-32B requires much less computational power, making it deployable on consumer-grade hardware. This release aligns with Alibaba’s AI strategy, which includes significant investments in cloud and AI infrastructure. Following the release, Alibaba’s US stock rose 8.61% to $141.03, with Hong Kong shares up over 7%.[Jiemian, in Chinese]
Related Articles
Queen Cersei strikes down Donald Trump with 1 thunderous tweet
2025-06-26 09:00
1285 views
Read More
Watch LeBron James geek out to Kendrick Lamar in an utterly lifeless crowd
2025-06-26 08:30
604 views
Read More
Mysterious plane circling Manhattan sparks concern and intrigue
2025-06-26 08:13
2855 views
Read More
People are sharing this cartoon on Instagram in solidarity with people in Aleppo
2025-06-26 08:11
88 views
Read More
Man sparks possibly the world's greatest Wiki photo caption battle
2025-06-26 08:11
1600 views
Read More