TapTechNews August 9th news, Alibaba Tongyi Qianwen Qwen2 open source family welcomes a new member Qwen2-Math, which has three versions of 1.5 billion parameters, 7 billion parameters and 72 billion parameters. It is a language model specifically for solving math problems built based on Qwen2 LLM.
Qwen2-Math is a series of language models specifically for solving math problems built based on Qwen2 LLM. Its math ability significantly exceeds that of open source models and even surpasses closed source models (such as GPT-4o). The official hopes to contribute to solving advanced math problems that require complex multi-step logical reasoning for the scientific community.
The team evaluated our math-specific model Qwen2-Math on a series of math benchmark evaluations. The evaluation results on Math show that its largest math-specific model Qwen2-Math-72B-Instruct outperforms the most advanced models, including GPT-4o, Claude-3.5-Sonnet, Gemini-1.5-Pro and Llama-3.1-405B.
TapTechNews learned from the report that the new model series Qwen2-Math focuses on math ability and currently only supports English. The team plans to launch a bilingual model supporting English and Chinese and develop multilingual models.