This is a OpenAI news story, published by Yahoo Finance (Canada), that relates primarily to Qwen2-Math news.
For more OpenAI news, you can click here:
more OpenAI newsFor more Qwen2-Math news, you can click here:
more Qwen2-Math newsFor more Ai research news, you can click here:
more Ai research newsFor more news from Yahoo Finance (Canada), you can click here:
more news from Yahoo Finance (Canada)Otherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
Qwen2 LLMs. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest latest LLMs news, Alibaba Group Holding news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
AlibabaSouth China Morning Post
•87% Informative
Alibaba Group Holding launches a group of maths-specific large language models (LLMs) Qwen2-Math , which the e-commerce giant claims can outperform the capabilities of OpenAI 's GPT-4o in that field.
The new models still have some limitations owing to their " English -only support" The plan to shortly release bilingual models, with multilingual LLMs also in the development pipeline.
VR Score
92
Informative language
95
Neutral language
11
Article tone
formal
Language
English
Language complexity
58
Offensive language
not offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
medium-lived
External references
20
Source diversity
4
Affiliate links
no affiliate links