This is a Alibaba news story, published by South China Morning Post, that relates primarily to Zhou Chang news.
For more Alibaba news, you can click here:
more Alibaba newsFor more Zhou Chang news, you can click here:
more Zhou Chang newsFor more Ai research news, you can click here:
more Ai research newsFor more news from South China Morning Post, you can click here:
more news from South China Morning PostOtherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
Algorithm engineer Zhou Chang. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest Zhou news, Zhang news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
Alibaba CloudSouth China Morning Post
•74% Informative
Zhou Chang , who worked on the Tongyi Qianwenlarge language models (LLMs), has decided to leave Alibaba ’s cloud computing unit after seven years at the firm.
Zhou played a leading role in the development of the Tongy Qianwen LLMs that were released last year .
He was also a member of the team behind multimodal AI model M6 , released by Alibaba in 2021 .
VR Score
79
Informative language
81
Neutral language
67
Article tone
formal
Language
English
Language complexity
57
Offensive language
not offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
short-lived
External references
no external sources
Source diversity
no sources
Affiliate links
no affiliate links