Moore Threads and Teacher AI Complete Large Model Training

According to TapTechNews on June 14, TapTechNews learned from the official WeChat account of Moore Threads that Moore Threads and the all-subject educational AI large model Teacher AI jointly announced that the two sides have completed the training and testing of the large model.

Teacher AI was established in 2020, and its core model team is from Tsinghua University, with its headquarters located in Haidian District, Beijing. Since the opening of the internal test of the all-subject educational large model, Teacher AI has over 25,000 users and supports more than 30 discipline knowledge and more than 2,000 textbooks.

Moore Threads stated that relying on Moore Threads' Kuae (KUAE) kilocalorie intelligent computing cluster, Teacher AI completed the high-intensity training and testing of its 7-billion-parameter large model. The entire training process took one week, and the training efficiency met the expectations. The two sides will also carry out adaptation work in the large model inference to better deal with the high-frequency inference requirements brought about by the knowledge update and content upgrade of the educational AI large model.

According to a previous report by TapTechNews, Moore Threads announced on May 27 that it completed the 3B (300-million) scale large model MT-infini-3B training based on the domestic all-functional GPU kilocalorie cluster with Wuxin Qiong, achieving the first domestic GPU end-to-end AI large model training in the industry.

Likes