South Korea's First Open-Source AI Model EXAONE 3.0 Launched by LGAIResearch

TapTechNews August 12th news, LGAIResearch announced on August 7th the launch of South Korea's first open-source AI model EXAONE 3.0, marking South Korea's entry into the global AI field dominated by American tech giants and emerging enterprises in China and the Middle East.

The EXAONE 3.0 open-source model is based on the Decoder-only Transformer architecture, with 7.8 billion parameters, and the amount of training data (tokens) is 8 trillion, which is a bilingual model for English and Korean.

South Koreas First Open-Source AI Model EXAONE 3.0 Launched by LGAIResearch_0

In LG's press release, it was stated: Among the EXAONE 3.0 language model lineup built for various purposes, the 7.8 billion instruction-tuned model is being open-sourced in advance so that it can be used for research. We hope the release of this model can help domestic and foreign AI researchers conduct more meaningful research and help the AI ecosystem take a step forward.

The official test shows that the model's English proficiency reaches the world's top level, with the first average score in real use cases, surpassing models such as Llama 3.0 8B and Gemma 29B. In terms of mathematics and coding, the average score of EXAONE 3.0 also ranks first, and its reasoning ability is also relatively strong.

South Koreas First Open-Source AI Model EXAONE 3.0 Launched by LGAIResearch_1

South Koreas First Open-Source AI Model EXAONE 3.0 Launched by LGAIResearch_2

In the Korean aspect, the average score of both real use cases and single benchmarks of EXAONE 3.0 ranks first.

South Koreas First Open-Source AI Model EXAONE 3.0 Launched by LGAIResearch_3

LG claims that compared to the previous generation, the inference time of EXAONE 3.0 is reduced by 56%, the memory usage is reduced by 35%, and the operating cost is reduced by 72%; compared to the first released EXAONE 1.0, the cost is reduced by 6%.

South Koreas First Open-Source AI Model EXAONE 3.0 Launched by LGAIResearch_4

The model has been trained on 60 million professional data cases related to patents, codes, mathematics, and chemistry, and it is planned to expand to 100 million cases in various fields by the end of the year.

TapTechNews attaches the EXAONE 3.0 model link as follows:

https://huggingface.co/LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct

Likes