World

China Trained a 1-Trillion-Parameter LLM Using Only Domestic Chips – Slashdot

“China Telecom, one of the largest wireless carriers in mainland China, says that it has developed two large language models (LLMs) relying solely on domestically manufactured AI chips…” reports Tom’s Hardware.
“If the information is accurate, this is a crucial milestone in China’s attempt at becoming independent of other countries for its semiconductor needs, especially as the U.S. is increasingly tightening and banning the supply of the latest, highest-end chips for Beijing in the U.S.-China chip war.”
Huawei, which has mostly been banned from the U.S. and other allied countries, is one of the leaders in China’s local chip industry… If China Telecom’s LLMs were indeed fully trained using Huawei chips alone, then this would be a massive success for Huawei and the Chinese government.

The project’s GitHub page “contains a hint about how China Telecom may have trained the model,” reports the Register, “in a mention of compatibility with the ‘Ascend Atlas 800T A2 training server’ — a Huawei product listed as supporting the Kunpeng 920 7265 or Kunpeng 920 5250 processors, respectively running 64 cores at 3.0GHz and 48 cores at 2.6GHz. Huawei builds those processors using the Arm 8.2 architecture and bills them as produced with a 7nm process.”

The South China Morning Post says the unnamed model has 1 trillion parameters, according to China Telecom, while the TeleChat2t-115B model has over 100 billion parameters.

Thanks to long-time Slashdot reader hackingbear for sharing the news.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button