Huawei announced on Monday that it will open-source two of its Pange models and model inference technologies based on Ascend. The firm said it launched the initiative to help build its AI ecosystem and expand overseas.
The Chinese tech giant’s open-sourcing move aligns with other Chinese AI players pushing an open-source development strategy. On Monday, multinational AI technology company Baidu also open-sourced its large language model series Ernie.
The firm revealed the open-sourcing of its Pangu dense model with 7 billion parameters, the Pangu Pro MoE (Mixtures-of-Experts) model with 72 billion parameters, and its model inference technology based on Ascend, which serves as the platform for AI infrastructure.
The weights of the Pangu Pro MoW 72B model and the large-scale MoE model inference code based on Ascend have been open-sourced, while the inference code for the Pangy 7B will be launched on the open-source platform soon.
Paul Triolo, partner and senior vice president for China at advisory firm DGA-Albright Stonebridge Group, noted that in recent years, Huawei has transformed from a competent private sector telecommunications company into a tech giant straddling the entire AI hardware and software stack.
Huawei announced that the open-source initiative was another key measure of its Ascend ecosystem strategy. The firm argued that it would help speed up the adoption of artificial intelligence across multiple industries globally.
According to the tech company, its Ascend ecosystem refers to AI products developed around its Ascend AI chip series. The firm acknowledged that the chips are widely considered to be China’s leading competitor to AI products from Nvidia. The U.S. multinational tech corporation is limited from selling its advanced products to China.
Lian Jye Su, chief analyst at Omdia, argued that Pangu advancing to open-source allows developers and organizations to test the models and add personalized customizations for their needs. He also believes the initiative will incentivize the use of other Huawei products.
Su acknowledged that coupling Huawei’s Pangu models with its AI chips and related products gives the firm an advantage in optimizing its AI solutions and applications. Huawei has focused on specialized AI models in sectors like government, finance, and manufacturing, despite competitors like Baidu having LLMs with broad capacity.
“Huawei is not as strong as companies like DeepSeek and Baidu at the overall software level – but it doesn’t need to be. Its objective is to ultimately use open source products to drive hardware sales, which is a completely different model from others. It also collaborates with DeepSeek, Baidu, and others and will continue to do so.”
-Marc Einstein, Research Director at Counterpoint Research.
Principal analyst at Constellation Research Ray Wang compared Huawei’s chip-to-model strategy to that of Google, which is also developing AI chips and AI models like its open-source Gemma models.
Huawei’s latest announcement also aligns the company with its international ambitions, where it has been slowly integrating into new overseas markets, along with firms like Zhipu AI. The Chinese tech company also invited developers, corporate partners, and researchers globally to download and use its open-source products to gather and improve feedback.
Einstein believes Huawei’s open-source will resonate well in developing countries where companies are more price-sensitive, as with its other products. Huawei also wants to bring its latest AI data center solutions to new countries as part of its global strategy.
The firm announced its strategic roadmap for advanced data centers in Uzbekistan on May 20, designed to power artificial intelligence initiatives. President of Huawei Digital Power Middle East and Central Asia, Alex Xing, said the company’s framework for next-generation AI infrastructure will support the country’s $1.5 billion AI development strategy.
Uzbekistan’s Artificial Intelligence Development Strategy 2030, signed in October 2024, indicates the country’s comprehensive strategy to develop $1.5 billion in AI-based software and services. The tech giant believes the plan calls for creating high-performance server infrastructure designed for big data processing.
According to IDC, the initiative is projected to exceed $200 billion by 2028, with Datacenter IT Power Capacity in Asia/Pacific to reach 94.4 Gigawatts in 2028. Xing highlighted three major challenges facing AI data centers: higher reliability, faster rollout, and greater energy demand. He also emphasized that the AI era demands 10 times more power, 10 times more storage, and 10 times more efficiency.
Your crypto news deserves attention - KEY Difference Wire puts you on 250+ top sites