At Huawei Connect 2025, open-source AI development emerged as a pivotal focus, with Huawei unveiling detailed plans and technical insights aimed at releasing its entire AI software ecosystem to the public by the end of the year.
Addressing Developer Challenges Head-On
Eric Xu, Huawei’s Deputy Chairman and Rotating Chairman, began his keynote with a candid reflection on the difficulties developers have encountered with the Ascend infrastructure. Highlighting the impact of the DeepSeek-R1 launch earlier this year, Xu remarked, “From January through April, our AI research teams collaborated intensively to ensure that the inference performance of our Ascend 910B and 910C chips met evolving customer demands.”
He further acknowledged ongoing feedback from users, stating, “Our clients have shared numerous concerns and expectations regarding Ascend, along with valuable suggestions.” This openness about past hurdles sets the stage for Huawei’s comprehensive open-source commitments announced at the August 5 Ascend Computing Industry Development Summit and reiterated at Huawei Connect.
For developers who have faced issues related to documentation gaps, ecosystem maturity, or usability, this transparent admission signals Huawei’s recognition of the divide between Ascend’s technical potential and its practical application. The forthcoming open-source strategy is clearly designed to bridge these gaps by fostering community involvement, transparency, and collaborative enhancement.
Deep Dive into CANN: Compiler and Virtual Instruction Set
Central to Huawei’s open-source pledge is CANN (Compute Architecture for Neural Networks), the critical middleware that connects AI frameworks with Ascend hardware. Xu detailed, “We will open interfaces for the compiler and virtual instruction set within CANN, while fully open-sourcing other software components.”
This nuanced approach differentiates between fully open-source elements and those where Huawei will expose interfaces but retain proprietary implementations. The compiler and virtual instruction set-key layers translating high-level AI code into hardware-executable instructions-will have open interfaces. This transparency allows developers to understand and optimize compilation processes for Ascend chips, even if the compiler’s core remains partially closed.
Such openness is vital for performance tuning, especially in latency-sensitive applications or scenarios demanding peak hardware efficiency. While full open-source access would permit developers to modify or replace the compiler, Huawei’s strategy balances transparency with protection of proprietary technology.
The timeline is firm: “By December 31, 2025, we will open source and provide open access to CANN based on the current Ascend 910B/910C architecture.” This clarifies that the release targets existing hardware rather than upcoming chip designs.
Mind Series: Empowering Developers with Open Toolchains
Beyond CANN, Huawei has pledged to fully open-source the Mind series application enablement kits and toolchains by the end of 2025. These tools encompass the SDKs, libraries, debugging utilities, profilers, and other resources developers rely on daily to build AI applications.
Unlike the tiered openness of CANN, the Mind series will be entirely open-source, allowing developers to inspect, modify, and extend the toolchain. This could lead to enhanced debugging capabilities, optimized libraries tailored to specific workloads, and more user-friendly interfaces-all driven by community innovation rather than vendor-only updates.
However, specifics about which tools are included, supported programming languages, and the depth of documentation remain to be clarified. Developers will need to evaluate the completeness and usability of these resources once they become available.
OpenPangu Foundation Models: Huawei’s Entry into Open AI Models
Huawei also committed to open-sourcing its OpenPangu foundation models, positioning itself alongside other open-source AI model initiatives such as Meta’s LLaMA and Mistral AI. This move invites community collaboration in developing and refining large-scale AI models.
Details on OpenPangu’s architecture, parameter size, training datasets, and licensing terms have yet to be disclosed. Critical questions remain about commercial usage rights, potential biases, fine-tuning capabilities, and redistribution policies. These factors will heavily influence the models’ adoption and utility.
Open-source foundation models offer developers a valuable starting point for domain-specific AI applications without the prohibitive costs of training from scratch. The quality, flexibility, and documentation accompanying OpenPangu will determine its competitiveness in the growing open AI ecosystem.
Seamless Operating System Integration
One notable practical advancement announced at Huawei Connect 2025 is the full open-sourcing of the UB OS Component, which manages SuperPod interconnects at the OS level. This component can be integrated into upstream open-source operating systems like openEuler, offering significant flexibility.
Users can choose to incorporate parts or the entirety of the UB OS Component into their existing Linux distributions-such as Ubuntu or Red Hat Enterprise Linux-either as embedded modules or plug-ins. This modular design reduces the need for migrating to Huawei-specific OS environments, lowering barriers to adoption.
However, this flexibility places the onus of maintenance, testing, and updates on the integrating organizations. Huawei provides the component as open-source code rather than a fully supported product for arbitrary Linux distros, making it best suited for teams with strong Linux expertise.
Framework Compatibility: Supporting Familiar AI Tools
Huawei’s strategy emphasizes compatibility with widely used AI frameworks to ease developer transition. The company has prioritized support for open-source communities such as PyTorch and vLLM, enabling developers to innovate without abandoning familiar tools.
PyTorch compatibility is particularly crucial given its dominance in AI research and production. If developers can run standard PyTorch code efficiently on Ascend hardware with minimal modifications, it significantly lowers the barrier to experimentation and deployment.
The integration with vLLM targets optimized inference for large language models, addressing a high-demand use case as organizations increasingly deploy LLM-based applications where inference speed and cost are critical.
Nonetheless, the depth and completeness of these integrations remain unclear. Partial compatibility or performance limitations could hinder adoption, making the quality of framework support a key factor in Huawei’s platform success.
Countdown to December 31, 2025: What to Expect
With just a few months until the December 31 deadline, Huawei appears well underway in preparing for the open-source release. This includes cleaning codebases, drafting documentation, finalizing licenses, and setting up repository infrastructure.
The initial quality of the release will heavily influence community engagement. Open-source projects lacking thorough documentation, practical examples, or mature tooling often struggle to attract contributors despite strong technical foundations.
Developers will require comprehensive learning materials and clear progression paths from basic examples to production-ready deployments. The December launch marks a starting point rather than a conclusion.
Long-term success depends on sustained investment in community management, issue resolution, pull request handling, documentation upkeep, and roadmap planning. Huawei’s commitment to ongoing support will determine whether Ascend’s open-source ecosystem flourishes or stagnates.
Unresolved Questions and Governance Considerations
Despite detailed commitments, several critical aspects remain unspecified. The choice of open-source licenses will shape how the software can be used commercially and whether derivative works must also be open-sourced.
Permissive licenses like Apache 2.0 or MIT encourage broad commercial use and proprietary derivatives, while copyleft licenses such as GPL impose stricter sharing requirements. Huawei has yet to disclose which licenses will govern the December releases.
Governance structures are also unclear. Key questions include whether an independent foundation will oversee development, if external maintainers will have commit rights, how feature prioritization will be handled, and whether community contributions will be transparently managed.
These governance factors often determine whether open-source projects attract vibrant external participation or remain vendor-controlled with limited community influence.
Preparing for Evaluation: The Next Six Months
Developers and organizations interested in Huawei’s open-source AI platform have a critical window over the next three months to prepare. This includes assessing workload compatibility with Ascend hardware and readying teams for potential adoption.
The December 31 release will provide tangible resources-codebases, documentation, examples, and toolchains-for hands-on evaluation. Subsequent months will reveal community engagement levels through issue reporting, contributions, and ecosystem development.
By mid-2026, clearer patterns should emerge regarding the platform’s viability as a community-driven project versus a primarily vendor-led initiative. This six-month period will be crucial for deciding whether to commit significant time and resources to Huawei’s open-source AI ecosystem.