Ant Group developed AI model-training technology using Chinese chips, reducing cost by 20%. The fintech giant used chips from Alibaba and Huawei along with a machine-learning method called mixtures of experts (MoE), which divides tasks into subsets specialized for efficiency. MoE models are also used by Google, China’s DeepSeek and other corporations. Ant’s training performance is said to be similar to that of Nvidia H800 chips. However, it continues using Nvidia hardware along with alternatives such as AMD and domestic chips. The company wants to reduce its reliance on high-end GPUs. This is in contrast to Nvidia’s Jensen Huang, who believes that AI advancement requires more powerful chips and not cost-cutting. Huang advocates larger GPUs that have more processing cores in order to drive revenue growth.[Bloomberg]