Ant Group Leverages Domestic Chips to Slash AI Training Expenses, Challenging Nvidia

According to a Bloomberg report citing sources familiar with the matter, Ant Group Co. has reportedly turned to Chinese-made semiconductors to cut AI training costs by 20%. Utilizing chips from Alibaba Group and Huawei Technologies Co., the firm developed models using the Mixture of Experts (MoE) approach, matching Nvidia Corp.’s H800 chip capabilities.

While still employing chips from Nvidia for some AI activities, the Jack Ma-backed company is increasingly favoring alternatives like Advanced Micro Devices Inc. (AMD) and local chips for its latest models. This transition aligns with the broader race between Chinese and U.S. firms in AI innovation, intensified by methods that rival the costly investment seen by OpenAI and Google’s Alphabet Inc.

Ant has released research suggesting its AI models at times surpass Meta Platforms Inc. in certain benchmarks, though these claims have not been independently verified by Bloomberg News.

Leading up to this, the company’s move underscores attempts by Chinese firms to leverage local semiconductor alternatives, as recent U.S. restrictions bar them from accessing advanced Nvidia processors like the H800.

Typically demanding high-performing GPUs like those from Nvidia, training costs have hindered smaller firms. Meanwhile, Ant’s strategy aims to develop large models without expensive GPUs, challenging Nvidia’s approach focused on building powerful processors for greater revenues.

On the other hand, Nvidia CEO Jensen Huang argues that, despite efficient models, computational demand will continue to grow, necessitating advanced chips rather than cheaper ones.

This philosophy supports Nvidia’s focus on enhancing its GPUs’ processing power, transistors, and memory capacity, contrasting with Ant’s goal to lower AI development costs while maintaining competitive model performance.