"Nvidia's still a buy": As China's ChatGPT moment rattles tech sector

The 17% plunge in Nvidia shares overnight is 'wildly overdone' as it changes almost nothing around the fundamental growth outlook for the chip manufacturer, according to Theo Mass, portfolio manager at Northcape Capital.
The tech investing guru said China's Deepseek is a large language model (LLM) that focuses on the training phase of artificial intelligence (where a model scrapes existing data to produce search results), but the long-term growth market for AI remains the inference phase where trained models learn patterns to create predictions and offer solutions.
"My biggest point would be keep your eyes on the prize more than anything else," said Mr Maas. "The biggest game in town that these guys (mega-cap tech companies and US chip suppliers) are going after is still inference AI over the next five to 10 years."
Mr Maas said Northcape Capital still owns Nvidia shares and is bullish on its outlook. However, panicked investors wiped $US600 billion from its valuation on Monday around worries Deepseek will lower future demand for Nvidia's graphic processing units (GPUs).

What does Deepseek change for investors?
Deepseek claims it has built its LLM for less than $US6 million using around 2000 GPUs (versus ~100,000 in a US LLM), on less memory requirements and application programming interface (API) costs around 95 per cent less than large US rivals.
The open source code reveals it slashed costs and computing power needed by letting parts of it model sleep when not required to return search engine results.
AI Model functions | US version | Deepseek |
Estimated training costs | >$US100 million | $US6 million |
Estimated GPUs required | >100,000 | 2000 |
Estimated active parameters | 1.8 trillion | 671 billion |
API costs | 100% | 5% |
Source: Deepseek, LLM programs
However, Mr Maas doubted Deepseek's claims that it built the model for just $US6 million and insisted the Chinese model does little to impact the outlook of rising demand for Nvidia's GPUs to power next generation inference AI.
"Training has been a sizeable start of the AI phenomena, but five or 10 years down the road Deepseek will be a small start-up in the bigger picture," Mr Mass said. "Nvidia has said 40% of their revenue is already related to infererence AI so they're happy to see some competition and optimisation in these models. So I'm still bullish on Nvidia."
Both broker RBC Capital Markets and Mr Maas pointed to the likelihood Deepseek has copied some of the open source code available at Meta's Llama LLM and effectively piggy backed off the hundreds of billions of capital investments made by Silicon Valley's AI operators.
"It's called a distillation model and takes someone else's capex and opex and plays with it," Mr Mass said of Deepseek. "That's the way to understand it, so it doesn't take us further down the road to next-generation artificial general intelligence (AGI). The point where AI becomes so smart it outthinks us."
Chip makers led losses in the US overnight, with Advanced Micro Devices (NASDAQ: AMD) falling 6.7% and Broadcom (NASDAQ: AVGO) plunging 17.4% on fears that less computing power will be required to power AI systems.
RBC said this potential development is a positive for large software providers as their gross profit margins may lift given they would have to pay less for the vast amounts of computing power required to host their software-as-a-service models online.
RBC pointed to how the likes of Adobe (NASDAQ: ADBE), Crowdstrike (NASDAQ: CRWD), Datadog (NASDAQ: DDOG) and Workday (NASDAQ: WDAY) climbed through Wall Street's trading day as investors rushed to work out the impacts of cheaper computing power.
Australia's largest tech companies such as Canva, Wisetech, Xero and Pro Medicus all use a software-as-a-service model that potentially would also accrue small gross margin benefits from cheaper computer power costs.
"Driving down the cost of LLM usage (training, reasoning, and inferencing) is likely a positive development for software companies that are not tied to a specific LLM vendor," said RBC.
"Furthermore, lower cost APIs mean application software vendors can price their GenAI offerings meaningfully cheaper and drive greater volume, which may also mean less of a gross margin headwind."
5 topics
2 stocks mentioned