
Nvidia Corp. NVDA CEO Jensen Huang said Wednesday that next-generation artificial intelligence models need “100 times more compute” than earlier versions, citing new reasoning approaches that process information step by step.
What Happened: “The amount of computation necessary to do that reasoning process is 100 times more than what we used to do,” Huang told CNBC following the company’s record-breaking fourth-quarter earnings report.
The chipmaker reported $39.3 billion in revenue for the quarter, up 78% year-over-year and exceeding analyst expectations of $38.05 billion. Data center revenue, which includes Nvidia’s market-leading GPUs for AI workloads, surged 93% to $35.6 billion, now representing over 90% of total revenue.
Huang specifically mentioned DeepSeek’s R1, OpenAI’s GPT-4 and xAI’s Grok 3 as models utilizing reasoning processes that require substantially more computing power. This comes despite a 17% stock drop in January triggered by concerns that DeepSeek had found ways to achieve better AI performance with lower infrastructure costs.
“DeepSeek was fantastic,” Huang said. “It was fantastic because it open-sourced a reasoning model that’s absolutely world class.”
See Also: Bitcoin Tumbles Below $85,000 Amid President Trump’s EU Tariff Threats
Why It Matters: The company has faced challenges in China due to export restrictions, with Huang noting that Nvidia’s percentage of revenue from China has fallen by about half. He added that Nvidia’s GB200 chip, sold in the United States, can generate AI content 60 times faster than versions sold to China under export controls.
Looking ahead, Nvidia projects first-quarter revenue of $43 billion, exceeding analyst expectations of $41.75 billion. Huang emphasized strong demand for the company’s new Blackwell AI supercomputers, which achieved “billions of dollars in sales in its first quarter.”
“Demand for Blackwell is amazing as reasoning AI adds another scaling law – increasing compute for training makes models smarter and increasing compute for long thinking makes the answer smarter,” Huang said.
Read Next:
Image Via Shutterstock
Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors.
Market News and Data brought to you by Benzinga APIs
© 2025 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.