Nvidia stock (NVDA) is on pace for a comfortable triple-digit percentage gain again in 2024, after a nearly 240% surge in 2023.
This year’s rise is due in no small part to a product that wasn’t even shipped until the last quarter of the year — Blackwell.
It’s the largest GPU (graphics processing unit) ever built, created by connecting two dies via a high bandwidth interface (HBI). In layperson’s terms, that translates to a lot of power at high efficiency, which is why it’s been in such demand from the so-called hyperscalers — companies like Alphabet (GOOG) and Microsoft (MSFT) that are building out huge data centers to power large language models (LLMs).
That hot demand for Blackwell and the 170% surge in Nvidia shares is what led Yahoo Finance to name the chip its 2024 Product of the Year.
Read more: Here’s why Walmart won the 2024 Yahoo Finance Company of the Year award
“Technically, Blackwell is a beast,” Matt Kimball, an analyst at Moor Insights & Strategy, told Yahoo Finance in an email. “Blackwell is an exponential leap forward as it is a dual GPU on a single chip with faster connectivity, a larger pool of high bandwidth memory (HBM3E), and the introduction of NVIDIA’s decompression engine to make data processing much faster (up to 6x relative to Hopper).”
First announced in March, Blackwell is in the right place at the right time. Data centers that power generative AI or LLMs are currently occupied with inputting data and training those models. Blackwell and its predecessor, Hopper, are well-suited for that work.
Blackwell represents an enormous advancement in power. It contains 208 billion transistors, more than two and a half times the number in Hopper.
Its GB200 NVL72 server, which combines 72 Blackwell GPUs with 36 Grace CPUs, clocks up to a 30x performance increase compared to the same number of Hopper GPUs for LLM inference workloads. It also uses up to 25x less energy.
“For us to leapfrog ourselves by an order of magnitude is pretty unheard of,” Dion Harris, director of accelerated data center, HPC (high-performance computing), and AI at Nvidia, said in a phone interview. “We were limited by physics, but we recognized that innovation with the HBI would allow us to extend the die-level communication and compute.”
US business spending on generative AI has sextupled in a year, going from $2.3 billion in 2023 to $13.8 billion in 2024, according to Menlo Ventures. And the trend is only going up, as major companies from banking and retail to tech and hospitality race to introduce advanced chatbots and assistants to their customers.