Nvidia dominated 2024 big-time. Next year? Plenty of challenges.
24 Desember 2024
9 4 minutes read
Nvidia (NVDA) has had the kind of year most companies can only dream of.
Its revenue and stock price soared thanks to prescient investments in artificial intelligence technologies that are paying off handsomely on the back of the generative AI wave.
That’s not all. It’s repeatedly swapped places with Apple (AAPL) as the largest publicly traded company in the world by market cap, topping the $3 trillion mark. CEO Jensen Huang has become one of the most in-demand executives in Silicon Valley, meeting with everyone from fellow tech luminaries to world leaders and then some.
And there’s more to come. The company is ramping up production of its high-powered Blackwell chip for AI applications and expects to ship several billion dollars worth of the hardware in the fourth quarter alone, with far more expected throughout the year ahead.
“Nvidia really has the [hardware and software] for the AI computing era,” Futurum Group CEO Daniel Newman told Yahoo Finance. “It’s all connected inside the [server] rack, outside the [server] rack, and then the software is very well … liked within the developer communities.”
But the competition isn’t sitting idly by.
Companies like AMD (AMD) are angling to poach Nvidia’s customers and slice into its estimated 80% to 90% market share. Even Nvidia’s own customers are working on chips meant to cut down on their reliance on the graphics giant’s semiconductors.
And Wall Street is getting on board.
Shares of Broadcom (AVGO), which works with companies like Google (GOOG, GOOGL) to design AI chips, are up 113% year to date and rocketed 44% in just the last month after CEO Hock Tan said AI could represent a $60 billion to $90 billion opportunity for the company in 2027 alone.
Still, taking on Nvidia will be a tough task for any company. And dethroning it as the AI king, at least in 2025, will be all but impossible.
Nvidia grabbed a first-mover advantage in the AI market on the back of early investments in AI software that unlocked its graphics chips to be used as high-powered processors. And it’s managed to hold onto that lead in the space thanks to continued advances in its hardware, as well as its Cuda software that allows developers to build apps for its chips.
Because of that, so-called hyperscalers, massive cloud computing providers including Microsoft (MSFT), Alphabet’s Google, Amazon (AMZN), Meta (META), and others continue to plow cash into buying up as many Nvidia chips as possible. In its most recent quarter, Nvidia reported total revenue of $35.1 billion. Of that, $30.8 billion, or 87% came from its data center business.
“Everybody wants to build and train these huge models, and the most efficient way to do it is with CUDA software and Nvidia hardware,” TECHnalysis Research president and chief analyst Bob O’Donnell told Yahoo Finance.
Nvidia is expected to continue to power the bulk of the AI industry in 2025 as well. The company’s Blackwell chip, the successor to its popular Hopper line of processors needed to power AI applications, is in production — and its customers, like Amazon, are already adding new cooling capabilities to their data centers to handle the immense heat the processors generate.
“I don’t know what the current backlog [for Nvidia’s chips is], but if it’s not a year, it’s close to a year,” O’Donnell said. “So, they’re pretty much sold out for most of everything they’re probably going to make next year already.”
With hyperscalers calling for increased or at least the same level of capital expenditures in 2025 as in 2024, you can expect a chunk of that will end up going to the purchase of Blackwell chips.
While Nvidia will retain control of the AI crown, there’s no shortage of challengers looking to take its throne. AMD and Intel (INTC) are the top contenders among chipmakers, and both have products on the market. AMD’s MI300X line of chips is designed to tackle Nvidia’s H100 Hopper chips, while Intel has its Gaudi 3 processor.
AMD is better positioned to steal market share from Nvidia, though, as Intel continues to struggle amid its turnaround efforts and hunt for a new CEO. But even AMD is having a difficult time cracking Nvidia’s lead.
“What AMD needs to do is make software really usable, build the systems where there’s more demand …with developers, and ultimately, that could create more sell through,” Newman said. “Because these cloud providers are going to sell what their customers ask for.”
It’s not just AMD and Intel, though. Nvidia’s customers are increasingly developing and pushing their own AI chips. Google has its Broadcom-based tensor processing unit chips (TPUs), while Amazon (AMZN) has its Trainium 2 processor and Microsoft (MSFT) has its Maia 100 accelerator.
There’s also concern that the shift to “inferencing AI models” will reduce the need for high-powered Nvidia chips.
Tech companies develop AI models by training them on huge amounts of data, otherwise referred to as the training process. Training requires incredibly powerful chips and lots of energy. Inferencing, or actually putting those AI models to work, is less resource- and power-intensive. As inferencing becomes a larger part of AI workloads, the thinking goes, companies will back away from needing to purchase so many Nvidia chips.
Huang has said he is prepared for this, explaining at various events that Nvidia’s chips are just as good at inferencing as they are at training.
Even if Nvidia’s market share slides, it doesn’t necessarily mean its business will be doing any worse than before.
“This is definitely a case of raising all boats,” Newman said. “So even with much stronger competition, which I think they certainly will have, that doesn’t mean they’re going to fail. This is people building a bigger pie.”
Email Daniel Howley at dhowley@yahoofinance.com. Follow him on Twitter at @DanielHowley.