While Nvidia's customers are ramping up internal AI chip development efforts, investors need to look at the broader picture.
Nvidia (NVDA -10.01%) The company is benefiting greatly from the surge in demand for artificial intelligence (AI) applications that began in late 2022. Cloud computing giants are lining up to use the company's data center graphics processing units (GPUs) for large-scale training and powering. Language Model (LLM).
for meta platform (meta -4.13%), microsoft, Amazon, and many others, Nvidia is the go-to provider of AI chips. It's notable that these tech giants are willing to wait up to a year between order and delivery to source Nvidia chips, and they pay top dollar for the chips.other chip manufacturers intel and Advanced Micro Devices (AMD -5.44%) By some estimates, Nvidia controls a whopping 95% of the AI chip market.
As a result, Nvidia's revenue and revenue grew rapidly. But some customers are making a concerted effort to reduce their reliance on the company's chips.
In-house production of AI chips
Nvidia's success in the AI GPU market is thanks to the A100 processor it launched in 2020. The graphics chip specialist built this GPU for high-performance computing applications, and it was manufactured using a 7-nanometer (nm) process node. OpenAI has reportedly deployed thousands of his A100 chips to train ChatGPT.
Interestingly, near the end of 2021, rival AMD began offering a competing data center accelerator, MI250X, built on the 6nm process node. However, third-party estimates report that the A100 performed better than newer AMD chips on LLM training training tasks.
And in 2022, Nvidia upped its game with the H100 processor built on a custom 5nm process. The company packed 80 billion transistors into its chips, compared to 54 billion for the A100. The H100 turned out to be significantly more powerful than its predecessor. AMD, on the other hand, took until the end of 2023 to launch its next competing chip, the MI300.
This explains why Nvidia's H100 was in such high demand last year, driving the company's data center revenue to $47.5 billion in fiscal 2024, up from $15 billion in the previous year. Meta was the only big buyer to do so, although he paid billions of dollars to Nvidia for his H100 purchase.
However, the lack of viable alternatives to the H100, high price, and low availability have led some of Nvidia's top customers to launch in-house AI chip development efforts to reduce their dependence on chipmakers. It explains why. For example, Meta Platforms recently announced the second generation of its proprietary AI chips built on a 5nm process node.
According to Meta, the new chip “more than doubles the compute and memory bandwidth of previous solutions while maintaining tight integration with workloads.” It is designed to efficiently provide a ranking and recommendation model that provides ”
Additionally, Meta plans to continue its in-house chip development program aimed at reducing operating and development costs for AI servers.
Something similar is happening at Microsoft. The tech giant will announce two custom AI chips towards the end of 2023, one of which is a 5nm AI accelerator called Maia 100. This AI chip has 105 billion transistors and is capable of LLM training and inference.
Amazon is also pursuing the path of developing its own AI chips. The company announced its latest product, Trainium2, in November, which the company claims is four times more powerful than its predecessor. Amazon Web Services customers have the option to use these chips to train their AI models. meanwhile, alphabet has jumped on the bandwagon with the newly unveiled Axion Custom AI Processor.
Given that Meta, Microsoft, Google, and Amazon were among the top buyers of H100 processors last year, their focus on developing their own chips is a threat to the semiconductor giant's bottom line. There is no doubt about it.
However, investors need to focus on the big picture
While it's true that Nvidia's customers want to reduce their dependence on Nvidia, the fact remains that they are expected to continue purchasing Nvidia's powerful GPUs. For example, when Nvidia announced the launch of its next-generation Blackwell AI GPUs last month, all of the companies mentioned above said they would introduce new chips when they become available.
That's not surprising. Nvidia's upcoming GPUs are expected to be significantly more powerful, allowing customers to train even larger LLMs. The chipmaker claims that Blackwell GPUs can perform LLM “at up to 25x less cost and energy consumption than previous generations.” Given that these new GPUs are likely to be priced competitively compared to his H100, Nvidia customers will be able to get more out of their AI hardware investments by using Blackwell processors. You will be able to earn profits.
As a result, demand for Nvidia's AI chips is likely to remain strong. Another reason Nvidia can remain a dominant player in his AI chip market is his control of Nvidia's supply chain. Nvidia's customers and rivals are eyeing the foundry giant TSMC However, NVIDIA is reportedly consuming 60% of TSMC's advanced chip packaging capacity.
Of course, TSMC is looking to increase production capacity to meet demand from Nvidia and other customers, but given that it already has a huge lead in the AI chip market, GPU specialists are looking to build additional production capacity at the foundry. likely to secure the largest share of volume. .
Therefore, even if other big tech companies continue their chip development efforts, Nvidia is likely to remain the top player in AI chips for quite some time. Japanese investment bank Mizuho estimates that Nvidia could sell $280 billion worth of AI chips in 2027, as the overall market is expected to reach $400 billion. As such, Mizuho expects NVIDIA's share of the AI chip market to decline over the next three years, but expects data center revenue to increase significantly.
So even if the company loses some market share, Nvidia's data center revenue is likely to continue growing at a healthy pace thanks to the long-term growth opportunity in AI chips. So investors don't need to worry too much about Nvidia's customer chip development moves. Rather, you should view the recent share price decline as an opportunity to buy more stock, considering the company is riding on an impressive catalyst.
Alphabet executive Suzanne Frye is a member of The Motley Fool's board of directors. John Mackey, former CEO of Amazon subsidiary Whole Foods Market, is a member of the Motley Fool's board of directors. Randi Zuckerberg is a former head of market development and spokesperson at Facebook, sister of Meta Platforms CEO Mark Zuckerberg, and a member of the Motley Fool's board of directors. Harsh Chauhan has no position in any stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Meta Platforms, Microsoft, Nvidia, and Taiwan Semiconductor Manufacturing. The Motley Fool recommends Intel and recommends the following options: Long January 2025 $45 Calls on Intel, Long January 2026 $395 Calls on Microsoft, Short January 2026 $405 Calls on Microsoft, and $47 May 2024 Calls on Intel. It's a short call. The Motley Fool has a disclosure policy.