Please ensure Javascript is enabled for purposes of website accessibility Quick view: Insights from NVIDIA’s 2Q 2025 earnings - Janus Henderson Investors - Institutional Finland
For institutional investors in Finland

Quick view: Insights from NVIDIA’s 2Q 2025 earnings

Portfolio Manager Richard Clode shares the key findings for investors from NVIDIA’s latest earning results, which reflect continued solid growth for generative AI offerings.

Richard Clode, CFA

Richard Clode, CFA

Portfolio Manager


29 Aug 2024
5 minute read

Key takeaways:

  • While a repeat of the stellar revenue growth numbers of previous quarters is challenging, NVIDIA remains in the early innings of US$1 trillion worth of data centre infrastructure spending requisite to advance the development of generative AI.
  • Blackwell delivery concerns were allayed with the company confirming it is on track to deliver several billion dollars’ worth of its most advanced AI chips in the end January quarter.
  • NVIDIA’s competitive moat and pricing power have proved strong, evidenced by increasing sovereign demand, a significant rebound in China sales and a broadening out of the customer base beyond Big Tech.

NVIDIA continues to believe that it is still in the relatively early innings of a major refresh of around US$1 trillion worth of data centre infrastructure that needs to be accelerated by GPUs for modern data processing, notably focused on generative AI. The company is about to embark on their first major new product cycle of the post-ChatGPT era with the new Blackwell chip that is scheduled for mass production in the fourth quarter of 2024 – new product cycles have historically been a key driver of NVIDIA’s growth and earnings potential.

Confirmation of only a short delay to Blackwell was therefore a key takeaway for investors. NVIDIA’s competitive moat in AI training so far remains untouched. But as gen AI applications are scaled out, infrastructure spending will pivot to inferencing, where barriers to entry are typically lower. This will create opportunities for other companies to emerge as challengers to NVIDIA.

Meanwhile, the law of big numbers makes it more challenging for NVIDIA to continue to maintain its stellar revenue growth rates (and exceed market expectations), given the quantum of additional absolute dollars needed for capital spend each year. We believe the next generation infrastructure theme remains alive and well, as evidenced by NVIDIA’s Big Tech key customers during the recent results season. Looking ahead, investment opportunities are likely to broaden out beyond NVIDIA from 2025.

Key insights from NVIDIA’s 2Q fiscal year 2025 results:  

  • Blackwell delivery delay fears allayed

The key concern heading into the latest results announcement was the impact of some initial teething issues with the next generation Blackwell platform. Given the accelerated roadmap of an annual cadence of new product delivery, these initial issues were to be expected. Management confirmed a mask redesign to improve yields but no functional issues with product sampling to customers today. NVIDIA appears to be on track to deliver several billion dollars’ worth of Blackwell chips in the end January quarter. Blackwell is key to driving growth for NVIDIA into the next two years. Hence the confirmation of a minor delay was welcome. As the first new architecture to launch in the post ChatGPT era, Blackwell should meet the challenges of next generation large language models requiring 10-20x the compute power. It should also enable NVIDIA to capture more content in the data centre by offering a full-rack solution encompassing GPU, CPU, memory, networking and software. Having all those pieces of the puzzle under one roof remains NVIDIA’s competitive moat, particularly in AI training, facilitating 3-5x the AI throughput in the same data centre floorspace, an increasingly scarce commodity – Blackwell is an AI infrastructure platform, not just a GPU.

  • The enterprise AI wave has begun

Major deployments to date have been to the hyperscalers and internet companies. But NVIDIA now sees enterprises across verticals and geographies deploying AI infrastructure, which remains an early innings opportunity. Automotive companies training up next generation autonomous driving models, as well as healthcare companies were cited as key verticals spending several billion dollars each with NVIDIA – the company now has most of the Fortune 100 as customers.

  • Increasing sovereign demand

NVIDIA raised its revenue forecast from sovereigns (countries as customers) and related entities to low double-digit billions from  high single-digits previously. This is a new customer base that did not exist in the internet era. Countries are now realising that their data is their natural and national resource – they have to use AI, build their own AI infrastructure so that they could have their own digital intelligence.

  • China comeback

Historically, China comprised 20-25% of NVIDIA’s data centre sales, but that collapsed post US semiconductor export restrictions. The recent launch of a compliant new chip, the H20, led to a significant rebound in China sales, albeit still only back to around 12% of sales. In a recent discussion we had with a major Chinese internet company, we learned that while the Blackwell’s predecessor the H20 is not necessarily a better chip to local alternatives, when using thousands of them to train an AI model, NVIDIA’s proprietary networking really comes into its own, driving far superior system performance.

  • Mild normalisation of margins

NVIDIA gross margins peaked at almost 80% in the end April quarter; in contrast, management had guided for a normalisation. Even with a current new product, the H200, that is just reselling extra high bandwidth memory and heading into the ramp up of a low-yielding major new complex product like Blackwell, the company has still been able to maintain a mid-70% gross margin and a mid-60% operating margin. We think this shows that NVIDIA’s competitive moat and pricing power is likely to remain strong.

  • New US$50bn share buyback

NVIDIA generated US$13.5bn in free cashflow last quarter, and looks on track to generate over US$100bn in annual free cash flow in the next couple of years. Given that backdrop, the company’s board authorised a new US$50bn share buyback authority. This is reminiscent of other highly cash- generative big tech companies that have delivered sizeable shareholder returns, while also investing heavily in innovation and preserving their virtuous growth flywheel.

*Source of NVIDIA earnings information and announcements: https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-second-quarter-fiscal-2025

AI inferencing: refers to artificial intelligence processing. Whereas machine learning and deep learning refer to training neural networks, AI inference applies knowledge from a trained neural network model and uses it to infer a result.

CPU: the central processing unit is the control centre that runs the machine’s operating system and apps by interpreting, processing and executing instructions from hardware and software programmes.

Free cash flow: cash that a company generates after allowing for day-to-day running expenses and capital expenditure. It can then use the cash to make purchases, pay dividends or reduce debt.

Gen AI: generative AI refers to deep-learning models that train on large volumes of raw data to generate ‘new content’ including text, images, audio and video.

GPU: a graphics processing unit performs complex mathematical and geometric calculations that are necessary for graphics rendering and are also used in gaming, content creation and machine learning.

Hyperscalers: companies that provide infrastructure for cloud, networking, and internet services at scale. Examples include Google, Microsoft, Facebook, Alibaba, and Amazon Web Services (AWS).

Large language model: a specialised type of artificial intelligence that has been trained on vast amounts of text to understand existing content and generate original content.

Next generation infrastructure theme: the proliferation of digital devices, AI, and broader technology adoption is driving an exponential leap in compute power and storage requirements. This increases demand for low carbon, low latency, high reliability and secure infrastructure, addressed by more flexible and efficient cloud architecture, edge compute, 5G infrastructure and data security.

Share buyback: a company buying back its own shares from the market, thereby reducing the number of shares in circulation, with a consequent increase in the value of each remaining share. It increases the stake that existing shareholders have in the company, including the amount due from any future dividend payments. Buybacks typically signal the company’s optimism about the future and a possible undervaluation of the company’s equity.

There is no guarantee that past trends will continue, or forecasts will be realised.
 

JHI

 

These are the views of the author at the time of publication and may differ from the views of other individuals/teams at Janus Henderson Investors. References made to individual securities do not constitute a recommendation to buy, sell or hold any security, investment strategy or market sector, and should not be assumed to be profitable. Janus Henderson Investors, its affiliated advisor, or its employees, may have a position in the securities mentioned.

 

Past performance does not predict future returns. The value of an investment and the income from it can fall as well as rise and you may not get back the amount originally invested.

 

The information in this article does not qualify as an investment recommendation.

 

There is no guarantee that past trends will continue, or forecasts will be realised.

 

Marketing Communication.

 

Glossary