Skip to content
Subscribers Only
Investment Alerts

Nvidia Delivers Another Blowout Quarter

November 22, 2023

Nvidia’s Q3 results were amazing. Has there ever been a company of this size in history growing so rapidly?

Q3 was another record quarter. Revenue of $18.1bn was up 34pc sequentially and up more than 200pc year on year and well above our outlook of $16bn. Starting with data centre, the continued ramp of the NVIDIA HGX platform based on our Hopper Tensor Core GPU architecture, along with InfiniBand end-to-end networking drove record revenue of $14.5bn, up 41pc sequentially and up 279pc year on year.

NVIDIA HDX with InfiniBand together are essentially the reference architecture for AI supercomputers and data centre infrastructures. Some of the most exciting generative AI applications are built and run on NVIDIA, including Adobe, Firefly, ChatGPT, Microsoft 365 Copilot, CoAssist, Now Assist with ServiceNow, and Zoom AI Companion. Our data centre compute revenue quadrupled from last year and networking revenue nearly tripled. Investment in infrastructure for training and inferencing large language models, deep learning recommender systems, and generative AI applications is fueling strong broad-based demand for NVIDIA accelerated computing.

Colette Kress, CFO, Nvidia, Q3 2024, 21 November 2024

Total Q3 revenue more than tripled to $18.1bn v expectations of $16bn. The shares fell and are falling as I write. There could be two reasons for this. Optimists might have been expecting a totally insane figure like $20bn plus for revenue and since every man and his dog knew the figures were going to be outstanding there was almost certainly a large short term bull position, which means many sellers after the results.

My guess is that when the selling dries up the shares will start to move ahead. There is nothing in the resultts to change the view that Nvidia is THE generative AI stock.

Innovation proceeds apace and the market for AI is growing all the time.

The enterprise wave of AI adoption is now beginning. Enterprise software companies such as Adobe, Databricks, Snowflake, and ServiceNow are adding AI copilots and assistants with their pipelines. And broader enterprises are developing custom AI for vertical industry applications such as Tesla and autonomous driving. Cloud service providers drove roughly the other half of our data centre revenue in the quarter.

Demand was strong from all hyperscale CSPs [cloud service providers] as well as from a broadening set of GPU-specialized CSPs globally that are rapidly growing to address the new market opportunities in AI. NVIDIA H100 Tensor Core GPU instances are now generally available in virtually every cloud with instances and high demand. We have significantly increased supply every quarter this year to meet strong demand and expect to continue to do so next year. We will also have a broader and faster product launch cadence to meet a growing and diverse set of AI opportunities.

Colette Kress, CFO, Nvidia, Q3 2024, 21 November 2024

If, as many believe, the whole AI and Generative AI boom is just beginning Nividia is operating in a world of massive opportunity.

A wild card is China.

Toward the end of the quarter, the U.S. government announced a new set of export control regulations for China and other markets, including Vietnam and certain countries in the Middle East. These regulations require licenses for the export of a number of our products, including our Hopper and MPIR 100 and 800 series and several others. Our sales to China and other affected destinations derived from products that are now subject to licensing requirements have consistently contributed approximately 20pc to 25pc of data centre revenue over the past few quarters.

We expect that our sales to these destinations will decline significantly in the fourth quarter, though we believe this will be more than offset by strong growth in other regions. The U.S. government designed the regulation to allow the U.S. industry to provide data centre compute products to markets worldwide, including China.

Continuing to compete worldwide as the regulations encourage, promotes U.S. technology leadership, spurs economic growth, and support U.S. jobs. For the highest performance levels, the government requires licenses.

For lower performance levels, the government requires a streamlined prior notification process. And for products at even lower performance levels, the government does not require any notice at all. Following the government’s clear guidelines, we are working to expand our data centre product portfolio to offer compliant solutions for each regulatory category, including products for which the U.S. government does not wish to have advanced notice before each shipment.

We are working with some customers in China and the Middle East to pursue licenses from the U.S. government. It is too early to know whether these will be granted for any significant amount of revenue. Many countries are awakening to the need to invest in sovereign AI infrastructure to support economic growth and industrial innovation.

Colette Kress, CFO, Nvidia, Q3 2024, 21 November 2024

How can anyone feel negative about prospects for this business.

From a product perspective, the vast majority of revenue in Q3 was driven by the NVIDIA HGX platform based on our Hopper GPU architecture with lower contribution from the prior generation Ampere GPU architecture. The new L40s GPU built for industry-standard servers began to ship supporting training and inference workloads across a variety of customers. This was also the first revenue quarter of our GH 200 Grace Hopper Secret chip, which combines our ARM-based Grace GPU with a Hopper GPU. Grace and Grace Hopper are ramping into a new multibillion-dollar product line.

Colette Kress, CFO, Nvidia, Q3 2024, 21 November 2024

It is insane what is happening.

All in, we estimate that the combined AI compute capacity of all the supercomputers built on Grace Hopper across the U.S., Europe, and Japan next year will exceed 200 exaflops [an exaflop is a measure of performance for a supercomputer that can calculate at least 1018 or one quintillion floating point operations per second. In exaflop, the exa- prefix means a quintillion, that’s a billion billion, or one followed by 18 zeros] with more wins to come. It is contributing significantly to our data centre demand as AI is now in full production for deep learning, recommends, serve chatbots, copilots, and text image generation. And this is just the beginning.

Colette Kress, CFO, Nvidia, Q3 2024, 21 November 2024

There are so many areas driving explosive growth.

Efficient scaling is a key requirement in generative AI because LLMs [large language models] are growing by an order of magnitude every year. Microsoft Azure achieved similar results on the nearly identical cluster, demonstrating the efficiency of NVIDIA AI and public cloud deployments. Networking now exceeds a $10bn annualized revenue run rate. Strong growth was driven by exceptional demand for InfiniBand, which grew fivefold year on year.

InfiniBand is critical to gain the scale and performance needed for training LLMs. Microsoft made this very point last week highlighting that Azure uses over 29,000 miles of InfiniBand cabling, enough to circle the globe. We are expanding NVIDIA networking into the Ethernet space. Our new Spectrum end-to-end Ethernet offering with technologies purpose-built for AI will be available in Q1 next year with support from leading OEMs, including Dell, HP, and Lenovo.

Colette Kress, CFO, Nvidia, Q3 2024, 21 November 2024

Jensen Huang, CEO and co-founder of Nvidia believes that something massive is happening.

These new data essentials are very few applications if not one application used by basically one tenant. And it processes data. It trains models and it generates tokens, it generates AI. And we call these new data centres AI factories.

We’re seeing AI factories being built out everywhere in just about every country. And so, if you look at where we are in the expansion, the transition into this new computing approach, the first wave you saw with large language model start-ups, generative AI start-ups, and consumer Internet companies. And we’re in the process of ramping that.

You see that we’re starting to partner with enterprise software companies who would like to build chatbots and copilots and assistants to augment the tools that they have on their platforms. You’re seeing GPU-specialized CSPs cropping up all over the world, and they’re dedicated to doing really one thing, which is processing AI. You’re seeing sovereign AI infrastructures, people, countries that now recognize that they have to utilize their own data, keep their own data, keep their own culture, process that data, and develop their own AI.

You see that in India. About a year ago in Sweden, you’re seeing in Japan. Last week, a big announcement in France. But the number of sovereign AI clouds that are being built is really quite significant. And my guess is that almost every major region will have, and surely, every major country will have their own AI cloud.

And so, I think you’re seeing just new developments as the generative AI wave propagates through every industry, every company, every region. And so, we’re at the beginning of this inflection, this computing transition.

Jensen Huang, CEO, Nvidia, Q3 2024, 21 November 2024

I am not sure in some seven decades of looking at companies that I have ever seen one with quite as many exciting things going on as Nvidia. It just may be the most exciting company of all time and that is against some impressive competition, giving what is going on with the other members of the Magnificent Seven.

Strategy – Fill Your Boots With Nvidia Shares

Five hundred dollars is becoming a key resistance level for Nvidia. It first topped $500 on 4 August and is clearly hitting selling whenever it reaches that level. It is not too surprising. Just over a year ago the shares bottomed out at a little over $100 so they have almost quintupled in just over a year. There must be loads of loose holders easily panicked out of the shares and we have just had a great piece of good news so short-term traders will feel there is nothing towards which to look forward.

Fair enough but if you are a long term investor and you want to own shares in one of the most exciting businesses on the planet at a reasonable price now is a good time to buy.

As already noted, I expect the selling to dry up and then the shares will resume their climb driven by all the exciting things happening globally with Generative AI and all the exciting things happening at Nvidia. Even in the short term we could see analysts setting more exciting targets for the shares, well above $600.

Listen to Jensen again.

NVIDIA is essentially an AI foundry. NVIDIA GPUs, CPUs, networking, AI foundry services, and NVIDIA AI enterprise software are all growth engines in full throttle.

Jensen Huang, CEO, Nvidia, Q3 2024, 21 November 2024

How WOW is that!

Share Recommendations

Nvidia. NVDA Buy @ $483.50

Further reading

More >
Subscribers Only
Investment Alerts

Chart Breakout – Stocks to Build a Portfolio

April 26, 2024
Subscribers Only
Investment Alerts

US Long Bond Yields Test New Peaks

April 19, 2024
Subscribers Only
Investment Alerts

But My Heart Belongs to GoDaddy

April 12, 2024
Subscribers Only
Investment Alerts

Zooming Higher, at Least for the Moment

April 11, 2024