
NVIDIA Q3 FY26 Earnings Review: Could it Get Any Better?
Contents
Nvidia Q3 FY26 Earnings Review: The $57 Billion Mic Drop
If you were worried about the AI bubble popping, worry less after today. Nvidia’s Q3 FY2026 earnings report wasn’t just a beat; it was a statement that the AI boom is far from over. With a fantastic $57 billion in revenue, up 62% year-over-year, and up a ridiculous 22% sequentially, the company is once again defying gravity(I hope investors are happy), physics, and perhaps even the expectations of the most bullish analysts on Wall Street.
While the rest of the market frets about ROI and capital expenditure sustainability, Nvidia is busy selling every single chip it can manufacture and wishing it could make more. The message from this call is clear: they are not just building chips; they are modernizing the entire world’s computing infrastructure.
The Good: Everything is Off the Charts
To put it mildly, Nvidia’s core business is on fire.
Data Center Dominance The Data Center segment revenue hit a record $51.2 billion, up 66% from a year ago and 25% sequentially. This isn’t just growth; it’s a transformation. This segment alone is now larger than the entire company was just a quarter ago. The growth is being driven by what CEO Jensen Huang calls three platform shifts: accelerated computing, powerful AI models, and agentic applications.
Blackwell is a Beast If you thought the Hopper architecture was big, the demand for Blackwell is, in Jensen’s words, “off the charts.” The company confirmed that Blackwell Ultra is now their leading architecture across all customer categories. Perhaps the most bullish signal for investors? The clouds are sold out. Every major cloud provider is racing to deploy Blackwell, and the demand continues to exceed supply.
Networking Explosion Often overlooked, Nvidia’s networking business is now a juggernaut in its own right. Revenue here hit $8.2 billion, up a massive 162% year-over-year. This growth is fueled by the need for massive scale-up clusters. As AI models get larger, the plumbing that connects thousands of GPUs becomes just as critical as the GPUs themselves. Nvidia’s NVLink and InfiniBand technologies are becoming the industry standard for these massive AI factories.
Margins Holding Strong Despite the complexity of ramping up new products, Nvidia maintained impressive profitability. GAAP gross margins landed at 73.4%, and non-GAAP at 73.6%. While this is slightly down year-over-year, it exceeded their own outlook, proving they can maintain pricing power even while navigating complex supply chain ramps. These great margins float down as well and allow the company to generate a ton of free cash flow which they can invest in their own growth but also other businesses like their recent Anthropic deal.
The Flurry of Deals: Nvidia is Everywhere
One of the most striking aspects of this quarter was the sheer volume of strategic partnerships and deals announced. Nvidia is aggressively embedding itself into every corner of the tech ecosystem, ensuring that whether you are building a robot, a chatbot, or a self-driving car, you are doing it on CUDA. The interesting thing is that these partnerships are often circuitous. NVDA invests in a business, gets an ownership stake in that business and said business from one of the many hyperscalers who are then buying NVDA chips themselves. It might seem a bit fishy at times but it certainly benefits Nvidia. It’s not even tech that’s working on AI as this stuff has value in things such a drug discovery as well.
The AI Model Builders Nvidia solidified its dominance with the smartest companies in the room.
- OpenAI: Nvidia is working on a strategic partnership to help them build and deploy at least 10 gigawatts of AI data centers. Jensen also hinted at the opportunity to invest in the company, solidifying a relationship that began when he hand-delivered the first DGX to them back in 2016.
- Anthropic: For the first time, Anthropic is officially adopting Nvidia infrastructure. This is a massive win, as they are establishing a deep technology partnership to optimize Anthropic’s models for CUDA.
- xAI & Musk: The call highlighted the massive Colossus cluster and a new partnership where xAI and Humain will jointly develop a network of world-class GPU data centers.
Traditional Tech & Enterprise It’s not just about the hyperscalers.
- Fujitsu & Intel: In a move to bridge ecosystems, Nvidia announced collaborations with both Fujitsu and Intel to integrate their CPUs with Nvidia GPUs via NVLink Fusion. This ensures that legacy architectures can still play nice with Nvidia’s acceleration.
- Uber: On the physical AI front, Nvidia is partnering with Uber to scale the world’s largest Level 4 ready autonomous fleet using the new Nvidia Hyperion reference architecture.
- AWS & Humain: A newly expanded partnership involving the deployment of up to 150,000 AI accelerators.
That’s not the end but I have to cut it off somewhere.
The Not So Perfect: Finding Flaws in a Diamond
No earnings report is perfect, and there are a few areas that bears might latch onto, though they feel like nitpicks in the grand scheme of things.
Gaming is Flat While the Data Center business rockets to the moon, the segment that started it all—Gaming—is effectively treading water. Revenue was $4.3 billion, up 30% year-over-year but down 1% sequentially. The company attributed this to channel inventories normalizing ahead of the holiday season. It’s clear that Nvidia is now an AI Data Center company that also happens to sell gaming cards. Plus, complaining about a sequential decrease despite 30% Y/Y growth might be just looking for negatives here.
Supply Constraints The problem with having demand that is off the charts is that you actually have to build the things. CFO Colette Kress admitted that supply availability for networking products varied during the quarter. Furthermore, as they ramp Blackwell, they are ordering to secure long lead-time components, which drove inventory up to $19.8 billion. The constraint isn’t demand; it’s physics and manufacturing capacity. That limits growth here but does mean that this isn’t likely the end of the road for NVDA with plenty of runway ahead especially as you read about these new deals worth tens to hundreds of billions of dollars announced seemingly every other week. Nobody in this space wants to be left behind and at least right now and likely for the foreseeable future, that means partnering with NVDA and buying their chips.
Gross Margin Compression While margins beat guidance, they are down year-over-year (from 74.6% to 73.4%). This is due to the transition from the mature Hopper architecture to the new, more complex Blackwell systems. As the product mix shifts to these newer, more expensive-to-build systems, margins are seeing slight pressure, though the company expects to hold them in the mid-70s next year.
The China Situation H20 sales (their China-specific chip) were insignificant in the quarter. Jensen expressed disappointment that they cannot ship competitive products to China due to geopolitical restrictions. For now, they are effectively assuming zero revenue from China for data center compute in their forward guidance. If that changes then there’s upside but if not then hey, I guess I’ll take 20% sequential growth and be happy about it.
Highlights from the Q&A
The analyst Q&A session is often where the real interesting data is found, and this call was no exception. Jensen Huang and Colette Kress fielded questions that ranged from supply chains to the laws of physics.
On Visibility and the $500 Billion Number When asked if the previously touted $500 billion in Blackwell and Rubin revenue visibility was still accurate, Colette Kress didn’t blink. She confirmed they are on track for that forecast and, perhaps more importantly, noted that the number will likely grow. She cited new deals, like the one with Anthropic, as net new opportunities that weren’t even in that original forecast.
On Scaling Laws Jensen gave a masterclass on why the demand for compute isn’t slowing down. He explained that we are seeing three scaling laws happening simultaneously:
- Pre-training: Making models smarter by training them on more data.
- Post-training: Using compute to refine models (Reinforcement Learning from Human Feedback).
- Inference: This is the big one. Newer reasoning models think before they answer. They run simulations and chain-of-thought processes during inference. This means inference is no longer a lightweight task; it requires massive computation. As Jensen put it, “How could thinking be easy?”
On Competition and ASICs An analyst asked if customers might switch to custom ASICs (chips designed specifically for one task). Jensen’s rebuttal was a five-point defense of Nvidia’s moat. He argued that because Nvidia accelerates every phase of AI (pre-training, post-training, and inference) and runs every model (from dense to mixture-of-experts), it offers a versatility that a specialized ASIC simply cannot match. He also noted that you’re not competing against chips; you’re competing against teams, implying that Nvidia’s engineering velocity is unmatched. NVDA has proven to be the leader in this space and they don’t seem to have any clear competitors at the moment.
On Margins When pressed about the costs of these complex new systems, Colette reassured investors that despite rising input costs, they are working to hold gross margins in the mid-70s for the next fiscal year.
Valuation and The Future: $4 Trillion and Beyond?
So, where does Nvidia go from here?
The company is banking on a future where the entire world’s computing infrastructure is rebuilt. Jensen described a vision where $1 trillion of traditional data centers (CPU-based) will be modernized into accelerated computing factories (GPU-based) over the next few years.
But the real kicker is the agentic AI revolution. We are moving from AI that just retrieves information to AI that performs tasks. This requires a level of compute that makes today’s infrastructure look like a pocket calculator.
This type of earnings should show investors that AI is far from dying and if anything, might just be starting. That should be good new for certain stocks that recently corrected on fears of overvaluation and bubbles bursting. It’s quite possible that the market has gotten ahead of itself in certain areas and needs to re-price things a little bit but long term, there’s plenty of room to find winners if that were to happen.
Still, in a gold rush, you’re better off being the company selling the shovels than being a miner. While some miners may strike it rich in the long run, the ones selling shovels(especially if they have near 50% free cash flow margins) will do no matter what and that’s NVDA.
The Metrics that Matter:
- Visibility: The company has visibility into $500 billion of revenue from Blackwell and Rubin through the end of calendar 2026.
- Guidance: For Q4, they expect revenue to hit $65 billion (plus or minus 2%). That implies a sequential growth of 14% on top of an already massive base and is above what analysts were expecting.
- The Cycle: With the Rubin platform (the successor to Blackwell) already on track for 2026, Nvidia is proving it can execute an annual product cadence, effectively rendering competitors obsolete before they even launch.
The Verdict Nvidia isn’t priced like a traditional hardware company because it isn’t one. It is the utility company for the AI age. While risks regarding supply chains and geopolitics remain, the company’s execution is nearly flawless. They are capitalizing on three simultaneous platform shifts—accelerated computing, generative AI, and agentic AI—all while maintaining margins that would make a software company jealous.
While a $4.4T market cap seems expensive, this is a company that has fantastic margins and will likely generate $100B in free cash flow in this fiscal year with a realistic potential to grow that number 50% in the next year. That means you’re paying a 3.5% forward free cash flow yield which while not cheap isn’t too bad for a company growing their top line so quickly with quite a lot of runway ahead if the AI revolution is really as impressive as many of these businesses think it could be.
Still, most investors, if invested in an S&P or Nasdaq index fund already hold a substantial amount of NVDA given their massive growth in recent years. NVDA and some of the other AI names has been behind the great performance of those indexes recently. NVDA makes up around 8% of the S&P 500 and almost 10% of the Nasdaq so weighing it more heavily beyond that might be a risky bet for investors.
After all, the company is already north of $4.4T so every 10% bump is adding a Mastercard or Netflix or Costco to it’s valuation. With how it’s growing and the business it’s in, I don’t expect $4.4T to be the end but I’m happy to just get my exposure via index funds where it’s usually the largest holding.
However, if you believe that AI is the next industrial revolution, Nvidia’s earnings report just confirmed that they are the ones selling the steam engines and for now, they show no sign of stopping. It also makes it seem like other companies that are part of the semiconductor ecosystem like TSM, ASML, AMAT and others will continue to benefit as well but NVDA is clearly the best of breed for now unless any of the few competing chips get closer.
Disclosure: This is not investment advice. Please talk to a qualified financial professional before making any investment decisions.


