Colette Kress Reveals What's Driving Nvidia's Next Growth Phase Beyond GPUs

When Nvidia’s CFO Colette Kress took the stage at CES 2026, she disclosed a striking metric that reframes how investors should think about the chip giant’s future: nearly 90% of customers who deploy Nvidia AI systems are also purchasing networking products. This attachment rate signals a fundamental shift in how the company is monetizing its position at the center of artificial intelligence infrastructure.

Nvidia has long dominated the market for data center GPUs, but the company’s expansion into networking hardware is now revealing itself as potentially more lucrative than anyone anticipated. The numbers tell the story. In the third quarter of fiscal 2026 alone, Nvidia generated $8.2 billion in networking revenue—a 162% surge compared to the same quarter in the previous year. This category encompasses NVLink (which links GPUs together), InfiniBand switches, and the Spectrum-X Ethernet networking platform. The company confirmed that industry giants including Meta, Microsoft, Oracle, and xAI are all constructing massive AI data centers specifically designed around Nvidia’s Spectrum-X Ethernet switches.

CFO’s CES 2026 Disclosure Unveils the 90% Attachment Rate

Colette Kress’s revelation about the networking attachment rate carries significant weight. She emphasized that even clients building their own custom AI chips for portions of their deployments continue to rely on Nvidia’s networking infrastructure. This suggests that networking has become a de facto standard—not an optional add-on, but a core component of AI data center architecture.

This finding challenges the conventional wisdom that Nvidia’s value proposition rests entirely on GPU dominance. If customers feel compelled to integrate Nvidia networking regardless of their GPU sourcing strategy, it demonstrates that the company has built a defensible moat in an adjacent market. The 90% figure essentially means that Nvidia’s networking business has become stickier and more deeply embedded in customer deployments than previously understood.

Why AI Networking Demands Are Fundamentally Different

The networking requirements for AI data centers operate on an entirely different scale than traditional cloud infrastructure. For AI training workloads, data throughput between GPUs must remain sufficiently high to prevent compute resources from sitting idle. Even brief delays in data movement can cascade into significant efficiency losses across thousands of GPUs. Similarly, AI inference tasks depend critically on rapid data flow to deliver timely predictions.

These demands have elevated Nvidia into a formidable position within the Ethernet switching market, particularly in the ultra-high-speed segment. According to IDC data, revenues for 800GbE switches nearly doubled sequentially in the third quarter of 2025. Nvidia now commands 11.6% of the data center Ethernet switch market—a solid third-place position behind Arista Networks and Cisco Systems, but achieved with remarkable speed.

Rubin Platform: Accelerating the Networking Revenue Trajectory

Nvidia’s strategic reveal of the Rubin platform at CES represents a deliberate pivot toward selling fully integrated rack-scale AI systems rather than standalone GPUs and networking components separately. The Vera Rubin NVL72, in particular, bundles GPUs, CPUs, and multiple networking technologies into a cohesive 72-GPU configuration designed for deployment by major AI cloud providers throughout 2026.

The platform features the new Spectrum-6 series of Ethernet switches, incorporating ports capable of 800 GB/s connectivity and delivering up to 102.4 Tb/s of aggregate switching capacity. By packaging networking as an integral part of rack-scale solutions, Nvidia is essentially ensuring that customers cannot easily separate the networking component from their purchasing decision. This architectural approach has the potential to drive networking revenue substantially higher throughout 2026 and beyond.

The shift toward rack-scale systems represents a crucial evolution in Nvidia’s business model. Rather than relying on customers to assemble infrastructure piecemeal, Nvidia now offers an end-to-end solution where networking adoption becomes nearly automatic. Colette Kress’s 90% attachment rate validates that this strategy is working precisely as intended.

Market Opportunity: From $14.9 Billion to $46.8 Billion

MarketsandMarkets projects that the global AI networking market will expand from $14.9 billion in 2025 to $46.8 billion by 2029, representing a compound annual growth rate of 33.8%. This trajectory suggests that even if Nvidia loses competitive ground in the accelerator market to emerging rivals, the networking segment offers a parallel growth engine with substantial runway.

The sheer scale of investment flowing into AI infrastructure further reinforces this opportunity. McKinsey estimates that by 2030, global capital spending on data centers will need to reach nearly $7 trillion to meet demand. A disproportionate share of that spending will flow directly to Nvidia, particularly given the company’s integration of networking into its rack-scale solutions.

Investment Outlook: Opportunity Tempered by Execution Risk

Nvidia’s AI infrastructure dominance and expanding networking business create a compelling bull case. However, investors must contend with genuine uncertainties. Predicting AI demand four years forward remains inherently speculative, and the industry’s investment trajectory could exceed what actual AI adoption supports. History shows that capital-intensive infrastructure buildouts frequently overshoot demand—and if that occurs with AI data centers, Nvidia’s revenues could face sudden headwinds.

The company’s networking expansion, anchored by Colette Kress’s disclosed 90% attachment rate and driven forward by Rubin’s architectural integration, positions Nvidia well for sustained growth. Yet the broader tech landscape continues evolving rapidly. What appears certain today could face disruption through technological breakthroughs, competitive pressure, or shifting customer preferences tomorrow.

For investors considering exposure to AI infrastructure, Nvidia remains central to that thesis. But success in this space requires accepting that even dominant market positions can be challenged. The company’s ability to maintain its networking momentum, keep customers within its ecosystem through integrated solutions, and navigate competitive threats will ultimately determine whether this growth engine delivers the outsized returns that current valuations suggest.

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)