Home Technology & AI Nvidia Just Made a $500 Billion Power Move—And SoftBank Walked Away

Nvidia Just Made a $500 Billion Power Move—And SoftBank Walked Away

0
Nvidia × TSMC: Powering AI Futures

In just a few days, two seemingly opposite headlines around Nvidia sent a clear signal: the AI arms race is entering a new phase. On one side, Nvidia CEO Jensen Huang flew to Taiwan for a face-to-face with TSMC brass. The result? Reports suggest Nvidia could boost monthly production of advanced 3-nanometer chips by 50%. That’s a major bet on rising AI infrastructure demand, especially with its new GB300 and Rubin platforms set to drive rack-scale AI computing into hyperspace.

On the flip side, SoftBank, an early Nvidia backer and AI evangelist, sold its entire stake in the chipmaker for $5.83 billion. That might sound like a bearish move, but SoftBank isn’t leaving the AI party. Instead, it’s doubling down on its own ventures—like the half-trillion-dollar “Stargate” data center project in the U.S.—which may still rely heavily on Nvidia’s silicon.

These moves aren’t contradictory—they’re revealing. Let’s unpack four ways these strategic shifts are shaping the next leg of the global AI boom.

Nvidia’s Foundry First-Mover Advantage With TSMC Changes The AI Supply Chain Math

When Jensen Huang visited TSMC in Taiwan this month, it wasn’t just a routine supplier check-in. Nvidia reportedly asked the chip foundry giant to increase 3-nanometer wafer output by 50% per month. That’s a massive ask—and it shows just how aggressive Nvidia is getting ahead of what it sees as an impending surge in demand for its next-generation Blackwell and Rubin platforms.

This comes as Nvidia’s current AI factory systems—namely the GB200 and GB300—are ramping up hard into production. According to its latest earnings call, Nvidia is already producing 1,000 full AI racks per week, a figure that’s expected to grow even further in Q3. These systems aren’t just powerful—they’re architectural shifts. Nvidia’s new NVLink 72 setup turns an entire rack into a single computing unit, capable of powering reasoning-based agentic AI. That’s a step-change in performance, not an incremental tweak.

By locking in deeper capacity with TSMC, Nvidia isn’t just securing more chips—it’s effectively building a supply-side moat. Lead times from wafer to full AI rack can take a full year, and in a world where every hyperscaler and sovereign government wants their own “AI factory,” TSMC access becomes a competitive edge. While others scramble to develop alternatives, Nvidia is making sure it can out-ship, out-scale, and out-innovate. That’s a power play.

SoftBank’s Exit Isn’t A Signal Of Weakness—It’s A Reallocation Of Firepower

SoftBank’s decision to dump all 32.1 million of its Nvidia shares for $5.83 billion in October turned heads. After all, this is the same Vision Fund that once owned $4 billion in Nvidia back in 2017 and reaped the rewards when the AI boom took off in 2023. So why cash out now?

The answer lies not in doubt—but in deployment. SoftBank isn’t retreating from AI; it’s pivoting to build in AI. Its newly announced $500 billion Stargate project is a sprawling plan to erect massive AI data centers in the U.S., which will inevitably require the kind of GPU horsepower Nvidia provides. This isn’t a break-up. It’s a restructuring.

What makes this shift important is that it signals a change in how AI infrastructure is financed. Instead of holding Nvidia stock, SoftBank wants to own the value chain downstream—from chips to full-stack AI services. That’s a more control-oriented, vertically integrated approach, akin to Amazon’s Trainium chips or Google’s TPUs, which serve internal needs rather than external customers.

For investors, the move underscores how different players in the AI ecosystem are carving out their lanes. Nvidia wants to be the arms dealer; SoftBank wants to be the empire builder. Both are playing offense—but in different games.

Wall Street’s Reaction Highlights A New Phase Of AI Sector Maturity

Interestingly, the market barely flinched after SoftBank’s exit. Nvidia stock even rose 3.5% following its TSMC meeting, reversing a 9% five-day slide. That’s telling. In prior years, a major stakeholder selling out would’ve triggered alarm bells. Now? Investors seem more focused on supply-chain news and forward earnings than shareholder turnover.

Part of that stems from Nvidia’s own visibility into 2026 and beyond. On its recent earnings call, management said they see $3 trillion to $4 trillion in global AI infrastructure build-outs by the end of the decade—and they believe Nvidia can capture a large share of that, especially given how its rack-scale platforms are now foundational to AI data centers.

What also buoyed investor sentiment was the company’s detailed roadmap: Rubin, the next-gen platform following Blackwell, is already in fab at TSMC. The company is sticking to its one-year cadence for major architecture upgrades—a pace competitors like AMD or Google’s TPUs may find hard to match. That kind of consistency breeds confidence, especially when paired with Nvidia’s now industry-standard CUDA software stack.

Wall Street analysts have taken note. Citi raised its price target to $220. Melius Research still sees upside to $300. Morningstar bumped its fair value to $225 and sees 45% CAGR through fiscal 2028. Markets may be volatile, but Nvidia’s messaging is anything but.

The Strategic Tension Between Openness & Control Will Define The Next Wave

What connects the TSMC ramp-up and SoftBank’s exit is a subtle but powerful theme: control versus openness. Nvidia, with its proprietary CUDA software and tightly integrated hardware, is winning by building the most comprehensive, vertically integrated AI stack. From networking to chips to software, it’s all Nvidia—or it’s not optimal.

But the world’s hyperscalers and enterprise giants don’t love vendor lock-in. Amazon has Trainium and Inferentia. Google has TPUs. Microsoft is working with AMD. Everyone wants an alternative—not necessarily to replace Nvidia, but to balance it.

SoftBank’s strategy mirrors this. By selling Nvidia and channeling capital into its own infrastructure efforts, it gains control over design, deployment, and monetization. It’s also a hedge—if supply from Nvidia ever tightens or geopolitics shift (as they did with H20 export restrictions to China), SoftBank won’t be left out in the cold.

Nvidia, for its part, is countering this by making its platform more indispensable, not less. It’s expanding the use cases—from data centers to robotics (with Jetson Thor), from quantum simulations to sovereign AI buildouts. Its systems are showing up in everything from EU-funded supercomputers to Disney’s creative pipelines.

The open-versus-integrated tension isn’t new. But in AI, where switching costs are high and performance gaps are widening, it may be the defining battle of the decade.

Final Thoughts: A Sector In Flux, A Giant Still Expensive

Nvidia’s deepening ties with TSMC and SoftBank’s complete exit from Nvidia stock might look like two diverging paths—but they actually reflect the same reality: the AI boom is evolving. No longer just a race to build chips, it’s now a full-blown ecosystem war, where control of software, infrastructure, and capital flows will determine who dominates.

Investors should weigh this strategic complexity against valuation. Nvidia currently trades at 56.70x LTM P/E, 48.75x LTM EV/EBITDA, and 29.28x LTM P/S, based on the latest November 10, 2025 data. While those multiples are off peak levels from earlier this year, they remain high, reflecting robust expectations.

Will Nvidia grow into that valuation? Possibly. But as SoftBank’s strategic redeployment and Nvidia’s chip ramp illustrate, even the giants of this space are shifting playbooks. Investors, builders, and regulators will all have to adapt.

And as for Nvidia? It’s still building like the boom isn’t over—but it’s not the only one playing to win.

Disclaimer: We do not hold any positions in the above stock(s). Read our full disclaimer here.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version