Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
"Sell the Shovel" Huang Renxun depicts AI's "five-layer cake"—who will pay for the trillion-dollar feast?
Everyday Observer Commentator | Zhao Linan
Editor | He Xiaotao Chen Junjie Du Bo Proofreader | Duan Lian
On March 10, NVIDIA CEO Jensen Huang published a signed article titled “AI is a Five-Layer Cake,” declaring to the world the inevitability of AI (artificial intelligence) as the infrastructure of the modern world.
In this ambitious discourse, Huang pointed out that AI has become an essential infrastructure, much like electricity and the internet. To support this vast narrative, he vividly proposed the “five-layer cake” of AI architecture: energy, chips, infrastructure, models, and applications.
Huang aims to construct a logically coherent industrial paradigm for this AI investment that requires trillions of dollars.
As the largest “shovel seller” in this global tech “gold rush,” NVIDIA’s core interests are naturally tied to maintaining the continued enthusiasm for AI investments. However, when we examine this from a cold financial perspective and business logic, we find that the pinnacle of this trillion-dollar “five-layer cake” has yet to see enough and sufficiently strong “payers.”
Huang’s ultimate narrative as a “shovel seller” aims to sustain the AI gold rush. In Huang’s projection, the paradigm of computing has undergone a fundamental shift, with humanity transitioning from the “pre-packaged software” era to the AI-led “real-time intelligence” era, necessitating a complete overhaul of the entire computing architecture. Huang asserts that energy, chips, infrastructure, models, and applications—every successful application will drive each layer beneath it, all the way down to the power equipment that sustains its operation.
This logic sounds impeccable, but it fails to address a critical question: The first three layers (energy, chips, and infrastructure) are extremely asset-heavy, requiring massive amounts of real capital investment; while the top layer, the application layer, is the only exit for this building to generate revenue and produce free cash flow from the external world (ordinary consumers and non-AI business entities). For this narrative logic to truly work, the real economic value created by the top-level applications must cover the enormous depreciation, energy consumption, and R&D costs of the four layers below. Unfortunately, the market has yet to provide a convincing answer.
Currently, there exists a “left foot stepping on the right foot” prosperity among AI giants. If the external blood-generating ability of the application layer is insufficient, then where does the funding source driving the current global AI frenzy come from? The answer lies hidden within the financial statements of Silicon Valley giants.
The current AI boom largely exists as a “deeply vested internal financial cycle.” Taking Microsoft’s collaboration with OpenAI as an example, Microsoft invested heavily in OpenAI, a significant portion of which translates into computing power credits for Microsoft cloud services. OpenAI, in order to train increasingly large models, must consume these credits, a considerable part of which is counted as revenue for Microsoft’s cloud business. With this impressive financial report, Microsoft demonstrates the success of its AI strategy to Wall Street, thereby gaining a higher market value; subsequently, Microsoft uses its ample funds to purchase more GPUs from NVIDIA, expanding its AI servers, beginning a new cycle with OpenAI.
On the financial statements, this closed loop equates to buying “two bowls of noodles” with one piece of money. This frenzy of infrastructure, is it driven by the unstoppable real demand for AI applications across various industries, or is it a result of an arms race among giants to seize positions?
The litmus test for the AI bubble lies in whether the application layer can generate sufficient cash flow. Whether the AI industry can ultimately avoid the fate of a bubble burst does not depend on how high the underlying computing power stacks up, nor on how large the model parameters are. All the pressure ultimately transmits to the application layer.
Autonomous driving, humanoid robots, drug development platforms—these visions of the future are indeed beautiful, but getting consumers and businesses to shell out trillions of dollars remains a significant commercialization gap. If the cost savings from deploying AI applications or the additional profits created do not cover the computing bills paid to cloud vendors; if ordinary consumers are unwilling to pay monthly subscription fees of dozens of dollars for various AI assistants, then this infrastructure built on trillions of dollars may become a superhighway with no traffic.
As Huang stated at the end of the article, the direction is clear: AI is becoming the infrastructure of the modern world. I do not deny the long-term revolutionary significance of AI, but in the short term, as the “shovel sellers” rake in profits and continue to throw out new narratives to maintain capital confidence, investors must remain clear-headed: until enough real users who are willing and able to pay for the entire industry chain are found, this enticing “five-layer cake” is temporarily just a capital feast for a select few.