Jensen Huang 1.8 18 ideas

CEO, NVIDIA
After 1 day
N/A
11/15 min ideas
After 1 week
N/A
11/15 min ideas
After 1 month
N/A
6/15 min ideas
0 winning  /  6 losing  ·  6 positions (30d)
Net: -11.5%
Recent positions
TickerDirEntryP&LDate
NVDA LONG $178.76 Mar 19
By sector
Stock
16 ideas -11.5%
private
2 ideas
Top tickers (by frequency)
NVDA 7 ideas
0% W -10.7%
MSFT 2 ideas
0% W -10.6%
AMZN 1 ideas
TSLA 1 ideas
0% W -13.0%
META 1 ideas
Best and worst calls
Huang argues that despite a higher upfront cost for Nvidia's inference factory (~$50B vs. ~$30-40B for alternatives), it generates the lowest cost tokens due to 10x better throughput. The chip cost difference is a small portion of the total data center cost (land, power, shell, networking, storage, CPUs). The true economic metric for AI infrastructure is the cost per unit of work (token), not the price of individual components. Nvidia's full-stack, system-level optimization and architectural velocity deliver superior throughput and efficiency. This efficiency advantage defends and expands Nvidia's market share against custom ASIC competitors, as customers prioritize total cost of ownership and performance over upfront chip price. Competitors achieve a comparable or superior architectural leap, collapsing Nvidia's throughput advantage and making their system-level integration less unique.
NVDA All-In Podcast Mar 19, 18:27
CEO, NVIDIA
Jensen Huang explicitly stated that NVIDIA's growth is accelerating and the company has high-confidence visibility of over $1 trillion for its Blackwell and Ruben products by the end of 2027. This visibility reflects strong and expanding demand for NVIDIA's core AI and inference technologies, which are central to the company's revenue and earnings growth. LONG on NVIDIA due to the clear acceleration in growth, substantial future order visibility, and strategic positioning in the AI inflection point. Execution risks such as supply chain disruptions, technological shifts, or a slowdown in AI adoption could impair the realized growth.
NVDA CNBC Mar 17, 16:15
CEO, NVIDIA
"We're the only platform in the world today that runs every single domain of A.I.... That allows us to be the lowest cost, the highest confidence platform... That infrastructure investment you could make on NVIDIA, you could make with complete confidence." If NVIDIA is seen as the only complete, lowest-cost, highest-confidence platform for a $1 trillion+ market, it will crowd out investment in competing architectures. Customers making billion-dollar, long-term infrastructure bets will rationally consolidate on the platform they deem safest. This creates a severe headwind for competitors like AMD and Intel, who are fighting for the scraps or specific niches. This is a SHORT thesis (or AVOID thesis for longs) against NVIDIA's primary US-listed competitors, as Huang's commentary suggests a winner-take-most dynamic where NVIDIA captures the overwhelming majority of high-value infrastructure spending due to perceived platform superiority and lower total cost of ownership. A competitor could achieve a breakthrough in performance or efficiency. NVIDIA could face regulatory action that levels the playing field. The total AI market could grow so large that even a smaller slice is highly profitable for AMD/INTC.
INTC AMD Bloomberg Markets Mar 16, 19:56
CEO, NVIDIA
"I see through 2027. At least $1 trillion... I am certain computing demand will be much higher than that... We're the only platform in the world today that runs every single domain of A.I... That infrastructure investment you could make on NVIDIA, you could make with complete confidence." The CEO is making an explicit, multi-year, quantitative revenue forecast ($1T+) based on his unique insight into industry demand and NVIDIA's unmatched technical position. This projection, if taken seriously, implies a massive multi-year growth runway and a durable competitive moat, justifying a long-term equity position. This is a LONG thesis based on the CEO's specific, high-confidence financial projection and his claim of NVIDIA's unique and "fungible" platform dominance. The $1T projection is an extraordinarily high target; failure to meet it could lead to significant multiple contraction. Competition from in-house silicon (e.g., Google's TPU, Amazon's Trainium) and external rivals (AMD, INTC) could erode market share. The projection depends on sustained, explosive AI investment which may slow.
NVDA Bloomberg Markets Mar 16, 19:56
CEO, NVIDIA
Nvidia reported 75% revenue growth in its core data center business with 75% gross margins. Huang states AI is in its "third inflection" (Agentic AI), creating demand that is "off the charts" and broad-based across industries, not just hyperscalers. Skeptics argue the "law of large numbers" will cap growth, but Huang's "Agentic AI" thesis implies a total replacement of the software stack, justifying continued exponential capex. The valuation (PE ratio) is noted to be lower than other chip peers despite superior growth. LONG. The bottleneck is no longer chip supply but infrastructure, and demand remains uncapped by the shift to agentic systems. Hyperscaler capex exhaustion or regulatory intervention.
NVDA CNBC Feb 26, 18:51
CEO, NVIDIA
Jensen mentions "OpenAI's Codex" and "Cloud Code" are doing "incredibly well in companies all over the world in software programming." Microsoft (via GitHub Copilot and OpenAI ownership) is the primary commercializer of OpenAI's Codex. Jensen confirms this specific use case (coding agents) is scaling globally in the enterprise, which directly drives MSFT's Azure and Copilot revenue. LONG. Confirms product-market fit for MSFT's AI coding tools. Enterprise churn or competition from open-source coding models.
MSFT CNBC Feb 26, 14:34
CEO, NVIDIA
Jensen states AI is in a "third inflection point" driven by "Agentic AI" (agents that reason and act). He notes, "The amount of computing demand is off the charts" because these agents require significantly more processing power than simple chatbots. The market fears "peak AI spend," but Jensen argues the shift to *Agentic* workflows creates a new, higher baseline for compute intensity. If agents are now "doing work" rather than just answering questions, the inference cost per task increases, sustaining demand for NVDA's Blackwell and Rubin chips. LONG. The fundamental demand driver (Agentic AI) validates the rich valuation and continued growth. Regulatory crackdowns on AI agents or supply chain bottlenecks for new chips (Rubin).
NVDA CNBC Feb 26, 14:34
CEO, NVIDIA
Jensen explicitly highlights "Our big partnership with Lily [Eli Lilly]" and notes that "Scientific computing is being completely revolutionized by artificial intelligence." While most investors focus on AI for tech/coding, Jensen identifies BioTech/Pharma as a major growth vertical. LLY is using NVDA's platform to accelerate drug discovery. This validates LLY not just as a GLP-1 play, but as a tech-enabled pharma leader. LONG. LLY is the specific winner named in the "AI for Science" vertical. Clinical trial failures unrelated to AI efficiency.
LLY CNBC Feb 26, 14:34
CEO, NVIDIA
Jensen states, "Car companies that are building autonomous vehicles... The robotaxi era is coming. And so there's a whole bunch of computing being built for that." Jensen confirms that massive compute clusters are currently being built specifically for AVs. This implies the technology is nearing deployment maturity. TSLA (Cybercab/FSD) and GOOGL (Waymo) are the leaders in this space. If NVDA is selling them the compute *now*, the rollout is imminent. LONG. A direct bet on the "Robotaxi era" Jensen predicts. Regulatory hurdles for Level 4/5 autonomy or safety accidents.
TSLA GOOGL CNBC Feb 26, 14:34
CEO, NVIDIA
Despite market fears over spending, Huang argues that demand is "sky high" and the spending is appropriate. We are in the "largest infrastructure buildout in human history." AI is not just a feature; it is fundamentally changing how computing works (from search to shopping to movie recommendations). Therefore, the massive Capex spending by Nvidia's customers is a requirement, not a mistake. Huang cites a fundamental shift in computing architecture as the driver for demand. If the hyperscalers (AMZN, MSFT, etc.) pull back on spending due to shareholder pressure, Nvidia's revenue would take a direct hit.
NVDA CNBC Feb 06, 19:35
CEO of Nvidia
Both companies are described as having crossed an inflection point where AI is no longer just "curious" but "super useful" and profitable. These companies are currently compute-constrained. If they had twice the hardware, their revenue would quadruple. They are generating "profitable tokens," meaning the cost to produce the AI output is lower than the value they sell it for. Described as "$20 billion run rate companies" with accelerating growth and profitable revenues. These are private assets (hard to access) and face intense competition from open-source models.
OPENAI ANTHROPIC CNBC Feb 06, 18:46
CEO of Nvidia
The Guest Investor defends the massive CapEx spending ($660B combined) by comparing it to Amazon investing in AWS in 2008. Jensen points to Meta specifically, noting their earnings have already moved because AI improved their ad targeting and recommendations. The market views high spending as "burning cash," but the speakers view it as "digging a gold mine." You must spend upfront to extract the gold (intelligence/tokens). Once built, these platforms will generate significantly higher cash flows, similar to how AWS became a profit engine for Amazon. Meta's earnings growth driven by AI recommender systems; AWS currently generating $30B/year in profit from investments made in 2008. If the "gold mine" is empty—meaning if AI applications do not generate the expected revenue to justify the upfront cost.
MSFT CNBC Feb 06, 18:46
CEO of Nvidia
Jensen Huang (CEO, NVIDIA) | 18 trade ideas tracked | NVDA, MSFT, AMZN, TSLA, META | YouTube | Buzzberg