ChatGPT – The Super Assistant Era | BG2 Guest Interview

Watch on YouTube ↗  |  March 15, 2026 at 16:25  |  1:03:41  |  BG2 Pod

Summary

  • ChatGPT has reached 900 million weekly active users (WAU), representing roughly 10% of the global population, with highly unusual "smiling" retention curves indicating users return more frequently as they learn to delegate tasks.
  • The AI industry is rapidly shifting from horizontal chatbots to proactive, agentic models capable of executing long-horizon tasks (e.g., coding, booking flights, professional services).
  • Compute (GPUs) remains a zero-sum bottleneck; as models shift to test-time compute (reasoning), token consumption and GPU demand per user are scaling exponentially even as intelligence costs drop.
  • The next massive startup opportunity lies in AI-driven professional services—companies that embed deeply into enterprise workflows to deliver outcomes rather than just software tools.
Trade Ideas
Brad Gerstner Altimeter Capital / CEO 34:06
Apple Reliance with Gemini, those are two big user bases, right? A lot of India, a lot of the iOS users... partnerships are a great way to bring two products together and expose something like ChatGPT. Foundational model builders are fighting a capital-intensive war, but they all desperately need distribution to reach the next billion users. Apple owns the most valuable distribution network in the world (iOS) and can extract massive value by integrating these models into its ecosystem, driving a hardware supercycle without bearing the massive foundational training capex. LONG. Apple is perfectly positioned to monetize AI at the edge through hardware upgrades and platform integration fees. Regulatory scrutiny over default search/AI partnerships or delays in rolling out compelling Apple Intelligence features that actually drive consumer hardware upgrades.
Nick Turley Product Leader, OpenAI 39:46
GPUs are zero sum and if you don't have more GPUs you really have to figure out how do you make very very hard trades... demand keeps going up even as prices go down. When you just look at token consumption per user... you see a lot of very GPU hungry workflows. As AI models evolve from simple text generation to reasoning (test-time compute) and autonomous agents, the compute required per user query scales exponentially. This guarantees sustained, insatiable demand for the underlying silicon provided by Nvidia and manufactured by TSMC, regardless of which software layer ultimately wins the consumer war. LONG. Compute remains the fundamental bottleneck and the most valuable, zero-sum resource in the AI economy. Geopolitical tensions affecting Taiwan (TSMC) or a sudden breakthrough in algorithmic efficiency that drastically reduces hardware compute requirements.
Nick Turley Product Leader, OpenAI 53:24
I'm really excited about these companies that are going into companies and getting extremely hands-on and doing effectively professional services with AI because we've saturated all the emails and you need to get proximate to the problems. The initial wave of AI adoption was horizontal (basic chatbots and email drafting). The next wave of massive value creation is vertical and outcome-based. Companies like Palantir and ServiceNow that embed AI directly into complex corporate workflows to act as automated professional services will capture this next enterprise TAM. LONG. Enterprise AI integration is moving from basic API calls to deep, hands-on operational execution, heavily favoring established platforms with deep enterprise access. Enterprise sales cycles are notoriously long, and internal IT departments may attempt to build these integrations in-house using open-source models.
Nick Turley Product Leader, OpenAI 54:27
Credit where credit is due I think NotebookLM is awesome and differentiated and helps me learn new stuff. I think it's great. When a leading product executive at OpenAI explicitly praises a competitor's product as awesome and differentiated, it signals that Alphabet is successfully innovating at the application layer. Google's ability to leverage its massive data ecosystem into unique, sticky AI tools proves it is not being entirely disrupted by OpenAI and remains a formidable builder. LONG. Alphabet retains massive distribution and is proving capable of building highly differentiated AI applications that even industry rivals admire. Cannibalization of traditional high-margin search ad revenue by higher-compute, direct-answer AI queries.
Up Next

This BG2 Pod video, published March 15, 2026, features Brad Gerstner, Nick Turley discussing AAPL, NVDA, TSM, PLTR, NOW, GOOGL. 4 trade ideas extracted by AI with direction and confidence scoring.

Speakers: Brad Gerstner, Nick Turley  · Tickers: AAPL, NVDA, TSM, PLTR, NOW, GOOGL