AI’s new power brokers: Ramp’s chief economist and the 24-yr-old taking on Big AI

Watch on YouTube ↗  |  March 12, 2026 at 20:35  |  36:17  |  CNBC

Summary

  • Anthropic's Claude is rapidly taking enterprise market share from OpenAI, with 1 in 4 businesses on Ramp now paying for it, up from 1 in 25 a year ago.
  • OpenAI experienced its worst month ever in Ramp's index (down 1.5%), as 70% of head-to-head enterprise matchups now go to Anthropic.
  • The Pentagon's attempt to blacklist Anthropic inadvertently acted as a massive PR campaign, doubling the company's paid subscribers.
  • Private Equity firms are adopting AI faster than non-PE-backed companies, using top-down mandates to drive productivity in legacy sectors like retail and healthcare.
  • The next frontier of AI is mathematical reasoning and formal code verification, which requires massive, continuous inference compute, driving hundreds of millions in startup funding directly toward cloud and hardware providers.
Trade Ideas
Ara Karazian Chief Economist, RAMP 2:25
"Most of the money is now going to anthropic... 70% of head-to-head matchups against OpenAI versus Anthropic now go to Anthropic." Anthropic is a private company, but Amazon and Google are its primary strategic investors and cloud providers. As Anthropic aggressively captures the enterprise market and displaces OpenAI as the default choice, AMZN and GOOGL will capture the underlying cloud compute revenue and ecosystem lock-in. LONG. The enterprise shift toward Claude directly benefits the hyperscalers backing Anthropic. OpenAI releases a highly anticipated next-generation model (e.g., GPT-5) that rapidly reclaims the performance crown and enterprise preference.
Ara Karazian Chief Economist, RAMP 13:10
"OpenAI had its worst month ever in your index, down one and a half%... most of it is driven by new businesses that are buying AI for the first time are now going for anthropic." Microsoft's massive AI premium and enterprise narrative are heavily tied to OpenAI's dominance. If OpenAI is consistently losing new enterprise deployments to Anthropic, Microsoft's Copilot and Azure OpenAI services may face stronger-than-expected competitive headwinds and pricing pressure. WATCH. The data shows a clear inflection point away from OpenAI as the default enterprise choice, which warrants caution regarding MSFT's near-term AI growth metrics. Microsoft's distribution advantage through Office 365 and Windows is so deeply entrenched that underlying model preference doesn't negatively impact their top-line revenue.
Ara Karazian Chief Economist, RAMP 14:59
"VC backed companies are more likely to use AI than PE backed companies which are more likely to use AI than all other companies... A PE firm might be working with a large retail chain. It might be working with a large hospital network." Large private equity firms like Blackstone (which is reportedly in talks to tie up with Anthropic) have the board seats and top-down authority to mandate AI adoption across massive portfolios of non-tech companies. This gives PE firms a unique operational moat to drive massive productivity gains, margin expansion, and higher exit valuations compared to standard public companies. LONG. PE firms that successfully force AI integration across legacy industries will generate outsized alpha. AI integration in legacy businesses (hospitals, retail) faces severe regulatory, data privacy, and cultural hurdles that could delay or destroy expected ROI.
Karina Hong Founder, Axiom 25:16
"We're also you know scaling inference we're seeing scaling inference almost has no wall... I think there will be compute clause [costs]." The next wave of AI development is moving beyond simple chatbots into formal mathematical verification and agentic reasoning. These processes require models to "think" step-by-step, which demands massive, continuous inference compute. Startups are raising hundreds of millions of dollars specifically to funnel into these compute costs, directly benefiting Nvidia's hardware monopoly. LONG. The shift from pre-training to inference-heavy reasoning models creates a virtually limitless ceiling for GPU compute demand. Hyperscalers successfully develop and deploy custom silicon (ASICs) that offload inference workloads from Nvidia GPUs at a significantly lower cost.
Up Next

This CNBC video, published March 12, 2026, features Ara Karazian, Karina Hong discussing AMZN, GOOGL, MSFT, BX, NVDA. 4 trade ideas extracted by AI with direction and confidence scoring.

Speakers: Ara Karazian, Karina Hong  · Tickers: AMZN, GOOGL, MSFT, BX, NVDA