Summary
The hosts discuss Cerebras' upcoming IPO, focusing on its unique wafer-scale chip architecture that delivers dramatically faster AI inference compared to NVIDIA GPUs. They highlight the strong demand evidenced by 20x oversubscription and revision to $4.8B raise, the partnership with OpenAI, and distribution via AWS. Both Josh and Ejaaz express bullish views on investing in the IPO, while acknowledging risks such as high valuation and OpenAI's own chip development.
- Cerebras is going public with a $4.8B IPO, revised up from $3.5B due to 20x oversubscription.
- The company uses a wafer-scale chip with 1.2 trillion transistors and SRAM memory, enabling 20x faster inference than NVIDIA's Blackwell on some models.
- OpenAI invested $10B and uses Cerebras chips for its Codex Spark model, proving the technology at scale.
- Cerebras has distribution through AWS Bedrock and a partnership with OpenAI.
- The hosts see this as the first major AI hardware IPO and a potential threat to NVIDIA's monopoly.
- Both Josh and Ejaaz plan to buy the stock, viewing it as a long-term investment despite a high price-to-revenue multiple.
- Risks include OpenAI building its own inference chip and a memory ceiling on the current Cerebras architecture.
- The broader AI IPO pipeline includes SpaceX, OpenAI, Anthropic, and others, potentially absorbing significant market liquidity.