▶ Full Post Text
Open those sleepy eyes, I’ve been watching Micron for a while and the more I dig into it the more it feels like the market is still pricing this company like it’s the same old cyclical memory stock from 2018. Meanwhile the entire AI industry is about to consume absurd amounts of memory for the next decade.
Everyone wants exposure to AI through GPUs. NVDA, AMD, accelerators, whatever. But GPUs don’t actually work without huge amounts of memory bandwidth feeding them data. Training large models and running inference requires insane amounts of DRAM and high bandwidth memory. That’s the part of the stack people seem to forget about, and Micron sits right in the middle of it.
What really caught my attention was how strong the actual numbers have already become. Micron just reported about $13.6B in revenue, up from roughly $8.7B a year ago, and earnings of about $4.78 per share compared with $1.79 last year. Operating cash flow for the quarter was over $8B. Those are not numbers from a company that’s waiting for AI demand to show up, the demand is already here and it’s ramping fast.
Then management dropped guidance for the next quarter and it was even more aggressive. They’re expecting roughly $18.7B in revenue and around $8.4 EPS. At current prices that puts the stock somewhere around 12x that earnings run rate, which honestly doesn’t seem that crazy for a company tied directly to the biggest infrastructure buildout in tech right now.
The part that makes the story even more interesting is high bandwidth memory. HBM is the specialized memory that sits next to AI GPUs and feeds them data fast enough to run large models. Without it, the GPUs basically choke on data throughput. Micron said they’ve already sold out their entire HBM supply for 2026, including next generation HBM4. When a company is literally booked out years ahead in one of the most important components in AI infrastructure, that’s worth paying attention to.
The industry projections around this market are also pretty wild. Estimates have the HBM market growing from around $35B in 2025 to about $100B by 2028, which is roughly a 40% annual growth rate. That’s before even considering how fast data center demand could accelerate if AI adoption continues at the current pace.
At the same time supply across the memory industry is still tight. Micron has said they still can’t meet all customer demand, and pricing is already moving up. Some industry estimates suggest DRAM prices have jumped close to 90% quarter over quarter, with NAND prices up more than 50%. When memory pricing moves like that, the companies producing it tend to see big margin expansion.
They’re also pushing new products designed specifically for AI data centers. Micron recently announced a 256GB SOCAMM memory module built for high performance computing and AI workloads. The claims were pretty impressive, significantly lower power usage, smaller physical footprint, and meaningfully faster performance for large language model inference. In an environment where data centers are constantly fighting power limits, improvements like that matter.
So the obvious question is: if the fundamentals look this strong, why does the stock still get hit sometimes?
Part of it is just history. Memory stocks have burned investors for decades because the industry used to be brutally cyclical. Every time pricing peaked, supply eventually flooded the market and margins collapsed. A lot of investors still assume that pattern will repeat no matter what.
The other reason is macro noise. Semiconductor stocks are extremely sensitive to sentiment. War headlines, interest rate fears, AI rotations, analyst upgrades or downgrades, all of that can move the stock in the short term even if the long-term demand story hasn’t changed.
To be fair, the bear case isn’t crazy either. Memory is still cyclical to some degree. If AI spending slows down, or if hyperscalers pause their infrastructure buildouts, Micron would definitely feel that. Expectations going into earnings are also pretty high, so even strong results could cause volatility if investors decide growth is peaking.
But when I zoom out and look at the bigger picture, the demand side of the equation is hard to ignore. AI infrastructure is scaling rapidly and every single GPU cluster needs massive amounts of memory bandwidth to function. As models get bigger and inference workloads expand, the amount of memory required per system keeps increasing.
So in a weird way **Micron becomes a sort of second order AI play. If NVIDIA sells more GPUs, memory demand rises. If hyperscalers build more AI data centers, memory demand rises. If large language models continue getting bigger, memory demand rises again.**
The market still tends to treat Micron like a boom and bust commodity stock, but the environment around it is starting to look very different. Revenue growth is accelerating, margins are expanding, HBM supply is already locked in years ahead, and pricing across the industry is improving.
Maybe the market is right and memory cycles will always come back to bite investors eventually. But if AI infrastructure spending keeps ramping the way it has been, Micron might be sitting in one of the most important bottlenecks in the entire stack.
Not financial advice. I just like the company selling the memory that all these trillion dollar AI models need to run.
My $MU Position as of today:
I’m currently holding 506 shares of Micron with an average cost of $410.91. At the current price around $388, my equity position is worth roughly $196k, making up about 78% of my portfolio.
I’m also positioned with $17k in options exposure, focused on upside into the next few months
April 17 Calls
2x $450 calls
2x $420 calls
1x $400 call
July 17 Position
2x $400 / $460 call debit spreads
TLDR; Micron becomes a sort of second order AI play. If NVIDIA sells more GPUs, memory demand rises. If hyperscalers build more AI data centers, memory demand rises. If large language models continue getting bigger, memory demand rises again.