MEMORY LANE: AI’S ROAD LESS TRAVELED
(MU), (NVDA),
Did you know that a single advanced AI model training run devours more memory than 10,000 high-end smartphones combined?
While Wall Street is obsessing over NVIDIA’s (NVDA) chips, they’re completely ignoring the memory manufacturers supplying the digital equivalent of rocket fuel that makes these silicon beasts actually work.
I was slapped in the face with this reality last week when an old Stanford buddy dragged me to his AI startup in San Francisco. Their modest-looking server rack (cost: a heart-stopping $2.3 million) was crammed with hundreds of terabytes of high-bandwidth memory, most sporting the Micron (MU) logo.
“Without these memory chips,” my friend’s CTO confessed after his third bourbon, “our fancy AI models would be about as useful as a Ferrari without gas.”
Yet the market is treating memory stocks like they’re selling typewriter ribbons in the iPhone era.
NVIDIA has morphed into a $3 trillion behemoth while Micron bounces around like a ping-pong ball between $70 and $125.
It’s currently loitering at $98, suffering from the same market schizophrenia that drives investors to pay 120x earnings for money-losing AI startups while ignoring the very companies providing their lifeblood.
Let me be blunt: This disconnect is creating one of the juiciest AI investment setups of 2025, assuming you have the patience to wait for the right moment to pounce.
The raw numbers first, before I tell you why they’re about to get a lot more interesting. Micron’s last quarterly report showed EPS of $1.56, handily beating estimates.
Looking ahead to June 25th’s report, analysts expect $1.59 in EPS and 9.6% sequential revenue growth to $8.83 billion. Not bad for a supposedly boring memory maker.
Now, I’m not going to sugarcoat the obvious – bottom-line growth is crawling at 1.9% while the top line gallops at 9.6%.
Gross margins sit at 36.79%, well south of their 47.28% glory days in 2021-2022. The bean counters are fretting about Idaho fab startup costs and NAND underutilization. Yawn.
But here’s what’s making me salivate: Memory is traditionally the semiconductor industry’s most schizophrenic sector, but AI is creating a once-in-a-generation structural shift that’s about to blow up the traditional cycle.
High-bandwidth memory used in AI accelerators sells for a mouth-watering 5-10x premium over conventional memory, and demand is going absolutely ballistic.
Meanwhile, options traders are apparently smoking something potent.
Micron’s implied volatility is lounging at just 46.7 – in the lowly 44th percentile of its range – despite the stock bouncing around like a kangaroo on amphetamines.
The IV is 17.5% below the 20-day historical volatility of 56.6, which is market-speak for “options are dirt cheap right now.”
And there’s another catalyst that Wall Street’s algorithm-sniffing geniuses haven’t properly digested.
The Trump administration just rescinded Biden’s AI diffusion rule on May 13th, potentially blowing open the doors for U.S.-made AI hardware.
While pencil-pushers debate the policy implications, I’m thinking about the tidal wave of memory orders this could unleash as global AI development accelerates.
The growth projections would make even the most jaded venture capitalist drool. Fiscal 2025 consensus shows EPS of $6.99 – a face-melting 437.56% year-over-year explosion.
For 2026, some analysts are predicting up to $15.75 per share. With a forward P/E of 14x, Micron is practically on the clearance rack compared to other AI darlings.
So what’s my advice? Resist the urge to back up the truck immediately.
For the options junkies among you, the June 27th $96 calls at $5.90 look tempting – that’s leveraged upside for just 6% of the share price.
Current shareholders might consider some cheap put protection, given the market’s bizarre underpricing of potential volatility.
But the real money will be made by those with the discipline to wait for the inevitable sector-wide freakout that sends Micron spiraling toward $70-80. That’s when you strike.
I’ve seen this movie before – three times since 2018 – and the ending is always the same: massive gains for those who buy when others are panicking.
My buddy’s bourbon-loosened CTO made a prediction that kept me awake that night: “In five years, we’ll need 50 times more memory capacity for AI than what exists today.”
When I ran those numbers, they were so preposterous I had to check them twice. But every major industry projection points to the same conclusion.
Remember those 10,000 smartphones worth of memory I mentioned? That’s for today’s pedestrian AI models.
Tomorrow’s models will make these look like pocket calculators, creating an insatiable memory appetite that only a handful of companies can satisfy.
When margins inevitably recover and Micron’s cycle turns positive – as it always does – the stock will go vertical faster than you can say “I should have listened to John.”
The real money in gold rushes was never made selling picks and shovels – it was made selling the dynamite. In the AI gold rush, memory isn’t the shovel – it’s the explosive that makes the whole operation possible.
Just make sure you’re buying it when others are too terrified to light the fuse.