A New Trend in AI is Emerging: Efficiency
Most important take away
The current pace of AI infrastructure buildout is unsustainable due to massive power shortfalls — 30-50% of data centers planned for 2026 will be delayed because of power constraints, and Morgan Stanley projects a 44 gigawatt shortfall through 2028. AI model and memory efficiency breakthroughs (like Google’s TurboQuant, which could reduce LLM memory needs by 6x) are not just nice-to-haves but absolute necessities for the AI buildout to proceed anywhere close to plan.
Summary
Stocks and Investments Mentioned
- Meta Platforms: Lost two court cases related to social media addiction and misleading safety claims. Faces potential legislative risk if social media is regulated like tobacco. The hosts noted the fines ($3 million split with Alphabet) are immaterial, but the legal precedent could matter long-term.
- Alphabet (Google/YouTube): Also found liable in the social media mental health case. Separately, announced TurboQuant memory compression research that could reduce LLM memory requirements by 6x.
- ARM Holdings: Pivoting its business model to design and sell its own CPUs (fabless, like Nvidia/AMD) rather than just licensing designs. Meta will be the first customer for ARM’s custom silicon.
- Micron and SanDisk: Sold off on the Google TurboQuant news, but the hosts noted Micron is still up 300%+ over the past year and the memory compression only addresses a fraction of total memory needs (key-value cache specifically, not all memory).
- Nvidia and AMD: Referenced as comparisons for ARM’s new fabless chip manufacturing model.
- S&P 500 / broad market index funds: Discussed in the context of dollar-cost averaging vs. lump sum investing.
Actionable Insights
-
Do not panic-sell memory stocks on efficiency headlines. Google’s TurboQuant only reduces a fraction of memory requirements (key-value cache), not total memory needs. Even at one-sixth current usage, demand for memory chips will remain enormous as AI scales. The broader memory supply imbalance is likely to continue.
-
Watch AI infrastructure plays through the lens of efficiency, not just scale. The sheer power and resource demands of current AI plans are unsustainable. Companies that solve efficiency problems (ARM with lower-power chips, Google with model compression) may become increasingly important. This is a natural evolution of the technology.
-
Monitor social media regulation risk but do not overreact. The tobacco stock comparison is instructive — even decades of litigation and regulation did not destroy tobacco as an investment. The current fines are trivial. However, watch for precedent-setting rulings on appeal and potential legislation banning social media for minors, which could have a more material impact.
-
Set up automatic/scheduled investments rather than waiting to buy the dip. A Fidelity study of $5,000 annual investments from 1980-2023 showed: investing on January 1st each year yielded $5.1M, monthly dollar-cost averaging yielded $4.8M, perfectly timing the market bottom each year yielded $5.6M, and picking the worst day each year still yielded $4.2M. The upside of perfect timing (10% better) is not worth the downside risk (18% worse). A hybrid approach — invest 80-90% on a schedule and keep 10-20% as opportunistic cash — is a practical middle ground.
-
Use dollar-cost averaging to build positions in individual stocks. Rather than investing a lump sum, spread purchases over several months. This mathematically forces you to buy more shares when prices are lower, removing emotion from the equation.
Chapter Summaries
Social Media Litigation and Its Investment Implications
Meta and Alphabet were found liable in court cases related to social media addiction and misleading safety statements. The hosts discuss whether this represents a “tobacco moment” for social media companies. They conclude the fines are immaterial but the legal precedents could lead to legislative action. However, they note it is too early to draw the tobacco parallel and that even tobacco stocks performed well despite decades of regulation.
AI Efficiency Breakthroughs: ARM and Google
ARM announced it will begin designing and selling its own CPUs (fabless model) with Meta as its first customer, marking a significant business model pivot. Google announced TurboQuant, a memory compression method that could reduce LLM memory needs by 6x. The hosts argue these efficiency gains are not just improvements but necessities, given that 30-50% of planned data centers face delays due to power shortfalls and the current AI buildout trajectory is unsustainable.
Mailbag: Automatic Investing vs. Buying the Dip
A listener asks whether to set up automatic investments or wait for market dips. The hosts strongly advocate for scheduled investing, citing a Fidelity study showing that even the worst possible market timing over 40+ years still produced $4.2M from $200K invested. They recommend a hybrid approach: invest the majority on a regular schedule while keeping a small cash reserve (10-20%) for opportunistic purchases during significant drops.