Google developed a new compression algorithm that will reduce the memory needed for AI models.
If this breakthrough performs as advertised, it could drastically reduce the amount of memory chips needed to support AI.
The impact of artificial intelligence (AI) has been far-reaching, changing the tech landscape as we know it. These next-generation algorithms are promising to streamline time-consuming tasks, resulting in significant increases in productivity. The scramble to adopt AI has shifted the demand for data centers into high gear, and foundries are churning out AI-capable semiconductors as fast as they can make them. Memory chips, including high-bandwidth memory (HBM), DRAM, and NAND, play a critical role in AI processing, yet despite the ramp in production, they remain in short supply and command premium prices.
Alphabet's (NASDAQ: GOOGL) (NASDAQ: GOOG) Google just announced a significant breakthrough in compression technology that could make AI models more efficient, thereby reducing the need for some of these scarce memory chips. That could be bad news for Micron Technology (NASDAQ: MU) and Sandisk Corporation (NASDAQ: SNDK).
Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »
Image source: Getty Images.
In a blog post published last week, Google announced that its scientists had developed an AI memory-compression algorithm, dubbed TurboQuant. "We introduce a set of advanced, theoretically grounded quantization algorithms that enable massive compression for large language models and vector search engines," Google scientists said in the research paper.
Perhaps the most stunning claim is that the integration of the algorithm reduces memory usage "by at least 6x and delivers up to 8x speedup, all with zero accuracy loss, redefining AI efficiency." Put another way, it could reduce the number of memory chips needed by 83%.
In a post on X, Cloudflare co-founder and CEO Matthew Prince said, "This is Google's DeepSeek." He went on to say there was "so much more room to optimize AI inference for speed, memory usage, power consumption, and multi-tenant utilization."
An early review of the process suggests that not all memory will be affected equally. NAND flash memory will feel the brunt of the impact, while DRAM and HBM will be largely unaffected. That means the news is worse for Sandisk, as nearly all the company's revenue comes from NAND. Micron has much lower exposure to the NAND market, with roughly 21% of its revenue from flash memory in Q2.
Severe shortages of memory chips have driven prices higher, something Micron CFO Mark Murphy addressed during the company's Q2 earnings report earlier this month. "DRAM ... prices increased in the mid-sixties percentage range, driven by tight industry conditions ... NAND ... prices increased in the high-seventies percentage range, driven by tight NAND industry conditions."
If Google's new algorithm performs as promised, it will reduce the need for some forms of memory. Decreasing demand would ultimately lower prices, taking a toll on Micron's and Sandisk's sales.
The news isn't all bad. Some commentators suggest that falling memory prices will actually increase demand. By lowering the cost of memory and, by extension, AI, businesses will be more likely to adopt AI, thereby increasing memory usage and demand.
Only time will tell how this all shakes out.
Developments in the field of AI support a few inescapable conclusions. First, a well-balanced portfolio remains as important as ever in helping insulate investors from volatile share price movements. Second, those who make portfolio decisions based on knee-jerk reactions to fast-moving developments may miss the bigger picture, which could ultimately be a costly mistake.
Investors will want to keep an eye out for future developments.
Before you buy stock in Alphabet, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Alphabet wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $503,861!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,026,987!*
Now, it’s worth noting Stock Advisor’s total average return is 884% — a market-crushing outperformance compared to 179% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
See the 10 stocks »
*Stock Advisor returns as of March 30, 2026.
Danny Vena, CPA has positions in Alphabet and Cloudflare. The Motley Fool has positions in and recommends Alphabet, Cloudflare, and Micron Technology. The Motley Fool has a disclosure policy.