Galliano is right. In his book La máquina ingobernable (The Ungovernable Machine), and with the intention of applying a certain rationality to how we use and consume the internet, Alejandro Galliano argues that human beings will eventually have to face a reality: the internet –no matter how infinite it may seem– is finite, and its resources are limited. That is: the internet's current infrastructure has a physical ceiling, even if it doesn't look like it –connection speed, data-processing volume, bandwidth, storage capacity. In short, general parameters that define the network's physical side: its limits.
Internet Scarcity: The Galliano Thesis
Galliano's solution, in line with his Marxist outlook, is to limit pointless internet use: rationalize it, impose quotas. Even though that part of the argument never quite convinces me, there is something very real in the idea of "internet scarcity". The internet's infrastructure does have a physical limit –but it also has an economic one: at what point does a company that needs unlimited resources to function stop being profitable? The symptom Galliano tried to predict as a potential excess demand for the internet (data) now shows up, instead, as excess demand for compute power. Let's see.
Anyone who tried to buy a graphics card during Ethereum's last mining rush knows what I'm talking about. During the later stretch of the pandemic, mining Ethereum with consumer GPUs became profitable for many, which created a bubble: GPU prices shot up, and the compute power allocated to Ethereum grew exponentially. When Ethereum abandoned proof of work and switched to proof of stake, GPU mining stopped being profitable and demand collapsed. The cycle looks ready to repeat: a demand spike, scarcity, and sky-high prices. But this time it's not graphics cards (GPUs). It's RAM.
Galliano not only made this point in his book; he also talked about it on an episode of Sherpas, the podcast by El gato y la caja, and in at least two of the three talks we shared over the year. My only disagreement is this: under the current mode of production, bottlenecks caused by excess demand aren't solved through rationalization –they're solved through price increases. Which, sure, is ultimately a form of rationalization too (a market one). The problem is what always ends up happening in the current capitalist model: the one with access to more resources is the one who takes the most.
RAM for Everyone (Who Bought Ahead of Time)
The RAM shortage and the sustained rise in prices are not a market anomaly or a passing supply-chain accident. They're the direct consequence of an industrial reshuffle. AI –especially LLMs– introduced a workload that prioritizes memory over almost any other resource. Training and inference move massive volumes of data constantly, demanding extreme bandwidth and near-permanent availability. In that context, memory stopped being a cheap commodity and became a strategic input.
LLMs are memory-bound systems: their performance depends less on raw compute than on the ability to move large amounts of data as fast as possible. That technical detail is key to understanding why the bottleneck isn't in CPUs –not even in GPUs– but in the memory that feeds them.

Memory as the New Star of Home Computing
Within the DRAM family –the basis of modern memory– different types coexist, with very different roles. DDR5 is classic system memory, optimized for capacity and reasonable latency, meant for PCs, traditional servers, and general use. VRAM, in the form of GDDR, prioritizes bandwidth to feed GPUs, sacrificing latency and scalable capacity. And then there's HBM, High Bandwidth Memory, which isn't simply "faster RAM" but a different architecture: vertically stacked chips, ultra-wide buses, and direct integration with high-performance accelerators.
HBM is the memory modern AI needs. Not as a preference, but as a technical requirement. Training and serving giant models without HBM is, in practice, unworkable. The problem is that HBM and DDR5 compete for the same fabs, the same industrial processes, and the same capital. When making HBM yields far higher margins –and data-center customers pay with very few limits– the manufacturers' decision is obvious: prioritize HBM and push consumer memory to the side.
The result is a crowding-out effect. RAM isn't scarce because it can't be produced, but because it has lost strategic priority. PCs, consoles, and home hardware become collateral damage in an industry that reorganizes its output around AI. The current scarcity isn't a bug in the system; it's a sign of saturation. It marks the point where a single class of workload dominates industrial planning and resets prices for everyone else. Galliano is right.
The Impact Is Inevitable
According to Reuters, the three major players in the sector are considering changes to their production. The giants Samsung, SK hynix, and Micron Technology are restructuring their businesses. It's obvious that, given the need for compute –and the constant injection of speculative capital they can rely on– AI companies are in a dominant market position to bend production priorities in their favor. If these companies follow through on the shift toward prioritizing HBM, the consumer RAM market will be hell. Reports of price hikes are already constant. Speculative moves are starting too: suppliers want to get ahead of a high-price market, buy in bulk, and amplify scarcity.
Every computer ordinary people use has RAM. A phone, a console, a laptop. Are we heading toward a market with less memory than in recent years? Will consumer computing prices skyrocket? These are some of the scenarios we've been anticipating at 421, to avoid walking into this blind. Luis Paz's praise of desktop computers now trades at 10x what it was worth when it was published. Same thing with Soldán’s idea of "compute oligarchs" –and every time I read that, I laugh, because I picture a Kingpin-style mafia guy surrounded by RAM sticks and graphics cards.
On top of this, I'll add two more questions I picked up from reading a bunch of American users on X. If this crisis drags on for more than a couple of years, as everything seems to indicate, are we headed toward a model where compute power is concentrated in large data centers and nobody owns anything like a real computer at home anymore –instead renting that processing via streaming? And are we headed toward a rental model like the one people tried to build a few years ago with those failed "cloud services"?
As the cited tweet puts it: if a PS3 could run something on the level of GTA V with just 512 MB of memory, and now a single Google Chrome tab eats 1.2 GB, what the hell happened to optimization? While RAM was a relatively unglamorous, cheap component, nobody seemed very interested in optimizing its use. Will that trend change? Will we go back to optimizing software performance because of a new scarcity?
Either way, it sure looks like 2026 is bringing more expensive –and less powerful– computers for the average user. The compute market is getting more complicated by the day, crises are recurring on shorter cycles, and the physical limits of a connected society are now in plain sight. Galliano is right.