Back Research Notes Why AI Is Running Into Silver Published on January 26, 2026 By Jordi Visser Memory, Electricity, and the Physics of Exponential Growth Every period of exponential innovation carries a familiar illusion: that progress is limited primarily by imagination, software, or intelligence. In the early phases of a technological wave, this illusion often holds. Capability compounds rapidly, costs fall, and constraints appear to dissolve. History shows, however, that exponential systems do not escape physics. They collide with it. When that collision occurs, progress does not slow because ideas run out, but because bottlenecks emerge in places that were previously invisible. This is why my 2026 outlook paper was titled βThe Physical World Upgrade: Why AIβs Next Chapter Demands a Hardware Renaissance.β Artificial intelligence has now reached that point. Much of the public debate around AI remains centered on models, algorithms, and compute. The conversation oscillates between optimism about intelligence gains and anxiety over job displacement, regulation, or software valuations. Markets ask when software bottoms will form or when certain data-center trades will work again. Meanwhile, engineers and governments are signaling a different concern entirely. Two inputs, seemingly unrelated, have begun behaving in strikingly similar ways: high-bandwidth memory in the digital world, and silver in the physical one. Since September, both DRAM prices and silver prices have moved sharply higher. Both are essential to the physical-world upgrade required for AIβs next phase. At Davos, Jensen Huang estimated that this global upgrade could reach $85 trillion over the next 15 years, far larger than the $5β8 trillion currently discussed for data centers alone. The real investment wave is not confined to training models or scaling centralized compute, but to rebuilding the infrastructure that allows intelligence to operate everywhere. Constraint Migration: From Software to Hardware to Materials AI does not just create abundance in software; it exposes scarcity in the physical world. DDR3, older-generation DRAM, shows what happens when production exits too early. Silver shows what happens when strategic importance is misunderstood for too long. Both have experienced sharp repricing, supply stress, and rising strategic attention. Both are small in mass, expensive relative to alternatives, and essential to system functionality. Most importantly, both reveal the same structural truth: exponential systems collapse onto their narrowest constraint. To understand why silver now matters in the age of AI, it helps to begin with memory. Memory as the First Hidden Bottleneck In the first two years following the launch of ChatGPT, compute appeared to be the dominant constraint. Progress was measured through GPU shipments, capital expenditures, and the assumption that more arithmetic power would directly translate into more intelligence. As models scaled, a different reality emerged. Compute without memory stalls. Arithmetic units increasingly sat idle while waiting for data. Latency, bandwidth, and proximity became more important than raw processing power. Intelligence, it turned out, was not just about calculation, it was about recall, context, and state. High-bandwidth memory became a choke point not because it was conceptually exciting, but because it sat at the boundary between potential and realization. Without sufficient memory bandwidth, AI systems could not utilize the compute they already had. The constraint was not intelligence itself, but access to intelligence. Silver will prove to be the electrical analog, the interface material that determines whether power infrastructure can keep pace with the intelligence it must serve. As this bottleneck became visible, memory shifted from an afterthought to a strategic asset. Inference, the Edge, and Data Movement This shift becomes even more pronounced as AI moves from centralized training toward distributed inference and edge deployment away from the cloud and closer to the physical world it must sense, interpret, and act upon. Inference is fundamentally a data-movement problem. Performance depends less on raw computation and more on how quickly data can be retrieved, moved, and acted upon under tight energy and latency constraints. Leaving the cloud changes the nature of the system. In centralized environments, latency can be masked and inefficiencies tolerated. At the edge, distance becomes unforgiving. Intelligence must operate in real time, with limited power budgets, minimal buffering, and near-zero tolerance for failure. Memory must be local, fast, and reliable. Power delivery must be precise. The margin for error collapses. As inference spreads outward to robots, vehicles, industrial systems, sensors, and autonomous platforms, the system increasingly rewards architectures and materials that perform under continuous stress rather than peak benchmarks. Reliability replaces raw throughput as the defining metric. As memory became strategic, so too did the physical materials that enable fast, reliable signal transmission. In that sense, silver is not merely an industrial input, it is a scaling constraint for embodied intelligence, linking digital cognition to the physical world it increasingly inhabits. Electricity as the Physical Analog of Memory The physical world is now undergoing a parallel transition and its constraints are already visible. The global economy is not short of energy. Fossil fuels remain abundant. Renewable capacity continues to expand. From a purely volumetric perspective, there is no immediate energy scarcity. But AI systems do not run on energy in the abstract. They run on electricity delivered with extreme precision, density, and reliability and that distinction is no longer academic. Across the data-center ecosystem, shortages have already emerged in the infrastructure required to convert energy into usable electricity. Gas turbines face multi-year lead times. Transformers are constrained by manufacturing bottlenecks and grid backlogs. Switching gear, high-voltage components, and cooling systems are increasingly rate-limiting factors. In many regions, the constraint is no longer compute or capital, but whether power can be conditioned, delivered, and cooled fast enough to support new AI loads. Timing matters. Voltage stability matters. Heat dissipation matters. Conversion losses matter. Failure tolerance is approaching zero. Electricity plays the same role in the physical world that memory plays in the digital one. It determines whether theoretical capacity can be turned into real performance. Energy without electricity is like compute without memory: abundant, but unusable. Silver sits precisely at this interface. Silver as an Interface Material Silver is not an energy commodity. It does not generate power. Instead, it appears at the narrow waist of electrical systems where electrons are forced through small spaces under high current, high frequency, and high reliability requirements. It is used in contacts, switches, connectors, solders, conductive pastes, power electronics, and photovoltaic metallization. These are not headline components. They are failure points. What matters about failure points is not volume, but system behavior. As electrical systems scale, performance does not degrade smoothly it fails discretely. Connectors trip. Switches arc. Data centers go offline. In exponential systems, failures propagate outward from interfaces. They do not average out; they cascade. This is why silver demand rises as systems scale not because more silver is consumed in bulk, but because reliability margins collapse. Synchronized S-Curves and Structural Scarcity Solar power provides the clearest example. Despite a 95% reduction in silver intensity per watt since 2010, from roughly 400mg to just 13mg for modern N-type cells, total solar consumption has quadrupled, from 50 million ounces to 232 mi