Want to learn more?
Contact a data center team member today!
This article explores the changing customer landscape - specifically the rise of neoclouds and the pivot to inference - and outlines the four critical pillars to look for in a data center partner.

The landscape of Artificial Intelligence is shifting from the experimental to the essential. While much of the initial industry focus centered on the massive compute power required for AI training, a new era has emerged: the era of production, or inference. As businesses transition from building models to delivering real-world AI services, infrastructure requirements are evolving rapidly. This article explores the changing customer landscape - specifically the rise of neoclouds and the pivot to inference - and outlines the four critical pillars to look for in a data center partner.
The demand for AI infrastructure is entering a significant transition phase driven by two primary factors: the emergence of specialized cloud providers and a fundamental change in how compute power is utilized.
The Rise of Neoclouds: While hyperscale clouds remain the largest investors, the market is being accelerated by neoclouds, the new generation of specialized cloud computing providers that seek purpose-built, high-density facilities tailored specifically for accelerated workloads. By 2030, neoclouds are forecast to account for nearly 10% of the market, while traditional on-premise data centers are expected to see their share shrink from 29% to just 6%.
From Training to Production: There is a growing distinction between AI training and AI inference. Training involves vast superclusters for one-off model development, but inference—the process of running those models to answer queries—is a recurring, incremental need. According to Structure Research, infrastructure for inference will grow at a staggering 79% CAGR through 2030, far outpacing training's 25%. By the end of this decade, 80% of all AI infrastructure will be dedicated to inference.
This will dictate a different type of data center footprint.
The growth of neoclouds and the rise of inference run over and above existing market trends. Taken together with ongoing enterprise and cloud growth they are redefining the critical capabilities and business models of data center infrastructure providers.
Here are four of their new critical capabilities:
Unlike training, which can often be conducted in remote locations, inference requires local infrastructure to minimize latency and share data effectively. This demands a distributed global footprint. In the current market, power has surpassed land as the primary constraint for site selection. As mature hubs become power-constrained, geographically distributed partners become essential. Iron Mountain Data Centers for instance, have committed to doubling their development pipeline to 2.6 GW over the coming years, and are securing capacity not just in established hubs like Northern Virginia, Miami and Chicago, but also in high-growth merging hubs like Spain and India.
GPUs require approximately ten times the power density of traditional CPUs. This is driving the move to liquid cooling, which can remove heat 3,000 times more efficiently than air. Accelerated compute facilities must support a mix of Active Rear Door Heat Exchangers and Direct-to-Chip Liquid Cooling to handle densities ranging from 50kW to 200kW per rack. There is also a new level of sophistication required for power design and management. Data center operators should offer Behind the Meter (BtM) capabilities and Battery Energy Storage Systems (BESS) to manage grid constraints and enable load shifting.
As infrastructure scales and AI becomes politicised, mitigating climate impact is non-negotiable. The new Greenhouse Gas Protocol stipulates that clean energy claims must be local and simultaneous with use. Forward-thinking partners are moving beyond simple offsetting to 24/7 carbon-free energy tracking. For example, IMDC has committed to 100% clean power tracked hour-by-hour by 2040. Additionally, look for construction standards like BREEAM, which assess the entire lifecycle of a build, from water use to ecological impact.
In a market where compute power doubles every few months, flexibility is the ultimate asset.
Data center operators must be prepared for "last-minute" specification changes and creative site selection. A standards-based approach using tried-and-tested blueprints allows for quick customization while maintaining operational reliability, but sometimes there is no standardized solution, and one-off designs need to be undertaken for customers. Flexibility with unremitting customer focus will mark out the most successful operators.
You can get more detail on these key considerations for next generation infrastructure buyers in the new IMDC ebook “AI inferencing infrastructure: 4 things to look for”
Iron Mountain Data Centers delivers secure colocation solutions for cloud & AI infrastructure across 30+ locations with industry-leading sustainability.
This Iron Mountain Data Centers (IMDC) ebook gives a quick overview of evolving customer needs, and sets out the key things to look for in a data center partner as you build out accelerated inferencing infrastructure to deliver new AI services.
Contact a data center team member today!