The Race to Build the AI Data Center Factory
Antonio Neri, President and CEO of HPE, with Hiromichi Matsuda, President and CEO of KDDI, announcing their joint plan to launch the Osaka Sakai Data Center by early 2026 – a next-gen, NVIDIA-powered AI hub built for high-density workloads and liquid cooling innovation.
The global data center industry is in the midst of a structural shift, driven by the explosive growth of artificial intelligence. As hyperscalers and enterprises race to deploy GPU-powered AI models, a new type of facility is emerging – purpose-built “AI factories” designed to handle the extreme power and cooling needs of modern accelerated computing.
Traditional air-cooled, low-density facilities are giving way to high-density, liquid-cooled infrastructure, particularly across Asia-Pacific, where demand for AI-ready capacity is accelerating.
Building AI-Ready Data Centers for the GPU Era
In Australia, public cloud and colocation provider NEXTDC has taken a lead role in building facilities that can accommodate the latest NVIDIA DGX systems. As a certified partner in the NVIDIA DGX-Ready Data Center program, NEXTDC operates “GPU-optimized halls” engineered for workloads that far exceed the 15–20 kW per rack limits of conventional designs.
These halls can manage densities of 60 kW per rack or more, with liquid cooling capabilities reaching 130 kW per rack. This allows customers to deploy large-scale NVIDIA H100/H200 GPU clusters without overhauling their own infrastructure.
The company’s AXON interconnect fabric also supports low-latency, multi-region training and federated learning – critical for AI models requiring distributed computing. By enabling an operational expenditure (OpEx) model for high-density AI workloads, NEXTDC lowers barriers for organizations looking to innovate quickly in the GPU era.
Liquid Cooling Gains Ground in APAC
In Japan, KDDI Corporation is exploring immersion cooling as a viable approach for AI workloads. Working with Mitsubishi Heavy Industries (MHI) and NESIC, KDDI conducted a demonstration using single-phase immersion cooling in a compact container setup.
The test achieved a PUE of 1.05 and cut cooling energy use by 94% compared to air-cooled designs. Immersion cooling eliminates the need for large air conditioning systems and improves energy efficiency, making it an attractive option for AI infrastructure.
KDDI’s system maintained Tier 4 stability and is now being developed for full commercialization with HPE by early 2026. For a region balancing rapid AI growth with sustainability targets, such developments highlight a practical path toward energy-efficient, AI-ready data centers.





Follow the Chipmaker: How AI and Demand-Side Pressure Are Forcing a New Climate-Driven Energy Playbook
Beyond the Grid: How a 500 MW Deal in Malaysia Signals Asia’s New Energy Playbook
Change the Play: Global Investors Pivot to AI “Real Assets” Amid Sovereign Fund Push in Asia
The New Currency: AI Governance Takes Center Stage as Trust Becomes a Critical Asset
Global AI Race Fuels Vertical Integration as AI Investment Strategies Shift
Localized Innovation Redefines HR Technology in Asia: Regional R&D and Ecosystems Challenge Global Models