Sumit Mukhija, CEO of DCI Data Centers will discuss AI’s effect on digital infrastructure at the Melbourne Cloud and Datacenter Convention on April 3rd. With over 30 years of experience at companies like ST, GDC India, Tata Communications, Microsoft, and Cisco. Brookfield appointed him to a leadership role at DCI in Sydney.
Mukhija sees a core ‘essence’ in data centers globally, with differences in local factors like land considerations, approval processes, and vendor ecosystems. He observes that Australia’s strict construction standards contrast with the reliance on imported components like chillers and UPS systems, unlike regions where manufacturers produce locally for faster delivery.
DCI is growing in Australia, New Zealand, and South Korea with a focused strategy. Mukhija stresses the company’s dedication to solving customer issues rather than expanding broadly. In the next 12 to 18 months DCI aims to target key markets where it can have a significant impact.
AI and Data Centers
AI has been around for a while with different forms of automation, analytics, and robotics. The current version of AI powered by deep learning and billions of parameters is getting a lot of attention in various industries. This trend is similar to the early days of cloud computing where CIOs looked for advice before expanding their cloud usage.
The adoption of AI is changing. Companies are now figuring out how to support more powerful equipment in their data centers. AI is expected to become mainstream quicker than cloud computing. The real shift in AI will happen when inferencing becomes as important as training and models are used at the edge for practical applications. This shift is estimated to occur in the next two to three years.
The use of AI is influencing decisions about data centers. Businesses need to decide where to set up infrastructure that supports AI. Dealing with increased power densities creates cooling challenges. The most common cooling method is direct-to-chip liquid cooling, which is a mix of liquid and air cooling. These changing needs make it harder to choose how to deploy AI, whether to use edge locations, existing cloud areas, or dedicated AI/ML zones. The decision depends on the workload, whether it’s for training or inferencing or whether it should be centralized or closer to the data source.
Computer and Facility Evolution
Data center deployment has always evolved alongside computer advancements. From centralized mainframes to distributed servers and hyper-converged private clouds, infrastructure has adapted to shifting computing models. The emergence of AI-ready infrastructure further accelerates these changes, particularly in terms of power consumption.
Processor capabilities have advanced exponentially. CPUs now usually use between 200-300 watts of power, with some reaching 350 watts. AI workloads, mainly run by GPUs need even more power with GPUs designed for up to 1,000 watts each. Such high power levels were rare before but are now common.
Ten years ago data center racks usually used 3-5 kW of power. Today even basic enterprise applications require 10 kW per rack, while AI-focused infrastructure ranges between 80-120 kW per rack. Such high-density deployments necessitate a fundamental shift in power and cooling strategies.
Strategic and Financial Implications
Higher power densities mean higher initial costs. Mukhija estimates that AI-driven data center infrastructure requires 30-40% more upfront investment along with higher ongoing operational costs. However, he believes that the total cost of ownership will decrease due to efficiency gains, such as lower chiller usage and improved energy consumption, resulting in reduced PUE (Power Usage Effectiveness).
The shift also requires hiring specialized personnel, particularly for liquid cooling systems which introduce additional plumbing, sensors, and flow meters. This is more than just a technical change it is a strategic shift that requires close collaboration between operators and customers. Security concerns are heightened as liquid cooling brings water directly into data halls, increasing technician foot traffic. Mukhija emphasizes the need to revise contractual agreements to address these new risks.
AI in Data Center Operations
While much discussion focuses on how AI will affect data center infrastructure, Mukhija points out its potential to improve data center operations. Automation has always been important for controlling temperature and making cooling systems more efficient but AI can take it a step further by looking at large amounts of sensor data to make performance better in real time.
Traditionally, Building Management Systems (BMS) mainly kept an eye on facilities. But with AI, we can now connect alarms in real-time and predict when maintenance is needed to enhance security measures. AI-powered analysis can stop system breakdowns, find out what caused problems, and make energy use more efficient on the spot.
Newer data centers are actively using AI in their management processes, although this development isn’t getting as much attention. Mukhija believes that as time goes on AI will play an even bigger role in making operations more efficient and completely changing how data centers are managed