Select a Language
Close

Revolutionizing Data Centers: The Transformative Power of AI

Published July 24, 2024

SEL2_AESPA_PERSPECTIVE

AI Seeing Strong, Sustained Growth

The evolution of AI applications, particularly generative AI, has been rapid in the past couple of years and shows no sign of stopping. We are at the dawn of a future where most aspects of life and business are likely to be impacted by AI in some way. Indeed a June article in Forbes notes the market for AI is estimated to grow from US$86.9 billion in revenue in 2022 to a staggering $407 billion by 2027.

With new AI applications being rolled out each year, this in turn fuels a strong demand for digital infrastructure, including here in Asia. Data centers are required to support both the training of AI models such as LLMs, Large Language Models, as well as inference deployments, where a trained AI model applies the training knowledge to live data, e.g. when you send a request to Chat GPT. The AI evolution presents both challenges and opportunities for the data center industry; this blog looks at some of the main impacts and how Digital Edge is positioned to meet these.

1. Increased demand on an unprecedented scale

We are seeing a rapid acceleration in the scale of AI deployments, with customers talking about 50MW-100MW capacity deployments and beyond, often much larger than what we have historically seen for cloud computing. For training AI models, location is less important, so investments will go to areas where large plots of inexpensive land are available for sizable hyperscale campuses. On the other hand, for AI inference, latency is a very important factor, so in parallel we are seeing growing demand for smaller edge data centers in central locations. Many generative AI applications are real-time in nature, so having the compute power as close as possible to the end user is crucial.

This means the growth of AI is likely to be a key demand driver for the data center industry; Structure Research estimates that by 2026 the global data center market will be worth more than US$100 billion, with Asia accounting for half of this. Meanwhile many commentators believe this is a conservative estimate given the unprecedented scale of AI, with the Asian AI market already projected to be worth $81.9 billion this year. In particular South and Southeast Asia, including India, Indonesia and Malaysia, have the potential to become AI hubs, attractive for hyperscale deployments thanks to relatively inexpensive land and power, as well as favorable conditions for importing GPU chipsets, and existing and planned subsea cable infrastructure.

2. The power consumption conundrum

Perhaps the most significant challenge that AI represents for the data center industry is around power consumption. Both training and inference models rely on specialized AI acceleration hardware to handle the massive parallel processing and intensive calculations needed; TPUs (Tensor Processing Units) and GPUs (Graphics Processing Units). Compared to CPUs (Central Processing Units) these excel at parallel processing but are very power hungry.

An illuminating example is the power consumption of a standard NVIDIA AI GPU. Just a few years ago consumption of this hardware was 300W, while today the current NVIDIA H100 GPU consumes up to 700W. Nvidia’s upcoming Blackwell range is even more power intensive, with the B200 having a peak consumption of 1,200W, and the GB200 (combining two B200 GPUs and one Grace CPU) expected to consume up to a whopping 2,700W. Finding a consistent and reliable source of power for these deployments across data centers in the region, while adhering to ever increasing pressures around sustainability and ESG requirements, is the big question on everyone’s minds.

3. Liquid cooling for higher power density

With increases in power consumption come related challenges in terms of the higher power density of data center deployments. According to JLL’s Data Centers 2024 Global Outlook, average rack density for enterprise and colocation data centers today stands at 12.8kW and 36.1kW for hyperscale data centers. However this is forecast to increase to 17.23W and 48.7kW respectively in 2026, an increase of 35% in just two years. This is a markedly faster pace of increase than we saw prior to AI entering the scene.

This requires a rapid shift to new cooling methods such as direct to-chip and immersion liquid cooling. In other words, as Clifford Chance’s 2024 Data Centre Outlook puts it, something which was almost ‘futuristic’ a few years ago will soon be a ‘must-have’ for new data center builds. This of course brings added challenges such as installation complications, site constraints and floor loads, requiring data center operators to rethink data center design and construction, with a renewed focus on enhanced water and energy efficiency.

4. The Importance of connectivity

Finally, it’s important to remember that generative AI tools require access to significant data sets to help train and continuously improve their capabilities following initial training and deployment. This makes both low latency and connectivity important factors when it comes to AI inference deployments. In a regional context, this will favor forward-looking data center providers that can offer edge facilities in key metros and locations across Asia, with rich interconnectivity options.

Therefore, we expect to see continued investment in the connectivity elements that sit along-side the data center itself. For example, Digital Edge recently increased its investment in Indonet, Indonesia’s first commercial ISP, to further leverage their dark fiber, cloud and network service assets for the benefit of our colocation customers in the country.

 

How is Digital Edge Positioned for the AI Evolution?

As a young company, Digital Edge is well placed to adapt to meet the ever-evolving demands of AI.

On the issue of increasing demand for the right data center capacity in the right locals, then we are well positioned with our extensive platform across Asia. As noted earlier, the real-time nature of many generative AI applications mean that they need to be deployed on the edge, close to the user, in major urban locations, which mirrors our strategy of modern edge colocation data centers located in major metros across Asia.

We have an existing network of 17 data centers in six countries, including our newest 23MW EDGE2 facility opened in Jakarta earlier this year and our 36MW SEL2 facility in Seoul which will open in the Autumn. We also have significant capacity in the pipeline, including a 300MW campus in Navi Mumbai, ideal for hyperscale and AI deployments.

Regarding power consumption and the sustainability challenge it poses to data center operators, Digital Edge has a clear ESG strategy to address this, including targets around constructing highly energy efficient ‘green’ data centers, which we covered in another blog post earlier this year. We also recently announced a new partnership to deliver renewable energy solutions for our data centers across Asia Pacific to ensure the power we do use at our facilities has minimal carbon impact.

Turning to the challenge of increasing power density brought about by the rise AI applications, Digital Edge has a highly experienced engineering team that is at the forefront of innovation in this area. Our two latest sites to open, NARRA1 in Manila and EDGE2 in Jakarta, are both AI ready, able to support higher power density deployments where needed. We have already deployed cutting edge liquid cooling technology from Nortek Data Center Cooling at both these sites, and are actively exploring other new advancements such as direct-to-chip cooling.

Finally, on the importance of connectivity for AI inference deployments, Digital Edge is well positioned as we have a strategic focus on offering rich interconnectivity solutions within our data centers, including Cross Connect, Cross Link and our Edge Peering Internet Exchange (EPIX), as covered in this earlier blog post. In addition our new TYO7 facility opening in Tokyo next year is designed to be an interconnect focused project, bringing highly sought after colocation capacity to the heart of downtown Tokyo.

I hope this blog gives an overview of the many ways in which AI is impacting the data center industry. We will explore some of these aspects in more details in future blog posts, but for now, if you’re interested to know more about any of our facilities and our AI capabilities, then please reach out to us at sales@digitaledgedc.com.

Larry Tam
Senior Vice President, GTM