Digital Edge CEO outlines strategy for meeting AI demands
The AI revolution has increased the demand for data centres, especially with the advent of generative AI. However, with AI still considered by many as still in its infancy stage, data centre operators will just have to keep on adjusting their business strategy based on how the technology evolves.
According to Samuel Lee, Chief Executive Officer of data centre operator Digital Edge, flexibility is not an issue. For him, being a young player in the market is actually an advantage, given the rapidly-evolving nature of AI.
“Since Digital Edge was established in 2020, there has been a step change in the size and scale of IT deployments around the world. From the expansion of global cloud services to the rise of AI, the implications for the data centre industry have been wide ranging. As a young company, we are well positioned to adapt our business to meet this demand and are actively investing in new technologies to future-proof our facilities for the AI era,” he said.
Presently, the current catch-phrase when it comes to AI data centre infrastructure seems to be “the bigger, the better,” Lee observed.
“Only a few years ago we were talking about 10-20MW deployments; today we’re talking about 100MW+ and in North America some operators are even building up to 1GW campuses. Scalability is particularly key as AI applications often need to ramp up deployments quickly,” he said.
Due to these developments, Digital Edge is focusing on large scale campuses, like its site in Incheon, which has a 100+MW capacity, and its site in Mumbai, which has a 300+MW capacity. The company currently has 17 data centres, spread across six countries— Japan, China, India, the Philippines, South Korea, and Indonesia.
Despite these projects, Lee admitted that access to power is a recurring challenge in the data centre industry. According to him, simply training an AI chatbot can use as much electricity as a neighbourhood consumes in a year.
“Finding a consistent and reliable source of power across the APAC region while also meeting sustainability targets is not easy. Data centre providers are increasingly looking at onsite electricity generation or forming partnerships with renewable energy developers to address this challenge,” he shared.
To this end, the company partnered with Peak Energy to pursue renewable energy production projects, which will help power the data centres.
Additionally, data centre operators need to be able to cater to much higher power rack densities, Lee noted. Citing a recent report from JLL, the CEO said that average rack density for hyperscale data centres is forecast to increase 35 percent in the next two years, with AI GPUs largely fuelling this increase.
“This poses challenges for operators around data centre cooling with new technologies in this space highly costly and constantly evolving. We are actively exploring new advancements in this area and our recently launched facility in Seoul has been designed to be AI-ready, able to support a cabinet density of up to 130kw when a liquid cooling solution is used,” he said.
Aside from power, one of the most important considerations for data centre operators is land area. In Singapore, for example, finding available land is difficult, given the size of the island nation. Therefore, companies like Digital Edge are looking at neighbouring countries to build their infrastructure.
“For AI training models, a central location is less important, meaning investments will go to those geographies that offer sizable land banks and cheap power. As a result, we are already starting to see AI data centre clusters emerge in places such as Navi Mumbai in India and Johor in Malaysia,” Lee said.
Meanwhile, for AI inference deployments, whereby the trained AI tool generates information from live data, low latency is crucial.
“Many generative AI applications are real-time in nature, so having the compute power as close as possible to the end user is crucial. That’s why in parallel, we are seeing growing demand for smaller edge data centres in central locations,” the CEO added.
Strategy-wise, Digital Edge has diversified its data centre assets, from its edge data centres like its recently-launched 23MW facility in downtown Jakarta, to its bigger campuses such as the 300+MW development in India, which is more suitable for training AI applications such as Large Language Models.
Because data centres are power-intensive, operators are constantly searching for ways to save energy. Proof of this is the industry’s shift from air to liquid cooling solutions.
Digital Edge, for its part, has deployed Nortek’s Statepoint Liquid cooling technology, and has also developed a single closed facility water system design coupled with a hybrid cooling tower.
“These innovations not only enable us to optimise the performance of our data centres for AI applications, but also reduce both water and power consumption to help us achieve our ESG goals,” Lee said.
Meanwhile, the use of lithium-ion batteries in data centres has come under scrutiny following a spate of high-profile fire incidents. In response, Digital Edge’s engineering team has partnered with energy storage developer Donghwa ES to develop the Hybrid Super Capacitor (HSC). According to Lee, the HSC technology not only significantly reduces the risk of fire hazard, but also offers other advantages such as minimal maintenance needs, a significantly wider operating temperature range, and the ability to recharge in minutes instead of hours.
“This technology has the potential to revolutionise data centre ancillary power generation for AI workloads and we are excited to explore this further in 2025,” he noted.
Looking ahead, Lee anticipates that the demand for data centre infrastructure in Asia-Pacific will only continue to grow, especially with data privacy laws requiring data to be stored onshore.
“This, coupled with customer demand for lower latency means we will continue to see requirements for edge data centres in key metros across the region,” the CEO explained.
As for AI, Lee projects that it will grow hand-in-hand with cloud computing: “That’s why we’re also seeing a significant increase in the hyperscale demand that goes alongside these AI deployments; from cloud providers to content delivery networks and OTT media platforms.”
In conclusion, a “one-size-fits-all” approach does not exist for AI infrastructures, because each customer has their own unique requirements.
“As a result, data centre operators are increasingly having to (integrate) flexibility into their facility design to ensure they can customise product solutions to each deployment.
This includes maintaining white space and common infrastructure in your building footprint to provide the flexibility to adapt the design to new technologies as they emerge,” Lee said.
link