Data Center Technology Evolves to Catch Up With AI

Artificial intelligence helped spark the boom in data centers, and now it's helping run more of them remotely

reprints


Generative artificial intelligence is causing the demand for data centers to explode, bringing with it a host of operational and energy challenges. Proptech firms are trying to come up with solutions — including using GenAI itself to remotely control data center operations. 

The demand is so great — and data centers’ energy consumption so voracious — that the facilities once measured in square feet or by the number of racks they could accommodate are now measured by their power usage, said Peter Hannaford, CEO of EdgeNebula, a London-based provider of sustainable cloud-  and AI-driven technology for data centers.

SEE ALSO: Life Sciences Market Still Healing, But Emerging Trends Could Upend Sector in 2025

“They’re measured by megawatts and increasingly in gigawatts,” said Hannaford in an email. “So the large data centers are now being built where the power is. Proptech needs to be ahead of the game or to be prepared to accommodate on-site power generation — and eventually storage when the tech is available.”

Citigroup estimated last May that data centers’ market size was 33 gigawatts in 2023, and predicted compound annual growth of 17 percent to 100 gigawatts in 2030.

Despite fears that China’s DeepSeek would negatively impact U.S. data center growth, the opposite is happening. Data center technology spending zoomed 34 percent in 2024 to $282 billion, according to Synergy Research Group. It will have exceeded half a trillion dollars in the first month of 2025.

Among the big cloud-computing AI players, Amazon’s AWS announced Jan. 7 that it was investing $11 billion in its Georgia facility, after saying last year that it would spend $100 million over the next decade on various locations. In addition, Google Cloud, Google’s cloud computing business, last year said it had a $37 billion annual run rate, while Meta announced that it is investing $65 billion in 2025 alone.

Staggering numbers.

However, time-to-market is a major issue in building data centers, often requiring two years from start to switch-on, said Hannaford. Like other companies, EdgeNebula is building smaller data centers of 500 kilowatts for itself to make use of the numerous locations where power is available amid an abundant supply of customer-rich real estate in and around metro locations. He advises other proptech firms to readjust their portfolios accordingly.

While data centers are growing in number and complexity, fewer facility personnel will be required on-site, and even remotely, Hannaford said. In other words, the day is fast approaching when data centers, even far from population hubs, could be operated almost entirely remotely. 

“There is a popular misconception that building these massive new data centers will result in new jobs being created,” he said. “This is only true during the construction phase. Once built, there are far fewer people employed in a data center than in a commercial office block. Data centers are built for machines, not people.”

The growing need to control data centers remotely centers chiefly on their energy consumption, but also on security requirements, he continued.

“If the data center occupies part of a larger building, then interfaces with building management systems will also be needed,” Hannaford said. “If the data center is part of a bigger national or international network, then fiber connectivity between data centers will be needed.

“With pressure on operators to reduce their carbon footprint coming from governments and certainly customers, remote monitoring of efficiency, efficient power usage, and efficient cooling systems is essential. As well as efficiency, proptech’s main role will be to deliver resilience — as close to 100 percent uptime as possible. So, remote control systems are needed to switch to fallback systems, including power, cooling and connectivity in cases of failure.”

What proptech can add to support data center mega-growth is “the trillion-dollar question,” Ajey Kaushal, a principal at JLL Spark Global Ventures, said in an email. “While proptech plays a substantial role here, it does so in concert with many other verticals. It offers solutions that help address this problem both inside and outside the box.” 

Some examples of inside-the-box innovations, according to Kaushal, are: optimizing what’s called the “power usage effectiveness” of a data center with companies such as Fluix and Coolgradient; “load shifting” to ensure critical systems receive power during peak demand via companies such as JCI OpenBlue and BrainBox AI; and actively monitoring power quality through Phaidra, EkkoSense and similar firms.

As for outside the box, Kaushal cited proptech solutions from firms such as Paces, which helps identify land parcels conducive for data center development. Other solutions include processes and materials to build more quickly and more efficiently via companies such as Verrus and Flexnode; management of site power generation and storage infrastructure through firms like EdgeCentres; and help in capital planning for increased power generation or data center expansion such as through Paces and Pearl Street. (JLL Spark is not invested in these firms.) 

Although not limited to data centers, proptech startup Phaidra provides such locations with an AI platform that includes remote controls, said Jim Gao, the Seattle-based company’s co-founder and CEO. Phaidra provides closed-loop AI control service that helps operators maintain plant stability, energy efficiency and sustainability in mission-critical facilities of all kinds, he explained.

“We use a type of AI known as reinforcement learning, which is one of the four major branches of machine learning,” said Gao. “It is also the only type of machine learning that learns by doing, like humans. It learns the same way by having software-defined agents that take actions within their environment. In our case, they’ll take actions like changing variable frequency drive pump speeds, or turning chillers on and off. And they learn from their actions to get better at managing the facility, which means more reliable, more safe and more energy efficient.”

A mechanical engineer, Gao comes by his AI and data center knowledge from his previous work at Google, where he was a data center engineer and successfully produced 30 percent-plus energy efficiency gains. In 2019, he left Google with former Google DeepMind lead engineer Veda Panneershelvam, now Phaidra’s chief technology officer, and Katie Hoffman, formerly of Trane Technologies, and now Phaidra’s chief operating officer.

The company closed a Series A round of approximately $60 million in 2024, and now has nearly 100 employees, with customers in the U.S. as well as Australia, Canada, Europe, Singapore and the U.K., according to the company.

“The primary value is energy savings and reliability improvements through AI control,” said Gao, who declined to name clients and the number of companies with which Phaedra was working. “We work with a lot of the largest data center co-location companies in the world, and increasingly with large tech companies, primarily in the U.S., although we are active with companies in the E.U., as well as Southeast Asia, with a very heavy emphasis on Singapore in that region.”

The exponential increase in data center technical complexity might not be as obvious as its energy needs, said Gao. 

“Data centers are becoming ever larger. There’s more and more machines going into them,” he said. “There’s more and more complex workloads going into them where, instead of very stable [central processing units] you’re seeing high-density CPUs with liquid cooling at peak workload. So you have this order of magnitude in complex facilities. 

“And this is where AI really shines, because AI is good at analyzing and managing massive complexity, as long as it’s underpinned by data, of course.”

Philip Russo can be reached at prusso@commercialobserver.com.