2024 Trends in Data Center Design

April 19, 2024
There was a time, not too long ago, where the idea of a data center was a futuristic fantasy. Now the future has arrived, and our use of digital technology, and our need to store and process data, grows exponentially. Bit mining and AI are only making the surge swell. More data means more servers and more servers mean more data centers. “Demand for data centers is at an all-time high, and it’s been trending that way for a few years now,” adds David Fanning, Critical Facilities Leader, Senior Associate, Gensler, Denver.
 
Where companies once owned and operated their own small data centers, there’s a big shift underway. “It’s becoming more consolidated into fewer entities owning much larger areas of the internet landscape,” says Tom Widawsky, Principal, Tech, HDR, New York City.
 

More colocation and hyperscale. Less enterprise.

While enterprise data centers had been the most prevalent data center design projects, they’re declining in numbers. Many major hospitals, banks, or corporations who once operated their own data center to have content on premises, have opted for using colocation data centers instead. “Colocation providers are those that actually build data centers for others to occupy,” explains Widawsky. 

There are two types of colocation. Retail colocation is where providers build out and operate the facility and you occupy your IT equipment in their space. The other option gives the customer more control. “Wholesale is where a developer-type data center operator builds a building as a shell. Maybe it's got power to it. Maybe it has some base infrastructure, but essentially if I'm a customer, I then come in and dictate what the fit-out of the mechanical/electrical systems are going to be. It’s build to suit for me,” shares Jackson Metcalf, Critical Facilities Leader, Industrial & Logistics Leader, Senior Associate, Gensler, Chicago.
 
The other main type of data center is hyperscale. “Hyperscale is typically defined by the big five: Amazon, Apple, Google, Meta, and Microsoft,” says Widawsky. As the name suggests, these are usually massive facilities and often operated by these cloud providers. Both hyperscale and colocation operators, according to Widawsky, are investing in and creating prototype designs. “They're trying to do something that's very repeatable, that's very standardized, and very consistent as far as procurement and size expectations go,” he adds.
 

As density increases, so does the heat.

Data centers are becoming more dense and mechanical systems must keep pace. “Picture your old computer tower at home. There's a fan in there that removes the heat, the harder it works,” explains Fanning. Data centers have rows and rows of servers with fans to move the air and vent the heat that’s generated (aka hot aisle containment). 
 
 
“Historically, you would have a density in each rack of servers that typically hovered around three to five kilowatts per rack. What we're seeing now is the advance of these servers. They are ten- to twenty-times more dense. So, [it’s gone] from three kilowatts a rack, up to 30 to 50 and in some cases even higher,” he adds. As the temperature in data centers rises, new means of cooling the equipment are being explored, such as direct-to-chip liquid cooling or immersion cooling, which encloses the server equipment in dielectric liquid to dissipate heat. “The cooling technology is getting closer and closer to the equipment that it’s cooling,” says Fanning. “With the advent of in-rack or direct-to-chip cooling, you’re capturing that heat more efficiently, as opposed to just removing it via air through the facility.”
 
Data center developers are using a variety of different cooling approaches as testbeds, notes Metcalf, to explore which works best. “It’s an exciting time for us on the design side, because we are getting to explore these in their early developmental stage,” he notes. These new technologies also impact the physical footprint of the data center. “Because you’re not removing hot air, you need less, what we would call, white space,” adds Fanning.
 
Cooling technologies aren’t the only advancements being made, though. “The industry, as the need for efficiency increases, has pushed OEM manufacturers that build these servers to figure out a way to run their servers hotter because that will reduce our cooling requirements,” notes Widawsky. With servers operating at a higher temperature and alternative cooling technologies being deployed, energy efficiency is more easily achieved.

Water management drives decisions.

Climate change and the increase in global population are making water a more universal consideration for architecture projects, including data centers. Widawsky explains: “In the very beginning, when we were looking so hard at energy consumption, the quick solution was evaporative cooling, which became very popular. But as the industry grew and these data centers became larger, water consumption became millions of gallons a day and the industry immediately recognized that that is not a sustainable long-term path.” With a real push toward smarter use of resources, data centers have pivoted toward closed loop systems where water is reused and recirculated. “Water's a big deal when it comes to data centers, and alternative means of providing cooling are definitely where I think things are headed,” adds Metcalf.
 
 

Power must be considered early.

Data centers, especially hyperscale, consume a tremendous amount of power. Developers need to know that the existing grid can supply enough power to the data center. “I’ve heard that substation demand is on a five-to-seven-year backlog,” Metcalf says. “That is a major driver in where a data center developer or operator is going to consider building.”
 
Not only do the power demands of the new facility need to be considered during initial planning, but it’s equally important to ask if additional power can be supplied should further development of the site be necessary in the future. Widawsky warns: “All of that planning needs to be thought about when you first purchase the site and not later on. You're certainly not doing any major excavation work once you have an active region already operating.”
Because of delays in the availability of power, more and more developers are looking at onsite power generation in a variety of forms, such as hydrogen fuel cells. “They're only being adopted in smaller situations today, simply because none of these systems are designed to handle the scale of power that these sites have increased to. Where previous sites may have been 12-to-24-megawatt facilities, today these sites are going to 100 megawatts. At that size, it really strains what some of these alternative energy systems or onsite power systems can provide,” explains Fanning.
 

Data centers moving closer to cities and becoming multi-story.

The rural data center is not going away anytime soon. “Most of them are located on the outskirts simply because the land is cheaper, the power is usually cheaper, and honestly, once the first data center provider goes there and improves the infrastructure for network and cabling, you'll start seeing these hub areas where several places will develop,” explains Widawsky.
 
There is some evidence, however, that data centers – especially colocation providers – are moving closer to cities. “You'll see those big campuses out in the nether regions. Those are the primary areas where they do a lot of their work, but they also need smaller infill sites closer to urban areas in order to make sure that they have a streamlined network system for end-to-end minimal latency issues,” he says. 
 
Multi-story projects (historically less common in the U.S. than Europe and Asia) may become more common. “If you can get farm field real estate prices and 200 acres, it's really affordable to build big single-story data centers,” explains Metcalf. “There's not the land in surrounding suburbs where you can build a monster campus, especially single story. There are still a few sites, but generally speaking, it's not abundant. And so the need for multi-story [data centers] has always existed.”
 

Aesthetics matter.

As data centers move closer to urban/suburban areas, design becomes more important – to be a better neighbor, fight NIMBY resistance, and meet local requirements. “High-income suburbs do not want an industrial shed. They want the tax revenue. They want the high-paying jobs. There are so many benefits to having a data center in your community. But they don't necessarily want to look at what would get built if there were no regulations to make them be better,” says Metcalf.
An increasing number of urban/suburban data centers are paying attention to façades, landscaping, and tastefully imbuing the design with local and regional flavor. Colocation data center providers especially are paying attention to design. “They need to have a marketing capability because they're trying to attract customers to their locations,” explains Widawsky. “It's a competitive market.” 
 
The same is true for the need to attract and retain talent. “There is such a talent shortage of people who really have experience in this,” says Metcalf. “Building a low-quality workplace is a short-sighted decision. When you're spending hundreds to a thousand dollars per square foot on the portion of the building with the computing and electrical gear and all that, interior workplace is like a rounding error for one of these buildings. People want to be proud of where they work.”
 
 

Build now for the future.

“You generally try to build a building for at least a 20-year lifespan. But with data centers, you're building that building for something that has a general lifespan of 3 to 5 years,” notes Widawsky. Short of being psychic, it can be hard to predict what the future of data center design will look like. However, architects doing these projects regularly are keen to understand both current practicalities and future advancements and how a project today might be built to accommodate the technology of tomorrow.
 
Fanning shares the influence AI is having: “The design parameters continue to change as people understand how this new technology is deployed, and what these new densities really mean in terms of energy usage, airflow, etc. What we're seeing now is not a neat and clean, 'We need a hyperscale data center or an AI data center,’ type of decision. Instead, it’s ‘We need a hybrid data center that accommodates both.’ It's requiring far more innovation than we've seen before.”
 
One of HDR’s hyperscale clients preemptively planned space for cooling distribution units (CDUs) at the ends of the rows. “Even though the current model for their chips and their servers today weren't requiring water to the chip, they knew that they were going to be implementing that in the next three to five years,” says Widawsky. Planning an additional two to three feet at the end of the server rows added about 10,000 square feet to the footprint, but the facility will be ready to accommodate direct-to-chip or immersion cooling when the tenants decide that's the direction they want to go.
 
The world of data center design is changing quickly—trying to incorporate the advancements in technology and meet skyrocketing demand, while preparing buildings to accommodate the unknowns of how technology will advance and demand will grow in the future.  It's interesting that design teams creating data centers today are thinking so much about tomorrow. 

Sponsored Recommendations

Eldorado Stone Before & Afters

Check out some of Eldorado's recent projects. This will show you the before and after results of how Eldorado could add value to your next architectural endeavor.

Eldorado Stone Inspiration Galleries

For over 50 years, Eldorado Stone has demonstrated an undeniable passion for creating authentic products that elevate not only quality and design but also attainability.

About Eldorado Stone

Today, Eldorado is recognized as a leader in quality and innovation in the U.S. and around the world. Eldorado Stone, LLC is headquartered in Oceanside, CA, and currently operates...

Stone 101 Guide

Renovating or building your home can be both exciting and daunting as you go through the process of choosing an optimal layout, the best materials, and pleasing colors and finishes...