A project to develop an advanced data center to serve as a high-density computing, mission-critical disaster recovery facility afforded an opportunity for a prominent health care enterprise to create a truly unique and highly sustainable structure.
The client’s goal was to design the most efficient 3.75MW data center possible, leveraging passive strategies for cooling and ventilation to the greatest extent possible (“chiller-less” design). To further the limits of possible operational efficiency, the critical decision was made to operate the facility beyond the limits of ASHRAE recommended and even beyond ASHRAE allowable for humidity.
The project’s sustainability strategies capitalize on the temperate climate that prevails in its Northwest U.S. location. The local climate coupled with an intelligent and innovative design allows 100% unconditioned outside air to be used for cooling 97% of the year, and to achieve ideal environmental conditions during the remaining 3% of the year an advanced evaporative cooling system is employed. Further, air-side economization, or server exhaust mixed with outside air, is used for heating during the winter.
The success of the facility’s combined “smart” design and passive cooling and heating strategies is validated by annual savings to the owner of $500,000 on utilities compared to a chiller-based design. Another metric supporting this facility’s energy efficiency is its Power Usage Effectiveness (PUE) rating of roughly 1.13, a rating considerably superior to PUE ratings achieved by comparable data center facilities. (PUE is an industry standard for measuring data center efficiency calculated by dividing a facility’s total electrical load by its IT load).
A highly analytical design process was applied to maximize the intrinsic structural strengths of geodesic domes to this data center scenario. This included such practical advantages as enabling a radial layout of servers to optimize fiber distribution, shortening cable runs and improving maintenance access; providing a uniform distribution for air intake; and utilizing the open volume of the dome’s interior for thermal buoyancy and efficient dissipation of hot air.
The variety of sustainability approaches successfully applied to this project exemplifies the powerful potential of combining an owner commitment to sustainability with the architectural capability to help realize that commitment in a spirit of innovation.
Why the circular/radial plan?
Critical to the client’s vision for their new data center was the radial layout of server IT Rooms. Learning from distinct challenges of the client’s existing 500 kW facility, the data center’s radial layout provides optimal fiber distribution to each of the IT Rooms because cabling is allowed to run to each of the servers directly from the IT Hub. In a conventional linear data center layout, fiber cabling is overlapped in order to reach the furthest row of servers, burying cable in the fiber trays that ultimately becomes completely inaccessible. With the radial layout comes the circular plan that has a few other key advantages over an orthogonal layout, including:
- Uniform distribution for air intake (mitigates shadow effect for intake on the wind’s leeward side).
- Inherent structural dependability and independence of the geodesic dome enclosure.
- Interior open volume of the dome used for thermal buoyancy and dissipation of hot air.
From a floor plan foot print perspective, the radial layout is arguably less efficient than a traditional orthogonal layout; however the enlarged cold aisles that are produced with the radial layout actually provide an optimal configuration for the fanwall air intake system. The cold aisles as a “slice” allow a larger fan wall assembly to directly funnel air unobstructed toward the server racks. And, at roughly $66/sq. ft. of surface area, the dome and gravity vent enclosure will prove to be cost competitive to comparable orthogonal stick built structures.
Mechanical System
The data center was designed to specifically take advantage of the moderate weather conditions in the Pacific Northwest, thereby allowing for the removal of a mechanical cooling system. The result is an advanced cooling methodology using air-side economization coupled with evaporative cooling to provide the environmental conditions required at the server intake.
Data Center Environment: The American Society of Heating Refrigeration and Air Conditioning Engineers (ASHRAE) Technical Committee 9.9 (TC9.9) has developed thermal guidelines for data processing environments. These guidelines establish minimum and maximum temperature and humidity environmental parameters within data centers. The environmental parameters fall into two ranges: Recommended and Allowable.
Data Center Mechanical System: To take advantage of the location’s favorable climate, an airside economizer type system has been selected for the data center mechanical system. This type of system utilizes air as the principal medium for cooling and heating; there is no mechanical cooling, and heating is harvested by recycling the heat rejected from the servers. It should be noted that since there is no mechanical cooling system, the air management system has no dehumidification capabilities. However, a review of the historical weather data indicates there is little chance that the humidity level in the data center will exceed the client’s expanded RH range.
The system operates as follows: ambient air is drawn into the data center by supply fans and is then discharged into the cold aisles where the servers use the air for cooling purposes. The servers discharge the air into hot aisles that are open to the dome plenum, and the plenum becomes slightly pressurized as a result. In addition, some of the supply air will leak into the plenum further pressurizing it. The air is then relieved from the plenum through relief vents in the dome roof or returned to the inlet of the supply fans.
There are three modes of operation for the mechanical system depending on the ambient conditions: Recirculation Mode occurs when the conditions are below the minimum recommended temperature range, Free-cooling Mode when conditions are within the recommended temperature range and Evaporative Cooling Mode occurs when the conditions are above the recommended temperature range. For the Re-circulation Mode, the mechanical system will utilize the rejected heat from the servers to heat the incoming air when the outside air temperature is below the minimum temperature limit. The system will re-circulate air from the hot aisles with incoming outside air by modulating return air dampers and outside air dampers to maintain minimum supply air temperature. In the Free-cooling Mode, the ambient air temperature is within the recommended temperature range and provides all the necessary cooling for the data center. The Evaporative Cooling Mode occurs when the ambient air temperature exceeds the maximum temperature limit and evaporative cooling is necessary.
PROJECT INVOLVEMENT. ARCHITECTURAL LEAD: PROGRAMMING, CONCEPTS
BUILDING AREA. 22,360 SF
PROJECT COST. EST. $21,000,000
CLIENT. CONFIDENTIAL
LOCATION. PACIFIC NW
ARCHITECTS/ENGINEERS. CH2MHILL
PROJECT TEAM. JOHN HISER PROJECT MANAGER, BOB KIRKENDALL DATA CENTER SR.TECHNOLOGIST, TYSON GILLARD ARCHITECTURAL LEAD, DALE HEBERLING ELECTRICAL, MIKE DRAGON MECHANICAL,, THOM SPEAR STRUCTURAL, KEITH KIBBEE CFD MODELING, ADAM MAY DESIGNER, TONY NEAL VISUALIZATION