ashrae 2011 liquid cooling whitepaper

28
© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. All rights reserved. 1 ASHRAE TC 9.9 2011 Thermal Guidelines for Liquid Cooled Data Processing Environments Whitepaper prepared by ASHRAE Technical Committee (TC) 9.9 Mission Critical Facilities, Technology Spaces, and Electronic Equipment © 2011, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. All rights reserved. This publication may not be reproduced in whole or in part; may not be distributed in paper or digital form; and may not be posted in any form on the Internet without ASHRAE’s expressed written permission. Inquiries for use should be directed to [email protected]

Upload: sargurusiva

Post on 01-Oct-2014

311 views

Category:

Documents


32 download

TRANSCRIPT

Page 1: ASHRAE 2011 Liquid Cooling Whitepaper

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 1

ASHRAE TC 9.9

2011 Thermal Guidelines for

Liquid Cooled Data Processing

Environments Whitepaper prepared by ASHRAE Technical Committee (TC) 9.9 Mission Critical

Facilities, Technology Spaces, and Electronic Equipment

© 2011, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. All rights

reserved. This publication may not be reproduced in whole or in part; may not be distributed in paper

or digital form; and may not be posted in any form on the Internet without ASHRAE’s expressed written

permission. Inquiries for use should be directed to [email protected]

Page 2: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 2

Executive Summary

ASHRAE TC 9.9 created the first edition of the “Thermal Guidelines for Data Processing

Environments” in 2004. Prior to that the environmental parameters necessary to operate data centers

were anecdotal or specific to an IT manufacturer. In 2008 with the second edition of the Thermal

Guidelines, ASHRAE TC 9.9 expanded the environmental range for data centers to enable increased

economizer usage at an increasing number of locations throughout the world.

TC 9.9 has recently published (May, 2011) a further update described in a whitepaper available on TC

9.9’s website. The whitepaper documents expanded data center environmental guidelines by adding two

more envelopes that are wider in temperature and humidity.

However, these guidelines are for air-cooled IT equipment and do not address water temperatures

provided by facilities for supporting liquid cooled equipment here (liquid cooled IT equipment refers to

any liquid within the design control of the IT manufacturers such as water, refrigerant, dielectric, etc.).

The TC 9.9 committee did publish “Liquid Cooling Guidelines for Datacom Equipment Centers” in

2006 which focused mostly on the design options for liquid cooled equipment and did not address the

various facility water temperature ranges possible for supporting liquid cooled equipment.

This document describes classes for the temperature ranges of the facility supply of water to liquid

cooled IT equipment. In addition, this document reinforces some of the information provided in the

Liquid Cooling Guidelines book on the interface between the IT equipment and infrastructure in support

of the liquid cooled IT equipment.

Since the classes cover a wide range of facility water temperatures supplied to the IT equipment, a brief

description is provided for the possible infrastructure equipment that could be used between the liquid

cooled IT equipment and the outside environment.

At the time of the first air cooling Thermal Guidelines the most important goal was to create a common

set of environmental guidelines for the IT equipment design. Although computing efficiency was

important, performance and availability took precedence when creating the temperature and humidity

limits. Progressing through the first decade of the 21st century, increased emphasis has been placed on

energy efficiency.

Power usage effectiveness (PUE) has become the new metric for measuring data center efficiency,

creating a measurable way to see the effect of data center design and operation on data center efficiency.

More recently the use of the waste energy has become an important consideration for some data center

operators.

With these three focus areas of performance, energy efficiency and use of the waste energy, several

ranges of facility supply water temperatures have been recommended to accommodate the business and

technical requirements of the data center operator.

Page 3: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 3

Introduction

The global interest in expanding the temperature and humidity ranges for air cooled IT equipment

continues to increase, driven by the desire for achieving higher data center operating efficiency and

lower total cost of ownership (TCO).

For these same reasons liquid cooling of IT equipment can provide high performance while achieving

high energy efficiency in power densities beyond air cooled equipment while simultaneously enabling

use of waste heat when supply facility water temperatures are high enough. This document is created to

specify the environmental classes for the temperature of water supplied to IT equipment.

These environmental guidelines / classes are really the domain and expertise of IT OEMs. TC 9.9 has

demonstrated the ability to unify the commercial IT manufacturers and improve overall performance

including energy efficiency for the industry.

By creating these new facility water cooling classes and NOT mandating the use of any one of these

classes, server manufacturers can develop products for the classes depending on the customer needs and

requirements for products within each class.

Developing these new classes among the commercial IT manufacturers in consultation with the Energy

Efficient High Performance Computing (EE HPC) Working Group (WG) should produce better results

since the sharing of some critical data among them has proven in the past to achieve broader

environmental specifications than what otherwise would have been achieved.

Page 4: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 4

IT Equipment Liquid Cooling

The increasing heat density of modern electronics is stretching the ability of air to adequately cool the

electronic components within servers as well as the Datacom facilities that house these servers. To meet

this challenge, the use of direct water or refrigerant cooling at the rack (for this document a rack deploys

4 posts with common rack footprints of 0.6 x 1.2 m) or board level is now being deployed.

The ability of water and refrigerant to carry much larger amounts of heat per volume or mass also offers

tremendous advantages. The heat from these liquid cooling units is in turn rejected to the outdoor

environment by using either air or water to transfer the heat out of the building. Because of the

operating temperatures involved with liquid cooling solutions water-side economization fits in well.

Liquid cooling can also have advantages in terms of lower noise levels and close control of electronics

temperatures. However, there are some concerned with liquid in electronic equipment from a leak

aspect. This is an issue because the electronic components are upgraded on a routine basis resulting in

the need to disconnect and reconnect the liquid carrying lines.

To overcome this concern, IT OEM designers sometimes utilize a non-conductive liquid, such as a

refrigerant or a dielectric fluid in the cooling loop for the IT equipment.

In the past, high performance mainframes were often water-cooled with the internal piping supplied by

the IT OEM. Components are becoming available today that have similar factory installed and leak

tested piping that can accept the water from the mechanical cooling system, which may also employ a

water-side economizer.

Increased standardization of liquid cooled designs for connection methods and locations will also help

expand their use by minimizing piping concerns and allow for interchangeability of diverse liquid

cooled IT products.

The choice to move to liquid cooling could come at different times in the life of the data center. There

are three main times when the decision between air and liquid cooling must be made. These will be

briefly discussed. Water’s thermal properties were discussed earlier as being superior to air. This is

certainly the case, but that does not mean that liquid cooling is invariably more efficient than air cooling.

Both can be very efficient or inefficient and it generally has more to do with the design and application

than the cooling fluid. There are modern air-cooled data centers with air economizers being built that

are far more efficient than many liquid cooled systems. In fact the choice of liquid cooled versus air-

cooled generally has more to do with other factors than efficiency.

A. New Construction

In the case of a new data center the cooling architect must consider a number of factors. First is the

workload in the data center. Second, the space available and location specific issues can affect the

choice.

Page 5: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 5

Finally, the local climate comes into play. If the data center will have an economizer and the climate is

best suited to air-side economizers (mild temperatures and moderate humidity) then an air-cooled data

center could make the most sense. Conversely, if the climate is primarily dry, a water side economizer,

with the cooling fluid conveyed to the racks (or the coolant distribution unit - CDU) could be ideal.

Liquid cooling more readily enables the reuse of waste heat. If a project is adequately planned from the

beginning, reusing the waste energy from the data center could reduce the site or campus energy use. In

this case liquid cooling is the obvious choice as the heat in the liquid can most easily be transferred to

other locations.

Also, the closer the liquid is to the components the higher quality heat will be recovered and be available

for alternate uses.

B. Expansions

Another common application for liquid cooling would be adding or upgrading equipment in an existing

data center. Existing data centers often do not have large raised floor heights or the raised floor plenum

is full of obstructions such as cabling.

If a new rack of IT equipment is to be installed that is of higher density than the existing raised floor air-

cooling can support, liquid cooling can be the ideal solution. Current typical air cooled rack power

densities can range from 6 kW to 30 kW.

In many cases rack powers of 30 kW are well beyond what legacy air cooling can handle. Liquid

cooling to a rack, rear-door, or other localized liquid cooling system can make these higher density racks

nearly room neutral by cooling the exhaust temperatures down to room temperature levels.

C. High Density and HPC

Because of the energy densities found in many high performance computing (HPC) applications, liquid

cooling can be a very appropriate technology. One of the main cost and performance drivers for HPC is

the node-to-node interconnect. Because of this, HPC typically is driven towards higher power density

than a typical enterprise or internet data center.

30 kW racks are typical with densities extending as high as 80 to 120 kW. Without some

implementation of liquid cooling these higher powers would be very difficult if not impossible to cool.

The advantages of liquid cooling increase as the load densities increase. More details on the subject of

liquid cooling can be found in “Liquid Cooling Guidelines for Datacom Equipment Centers”, part of the

ASHRAE Datacom Series.

Several implementations of liquid cooling could be deployed, such as the coolant removing a large

percentage of the waste heat via a rear door heat exchanger or a heat exchanger located above the rack.

Page 6: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 6

Another implementation would include a totally enclosed rack that uses air as the working fluid and an

air-to-liquid heat exchanger. Another would be with the coolant passing through cold plates attached to

processor modules within the rack. The CDU can be external to the datacom rack as shown in Figure 1

below or within the datacom rack as shown in Figure 2.

Figures 1 and 2 show the interfaces for a liquid cooled rack with remote heat rejection. The interface is

located at the boundary at the facility water system loop and does not impact the datacom equipment

cooling system loops which will be controlled and managed by the cooling equipment and datacom

manufacturers.

However, the definition of the interface at the loop affects both the datacom equipment manufacturers

and the facility operator where the datacom equipment is housed. For that reason all the parameters that

are key to this interface will be described in detail herein.

The Liquid Cooling Guideline Book described the various liquid cooling loops that could exist within a

data center and its supporting infrastructure. These liquid loops are shown in Figure 3. As seen from

Figure 3, the water guidelines that are discussed in this document are at the chilled water systems

(CHWS) loop. If chillers are not installed then the guidelines would apply to the condenser water

systems (CWS) loop.

Although not specifically noted, a building level CDU may be more appropriate where there are a large

number of racks connected to liquid cooling. In this case the location of the interface is defined the same

as Figure 1 but the CDU as shown would not be a modular unit but a building level unit.

Figure 1: Combination air- and liquid-cooled rack or cabinet with external CDU

Page 7: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 7

Figure 2: Combination air- and liquid-cooled rack or cabinet with internal CDU

Figure 3: Liquid Cooling Systems / Loops within a Data Center

Page 8: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 8

Facility Water Supply Characteristics for IT Equipment

The facility water is anticipated to support any liquid cooled IT equipment using water, water plus

additives, refrigerants, or dielectrics. The following sections focus on these applications.

A. 2011 ASHRAE Facility Supply Water Temperature Classes for IT Equipment

A.1 Liquid Cooling Environmental Class Definitions

Compliance with a particular environmental class requires full operation of the equipment within the

class specified based on non-failure conditions. The IT equipment specific for each class requires

different design points for the cooling components (cold plates, thermal interface materials, liquid flow

rates, piping sizes, etc.) utilized within the IT equipment.

For IT designs that meet the higher supply temperatures as referenced by the ASHRAE classes in the

table below, enhanced thermal designs will be required to maintain the liquid cooled components within

the desired temperature limits. Generally, the higher the supply water temperature, the higher the cost of

the cooling solutions.

Class W1/W2: Typically a data center that is traditionally cooled using chillers and a cooling tower

but with an optional water side economizer to improve on energy efficiency depending on the

location of the data center. See Figure 3a below.

Class W3: For most locations these data centers may be operated without chillers. Some locations

will still require chillers. See Figure 3a below.

Class W4: To take advantage of energy efficiency and reduce capital expense, these data centers are

operated without chillers. See Figure 3b below.

Class W5: To take advantage of energy efficiency, reduce capital expense with chiller-less operation

and also make use of the waste energy, the water temperature is high enough to make use of the

water exiting the IT equipment for heating local buildings. See Figure 3c below.

Table 1: 2011 ASHRAE Liquid Cooled Guidelines((I-P version in Appendix A)

Page 9: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 9

The Facility Supply Water Temperatures specified in the above table are requirements to be met by the

IT equipment for the specific class of hardware manufactured. For the data center operator, the use of

the full range of temperatures within the class may not be required or even desirable given the specific

data center infrastructure design.

There is currently no widespread availability of IT equipment in ranges W3-W5 today. Product

availability in these ranges in the future will be based upon market demand. It is anticipated that future

designs in these classes may involve trade-offs between IT cost and performance. At the same time

these classes would allow lower cost data center infrastructure in some locations. The choice of IT

liquid cooling class should involve a TCO evaluation of the combined infrastructure and IT capital and

operational costs.

Figure 3a,b,c: Class W1 / W2 / W3, Class W4, Class W5

B. Condensation Considerations

Liquid cooling classes W1, W2, and W3 allow the water supplied to the IT equipment to be as low as

2°C (35°F) which is below the ASHRAE allowable room dew point guideline of 17°C (63°F) for Class

1 Enterprise Datacom Centers (Thermal Guidelines for Data Processing Environments, 2nd Edition,

ASHRAE, 2008).

Electronics Equipment manufacturers are aware of this and are taking this into account in their designs.

Commensurate, data center relative humidity and dew point should be managed according to the

ASHRAE 2011 Thermal Guidelines for Data Processing Environments.

Page 10: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 10

If low fluid operating temperatures are expected, careful consideration of condensation should be

exercised. It is suggested that a CDU (as shown in Figures 1 and 2) with a heat exchanger be employed

to raise the coolant temperature to at least 18°C (64.4°F) to eliminate condensation issues or have an

adjustable water supply temperature that is set 2°C (3.9°F) or more above the dew point of the data

center space.

C. Operational Characteristics

For classes W1 and W2 the Datacom equipment should accommodate chilled water supply temperatures

that may be set by a campus wide operational requirement. It also may be the optimum of a balance

between lower operational cost using higher temperature chilled water systems versus a lower capital

cost with low temperature chilled water systems.

Consideration of condensation prevention is a must. In the chilled water loop, insulation will typically

be required. In connecting loops, condensation control is typically provided by an operational

temperature above the dew point.

The chilled water supply temperature measured at the inlet of the Datacom equipment or the CDU

should not exceed a rate of change of 3°C (5.4°F) per 5-minute cycle. This may require that the

infrastructure is powered by a UPS electrical system.

The maximum allowable water pressure supplied by the facility water loop to the interface of the IT

liquid cooled equipment should be 100 psig (690 kPa) or less.

The chilled water flow-rate requirements and pressure-drop values of the Datacom equipment vary

depending on the chilled water supply temperature and percentage of treatment (antifreeze, corrosion

inhibitors, and so on) in the water.

Manufacturers will typically provide configuration specific flow rate and pressure differential

requirements that are based on a given chilled water supply temperature and rack heat dissipation to the

water.

For classes W3, W4 and W5, the infrastructure will probably be specific to the data center and therefore

the water temperature supplied to the water cooled IT equipment will depend on the climate zone and

will vary throughout the year.

In these classes it may be required to run without a chiller installed so it is critical to understand the

limits of the water cooled IT equipment and its integration with the infrastructure designed to support

the IT equipment. This is important such that those extremes in temperature and humidity allow for

uninterrupted operation of the data center and the IT liquid cooled equipment.

The temperature of the water for classes W3 and W4 will depend on the cooling tower design, the heat

exchanger between the cooling tower and the secondary water loop, the design of the secondary water

Page 11: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 11

loop to the IT equipment and the local climate. To accommodate a large geographic region the range of

water temperatures was chosen from 2°C to 45°C (35°F to 113°F).

For class W5, the infrastructure will be such that the waste heat from the warm water can be re-directed

to nearby buildings. Accommodating water temperatures nearer the upper end of the temperature range

will be more critical to those applications where retrieving a large amount of waste energy is critical.

The water supply temperatures for this class are specified as greater than 45°C (113°F) since the water

temperature may depend on many parameters such as the climate zone, building heating requirements,

distance between data center and adjacent buildings, etc. Of course, the components within the IT

equipment need to be cooled to their temperature limits and still use the hotter water as the heat sink

temperature.

In many cases the hotter water heat sink temperature will be a challenge to the IT equipment thermal

designer. Although with much lower temperatures there may be opportunities for heat recovery for

building use in the W3 and W4 categories dependent upon the configuration and design specifications of

the systems to which the waste heat would be supplied.

D. Water Flow Rates / Pressure

Water flow rates are shown in Figure 4 for given heat loads and given temperature differences.

Temperature differences typically fall between 5°C to 10°C (9°F to 18°F). Minimum facility pressure

differential (drop) should not be lower than 0.4 bar.

Page 12: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 12

Figure 4: Typical Water Flow Rates for Constant Heat Load

E. Velocity Limits

The velocity of the water in the piping supplied to the IT equipment must be controlled to ensure that

mechanical integrity is maintained over the life of the system. Velocities that are too high can lead to

erosion, sound / vibration, water hammer and air entrainment.

Particulate-free water will cause less water velocity damage to the tubes and associated hardware. Table

2 provides guidance on maximum water piping velocities in pipes for systems that operate over 8,000

hours per year. Water velocities in flexible tubing velocities should be maintained below 1.5 m/s (5

ft/s).

Table 2: Maximum Velocity Requirements

Pipe Size Maximum Velocity(fps) Maximum Velocity(m/s)

>3 inches(75 mm) 7 2.1

1.5 to 3inches(38 to 75 mm) 6 1.8

< 1 inch(25 mm) 5 1.5

All Flexible tubing 5 1.5

Page 13: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 13

F. Water Quality

Table 3 identifies the water quality requirements that are necessary to operate the liquid cooled system

(Chilled water system loop-see Figure 3). The reader is encouraged to reference Chapter 49 of the 2011

ASHRAE HVAC Applications Handbook. This chapter, titled “Water Treatment” provides a more in

depth discussion about the mechanisms and chemistries involved.

Table 3: Water Quality Specifications Supplied to IT Equipment

Parameter Recommended Limits pH 7 to 9

Corrosion Inhibitor(s) Required

Sulfides <10 ppm

Sulfate <100 ppm

Chloride <50 ppm

Bacteria < 1000 CFUs / ml

Total Hardness (as CaCO3) <200 ppm

Residue After Evaporation <500 ppm

Turbidity <20 NTU (Nephelometric)

The most common problems in cooling systems are the result of one or more of the following causes:

F.1 Corrosion

There are various forms of corrosion: uniform corrosion, galvanic corrosion, crevice corrosion, pitting

corrosion, environmentally induced cracking, hydrogen damage, inter-granular corrosion, de-alloying

and erosion corrosion. Uniform corrosion removes more metal than other forms of corrosion, but pitting

corrosion is more insidious and difficult to predict and control (Ref: Denny A. Jones, “Principles and

Prevention of Corrosion”, 2nd Edition, Prentice Hall, 1996).

In typical cooling systems with wetted materials such as copper and aluminum alloys, steels and

stainless steels, aluminum is clearly the most prone to pitting corrosion and steel is the most prone to

uniform corrosion. In cooling systems without adequate water chemistry control, steel will uniformly

corrode and copper and aluminum will also pit.

Steel requires treated water to prevent uniform corrosion. A small fraction of the copper water-carrying

tubing will fail in untreated water due to pitting with a mean time to failure of about 2 years (Reference:

P. Singh, private communication). Aluminum is not recommended in cooling systems unless the water

chemistry, including aluminum-specific corrosion inhibitors, is under very stringent control.

Stainless steels will generally not pit or uniformly corrode in reasonably controlled waters free of sulfur-

reducing bacteria. Stainless steels do require some dissolved oxygen in the water for their surface

passivation.

Page 14: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 14

The selection of the cooling loop materials and manufacturing processes are important as illustrated here

by some examples: tube and pipe surfaces, especially copper tube surfaces, should be free of

contamination such as carbon films, which is a residue from the tube drawing operations, to reduce the

incidence of pitting corrosion. Stainless steel hardware must not be sensitized and be properly

passivated.

Sensitized stainless steel hardware may suffer inter-granular corrosion. Unpassivated stainless steel

suffers superficial corrosion that may contaminate the water. Aluminum is not recommended as a wetted

material in the cooling loop, but if found necessary to use, more corrosion resistant version may be

selected, including Al-clad alloys, and an aluminum-specific corrosion inhibitor must be added.

It is recommended that the corrosivity of the cooling water towards the alloys in the system be checked

periodically. While uniform corrosion can be readily measured, pitting corrosion testing requires a more

sophisticated electrochemical approach that few laboratories are equipped to conduct. (Ref: P. Singh, et.

al., “Potentiodynamic Polarization Measurements for Predicting Pitting of Copper in Cooling Waters”

Paper 212, Corrosion 92, The NACE Annual Conference and Corrosion Show, Nashville, 1992).

pH is an important water chemistry variable. Porbaix diagrams for metals indicate that metals corrode

the least around the neutral pH range, some a little higher than pH=7 and some a little lower. Corrosion

is also driven by high levels of chlorides, sulfides, and sulphates in the water, but one cannot make

reliable predictions of corrosion rates from the water chemistry, except under very extreme water-

chemistry conditions.

F.2 Fouling: Insoluble Particulate Matter in Water

Insoluble particulate matter settles at low flow velocities or adheres to hot or slime-covered surfaces and

results in heat-insulating deposits and higher pressure drops in the loop. Deposits can consist of silt,

iron rust, naturally occurring organic matter, particle matter scrubbed from the air, deposition of

chemical additives due to poor control, etc. Fouling is related to the amount of particulate matter or total

suspended solids in the fluid.

A full loop filtration system is not typically needed if the make-up water is of good quality. A side

stream filtration system may provide adequate solids removal at a smaller capital cost. The operational

aspect of filter monitoring and change out frequency must be considered and a specific maintenance

program established.

F.3 Scale: Precipitation of Salts Directly on Metal Surfaces

Scale is a dense layer of adherent salt precipated on surfaces as a result of the concentrations of the salts

exceeding their solubility limits. Higher temperatures promote scale formation by lowering the salts’

solubility limits. Scale typically consists of calcium carbonate and magnesium carbonate.

Hard waters, high in dissolved calcium and magnesium cations, are prone to scale formation on hotter

surfaces when the water pH is high. Soft waters, low in these dissolved ions, are less prone to scale

Page 15: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 15

formation. Hard waters are generally less corrosive because the scale formed on metal surfaces retards

the diffusion of oxygen to the cathodic areas.

In cooling systems, closed to air, from which water is not allowed to evaporate, scale formation is

generally not an issue. Evaporation and subsequent concentrating of the chemistry can occur in vented

expansion tanks, as well as through fittings and elastomers (gaskets, etc.) in the system.

If carbon dioxide from the air is allowed to dissolve in the water, the reduced propensity to scale

formation will leave the metal surfaces less protected from the cathodic half-cell reaction, thus,

increasing the metal corrosion rate. In cooling loops closed to air, corrosion inhibitors must be added

and their concentration routinely maintained over the life of the system.

F.4 Microbiologically Induced Corrosion (MIC): Corrosion due to Bacteria, Fungi and Algae

Carbon steels, stainless steels, and alloys of copper and aluminum may suffer microbiologically induced

corrosion, especially in stagnant waters with pH from 4 to 9 in the temperature range 10°C to 50°C

(50°F to 122°F). Even if there is no recorded incident of MIC in computer closed-loop cooling waters,

precautions must be taken to avoid bacteria in the water.

Slime and deposit formations are a characteristic of MIC. Slime consists of accumulated micro-

organisms and their secretions. Once MIC has begun, biocide treatment may not be effective because

organisms sheltered beneath the deposits may be out of reach of the injected biocide.

It is best to assemble the cooling loop hardware with minimal bacteria contamination and to treat the

water with a suitable biocide the first time the system is filled with water, followed by biocide injection

well before the bacteria content gets to 1000 CFU/ml.

Bacteria can greatly increase the risk of pitting. Pitting can occur at weld joints and high stress locations.

Aluminum corrosion can be accelerated by microorganisms in neutral pH water.

Copper, a known toxin to bacteria, can be attacked by some types of bacteria having a high tolerance for

cupric ions.

Aerobical bacteria induced slime formations on stainless steels can be initiation sites for pitting

corrosion. MIC on stainless steels often occurs at weldments, directly on the weld metal, or in the heat

affected zones on either side of the weld.

F.5 Other Considerations

Suspended solids and turbidity can be an indication that corrosive products and other contaminants are

collecting in the system. Excessive amounts may indicate corrosion, removal of old corrosive products

by a chemical treatment program, or the contamination of the loop by another water source. Suspended

solids at high velocity can abrade equipment.

Settled suspended matter of all types can contribute to pitting corrosion (deposit attack). Similarly there

may be ions present that may also cause these same issues. Some examples are:

Page 16: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 16

The presence of copper ions may be an indication of increased copper corrosion and the need for a

higher level of copper corrosion inhibitor.

Excessive iron rust is an indication that corrosion has increased, existing corrosion products have

been released by chemical treatment, piping has been added to the secondary loop, or the iron

content has increased in the replacement water.

Where water softening equipment is deployed, a total hardness of 10 ppm or greater indicates that

the hardness is bypassing the softener, that the softener regeneration is improper, or that

contamination from another system such as a cooling tower or city water is present.

The presence of sulfates is often an indication of a process or water tower leak into the IT liquid loop

(TCS loop – see Figure 3). High sulfates contribute to increased corrosion because of their high

conductivity.

Page 17: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 17

Connecting to Water Cooled IT Equipment

A. Wetted Materials

The materials in the cooling loop (CHWS loop or CHS – see Figure 3) hardware that come in contact

with water should be restricted to the following list:

Copper Alloys :122, 220, 230, 314, 360, 377, 521, 706, 836, 952

Stainless Steel

o Low-carbon 300 series stainless steels are preferred.

o Heat-treated 400 series stainless steels may be used for high mechanical stress applications.

o Carbon steels are not recommended unless steel-specific corrosion inhibitor is added and its

concentrations maintained.

Polymer/Elastomer (verify materials meet local flammability and code requirements)

o Acrylonitrile Butadiene Rubber (NBR)

o Ethylene Propylene Diene Monomer (EPDM)

o Polyester Sealant (anaerobic)

o Polytetrafluroethylene (PTFE)

o Polypropylene

o Polyethylene

Solder/Braze

o Soldering is not recommended because solder joint reliability is poor due to the relatively high

porosity in solder joints. Brazing is the recommended method for joining water carrying copper

hardware. Neither brazing nor soldering should be used for joining steels or stainless steels.

Solder Alloys: Lead-free alloys containing copper, silver and tin.

Solder Flux: A flux suitable for the lead-free solder alloy should be used. The post-soldering step

must include thorough cleaning to remove all flux residue.

Braze Filler Metal: BCuP or BAg

Braze Flux AWS Type 3A.

o The post-brazing step must include thorough cleaning to remove all flux residue.

B. Connections

Datacom equipment racks can be connected to the water systems by either a hard pipe fitting or a quick

disconnect attached to OEM flexible hoses. The quick disconnect method is very popular among

datacom equipment manufacturers. Each method has its own advantages and disadvantages which will

be discussed further.

Page 18: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 18

Using the hard pipe method, connections between the facility chilled water system and datacom

equipment rack can be flanged, screwed, threaded or soldered depending on the pipe materials used on

both sides of the interface.

However, the fitting and pipe materials must be compatible. The type of pipe material and hard pipe

connections will vary by the end user or customer. The end user will have to clarify their requirements

or standards to the OEM and design engineer for a successful project. Hard pipe connections will

require vibration isolation.

Most datacom equipment manufacturers requiring connection to the water supply or CDU unit generally

provide flexible hoses containing a poppet-style fluid coupling for facilities connection. A poppet-style

quick disconnect is an “Industrial Interchange” fluid coupler conforming to ISO 7241-1 Series B

standards. Brass or stainless steel couplings are most commonly used today and must be compatible

with the connecting pipe material.

If rack loads and flows are excessive, it is recommended that a duplicate set of supply and return lines or

hoses be deployed to enhance the fluid delivery capacity of the rack. The IT OEM design engineer will

have to determine if this is necessary during the design.

One of the main disadvantages of the quick disconnect is it has a very large pressure drop or loss

associated with it. This pressure loss must be accounted for in all pipe sizing and pump selection

procedures. The facilities infrastructure designer must consult with the coupling OEM for exact

pressure losses for a specific project or system.

An alternate form of quick connect without a poppet and much lower pressure drop for the same flow is

provided by several manufacturers. These non-poppet style quick connects generally have ball valves

integrated within the unit to shut off the water before disconnecting.

Finally, the interface must be properly insulated to prevent condensation. Insulation material should be

the same before and after the interface.

The supply and return piping before and after the interface should be properly labeled. This is critical to

prevent human error by accidentally switching the two lines. When using quick disconnects, it is

suggested to mix the sockets and plugs between supply and return lines to key them against cross-

connection at rack installation.

Page 19: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 19

Infrastructure Heat Rejection Devices

In most climates, datacom cooling requirements can be satisfied during part of the year using cooler

outdoor ambient temperatures to either supplement or replace the use of refrigeration equipment. This

process is known as an “economizer” or “free cooling” cycle.

Water-side economizers utilize an atmospheric heat rejection device, usually a dry cooler or cooling

tower, in order to provide a source of cool fluid for a separate economizer coil inside the CRAC unit.

The economizer cycle supplements the refrigeration system during integrated operation. This is

accomplished by pre-cooling the air with the economizer coil, which allows the capacity of the

refrigeration system to be reduced by shutting off compressors or using variable speed compressors.

For liquid cooled IT equipment a water side economizer is appropriate. It is a system by which the

supply air of a cooling system is cooled indirectly with water that is itself cooled by heat or mass

transfer to the environment without the use of mechanical cooling. Two water side economizer designs

can be applied:

Direct exchange – where condenser water can mix directly with chilled water

Indirect exchange – where a heat exchanger is used to separate condenser water and chilled water

loops

The condenser cooling system can be either open or closed loop. A variety of heat rejection devices are

available for this duty, including open-circuit cooling towers, closed-circuit cooling towers, hybrid wet /

dry cooling towers, dry coolers, and combinations of the above.

The evaporative systems can provide lower system energy usage along with lower design condenser

temperatures than dry systems, but they also require the use of water, an associated water treatment

program, and the issue of freeze protection must be dealt with in cold climates.

Evaporative water-cooled systems can also operate at cooler temperatures, reducing chiller maintenance

costs and extending the life of the mechanical equipment (note that the expected lifetime for an air-

cooled chiller is 12 to 15 years while water-cooled alternatives offer 23 to 27 year lifetimes per

“ASHRAE Equipment Lifetimes” section in the ASHRAE Handbook).

Cooling towers are often used for larger Datacom facilities due to the energy savings and the lower

condenser temperatures that are achievable. Open-circuit cooling towers are typically used on water-

cooled chiller systems due to their relatively low first cost and low fan energy.

Cooling towers cool water to within a few degrees of the entering air’s wet-bulb temperature, which can

be considerably lower than the entering air dry-bulb. As a result, the volume of air required to be moved

through an open-circuit cooling tower system for the same heat rejection is considerably reduced,

resulting in lower fan horsepower requirements as well as decreased sound levels. Cooling towers can

also offer space and weight advantages over air-cooled alternatives.

Page 20: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 20

A closed circuit can also be used for the condenser loop. This has the advantage of maintaining

condenser performance over time by significantly reducing the fouling potential as compared to an open

loop tower. To close the loop, either a closed-circuit cooling tower or a combination of an open-circuit

cooling tower and a heat exchanger may be used.

A closed-circuit cooling tower utilizes a wetted coil inside the tower to isolate the closed loop from the

open loop spray water. Because the closed loop design introduces another stage of heat transfer, closed-

circuit cooling towers are typically larger and require more fan horsepower compared to an open-circuit

cooling tower for the same thermal duty.

These disadvantages must be offset by the advantages of the closed loop design, such as the significant

reductions in both condenser fouling and condenser bundle maintenance. Rather than compare closed-

circuit cooling towers to open-circuit cooling towers, the designer should compare them with a heat

exchanger / open-circuit cooling tower combination, which serves the same purpose of isolating the

condenser fluid from outside contamination.

Another advantage of closed-circuit cooling towers is that they can also operate in a dry mode and reject

heat sensibly in cooler months. This characteristic can be used to avoid wet operation in the coldest

months of the year, to conserve water, or as an emergency mode of operation in the event of a loss of

water to the facility. These benefits can be offset by the higher fan energy usage resulting from

operating in the dry rather than the wet mode.

Dry coolers can also be used either alone or in conjunction with evaporative heat rejection equipment on

closed loop condenser systems in order to reduce water consumption and provide redundancy. Dry

coolers typically consist of a finned tubular heat exchanger with fans to move the air across the heat

transfer surface. Hybrid closed-circuit cooling towers which combine evaporative and sensible heat

rejection surface in one unit, are also an option.

All the heat exchange processes introduce some inefficiencies due to the required temperature difference

across the heat exchanger whether a cooling tower, plate-and-frame heat exchanger, or dry cooler. This

temperature difference is referred to as the “approach temperature”.

For example, the approach is the difference between the temperature of return water from the cooling

tower water entering the heat exchanger minus the temperature of water in the chilled water loop leaving

the heat exchanger. In general, the closer the specified approach is, the larger the heat exchanger.

Typical design approach temperatures for various heat exchangers can be found in the table below.

Table 4: Approach Temperatures

Page 21: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 21

The approach temperature that is available from a particular piece of heat rejection equipment is largely

determined by heat exchange surface area and the mass of air that is moved across this surface area.

Ultimately, the actual approach temperature utilized in the design will be based on computer simulation

to balance capital expenditure (heat exchange surface area) and fan energy (mass of air used for cooling)

while still realizing an attractive Return on Investment (ROI).

Even though it has less of an impact on the ROI, the fluid transport systems (piping and pumping)

should also be included in this simulation since overall system flow is related to the cooling delta T and

the other component of the pump selection (pressure) is determined by the required differential pressure.

The pressure drop through piping, heat exchangers and the heat rejection equipment, along with the

“lift” required for “open” systems, all have an effect on the energy used by this system. The evaluation

of pumping efficiency versus flow / pressure stability should also be carefully evaluated at this phase of

the project.

The approach temperatures seen at other heat exchangers in the system should also be considered when

determining the approach temperature required for the heat rejection equipment. A more effective

selection for the heat rejection equipment may be obtained if the approach temperatures in the remainder

of the system are carefully optimized.

Even if effective selections are made for the heat rejection and exchange equipment during the initial

phases of the project, degradation of the design approach temperatures for this equipment must be

considered. This is especially important for sensible heat rejection equipment and heat exchangers that

use water from an “open” system for cooling.

Finned heat exchange surfaces are prone to atmospheric fouling, especially in urban and industrial areas.

Finned surfaces are commonly found in Closed-circuit Cooling Towers, Finned Dry Coolers, Adiabatic

Finned Dry Coolers and Direct Evaporative Coolers.

This fouling may also be the result of “seasonal” conditions (cottonwood seeds or pollen clogging the

finned surfaces of sensible heat rejection equipment), or “long term” corrosion on surfaces in contact

with water or other fluids.

The amount of glycol that is added to a system for freeze protection also has a profound negative effect

on the ability of a fluid to exchange energy with the heat source and the heat rejection equipment.

Page 22: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 22

References and Bibliography

ASHRAE book, Design Considerations for Datacom Equipment Centers, 2005.

ASHRAE book, Datacom Equipment Power Trends and Cooling Applications, 2005.

ASHRAE book, Liquid Cooling Design Considerations for Data and Communication Equipment

Centers, 2007.

ASHRAE book, Thermal Guidelines for Data Processing Environments, Second Edition, 2009.

ASHRAE TC9.9 White Paper, ASHRAE 2011 Thermal Guidelines for Data Processing

Environments, 2011, www.tc99.ashraetcs.org.

Page 23: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 23

APPENDIX A. 2011 ASHRAE Facility Water Temperature Classes for IT

Equipment (I-P version)

A. Liquid Cooling Environmental Class Definitions

Compliance with a particular environmental class requires full operation of the equipment within the

class specified based on non-failure conditions. The IT equipment specific for each class results in a

different design point for the cooling components (cold plates, thermal interface materials, liquid flow

rates, piping sizes, etc.) utilized within the IT equipment.

For IT designs that meet the higher supply temperatures as referenced by the ASHRAE classes in the

table below, enhanced thermal designs will be required to maintain the liquid cooled components within

the desired temperature limits. Generally, the higher the supply water temperature, the higher the cost of

the cooling solutions.

Class W1/W2: Typically a data center that is traditionally cooled using chillers and a cooling tower

but with an optional water side economizer to improve on energy efficiency depending on the

location of the data center. See Figure A-a below.

Class W3: For most locations these data centers may be operated without chillers. Some locations

will still require chillers. See Figure A-a below.

Class W4: To take advantage of energy efficiency and reduce capital expense, these data centers are

operated without chillers. See Figure A-b below.

Class W5: To take advantage of energy efficiency, reduce capital expense with chiller-less operation

and also make use of the waste energy, the water temperature is high enough to make use of the

water exiting the IT equipment for heating local buildings. See Figure A-c below.

Table A-1: 2011 ASHRAE Liquid Cooled Guidelines (SI Version in Main Body)

The Facility Supply Water Temperatures specified in the above table are requirements to be met by the

IT equipment for the specific class of hardware manufactured. For the data center operator, the use of

Page 24: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 24

the full range of temperatures within the class may not be required or even desirable given the specific

data center infrastructure design.

There is currently no wide-spread availability of IT equipment in ranges W3-W5 today. Product

availability in these ranges in the future will be based upon market demand. It is anticipated that future

designs in these classes may involve trade-offs between IT cost and performance. At the same time

these classes would allow lower cost data center infrastructure in some locations. The choice of IT

liquid cooling class should involve a TCO evaluation of the combined infrastructure and IT capital and

operational costs.

Figure A-a,b,c: Class W1 / W2 / W3, Class W4, Class W5

Page 25: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 25

APPENDIX B. Energy Efficient High Performance Computing Working Group

The Energy Efficient High Performance Computing (EE HPC) Working Group (WG) is an ad-hoc

group, with logistics and organizational support from the DOE. The WG is open to all HPC

practitioners from the US National Labs and others involved or interested in HPC.

The WG efforts were focused on establishing the ideal water supply temperatures for the National labs.

The activity begins with a survey of National Lab locations and respective dry bulb (DB) and wet bulb

(WB) temperatures extremes.

The established goal was to define temperatures that a data center at a lab location could operate for

99.6% of the annual hours. 99.6% was chosen as a conservative design basis, all data came from the

ASHRAE Weather Data Viewer, Ver 4.0, 1999.

For water-side economizers, the WB temperature and assumed approach temperatures for typical

equipment were used. For dry-coolers, the DB temperature and assumed approach temperatures were

chosen. See Figure B-1 for the lab environmental conditions. Houston, Texas was included in the

analysis. There is no National lab there, but it was included because of the challenge of free cooling in

environments like Houston; hot and humid.

Page 26: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 26

Figure B-1: National Lab Environmental Conditions

Figure B-2 shows the stack of approach temperatures, figuratively depicting the temperature rise from

ambient (cooling tower or dry cooler) through to the liquid provided to the IT equipment. The stack goes

from a Tcase value for a typical high volume CPU to the warmest ambient temperature targets as defined

in Figure B-1. The stacks show the temperature rise in each step of the respective thermal management

systems.

The intent is to demonstrate the cumulative temperature rise from the external heat sink and the case

temperature of the CPU. The left stack is for a dry cooler from the 99.5°F value found in the lower

chart in Figure B-1 through a Cooling Distribution Unit (CDU) and any system preheat (e.g. series

cooling inside the IT) and finally the temperature rise through to the case temperature. The right stack is

the same except that it uses a cooling tower with the 79.7°F wet bulb temperature from Figure B-1.

Page 27: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 27

Figure B-2: Temperature Stack, Outdoor Environment to CPU

The above analysis has led the EE HPC WG to recommend water temperatures ranges shown in Table

B-3. Note these are the building-supplied water temperatures and not the IT loop water temperatures.

Table B-3: EE HPC WG Recommended Guidelines for Liquid Cooling Temperatures

Page 28: ASHRAE 2011 Liquid Cooling Whitepaper

2011 Thermal Guidelines for Liquid Cooled Data Processing Environments

© 2011 American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc.

All rights reserved. 28

© 2011, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. All rights

reserved. This publication may not be reproduced in whole or in part; may not be distributed in paper

or digital form; and may not be posted in any form on the Internet without ASHRAE’s expressed written

permission. Inquiries for use should be directed to [email protected]