data centre cooling strategies the changing landscape.. › content_files › divisionmeeting... ·...
Post on 06-Jul-2020
1 Views
Preview:
TRANSCRIPT
Data Centre Cooling Strategies The Changing Landscape..
Mark Deguara Director Data Centre Solutions ANZ e-mail: Mark.Deguara@emerson.com
2
Redefining ‘Huge’
State of the Data Centre , Emerson Network Power
3
Failure is Not An Option
State of the Data Centre , Emerson Network Power
4
Powerful Forces Are Driving Change in Data Centre Infrastructure
Virtualization/ Cloud
Always on Availability
Expectations
IT Exerting Greater Influence
than Facilities
Higher density
More with Less
Interdependent apps/functions
Data Centre & Server
Consolidation
External forces changing the
business climate
Business and technology forces
pressing on the data centre
Efficiency & Green Initiatives
Data Centers are Using Less Power and Cooling
Data Centre Modularity
Alternative Power/Cooling Technologies
4
5
Today’s Data Centre Needs Efficiency Protection Scalability Simplification
Cost reduction
Resource utilization
Spending
optimization
Higher availability, SLA assurance
Compliance and
governance
Accuracy
Provisioning
Permissioning
Responsiveness
Standardization
Consistency
Leverage existing tools and skill
sets
Cheaper Stronger Faster Easier
Need: a real-time, resilient data centre that makes the future of your business possible, with; • the agility and flexibility to meet customer demands, • adapt to rapid changes, • operate at high efficiency and • comply with requirements for risk management and protection
5
6
Today’s Data Centre Needs Efficiency Protection Scalability Simplification
Cost reduction
Resource utilization
Spending
optimization
Higher availability, SLA assurance
Compliance and
governance
Accuracy
Provisioning
Permissioning
Responsiveness
Standardization
Consistency
Leverage existing tools and skill
sets
Cheaper Stronger Faster Easier
Need: a real-time, resilient data centre that makes the future of your business possible, with; • the agility and flexibility to meet customer demands, • adapt to rapid changes, • operate at high efficiency and • comply with requirements for risk management and protection
6
7
Advances in Data Center Cooling over the Last Five Years
Wider ASHRAE allowable range Hot and Cold Aisle Containment Outside Air Economization EC/Variable Speed Fans Variable Capacity Compressors Row based cooling Intelligent cooling controls
7
8
Thermal Management Technology to Solve Customer Needs
Efficiency / Economization
Efficient Capital/Modular/Speed
Solutions
Control/Intelligence
Availability
8
9
Thermal Management Technology to Solve Customer Needs
Efficiency / Economisation
Efficient Capital/Modular/Speed
Solutions
Control/Intelligence
Availability
Higher Return Air & Chilled Water Temps, Variable Capacity, Aisle Control, Max Economisation Hours, New Technology
Solutions that will allow effective growth and use of Capital in Optimized Building Blocks, Lowest Max kW
Engineered / Optimized, Ease of Connectivity, Custom – Standard Configurations
From the Unit, Aisle to the Whole Data Centre – Visibility and Control
Systems & Controls to Measure, Monitoring & Service to maintain the highest Availability
9
10
Energy Efficiency, Lower Operating Costs
Energy consumption in a typical data center
Source: Energy Logic
Technologies and approaches to reduce operational expenditures (OPEX)
Focus on operational efficiency of data center environment
Simple assessments help identify energy consumption
Opportunity: Reduce Cooling to under
15% or PUE < 1.20
10
11
The Data Centre World
11 ASHRAE* – American Society of Heating, Refrigerating & Air Conditioning Engineers
ASHRAE* defines different working ranges for temperature and humidity based upon installation and technology. This graph shows ASHRAE typical representation of pre-defined working envelopes.
12
The Data Centre World
12
Typical Application: Legacy DC with Return Control and High Precision on Humidity
Hardware: All Servers
Temperature & Humidity: 20°C – 25°C 40% – 55% RH
13
The Data Centre World
13
Typical Application: Legacy DC with Return Control and High Precision on Humidity
Hardware: All Servers
Temperature & Humidity: 20°C – 25°C 40% – 55% RH
Typical Application: Current DC with Return or
Supply Control and Humidity Control
Hardware: All Servers
Temperature & Humidity: 18°C – 27°C
5.5DP – 60% RH&15DP
14
The Data Centre World
14
Typical Application: Legacy DC with Return Control and High Precision on Humidity
Hardware: All Servers
Temperature & Humidity: 20°C – 25°C 40% – 55% RH
Typical Application: Current DC with Return or
Supply Control and Humidity Control
Hardware: All Servers
Temperature & Humidity: 18°C – 27°C
5.5DP – 60% RH&15DP
Typical Application: Data Centre with focus on Energy Savings and enlarged limits on Humidity
Hardware: Enterprise servers, storage products
Temperature & Humidity: 15°C – 32°C 20% – 80% RH
Typical Application: Information and Technology
Space or Office
Hardware: Volume servers, storage
products, pc, workstations
Temperature & Humidity: 10°C – 35°C
20% – 80% RH
15
Chilled Water
Direct Expansion
Warehouse Type
Evaporative Cooling
Freecooling Chiller with ADIA
Multi-story Building
Technology
Data Centre Cooling
Building Type
16
How to Achieve Data Centre Thermal Management: 3 Main Cooling Technologies
Configuration Indoor units with compressors connected to an outdoor condenser
Cooling Medium Refrigerant Freecooling: pumped refrigerant, indirect with water or fresh air
Typical Application From 10kW to 500-700kW data centre
Configuration Indoor units with heat exchange, EC Fan, valves and controls connected to a Freecooling Chiller through large water pipes
Cooling Medium Water produced from a Freecooling chiller Freecooling: From the outdoor unit / fresh air cooling
Typical Application Medium and large data centre from 0.1 to 5MW module
DIR
EC
T E
XPA
NS
ION
C
HIL
LED
W
ATE
R
Configuration Outdoor unit connected in the perimeter of the building and recirculating the air from and to the data centre
Cooling Medium Fresh air + vaporized water Backup: either direct expansion or Chilled water
Typical Application Warehouse type data centre from 0.5MW upwards AIR
to A
IR
AD
IAB
ATIC
17
Direct Expansion w/Pumped Economiser First ever Pumped Refrigerant Economizer
mode with operation down to a 1.05 pPUE Reliable and Low Maintenance Operation
– No Water usage or treatment
– No Outside Air or damper maintenance – Instant automatic economizer changeover
iCOM control manages the system Larger building blocks 80 to 150 kW for
sites up to 3 MW
17
90% in Full Economization No Compressor Operation
Canberra
18
Annual free-cooling hours increase at higher return air conditions and lower unit loads
At 70% load and 30°C RAT, full economization is available at 10°C outdoor
How Higher Return Air Helps Economisation
Percentage of Annual Hours in Economisation Mode (@80% load) Location 30°C Return Air 35°C Return Air Sydney 16% 54% Melbourne 47% 80% Canberra 61% 90% 18
35º C Return
40º C Return
30º C Return
24º C Return
-15 -10 -5 0 5 10 15 21 27 32 38 Deg C Outdoor Ambient
19
Climate Impact on Economisation
19
Melbourne Sydney
Brisbane Perth
20
Liebert DSE Indoor Unit
Liebert MC Condenser
Liebert EconoPhase
20
Liebert iCOM
Liebert DSE with iCOM Benefits:
– Maximizes free-cooling hours through automatic economizer transition control
– Minimizes power consumption by optimizing component operation for lowest total system power
– Provides built-in diagnostics to ensure proper system operation
– Provides interface to Liebert data centre management systems
Economisation needs Intelligent Controls
Pump Speed
Refrigerant Temperature
Pump Head
EconoPhase Unit Status
Condenser Fan Speed
Ambient Temperature
Head Pressure
Liebert MC Unit Status
Unit Capacity
Room Temperature
Evaporator Fan Speed
Room Humidity
21
Adiabatic FreeCooling Chillers
100% Compressor Back-up
Adiabatic Cooling
Freecooling ADIABATIC
PAD
22
High Efficiency and High Availability Intelligent controls provide the most efficient operating mode for the chillers whilst also providing protection during extreme ambients (50C) or loss of water
FREECOOLING
ADIABATIC FREECOOLING
HYBRID COOLING
ADIABATIC MECHANICAL COOLING
SAFE MODE during water
shortages
LOW TEMP
HIGH TEMP
23
Evaporative Cooling Applications
23
24
Liebert EVI Indirect Evaporative Air Handler
Season Cooling Mode
Dry Bulb Temp (°C)
Wet Bulb Temp (°C)
Liters Per
Hour
Cooling PUE
Winter Dry 5 - - 1.10
Spring/Fall Wet 20 14 540 1.14
Summer Wet w/ DX 35 24 675 1.21
– Capacities from 150 – 400+ kW with DX or CW trim – PUE <1.20 with No Outside Air – Performance based on climate and Operating Temperatures – Potential for lower Max kW – Higher Tier applications require on site water storage or larger DX/CW
24
17°C
35°C
5°C
24°C
24°C
25
Indirect Evaporative Cooling Where it Works
Dry Heat Exchanger
Wetting Heat Exchanger
Wetting Heat Exchanger + Trim
Cooling
Assuming 24°C Supply and 35°C Return Air
100% Load
26
Liebert EVD Direct Evaporative Air Handler – Capacities from 150 – 600 kW with DX or CW trim – PUE <1.10 requiring Outside Air (actual performance based on climate) – Potential higher humidity and operating temperatures – Lower Max kW – Typically used in lower Tier applications
26
35°C
5°C 24°C
27
Direct Evaporative Cooling Customers Range – Where it works
Hyperscale / Aggressive
Operations
ASHRAE Recommended
Operations
27
Most data centres will implement a dew point limit
at or near the ASHRAE upper level of15oC
28
Liebert XD Primary or Supplemental Cooling Designed for High Density
Localized cooling modules – cooling at the source – add on demand
100% sensible
Pumped Refrigerant
Works with any rack
New Liebert XDR – Back Of Cabinet
– Room Neutral
– No Fans (electrical)
– Flexible – any rack
– Future economizer
Energy saving +50% Liebert XDR
28
29
Modular Constructed Data Centre? Traditional “Stick-Build”
– Build = creating physical structure
– Install = adding mechanical/electrical plant and IT eqpt.
Container Solution
– Manufacture = fabricating container in factory,
including mechanical/electrical plant
– Install = placing container inside/outside a structure, connecting mechanical/electrical plant, adding IT eqpt.
Modular Constructed Data Centre (MCDC)
Design Build Install
Design Manufacture Install
Design Manufacture Off-Site
Assemble On-Site
The MCDC Becomes the Building!
Emerson Data Centre
NBN Australia Data Centre
29
30
SITE MANAGER
DCIM – Data Centre Infrastructure Management
30
31
DCIM Evolution Roadmap Ca
pabi
lity
What do I have?
What is it doing?
What should I do?
Value
31
Reactive • Monitor / Alerts • Status / Health
views • Little
prioritization • Tools are
isolated • Few standards • Simple KPIs • Problems
understood
Proactive • Real-time data • Basic
information models
• Policies defined • Collaboration • Holistic tools • KPIs aligned to
objectives • Problems are
anticipated
Optimized • Predictive
modeling • Root cause
analysis • Automated
policy and exception management
• Active identification of cost reduction
• Avoid problem situations
Autonomic • Automated
policies • Auto-Response
to events • Self-diagnosis /
healing • Tight
integration to business processes Basic
• Remote data access
• Ad hoc processes
• Limited tools • Unclear roles • Problems not
understood
Do it for me!
Real-time data center
How do I Improve?
Real-time, resilient data center
32
DCIM Intelligence Delivers Capacity Management and Thermal Visualization and More
3D model of the data center showing the thermal conditions
Produce 3D rotational view of data center floor and inventory
Thermal visualization view of front/rear of IT racks
Capacity Stack Thermal Visualization
– Electrical power consumption – Airflow – Gross cooling capacity (ex CW) – Delivered Net Sensible capacity – Remaining Capacity – Predictive Diagnostics
Intelligent sensors provide insight to optimize efficiency and capacity
32
33
Wireless Sensors Can Help Wireless sensors support dynamic IT
environments and provide the flexibility and benefits of monitoring and control of the cooling systems
Offers lower deployment costs compared to fixed sensors
Measurements include temperature, humidity, pressure, power
Ultra low power consumption ensures long battery life (+5 years)
Wireless Sensor Devices Wireless Gateway Control System
33
34
Thermal Management System Keys to a Successful Data Centre Design
Resiliency OPEX Efficiency
CAPEX Cost SLA Why?
Containment
Drives improved reliability in controlled supply air temperatures to the Rack
Critical Means to driver for higher return air temperatures to the cooling unit
Higher return air temperatures increase unit capacities to lower unit count
Economization Reduced mechanical refrigeration energy
Improved life (MTBF)
Sustainable control of operating costs
Control
Adjust and optimize the cooling system to the IT load
Networking for optimization, fault tolerance and information sharing
Maintain operating condition (ASHRAE Guidelines)
Intelligent Monitoring
Means to verify optimal performance
Anticipating / predicting critical issues that could impact reliability and costs
Planning change and growth
Scalable Design
Time value of money, reduced waste of excess capacity
Takes advantage of latest technology as improvements come to market
Speed of deployment 3
35
Liebert HPC L Liebert
HPC S
Emerson’s Thermal Management Data Centre Product Portfolio
Liebert CRV
Liebert XDV
Liebert PCW
Liebert PDX Digital Scroll
Liebert XDH
Evaporative Air Handlers/Room Cooling
Knürr DCD
Knürr DCL
35
Liebert DM
LiebertHPS
Rack and In Row
Liebert DSE
Room/Perimeter Cooling
Liebert PEX DX & CHW
Liebert HPC M
Building Free Cooling Chillers Liebert
AFC
top related