High Density Cooling

Page 1

To 30kW and Beyond: High-Density Infrastructure Strategies that Improve Efficiencies and Cut Costs


Emerson Network Power: The global leader in enabling Business-Critical Continuity Emerson Technologies             

© 2010 Emerson Network Power

Uninterruptible Power Power Distribution Surge Protection Transfer Switching DC Power Precision Cooling High Density Cooling Racks Rack Monitoring Sensors and Controls KVM Real-Time Monitoring Data Center Software


Emerson Network Power – An organization with established customers

Š 2010 Emerson Network Power


Presentation topics • Emerson Network Power overview • “High Density Equals Lower Cost: High Density Design Strategies for Improving Efficiency and Performance,” Steve Madara, Vice President and General Manager, Liebert North America Precision Cooling, Emerson Network Power • “Sandia National Laboratories’ Energy Efficient Red Sky Design,” David Martinez, Facilities Coordinator, Computing Infrastructure and Support Operations, Sandia National Laboratories • Question and Answer session

© 2010 Emerson Network Power


High Density Equals Lower Cost: High Density Design Strategies for Improving Efficiency and Performance Steve Madara Vice President and General Manager Liebert North America Precision Cooling Emerson Network Power

Š 2010 Emerson Network Power


Agenda • Industry trends and challenges • Three strategies for enhancing cooling efficiency • High density equals lower cost

© 2010 Emerson Network Power


Industry Trends and Challenges

Š 2010 Emerson Network Power


Trends and challenges • Server design – Higher ∆T across the server raising leaving air temperatures Traditional cooling unit

© 2010 Emerson Network Power


Trends and challenges • Regulatory – ASHRAE 90.1 Standard – 2010 Revisions – Refrigerant legislation

• Discussion around lower dew point limits • Raising cold aisle temperatures – Intended to improve cooling capacity and efficiency – Monitor the impact on server fans

• No cooling

© 2010 Emerson Network Power


Spring 2010 Data Center Users’ Group survey Average kW per Rack 0%

10%

Now 4%

In two years

20%

30%

40%

28%

50%

60%

70%

36%

9%

30%

80%

15%

100%

Average: ~8kW

4% 3% 1% 11%

12%

21%

90%

4% 3%3%

Average: 10-12kW

15%

Maximum kW per Rack 0%

Now

10%

5%

20%

20%

In two 3% 10% years 2 kW or less >8 — 12 kW >20 — 24 kW

30%

40%

23%

20%

50%

60%

16%

16%

14%

>2 — 4 kW >12 — 16 kW © 2010 Emerson Power Greater than Network 24 kW

70%

9%

7%

80%

8%

11%

90%

100%

11%

11%

19%

>4 — 8 kW >16 — 20 kW Unsure

Average: >12kW

Average: 14-16kW


Spring 2010 Data Center Users’ Group survey: Top data center issues 0%

5%

10%

15%

20%

25%

30%

Experienced hot spots

40%

45%

40%

Run out of power

26%

Experienced an outage

23%

Experienced a “water event”

23%

N/A — Have not had any issues

23%

Run out of cooling

18%

Run out of floor space

16%

Excessive energy costs Other

35%

13% 2%

© 2010 Emerson Network Power


Spring 2010 Data Center Users’ Group survey: Implemented or considered technologies 0%

10%

Fluid economizer on chiller plant Fluid economizer using dry coolers

14%

Air economizer

Wireless monitoring Solar array

5%

16%

4%

16%

5%

Cold aisle containment Containerized/ modular data center

20%

30% 14%

7% 1% 3% 11%

DC power

6% 3%

28%

11%

22%

Already implemented Still considering Will not consider

14%

18%

16% 18%

18%

47% 38% 18%

Š 2010 Emerson Network Power

6% 3%

35% 9%

25% 18% 35%

100%

24%

38%

10%

18%

90%

31%

44%

6%

80%

7%

11%

14%

70%

29%

23%

5%

60% 28%

24%

Rack based cooling

50%

10%

14%

28% 5% 2%

40%

13%

15% 20%

Plan to implement Considered, but decided against Unsure


Spring 2010 Data Center Users’ Group survey: Implemented or considered technologies 0%

10%

Fluid economizer on chiller plant Fluid economizer using dry coolers

14%

Air economizer

Wireless monitoring Solar array

5%

16%

4%

16%

5%

Cold aisle containment Containerized/ modular data center

20%

30% 14%

7% 1% 3% 11%

DC power

6% 3%

28%

11%

22%

Already implemented Still considering Will not consider

14%

18%

16% 18%

18%

47% 38% 18%

Š 2010 Emerson Network Power

6% 3%

35% 9%

25% 18% 35%

100%

24%

38%

10%

18%

90%

31%

44%

6%

80%

7%

11%

14%

70%

29%

23%

5%

60% 28%

24%

Rack based cooling

50%

10%

14%

28% 5% 2%

40%

13%

15% 20%

Plan to implement Considered, but decided against Unsure


Impact of design trends on the cooling system 72 F By-pass air – 30% at 60 F Server 4 kw

CRAC ΔT 16 F

ΔT – 18 F 78 F 60 F

CFM – 70%

Design ΔT - 21 F Operating ΔT - 16 F

CFM – 100%

58 F

Legacy data center designs with poor airflow management resulted in significant by-pass air – Typical CRAC unit has a ΔT of around 21 F at 72 F return air by design – Low ΔT servers and significant by-pass air resulted actual lower CRAC ΔT and thus lower efficiency © 2010 Emerson Network Power


Impact of higher temperature rise servers 75 F By-pass air – 40% @ 56F Server 8 kw

Hot aisle 75 F

ΔT 21 F

ΔT – 35 F 56 F

CRAC

91 F CFM – 60%

CFM – 100%

54 F

Even with improved best practices, newer servers can create challenges in existing data centers – Server ΔT greater than CRAC unit ΔT capability requires more airflow to meet the kW server load – Requires even more careful management of the return air to the cooling unit as a result of the high exiting temperatures at the server. © 2010 Emerson Network Power


Introducing Efficiency Without Compromise

Improving performance of the IT infrastructure and environment

™

Balancing high levels of availability and efficiency

Adapting to IT changes for continuous optimization and design flexibility

Expertise & Support Š 2010 Emerson Network Power

Delivering architectures from 1060kW/rack to minimize space and cost


Three Strategies for Enhancing Cooling Efficiency

Š 2010 Emerson Network Power


Strategy 1: Getting the highest temperature to the cooling unit • Higher return air temperatures increase the cooling unit capacity and efficiency • Increases the CRAC unit ΔT • Increases the SHR

© 2010 Emerson Network Power


Raising the temperature at the room level

• Impacted by

With containment, hot air wrap-around is eliminated and server supply temperatures are controlled CRAC

Rack

CRAC

Rack

Rack

CRAC

Rack

Rack

Rack

Rack

Rack

Without containment, hot air wraparound will occur and will limit max return temperatures CRAC CRAC CRAC

• Containment can be at the room or zone level – The length of the rack rows (long rows difficult to predict) • Supply air control provides consistency between – Floor tile placement containments – Rack consistency and load • Can be used in combination with • Improved with high density modules or row – Ducted returns based cooling © 2010 Emerson Network Power – Local high density modules


Strategy 2: Providing the right cooling and airflow • Efficiency is gained when: Server Load (kW) = Cooling Unit Capacity (kW) Server Airflow = Cooling Unit Airflow

• Challenges – Rising server ΔT result in higher by-pass air to meet the cooling load – Requires variable speed fans and independent control of the airflow from the cooling capacity

© 2010 Emerson Network Power


Strategy 3: Provide the most efficient heat rejection components

• Reduce cooling unit fan power • Maximize use of economization mode to reduce compressor hours of operation (chiller / compressor) • Improve condenser heat transfer and reduce power • Direct cooling – eliminate fans and compressor hours © 2010 Emerson Network Power


High Density Equals Lower Cost

Š 2010 Emerson Network Power


High density equals lower cost Adding 2000 kW of IT at 20kW/rack vs. 5kW/rack • • • • •

Smaller building for high density Fewer racks for high density Capital equipment costs more Equipment installation costs higher High density cooling is more efficient

Building Capital Costs @ $250/sq. ft. Rack and PDU Capital Costs @ $2,500 each Cooling Equipment Capital Costs Installation Costs Capital Cost Total Cooling Operating Costs (1yr)

Total Net Savings of a High Density Design 5 yr Total Net Savings of High Density © 2010 Emerson Network Power

Cost Difference Low Density vs. High Density ($1,875,000) ($750,000) $320,000 $750,000 ($1,555,000) ($420,480)

($1,975,480) ($3,657,400)


Total cost of ownership benefits Liebert XD

Traditional Cooling

Fan power- 8.5kW per 100 kW of cooling Average entering air temperature of 7280 F

Fan power- 2 kW per 100 kW of cooling

Cooling Unit

Average entering air temperature of 95-100 F

Chiller Capacity kW Chiller Capacity per kW of Sensible Heat Load

1.40

Latent Load Fan Load Sensible Load

1.20

1.00

0.80

0.60

0.40

• • • • •

65% less fan power Greater cooling coil effectiveness 100% sensible cooling 20% less chiller capacity required Overall 30% to 45% energy savings yielding a payback down to 1 year

0.20

0.00

Traditional CW CRAC

CW Enclosed Rack

Refrigerant Modules

© 2010 Emerson Network Power


Solutions transforming high density to high efficiency XDO20

XDV10 XDC

or

XDP

Base Infrastructure (160 kW)

XDH20/32

Standard Cooling Modules Future pumping units of larger capacity and N+1 capability

Embedded Cooling in Super Computers (microchannel intercoolers)

XDS20 XDR10/40

Rear Door 10-40 kW

Component Cooling without Server Fans 20 - 40 kW Capable > 100 kW

35 – 75kW

10-35 kW New & Future Product Configurations

Š 2010 Emerson Network Power


Rear door cooling • Passive rear door cooling module – No cooling unit fans – server air flow only – Optimal containment system – Allows data centers to be designed in a “cascading” mode

• Simplifies the cooling requirements – Solves problem for customers without hot and cold aisles – Room neutral – does not require airflow analysis – No electrical connections

© 2010 Emerson Network Power


Š 2010 Emerson Network Power

Cascade Effect

Cascade Effect

Customer layout


Direct server cooling without server fans

• • •

A cooling rack which uses the Clustered Systems cooling technology to move heat directly from the server to the Liebert XD pumped refrigerant system Heat is never transferred to the air Provides cooling for 36 1U servers Available initially in 20kW capacity with a 40kW rack in the future

Benefits • • • •

No cooling module fans or server fans 8% to 10% smaller IT power requirements Chill Off 2 tested at 80 F fluid temperature result in effective PUE<1 Next opportunity: XDP connection to a cooling tower without a chiller © 2010 Emerson Network Power

CPU 0

CPU 1

Liebert XDS configuration


Energy consumption comparisons

Equivalent PUE<1

Š 2010 Emerson Network Power


Industry evolving technology to improve data center cooling efficiency

Š 2010 Emerson Network Power


Sandia National Laboratories’ Energy Efficient Red Sky Design David J. Martinez Facilities Coordinator Corporate Computing Facilities Sandia National Laboratories Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DEAC04-94AL85000


Why build Red Sky? What happened to consolidation? • Addresses critical national need for High Performance Computing (HPC) • Replaces aging current HPC system

© 2010 Emerson Network Power


System comparison THUNDERBIRD

RED SKY

140 Racks (Large)

36 Racks (Small)

50 Teraflops Total

10 Teraflops per rack

~13 kW / rack full load

~32 kW / rack full load

518 tons cooling

328 tons cooling

12.7M gal water per yr

7.3M gal water per yr

Š 2010 Emerson Network Power


Red Sky and new technologies System design implemented three new technologies for power and cooling:  Modular Power Distribution Units  Liebert XDP Refrigerant Cooling Units  Glacier Doors The facility had to be prepared…..

© 2010 Emerson Network Power


Facility preparation      

3.5 months Zero accidents 0.5 miles copper 650 brazed connections 400 ft. carbon steel 140 welded connections

© 2010 Emerson Network Power


Facility preparation

Š 2010 Emerson Network Power


Facility preparation

Š 2010 Emerson Network Power


Facility preparation

Š 2010 Emerson Network Power


Š 2010 Emerson Network Power


New cooling solutions Glacier Doors First rack-mounted, refrigerant-based passive cooling system on the market XDP Refrigerant Units Pumping unit that serves as an isolating interface between the building chilled water system and the pumped refrigerant (134A) circuit • Operates above dew point • No compressor • Power used to cool computer not dehumidify • 100% sensible at 0.13 kW per kW cooling

© 2010 Emerson Network Power


The total cooling solution How it works  90% heat load removed with Liebert XDP-Glacier Door combination  Uses Laminar Air Flow concept  Perforated tiles only needed in 1st row

© 2010 Emerson Network Power


Laminar air flow

Š 2010 Emerson Network Power


How it all comes together

45o

Chiller Plant

Liebert XDP Glacier Door Š 2010 Emerson Network Power


How it all comes together

April to Sept.

53o 61o

Liebert XDP

Plug Fan CRAC Unit

Chiller 0.46 kW power = 1 ton cooling

Chiller Plant 52o

Oct .to March

Plate Frame Heat Exchanger 0.2 kW power = 1 ton cooling Š 2010 Emerson Network Power

Plate Frame Heat Exchanger


Comparison of compute power and footprint

Š 2010 Emerson Network Power


Tons of cooling used

Š 2010 Emerson Network Power


Annual water consumption

Š 2010 Emerson Network Power


Carbon footprint 1,000

250

900 800

200

700 150

500 400

100

300 200

50

100 0

CO2E Footprint

Red Sky

Thunderbird

203 46

912 205

Š 2010 Emerson Network Power

0

gha

Tonnes

600


$1,400,000

18,000,000

$1,200,000

16,000,000

37% Reduction

$1,000,000

14,000,000 12,000,000

$800,000

10,000,000

$600,000

8,000,000 6,000,000

$400,000

4,000,000

$200,000 $0

kWh Cost Kilowatt Hours

2,000,000 0

Thunderbird (518 Tons of Cooling)

Red Sky (328 Tons of Cooling)

$1,324,222 15,954,483

$838,504 10,102,452

Š 2010 Emerson Network Power

Kilowatt Hours per Year

Kilowatt Hours cost per Year

Chiller plant power consumption and cost


Energy consumption 1,800,000 1,600,000

$120,000

1,400,000 $100,000 1,200,000 $80,000

1,000,000

77% Reduction 800,000

$60,000

600,000 $40,000 400,000 $20,000 $0

kWh Cost Kilowatt Hours

200,000

Thunderbird (21 CRACs)

Red Sky (12 XDPs & 3 CRACs)

$126,791 1,527,604

$28,256 340,437

Š 2010 Emerson Network Power

0

Kilowatt Hours Used per Year

Kilowatt Hours Cost per Year

$140,000


Q&A Steve Madara, Vice President and General Manager, Liebert North America Precision Cooling, Emerson Network Power David J. Martinez, Facilities Coordinator, Corporate Computing Facilities, Sandia National Laboratories Thank you for joining us! • Look for more webcasts coming this fall! • Follow @EmrsnNPDataCntr on Twitter

© 2010 Emerson Network Power


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.