Free Cooling in Data Centers: Limits and Best Practices
- Mar 9
- 10 min read

Free cooling can dramatically reduce data center cooling energy—if you respect its limits.
In practice, “free cooling” (also called economizer cooling) means using favorable outdoor conditions to remove heat with minimal compressor or chiller runtime. This article explains the main technical constraints (temperature, humidity, air quality, water, controls, and reliability) and the best practices to design and operate free cooling safely—without compromising uptime.
Score Group supports organizations at the intersection of Energy, Digital, and New Tech—Là où l’efficacité embrasse l’innovation…—to improve performance and sustainability with solutions adapted to each need.
Data centers are under rising pressure: the IEA estimates data centres accounted for about 1.5% of global electricity consumption in 2024 (about 415 TWh), with growth expected to continue strongly. (hkdca.com) Free cooling is one of the most impactful levers because it targets a major non-IT load: cooling infrastructure.
For an overview of our approach to data center design and optimization, visit Score Group’s DataCenters expertise (Noor ITS) and Score Group – Conseil et Solutions Énergétiques et Digitales.
What “free cooling” really means in a data center
Definition: economization, not “zero energy” cooling
Free cooling is the use of outdoor conditions to remove heat with reduced mechanical refrigeration. It is “free” in the sense that you avoid compressor work for many hours, but you still consume energy for fans, pumps, controls, filtration, and sometimes humidification/dehumidification.
Why it matters: cooling savings can be major
Free cooling impact is highly site- and design-dependent, but documented case studies show large reductions in cooling energy. For example, a retrofit of the National Snow and Ice Data Center (NSIDC) reduced cooling energy by over 70% in summer and over 90% in cooler winter months using airside economization and indirect evaporative cooling (study published in 2012). (research-hub.nrel.gov)
Free cooling operating envelope: the limits you cannot ignore
1) IT inlet temperature and humidity constraints (ASHRAE guidance)
Your free cooling strategy must keep server inlet conditions within equipment specifications. ASHRAE’s TC 9.9 guidance is widely used to frame safe operating envelopes. In the ASHRAE TC 9.9 power-related white paper (2016), the recommended range for classes A1–A4 is 18–27°C, with a dew point range of −9°C to 15°C and 60% RH (non-condensing). (ashrae.org)
Reality check: many data halls include mixed IT generations; ASHRAE also notes that implementing economization can be challenging when you have different thermal specs and equipment vintages. (ashrae.org)
2) Outdoor humidity and the “hidden” dehumidification penalty
Outdoor air can be thermally “good” (cool enough) but still unusable because humidity is outside recommended/allowable limits. ASHRAE explicitly notes that humidity outside limits can preclude air-side economization even when outdoor temperature is favorable. (ashrae.org)
Best practice implication: if your concept depends on frequent dehumidification or humidification to make outdoor air “fit,” you may erase a significant part of expected savings.
3) Air quality, filtration, and corrosion risk
Direct air-side economizers introduce outdoor contaminants into the IT space. ASHRAE notes that, depending on local air quality, direct air-side economization can bring pollutants into the data center and may require additional filtration. (ashrae.org)
This is not only about dust: airborne contamination can accelerate corrosion and reduce reliability of electrical equipment; ASHRAE highlights how airborne contamination and humidity can contribute to corrosion risks in power spaces. (ashrae.org)
4) Water constraints (evaporative / adiabatic “boost”)
Evaporative or adiabatic systems can extend free cooling hours and reduce compressor use, but they introduce a new constraint: local water availability and acceptability.
Uptime Institute stresses that water impact is local and must be assessed against watershed constraints; it also notes economizers can often provide 30% to 80% of the cooling, depending on design and climate. (journal.uptimeinstitute.com)
They also give a concrete example: in cooler climates, a large 25 MW data center using a closed-loop adiabatic evaporative system may rely on free cooling for 90% to 95% of the year (neither water use nor mechanical refrigeration) and consume only a small amount of water per MW of IT capacity. (journal.uptimeinstitute.com)
5) Reliability, redundancy, and operational complexity
Economizers add moving parts (dampers, louvers), sensors (temperature, RH/dew point), control sequences, and maintenance requirements. The limit is often not “can we cool?” but “can we cool predictably during transients, smoke events, dust storms, salt fog, or fast weather swings?”
Main free cooling architectures (and where each fits)
ASHRAE describes several forms of economization: air-side, water-side, and refrigerant economization, and notes that evaporative/adiabatic cooling can improve the efficiency of air- and water-side economization. (ashrae.org)
Comparison table: strategies, benefits, and practical limits
Free cooling approach | How it works | Best use cases | Key limits / risks | Best-practice mitigations |
|---|---|---|---|---|
Direct air-side economizer | Brings filtered outdoor air directly into the data hall; mixes with return air as needed | Cool/dry climates; facilities designed for robust filtration and controls | Outdoor pollutants; humidity outside IT envelope; fast swings; smoke events | High-performance filtration, air quality monitoring, humidity/dew point controls, smoke-mode lockout, tight sensor placement at server inlets |
Indirect air-side economizer | Uses a heat exchanger (wheel/plate/heat pipes) so outdoor air cools return air without entering the IT space | Sites with moderate pollution risk; operators prioritizing IT air cleanliness | Heat exchanger adds approach temperature (less “free” than direct); maintenance; pressure drops | Commissioning of airflow paths, bypass modes, coil/exchanger cleaning plans, fan energy optimization |
Water-side economizer | Outdoor air cools water/glycol (dry coolers/cooling towers) used in CRAHs/AHUs | Many enterprise/colocation sites seeking “safe” economizing without introducing outdoor air into the data hall | Water treatment (if towers), freezing protection, scaling, pump energy, approach temps | Variable-speed pumping, free-cooling coil sizing, waterside controls, water treatment & monitoring, freeze strategy |
Refrigerant economizer (pumped refrigerant) | Uses favorable ambient to reject heat without full compressor work | Retrofits or sites needing packaged solutions; partial free cooling in many climates | Complex refrigeration circuits; serviceability; performance depends on ambient and design | Lifecycle maintenance planning, redundancy strategy, robust controls and monitoring |
Evaporative/adiabatic assist | Uses water evaporation to reduce air or fluid temperature and extend economizer hours | Hot/dry climates; sites with acceptable water profile | Water availability, water quality, drift/legionella controls (where applicable), WUE impact | Closed-loop adiabatic, peak-shaving water strategy, WUE tracking, water-risk assessment per site |
Best practices to make free cooling work (without compromising uptime)
Start with measurable targets (PUE, WUE, and “hours on economizer”)
Define KPIs that can be verified post-commissioning:
PUE: ISO/IEC 30134-2 defines PUE as the ratio of data centre total energy consumption to IT equipment energy consumption, measured across the same period. (docbox.etsi.org)
WUE (if evaporative systems are involved): Microsoft summarizes WUE as liters per kWh and defines it as annual liters of water used for humidification and cooling divided by annual IT kWh. (datacenters.microsoft.com)
Economizer utilization: percentage of hours in full/partial free cooling, by season.
Example of public reporting: Google reports an average annual PUE of 1.09 in 2024 for its global fleet, and explains measurement boundaries and trailing twelve-month reporting. (datacenters.google)
Use the right control logic: temperature and humidity (dew point) matter
Best practice is to control economizers using a psychrometric approach (temperature + humidity/enthalpy) aligned with the IT inlet envelope. ASHRAE notes that humidity outside limits can block air-side economization even when temperature is favorable. (ashrae.org)
Control to server inlet sensors (not only CRAH return or room average).
Implement ramp-rate limits and alarm thresholds to avoid rapid swings.
Define fail-safe modes (e.g., revert to mechanical cooling if sensor disagreement occurs).
Engineer air management first (containment before economizers)
Free cooling performs best when airflow is predictable:
Hot-aisle or cold-aisle containment to reduce bypass and recirculation.
Seal cable openings; manage floor grommets; control pressure differentials.
Match fan control to real load and temperature needs (variable-speed where possible).
Design filtration and air quality strategy as a first-class requirement
For direct air-side economizers, filtration is not optional. ASHRAE warns that direct air-side economization can introduce pollutants and may require additional filtration. (ashrae.org)
Best practices include:
Multi-stage filtration with pressure drop monitoring and replacement planning.
Outdoor air quality assessment (particulates and gaseous contaminants) as part of site selection and design.
“Smoke mode” and “pollution mode” sequences to close dampers and run mechanical cooling during events.
Plan water strategy explicitly (if using adiabatic/evaporative assist)
Water constraints are increasingly decisive. Uptime Institute recommends evaluating water use against the safe withdrawal rate of the watershed and prioritizing dry free cooling in water-stressed areas. (journal.uptimeinstitute.com)
Operational best practices:
Use water as trim cooling (peak shaving) rather than as a baseline dependency where feasible.
Track WUE and contextualize it by location and season (not a single annual number).
Implement water quality monitoring and drift control where relevant.
Commissioning and continuous verification (Cx + monitoring) are non-negotiable
Many economizer projects “work” on day one but drift over time due to sensor bias, fouling, control overrides, or operational changes. Practical steps:
Functional testing of each mode: full free cooling, partial, mechanical, failover.
Trend logs for: outdoor dry bulb, dew point, damper positions, fan speeds, pump speeds, coil approach temperatures.
Define “no manual override without expiry” governance to avoid permanent inefficiency.
For guidance and structured approaches to energy-efficient data center design, the U.S. Department of Energy (FEMP) highlights updated best-practices resources (including a design best practices guide revised for 2024). (energy.gov)
How to estimate free cooling potential (a practical, defensible method)
Step 1: Pick your target IT inlet envelope
Start with your equipment specs and a governance decision (recommended vs allowable). ASHRAE’s recommended envelope of 18–27°C and dew point limits is often used as a baseline. (ashrae.org)
Step 2: Use hourly weather data and count “economizer hours”
Use TMY/EPW weather files or on-site measured data and classify each hour:
Full free cooling: outdoor conditions can meet inlet targets without mechanical cooling.
Partial free cooling: economizer provides some cooling, but mechanical assistance is required.
No economizer: mechanical cooling required.
ASHRAE notes that many worldwide locations can economize for as much as 50% of the hours in a year within the recommended 18–27°C range, and higher percentages may require wider allowable ranges. (ashrae.org)
Step 3: Convert hours to energy using a simple model (then refine)
If you do not yet have detailed simulation, a reliable first pass is:
Baseline cooling kWh (from metering or utility/billing model)
Expected compressor/chiller runtime reduction based on economizer hours and part-load curves
Added kWh for fans/pumps + filtration pressure drops
Added water (if evaporative) translated into WUE impact
Then refine with dynamic simulation and measured performance once deployed.
Concrete examples of free cooling in the real world
Case example: air-side economizer in operation most of the year
ENERGY STAR cites NetApp’s Global Dynamic Laboratory using air-side economizers to operate without a chilled water plant for more than 75% of the year (“full free cooling”), and using outside air for partial free cooling more than 98% of the time. (energystar.gov)
Case example: deep cooling-energy reduction via economization + indirect evaporative cooling
The NREL-linked conference paper on the NSIDC retrofit reports cooling energy reductions of over 70% (summer) and over 90% (cooler winter months) after retrofit, enabled by airside economization and indirect evaporative cooling. (research-hub.nrel.gov)
How Score Group supports free cooling projects (Energy × Digital × New Tech)
At Score Group, we act as a global integrator across three pillars—Energy, Digital, and New Tech—to improve operational efficiency, sustainability, and resilience.
Noor ITS: data center design, infrastructure, and operational reliability
Our Noor ITS division supports data center optimization from assessment to implementation, including infrastructure and operational constraints (redundancy, monitoring, maintenance). Discover our scope on NOOR-ITS and our dedicated page DataCenters Score Group: performance, security and storage. We also help ensure that supporting systems (network, sensors, supervision) are robust via IT Infrastructure: networks, servers and storage.
Noor Energy: energy management, building systems, and efficiency governance
Free cooling performance depends on controls, setpoints, and system-level optimization. Our Noor Energy division addresses energy governance and building intelligence through Energy management (monitoring, control, optimization) and Building Management (GTB/GTC and smart systems).
Noor Technology: data, automation, and smarter operations
Economizers are control-heavy. Our Noor Technology division can support the integration of digital layers (IoT, analytics, automation) to help maintain performance over time (drift detection, anomaly detection, predictive maintenance), aligned with operational constraints.
Managed services: keeping performance sustainable over time
Many efficiency gains erode without continuous follow-up. Score Group can support ongoing monitoring and improvement through Managed Services: maintenance and performance.
FAQ: Free Cooling in Data Centers (Limits & Best Practices)
How do I know if my site can use air-side free cooling safely?
Start with a climate and air-quality screening. Use hourly weather data to count how many hours meet your chosen inlet envelope (temperature and dew point), then overlay local constraints: pollution events, wildfire smoke risk, salt fog, and dust. ASHRAE notes humidity outside limits can block air-side economization even when temperature is favorable, and direct air-side can introduce pollutants requiring additional filtration. (ashrae.org) A safe design typically includes robust filtration, clear lockout modes, and tight monitoring at server inlets—not just room averages.
Is water-side economization “easier” than air-side economizers?
Often, yes—because water-side economizers avoid bringing outdoor air directly into the data hall, which reduces contamination concerns and helps maintain tighter humidity control. ASHRAE describes water-side economization as using outdoor air indirectly to chill liquid that then cools data center air via coils, with the advantage of not bringing outside air into the IT space. (ashrae.org) That said, “easier” doesn’t mean “simple”: you still must manage approach temperatures, pumping energy, freeze protection, water treatment (if towers), and commissioning of control sequences.
What temperature setpoints maximize free cooling without risking IT equipment?
There is no universal setpoint, but a common baseline is aligning with ASHRAE’s recommended guidance (for many environments, 18–27°C at the inlet, with dew point limits). (ashrae.org) Raising setpoints can increase economizer hours—ASHRAE notes economizer hours of beneficial use increase as computer room temperatures increase. (ashrae.org) Best practice is to validate against your actual IT fleet specs (including legacy gear), then implement changes gradually with strong monitoring and alarm management.
How should we report results: PUE only, or include water metrics too?
PUE is foundational and standardized: ISO/IEC 30134-2 defines it as the ratio of total data center energy to IT equipment energy over the same period. (docbox.etsi.org) But if you use adiabatic/evaporative cooling, you should also track water. Microsoft summarizes WUE as liters per kWh and defines it as annual liters of water used for humidification and cooling divided by annual IT kWh. (datacenters.microsoft.com) Reporting both avoids shifting the burden from electricity to water without visibility.
Can free cooling be compatible with high-density AI workloads?
It can be compatible, but the margin for error is smaller. High-density racks often require tighter control of airflow, higher heat flux management, and sometimes a move toward liquid cooling or hybrid architectures. Economizers may still reduce mechanical cooling hours, but you must ensure stable inlet conditions during fast load transients and weather swings. A robust approach combines containment, precise sensor placement, advanced controls, and clear failover sequences. When density rises, many operators shift to architectures where free cooling supports heat rejection loops (water-side economization) rather than relying on direct outside air in the IT space.
What’s next?
If you want to reduce cooling overhead while protecting availability, free cooling should be treated as a system engineering program—not a single equipment choice. At Score Group, we can help you assess feasibility, design the right economizer architecture, integrate controls (BMS/DCIM), and keep performance stable over time through monitoring and managed services. Explore our capabilities on Noor ITS – DataCenters and Noor Energy – Energy Management, or start from score-grp.com.



