The Growing Power and Sustainability Challenges of HPC and AI Clusters

Experts discuss the need for sustainable practices and innovative solutions in the face of increasing power consumption and environmental impact.

As the demand for more powerful high-performance computing (HPC) and artificial intelligence (AI) clusters continues to rise, so does the need for larger and more energy-intensive facilities. With the slowing of Moore’s Law, buying more hardware becomes essential for achieving better performance, resulting in increased energy dissipation and cooling requirements. At the recent SC23 supercomputing conference in Denver, experts from various fields gathered to discuss the challenges and potential solutions for building sustainable and carbon-neutral HPC and AI facilities.

Power Efficiency Is Great, But Not At The Expense Of Water

Power use efficiency (PUE) is a widely-used metric to measure the efficiency of data centers. However, the pursuit of low PUE has led to some detrimental practices, as highlighted by Nicolas Dubé from Hewlett Packard Enterprise (HPE). Some hyperscalers have built data centers in dry regions and optimized their cooling systems using evaporative cooling, which consumes significant amounts of water. This approach raises concerns about the impact on local communities and resources. Experts argue that optimizing for a few percentage points of energy consumption should not come at the expense of water scarcity or conservation efforts.

Location And Planning Matter

Location plays a crucial role in mitigating the environmental impact of HPC and AI facilities. Dubé suggests deploying data centers in regions with abundant green energy sources to reduce the carbon footprint. For example, a data center facility in Quebec is being developed with nearly 100% renewable power from hydro and wind sources. Additionally, there is an opportunity to utilize the waste heat generated by these facilities. The QScale facility in Quebec plans to collocate with agricultural greenhouses, utilizing the captured waste heat to warm the greenhouses during winter months. This approach not only reduces waste but also creates additional value by supporting local agriculture.

What If We Turn Off Systems Sometimes?

Andrew Chien from the University of Chicago’s CERES Center for Unstoppable Computing suggests a more dynamic approach to operating HPC clusters and data centers. Instead of running at constant capacity, operators can vary the utilization of these systems based on the availability and mix of power from the grid. For instance, during periods of high wind or solar output, a facility can operate at higher capacity, reducing both power costs and carbon emissions. This dynamic approach, combined with grid improvements, could lead to significant reductions in energy consumption and environmental impact.

Better, More Consistent Reporting Is Needed

To address the sustainability challenges of HPC and AI clusters, better and more consistent reporting is essential. Data center operators need to track and report sustainability metrics accurately. Schneider Electric has proposed 28 metrics, including total power consumption, PUE, renewable energy consumption, water consumption, energy reuse, and service utilization. However, the current reporting practices vary widely among operators, making it difficult to compare and assess the environmental impact. Schneider Electric suggests starting with a subset of these metrics and gradually expanding reporting efforts to improve transparency and accountability.

Conclusion:

As the demand for more powerful HPC and AI clusters continues to grow, so does the need for sustainable practices in their design, deployment, and operation. Experts emphasize the importance of considering water conservation, location selection, waste heat utilization, dynamic operation strategies, and consistent reporting. By adopting these measures, the HPC and AI community can work towards reducing the environmental impact of these facilities while continuing to meet the increasing computational demands of the future.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *