DOE's Argonne Leadership Computing Facility and HPE expand high-performance computing (HPC) storage capacity for exascale

press release
Aurora

The ALCF’s next-generation supercomputer, Aurora, will be one of the nation’s first exascale systems when it is delivered in 2021. Designed in collaboration with Intel and HPE, Aurora will enable researchers to accelerate discoveries and innovation across scientific disciplines.

The ALCF advances capabilities to target complex scientific research using modeling, simulation, and AI, ahead of its upcoming Aurora exascale supercomputer.

SAN JOSE, Calif. – January 30, 2020 – Hewlett Packard Enterprise (HPE) and the Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy (DOE) Office of Science User Facility located at Argonne National Laboratory, today announced that the ALCF will deploy the new Cray ClusterStor E1000, the most efficient parallel storage solution, as its newest storage system. The new collaboration supports the ALCF’s scientific research in areas such as earthquake seismic activity, aerospace turbulence and shock waves, physical genomics and more. The latest deployment advances storage capacity for research that requires converged modeling, simulation, artificial intelligence (AI) and analytics workloads, in preparation for Aurora, the ALCF’s forthcoming exascale supercomputer, powered by HPE and Intel, and the first-of-its-kind expected to be delivered in the U.S. in 2021. 

Cray ClusterStor E1000

ALCF will deploy the new Cray ClusterStor E1000, the most efficient parallel storage solution, as its newest storage system.

The Cray ClusterStor E1000 system utilizes purpose-built software and hardware features to meet high-performance storage requirements of any size with significantly fewer drives. Designed to support the Exascale Era, which is characterized by the explosion of data and converged workloads, the Cray ClusterStor E1000 will power ALCF’s future Aurora supercomputer to target a multitude of data-intensive workloads required to make breakthrough discoveries at unprecedented speed.                                               

“ALCF has their eye set on the Exascale Era by putting in place the infrastructure required for converged workloads in HPC, AI, analytics, modeling and simulation,” said Peter Ungaro, senior vice president and general manager, HPC and AI, at HPE. “Cray ClusterStor E1000 delivers the scalability and performance ALCF requires to unlock insights and discovery from these data intensive workloads.”

The ALCF’s two new storage systems, which it has named “Grand” and “Eagle,” are using the Cray ClusterStor E1000 system to gain a completely new, cost-effective high-performance computing (HPC) storage solution to effectively and efficiently manage growing converged workloads that today’s offerings cannot support. 

“When Grand launches, it will benefit ALCF’s legacy petascale machines, providing increased capacity for the Theta compute system and enabling new levels of performance for not just traditional checkpoint-restart workloads, but also for complex workflows and metadata-intensive work,” said Mark Fahey, ALCF director of operations, ALCF.

“Eagle will help support the ever-increasing importance of data in the day-to-day activities of science,” said Michael E. Papka, director, ALCF. “By leveraging our experience with our current data-sharing system, Petrel, this new storage will help eliminate barriers to productivity and improve collaborations throughout the research community.”

The two new systems will gain a total of 200 petabyes (PB) of storage capacity, and through the Cray ClusterStor E1000’s intelligent software and hardware designs, will more accurately align data flows with target workloads. The ALCF’s Grand and Eagle systems will help researchers accelerate a range of scientific discoveries across disciplines, and are each assigned to address the following:

  • Computational capacity - Grand provides 150 PB of center-wide storage and new levels of input/output (I/O) performance to support massive computational needs for its users. 
  • Simplified data-sharing - Eagle provides a 50 PB community file system to make data-sharing easier than ever among ALCF users, their collaborators and with third parties.

The ALCF plans to deliver its Grand and Eagle storage systems in early 2020. The systems will initially connect to existing ALCF supercomputers powered by HPE HPC systems: Theta, based on the Cray® XC40-AC™ and Cooley, based on the Cray CS-300. Grand, which is capable of 1 terabyte per second (TB/s) bandwidth, will be optimized to support converged simulation science and data-intensive workloads once the Aurora exascale supercomputer is operational.

About Hewlett Packard Enterprise
Hewlett Packard Enterprise is the global edge-to-cloud platform-as-a-service company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way we live and work, HPE delivers unique, open and intelligent technology solutions, with a consistent experience across all clouds and edges, to help customers develop new business models, engage in new ways, and increase operational performance. For more information, visit: www.hpe.com.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.

The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science


Media Contacts:
Brian Grabowski
Argonne National Laboratory
media@anl.gov

Nahren Khizeran 
Hewlett Packard Enterprise
Mobile: 209-456-0812

Systems