Monash University has implemented a multi-petabyte deployment at its eResearch Centre, giving the Melbourne-based advanced computing facility the capacity to store and manage massive workloads of data.
The university implemented a software-defined solution that uses Red Hat Ceph Storage on Dell EMC PowerEdge R630 and R730xd rack servers that it expects will accelerate application performance, simplify systems management, and address the university’s growing data storage requirements.
The eResearch Centre is now able to store and manage its massive workload of data — which already encompasses five petabytes — within a single infrastructure, Monash said.
“As a research institution, we are faced with the challenge of virtually limitless data, not only from new projects, but from archived and long-tail research. One of our key concerns in this process was having enough storage space in an OpenStack cloud environment, as it supports the majority of use cases from our researchers,” Monash eResearch Centre deputy director Steve Quenette added.
The university’s eResearch Centre focuses on “21st century research discovery” and partners with research groups to accelerate and transform research in both fundamental and applied sciences, with an emphasis on imaging and data science. The university said that its research centre also connects researchers and partners to the most appropriate hardware, software, and services to sustain their respective research capabilities.
In conducting its research and development, the eResearch Centre works with AuScope, Monach University’s supercomputing facility MASSIVE, and the Monash Bioinformatics and Immersive Visualisation platforms.
Monash University’s Multi-modal Australian ScienceS Imaging and Visualisation Environment (MASSIVE) received a high performance supercomputer upgrade last February to its M3 HPC, using Dell’s super compute platform, powered by graphics processing unit (GPU) giant Nvidia.
M3 is used in conjunction with the CSIRO and the Australian Synchrotron specifically to process complex data. According to Monash, over the past five years, MASSIVE has played a key role in driving discoveries across many disciplines including biomedical sciences, materials research, engineering, and geosciences.
Similarly, the National Computational Infrastructure (NCI), Australia’s national research computing service, purchased four IBM Power System servers for high performance computing in December, in a bid to advance its research efforts through artificial intelligence, deep learning, high performance data analytics, and other compute-heavy workloads.
The announcement came a month after the NCI announced that Xenon Systems would be supplying a Lenovo NeXtScale system as an extension of Raijin — the NCI’s supercomputer that was instrumental in the computation of the whole human genome sequences.
The Commonwealth Scientific and Industrial Research Organisation (CSIRO) expects to have a new supercomputer up and running in mid-2017, replacing its existing Bragg accelerator cluster, a system the organisation currently uses to solve big data challenges in fields such as bioscience, image analysis, fluid dynamics modelling, and environmental science.
The CSIRO also went to tender in September to find a new Advanced Technology Cluster (ATC) to replace the decommissioned Fornax system at the Pawsey Supercomputing Centre in Perth, a national supercomputing joint venture between the CSIRO, Curtin University, Edith Cowan University, Murdoch University, and the University of Western Australia.