Andreas Wicenec – Professor of Data Intensive Research, ICRAR; University of Western Australia

Andreas Wicenec

Presentation at III Multicore World

Comparative Scalability I/O Studies in HPC Clusters

Prof. Andreas Wicenec, UWA


Typical I/O to FLOP ratios (Amdahl numbers) of current HPC clusters are orders of magnitudes below one [1]. Apart from those theoretical limitations, the actual achievable I/O rates and thus the scalability of applications requiring access to very large data volumes very often is affected by non-optimised configurations of hardware and/or various software layers. In this paper we will present results of various experiments showing the influence of configuration changes, the usage of different I/O libraries and comparisons between cheap local, node based storage compared to a high end Lustre global file system. The results suggest that for certain extremely data intensive and data parallel problems, scalability can be reached by using an extreme shared nothing paradigm. On the other hand it is also clear that proper configuration and the choice and optimisation of the underlying I/O software stack, including the OS I/O system are equally important.

[1] Szalay, A. S. (2011). Amdahl ’ s Laws and Extreme Data-Intensive Scientific Computing. ADASS, 442.


Presentation for Computing for SKA workshop

A Persistence Layer for the SKA Science Data Processor

Prof. Andreas Wicenec, UWA


The SKA will require flexible persistent short, mid- and long-term storage and data access infrastructures distributed across several locations on wide-area networks spanning hundreds of kilometers. Long-term archival storage and access will potentially be spread around the globe to several regional data and processing centres. Models similar to the Worldwide LHC Computing Grid (WLCG [1]) could be adopted and adjusted to the SKA requirements, provided that we would be able to attract similar scientific and political interest in certain regions of the world. Although the SKA requirements are significantly different at a first glance, a more careful look reveals quite similar ‘valuable’ information content in SKA and LHC data. Filtering and preserving those valuable data items requires novel, automated data-mining and a sophisticated data life-cycle management infrastructure. Both are pretty much unheard of in current astronomical data archives. In this paper we will present the basic design for the SKA data layer and discuss why we think this design and the development approach will be able to address the challenges of the SKA.

[1] The LHC’s worldwide computer, The CERN Courier, 2013,


Andreas Wicenec

Professor of Data Intensive Research
International Centre for Radio Astronomy Research (ICRAR)
The University of Western Australia
M468, 35 Stirling Highway


Winthrop Research Professor at the University of Western Australia since 2010, leading the Information and Communication Technology (ICT) Program of the International Centre for Radio Astronomy Research (ICRAR) to research, design and implement data flows and high performance scientific computing for the Murchison Wide Field Array (MWA), the Australian SKA Pathfinder and the SKA. During his graduate, post-graduate and post-doctoral appointments, he was involved in the software development and reduction of photometric and astrometric Tycho data from the Hipparcos satellite. He joined ESO in 1997 as an archive specialist and was involved in the final implementation of the archive for ESO’s Very Large Telescope (VLT) and the ESO Imaging Survey. Between 2002 and 2010, he was employed as ESO’s Archive Scientist and led the ALMA archive subsystem development group. Prof. Wicenec is also involved in the International Virtual Observatory Alliance (IVOA). His scientific interests and publications include high precision global astrometry, optical background radiation, stellar photometry, dynamics and evolution of planetary nebulae and observational survey astronomy and the related scheduling and computational concepts.


Page at University of Western Australia here


Interview at ICRAR