Accelerating discovery via cloud services
We have made much progress over the past decade toward harnessing the collective power of IT resources distributed across the globe. In high-energy physics, astronomy, and climate, thousands work daily within virtual computing systems with global scope. But we now face a far greater challenge: Exploding data volumes and powerful simulation tools mean that many more–ultimately most?–researchers will soon require capabilities not so different from those used by such big-science teams. How are we to meet these needs? Must every lab be filled with computers and every researcher become an IT specialist? Perhaps the solution is rather to move research IT out of the lab entirely: to leverage the “cloud” (whether private or public) to achieve economies of scale and reduce cognitive load. I explore the past, current, and potential future of large-scale outsourcing and automation for science, and suggest opportunities and challenges for today’s researchers. I use examples from Globus, Swift, and other projects to demonstrate what can be achieved.
Prof. Ian Foster
Director, Computation Institute – University of Chicago & Argonne National Laboratory, USA
Professor, Department of Computer Science – University of Chicago.
Distinguished Fellow, Argonne National Laboratory.
Prof. Foster -who keynoted Multicore World 2013, will be speaking in 2015 about the Swift language -a parallel scripting language that offers “fast easy parallel scripting – on multicores, clusters, clouds and supercomputers.” His presentation will mention the “Data Manager” for the Science Data Processor (SDP) of the SKA -and the operating system Argo.