We have made much progress over the past decade toward harnessing the collective power of IT resources distributed across the globe. In big-science projects in high-energy physics, astronomy, and climate, thousands work daily within virtual computing systems with global scope. But we now face a far greater challenge: Exploding data volumes and powerful simulation tools mean that many more--ultimately most?--researchers will soon require capabilities not so different from those used by such big-science teams. How are we to meet these needs? Must every lab be filled with computers and every researcher become an IT specialist? Perhaps the solution is rather to move research IT out of the lab entirely: to develop suites of science services to which researchers can dispatch mundane but time-consuming tasks, and thus to achieve economies of scale and reduce cognitive load. I explore the past, current, and potential future of large-scale outsourcing and automation for science, and suggest opportunities and challenges for today's researchers. I use examples from Globus and other projects to demonstrate what can be achieved.
NSCI Committee
1:00 p.m. - 2:00 p.m. (Gaithersburg, 101, Red Auditorium)
11:00 a.m. - 12:00 p.m. (Boulder, VTC in 2-0113)
The presentation is available for viewing online.
Ian Foster
University of Chicago / Argonne National Lab