Member of the International Virtual Observatory Alliance

Splinter-Meeting: "eScience & Virtual Observatory"

The Splinter-Meeting "eScience & Virtual Observatory" at the Herbsttagung of the Astronomische Gesellschaft in Hamburg will take place on Wednesday, 26th September 2012, 2:15 pm - 6.15 pm.

Program

Convener: Harry Enke, Joachim Wambsganß

Time Speaker Title
14:15 - 14:20 Harry Enke Introduction
14:20 - 14:45 Shantanu Desai Data Management for next generation optical photometric surveys
14:45 - 15:10 Sebastian Els The future Gaia archive
15:10 - 14:35 Felix Stoehr Taming the ALMA data avalanche
15:35 - 16:00 Coffee Break
16:00 - 16:25 Joerg Knoche Planck Data Management Ideas and Implementation
16:25 - 16:50 Harry Enke MUSE Data management System MuseWISE
16:50 - 17:15 Matthias Hoffmann IGE/EGCF Connecting Computing Centres and Users on a European Scale
17:15 - 18:00 Discussion and Summary of the splinter session

Abstracts

Data Management for next generation optical photometric surveys
Shantanu Desai
The Dark Energy Survey Data Management System (DESDM) was setup around 2004 to process and calibrate the high volume data from the Dark Energy Survey (which will start taking data at the end of the year) on high performance supercomputers. DESDM has been extensively tested using simulated data from DES in a series of yearly data challenges. It also also been used to process real data from Blanco Cosmology Survey and imaging data for optical followup of South Pole Telescope detected galaxy clusters from a whole variety of NOAO and ESO telescopes. We shall provide some technical as well as scientific details of the various pipelines in DESDM and discuss how the workflows are designed. We shall also highlight a few science results using DESDM. In the coming decade at USM, we shall extend this existing data management system for analysis of optical data from ongoing and upcoming photometric surveys such as PanStarrs, CFHT, HSC etc.

The future Gaia archive
Sebastian Els
With the launch of ESA's astrometry mission - Gaia - approaching in the not far future, early steps have been taken to prepare for the generation of the Gaia archive. During the five years of its operational live, Gaia will observe more than one billion objects down to 20 mag. The Gaia observations will result in astrometric, photometric and spectroscopic data, and will impact all areas of astrophysical research. To allow for the best scientific use of those data, the access shall be assured by means of the Gaia archive. This presentation outlines the 'Gaia' mission, the expected data products, and the early steps which are being taken to prepare for the future Gaia archive.

MUSE data management system MuseWISE
Harry Enke
The MUSE instrument will start operations in 2013. For the MUSE collaboration, a data management system was developed. In three data centers the data will be hosted using a system which combines the MUSE pipeline and an extension of AstroWISE. The MuseWISE system provides a powerful and extensible tool to make the 3D data 'science ready' and flexible enough to accommodate for special pipeline processing for various scientific questions not covered by standard processing pipeline. The distributed data hosting allows for adding different tools while avoiding a network imposed bottleneck when accessing the data.

IGE/EGCF - Connecting Computing Centres and Users on a European Scale
Matthias Hoffmann
The Initiative for Globus in Europe (IGE) provides European researchers with tools to share computing power, databases, instruments, and other online resources. This talk will provide a basic overview of software that is developed within or distributed by IGE as well as IGE services such as the requirements tracker and training hub. Since moving huge data sets is of special importance in the area of astrophysics, dedicated Globus tools like the new Globus Online Cloud service are presented.

Planck Data Management - Ideas and Implementation
Joerg Knoche
The handling and processing of large amounts of heterogeneous data for a space mission like Planck is tricky to plan, particularly in case the algorithmic strategies used can be expected to change at any time and will be provided by a large number of individual scientists. In anticipation of such challenges, a versatile data processing system for the Planck mission was envisaged consisting of a database back end, the data management component, and a scientific workflow engine, the Process Coordinator. In this talk, I will give an overview about this data processing infrastucture developed at the Max Planck Institute for Astrophysics in Garching as a German contribution to the Planck mission

Taming the ALMA data avalanche
Felix Stoer
The Atacama Large Millimeter/submillimeter Array (ALMA) is nearing its phase of full operations. ALMA will collect about 200TB/year of astronomical data which will be reduced by an automatic pipeline and turned into fully calibrated science-ready data products. We present design choices, challenges and solutions from data storing over data reduction and data distribution to archival research, that allow to deal with the large amounts of data and, hopefully, achieve the maximum amount of science return.