Members of the Computing group are involved in a number of different projects, of which the LOFAR project is the largest. Progress of the LOFAR project is reported on the Project pages, but here we will mention some of the highlights. The SAS / MAC group finished Navigator 2.0, the interface for scheduling and monitoring LOFAR observations. MSc student Peter Boonstoppel together with the OLAP group and people from Argonne National Labs in Chicago developed a fast I/O protocol for the Blue Gene / L: ZOID. After this the group was busy with porting software from the Blue Gene / L to the Blue Gene / P which resulted in first fringes in August. In the Off-line group an initial version of the pre-processing pipeline was delivered. In the second half of the year this version was automated and ported to the temporary LOFAR cluster in Groningen. BBS was extended to incorporate a Global Solver, which allows for solutions over distributed subbands, and a first version of a distributed imager became operational. In the second half of the year we also started working on the second data processing pipeline: the known Pulsar pipeline. This pipeline has a large number of components running on the Blue Gene supercomputer. Finally, intial imaging observations with LOFAR20 (the MSSS: Million Source Shallow Survey) were specified and planned for.
In the SKADS project contributions were made to the Design and Costing document. In DS2-T2 frameworks for doing instrument simulations in the MeqTree package were developed. Furthermore, parallelization approaches were investigated and in the LIONS (LOFAR IONospheric Simulations) group the effect of the ionosphere on low frequency radio observations was investigated. The latter resulted in a document describing the Minimum Ionospheric Model that can be used for ionospheric calibration.
For the APERTIF project software was made available for reducing DIGESTIF data.
In the AstroStream project we are investigating with the TU Delft and the Free University of Amsterdam the possibilities of doing streaming astronomical processing. Correlator software was rewritten for running on a IBM Cell processor and on a ATI GPU card. We now have correlator implementations on both NVIDIA and ATI GPUs, the Cell/B.E. processor, Intel/SSE and the BG/P. The gridding part of imaging software was ported to the Cell and GPU and compared to CPU performance. The imaging code was parallelized on Intel multi-core machines with multi-threading and SSE3 vector parallelism. Also part of calibration software was considered for running on the Cell. Finally, tied array beamforming and asynchroneous transpose on the Blue Gene / P were looked into.
A small project for the DataSim company was concluded together with Dysi. In this project parallelisation and the application of calibration techniques on financial models was investigated. A Levenberg - Marquard solver code was delivered.
For the PrepSKA and RadioNet-FP7 ALBiUS projects we are in the process of defining the workpackages. PrepSKA WP2 had its kick-off meeting in November at SPDO in Manchester. The ALBiUS kick-off meeting will be held early 2009.
In the Computing Group are working:
Ronald Nijboer - Competence Group Leader
Chris Broekema - Researcher
Arthur Coolen - Design Engineer
Ger van Diepen - System Engineer
Ruud Overeem - Instrument Engineer
Marcel Loose - System Engineer
Vishambhar Nath Pandey - System Researcher
Dr. John Romein - System Researcher
Dr. Sarod Yatawatta - System Researcher
Alexander van Amesfoort - HPC Software Engineer
Dr. Tammo Jan Dijkema - Scientific Software Engineer
Yan Grange - Post-doc DOME - P2
Bas van der Tol - Scientific Software Engineer
Dr. Agnes Mika - DOME SKA Liaison Engineer
Bram Veenboer - PhD Researcher
Matthias Petschow - Postdoctoral Researcher / Software Developer