GALAH data reduction is carried out with custom pipelines developed by the survey team.
The data reduction pipeline combines existing Iraf routines with new python code, and is mostly automatic. For each night of data, the following steps are carried out:
- Subtract bias
- Remove cosmic rays and vertical streaks
- Interpolate over bad columns
- Create a nightly average flat-field image
- Identify spectrum traces from the average flat field
- Correct 2d spectra for optical aberrations
- Subtract scattered light
- Subtract fibre cross-talk
- Extract spectra
- Solve for wavelength calibration using ThXe arc lamp exposures taken with the science exposures
- Subtract sky spectrum
- Correct telluric absorption
- Correct for barycentric velocity
- Propagate error spectra
Once 1D spectra have been produced, they are passed to the next stage of the data reduction pipeline, where they are cross-correlated against a grid of 15 AMBRE synthetic spectra with log(g)=4.5, [Fe/H]=0, and Teff varying from 4000K-7500K in steps of 250K. This stage returns estimates for those parameters for each star, along with radial velocity for the blue, green and red arms individually and a mean radial velocity. Spectra are then normalized using carefully chosen continuum points.
Full details on the data reduction are given in Kos et al. 2017.