Skip to content

sdss/lvmdrp

Repository files navigation

Local Volume Mapper (LVM) Data Reduction Pipeline (DRP)

The LVM DRP is based in a collection of routines from Py3D.

Prerequisites

This code is being developed/tested in an Ubuntu-based OS, using Python 3.10. We recommend you use a Python environment manager such as Anaconda or similar, in order to work on the same python version and to avoid cluttering the OS's python installation. We assume you are a member of the Github sdss organization. We also assume that you have an SSH key set up on your local machine and registered in your Github account. If not, please follow these instructions to set up one.

To properly install and run the DRP you'll need to follow these steps first:

  1. Download the current version of LVM Core:

    git clone git@github.com:sdss/lvmcore.git

    and set the environment variable LVMCORE_DIR pointing to the root directory lvmcore in your .bashrc (or equivalent):

    export LVMCORE_DIR="path/to/lvmcore"
  2. Define this environment variable in your .bashrc (or equivalent) to point to your local mirror of the SAS:

    export SAS_BASE_DIR="path/to/sas-root-directory"

    you can download a target from the SAS while preserving the directory structure using this command:

    wget -X css --reject html -nH -nc -t0 -r –level=2 -E –ignore-length -x -k -p -erobots=off -np -N https://data.sdss5.org/sas/sdsswork/data/lvm/lco/<mjd>/ --user <user> --password <password>

    NOTE: we strongly recommend that you use the SDSS access product to achieve the same results.

  3. Create a new python environment. This is optional, but strongly recommended. With conda this is done like this:

    conda create -n lvmdrp python=3.10
  4. Make sure you are in the intended python environment and directory:

    conda activate lvmdrp

Installation

If you are planning on installing the DRP on a system other than Ubuntu (e.g., MacOS), please read the troubleshooting section before you continue with the steps below.

To install the DRP along with its dependencies, you need to run the following steps:

  1. Clone the Github repository:

    git clone git@github.com:sdss/lvmdrp.git
  2. Go into the lvmdrp directory:

    cd lvmdrp
  3. Install the DRP package in the current python environment (see contributing section below for a replacement of this step). Make sure you're back in the lvmdrp directory.

    cd lvmdrp
    pip install .

Testing the installation

There is a tool to quickly verify that all the needed environment variables are in place. You can run it like this:

envcheck

if the variables are correctly set, you should see the values of each and a successful message.

Setup Calibration Files

Download the current set of calibrations from the SAS sandbox. After installation of the pipeline, you can use the command drp get-calibs. For usage, run drp get-calibs --help. For example, to download all the calibration files for 60255, run

drp get-calibs -m 60255

This command will download the files using sdss-access and place them in $LVM_MASTER_DIR, which is defined by the pipeline as $LVM_SANDBOX/calib, mirroring the SAS. These are defined automatically relative to your root $SAS_BASE_DIR. You would find the files at $SAS_BASE_DIR/sdsswork/lvm/sandbox/calib/

Running the DRP

Say you want to reduce all expsure within <mjd>. You can do it by running in the shell the following:

drp run -m <mjd>

or you can reduce a single exposure number by running:

drp run -e <expnum>

or a list of exposure numbers in a file <expnum_file>, by running:

drp run -F <expnum_file>

More options are available, you can see them by running:

drp run --help

Running the DRP requires that you have correctly setup your environment by following the instructions in the Prerequisites and Installation sections.

The drp run will reduce your target exposure. Here is a list of reduction steps carried out by the DRP:

  • Preprocessing: overscan trimming and subtraction and pixel masking
  • Detrending: bias and dark subtraction, Poisson error calculation, flatfielding (pixel level, when available), units conversion (e-/s)
  • Astrometry: adds astrometry to the primary header and RA and DEC for each fiber to the slitmap extension
  • Stray light: modelling and subtraction of the straylight field
  • Extraction: fiber fitting spectra extraction, takes into account thermal fiber shifts in the Y direction
  • Spectrograph combination: row-stacking of spectrograph fibers
  • Wavelength calibration: pixel-to-wavelength mapping and LSF function per fiber
  • Fiberflat: flatfielding (fiber level) using twilight fiberflats
  • Wavelength refinement: refines the wavelength solution by matching the sky line positions, takes into account fiber thermal shifts in the wavelength direction, only used to subtract sky from standard fibers
  • Sky fibers interpolation: sky fibers interpolation along fiber ID by fitting the supersampled sky spectrum, per sky telescope
  • Wavelength resampling: wavelength resampling to a common grid (~0.5 Angstrom)
  • Flux calibration: calculates sensitivity curves for each standard star exposed and flux-calibrate the science fibers using the average sensitivity
  • Channel combination: stitching together spectrographs' channels
  • Sky subtraction: final sky subtraction separating sky lines and continuum and combining into master sky in a predefined way
  • Generate/update summary: adds a new record to the summary file (see description below)

The main outputs will be stored in the SAS directory:

$SAS_BASE_DIR/sdsswork/lvm/spectro/redux/<drpver>/<tilegrp>/<tileid>/<mjd>/

where you should find your lvmCFrame-<expnum:08d>.fits file, the raw_metadata.hdf5 file and the ancillary folder. Within ancillary you'll find files following the naming conventions:

  • lvm-[pdxlwhs]object-<camera>-<expnum:08d>.fits
  • lvm-dstray-<camera>-<expnum:08d>.fits: contains the stray light modelling information
  • lvm-[wh]sky_[ew]-<camera>-<expnum:08d>.fits: contains the supersampled sky fitting

where each letter in pdxlwh stands for preprocessed, detrended, extracted, wavelength-calibrated, wavelength-resampled, respectively. ew refers to east and west sky telescopes, respectively.

The main products of the pipeline:

  • lvmFrame-<channel>-<expnum:08d>.fits: extracted, spectrograph stacked, wavelength calibrated and flatfielded frame in electrons
  • lvmFFrame-<channel>-<expnum:08d>.fits: flux calibrated frame in physical units
  • lvmCFrame-<expnum:08d>.fits: channel combined flux calibrated frame
  • lvmSFrame-<expnum:08>.fits: sky subtracted frame
  • drpall-<drpver>.h5: summary file of the reductions using <drpver> version of the DRP

ESO sky routines installation and configuration

NOTE: you don't need to install the ESO sky routines to be able to run the science reductions as shown in the previous section

If you are planning on using the sky module, you will need to install the ESO routines first. To install skycorr and the ESO Sky Model, follow the instructions in the following links:

respectively. Additionaly, you'll need to set following the environment variable in your .bashrc (or equivalent):

export LVM_ESOSKY_DIR="path/to/eso-routines"

where eso-routines is a directory containing the root directories of both, the skycorr and the ESO sky model installations.

Creating test data

We encourage the reader to use the LVM data simulator to generate data for testing the DRP. But if you want to skip that step, we have already some simulations produced using the same simulator, so you don't have to run the simulator yourself, which can be computationally demanding in the case of 2D simulations.

If you follow the examples below, you will have access to the above mentioned simulations.

Examples

You will find tutorial notebooks to run different DRP routines in the examples folder. Here is you will find Jupyter Notebooks that illustrate different tasks in the DRP:

  • Basic Calibration: reduction of calibration images: bias, dark and pixel flats; as well as reduction of arcs and fiber flats.
  • Wavelength Calibration: automatic pixel to wavelength mapping and wavelength and LSF fitting.
  • Flux Calibration: conversion of the extracted spectra from electrons to flux calibrated spectra.
  • Sky Module: several procedures to sky-subtract science spectra.

In each of the links above you will find a short description of what's going on in each example and also the order in which those are intended to be followed.

Contributing to LVM-DRP development

There are two ways in which you can contribute:

  • Testing the DRP and reporting bugs on Github or
  • By diving into the code to fix bugs and implement new features

For those willing to contribute by coding, there are some steps to streamline the development process:

  1. Make sure you install the pipeline on your environment in edit (developer) mode, like this:

    pip install -e .'[dev]'
  2. Before you start coding on a new feature/bug-fix, make sure your local master branch is up to date:

    git pull origin master
  3. Create a branch to work on and make sure the name can be easily mappable to the work you intend to do:

    git checkout -b <feature_name>
  4. Start coding. Once you're done implementing changes:

    git status #check what has changed and identify the files you want to commit
    git add <changed_files>
    git commit -m "commit message"
  5. Afterwards, you can push your updates to the remote branch on Github:

    git push
  6. Finally, if you consider your feature is ready to be merged to the master branch, you can create a new pull request at Github.

Regarding commits, I'm trying to go for an atomic approach, where each commit has a single purpose. So please try to avoid as much as possible pushing lots of unrelated changes in one commit.

Troubleshooting

In some MacOS versions there may be the need to perform extra installation steps, before getting into the steps described in the installation section.

Issue importing CSafeLoader

Some Mac users have found the folloring error while importing CSafeLoader from the PyYaml package (~6.0):

AttributeError: module 'yaml' has no attribute 'CSafeLoader'

PyYaml is being installed as a dependency of PyTable. As of Aug 7, 2023, the problem seems to be solved by either installing PyTables from conda directly (instead of using pip install) or by installing PyTables from their master branch.

For MacOS (Monterey v12.6.2)

You will require to run this extra step before continuing with the regular DRP installation:

sudo port install py38-healpy

See healpy documentation for a statement on this issue.

After this step, you should be able to proceed with the DRP installation as described in the installation section.

For MacOS (Mojave v10.14.6)

The installation of the scipy package (a core dependency of the DRP) requires openBLAS to be installed to be able to compile the source files. If you are running on an old MacOS version, please follow these steps:

  1. Install openBLAS by doing:

    brew install openblas
  2. Set $PKG_CONFIG_PATH to point to your installation of openBLAS. This may look like this:

    export PKG_CONFIG_PATH="/usr/local/opt/openblas/lib/pkgconfig"

After these steps, you should be able to proceed with the DRP installation as described in the installation section.