Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add documentation for kwargs passed to reader+writer backends #1157

Merged
merged 3 commits into from
Aug 7, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 6 additions & 5 deletions .github/workflows/ci_workflows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,11 +37,6 @@ jobs:
python: '3.12'
toxenv: py312-test

- name: Python 3.12 (MacOS X)
os: macos-latest
python: '3.12'
toxenv: py312-test

- name: Python 3.10
os: ubuntu-latest
python: '3.10'
Expand Down Expand Up @@ -114,6 +109,12 @@ jobs:
toxenv: py312-test-devdeps
toxposargs: --remote-data=any

# doctest failure due to different no. of significant digits on arm64 (#1146)
- name: Python 3.12 (macOS)
os: macos-latest
python: '3.12'
toxenv: py312-test

steps:
- name: Checkout code
uses: actions/checkout@v4
Expand Down
4 changes: 2 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,8 +57,8 @@
# check_sphinx_version("1.2.1")

# Include other packages to link against
intersphinx_mapping['astropy'] = ('https://docs.astropy.org/en/latest/', None)
intersphinx_mapping['gwcs'] = ('https://gwcs.readthedocs.io/en/latest/', None)
intersphinx_mapping['astropy'] = ('https://docs.astropy.org/en/stable/', None)
intersphinx_mapping['gwcs'] = ('https://gwcs.readthedocs.io/en/stable/', None)
intersphinx_mapping['reproject'] = ('https://reproject.readthedocs.io/en/stable/', None)
intersphinx_mapping['mpl_animators'] = ('https://docs.sunpy.org/projects/mpl-animators/en/stable/', None)

Expand Down
31 changes: 25 additions & 6 deletions docs/spectrum1d.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,15 @@ Working with Spectrum1Ds
========================

As described in more detail in :doc:`types_of_spectra`, the core data class in
specutils for a single spectrum is `~specutils.Spectrum1D`. This object
specutils for a single spectrum is :class:`~specutils.Spectrum1D`. This object
can represent either one or many spectra, all with the same ``spectral_axis``.
This section describes some of the basic features of this class.

Basic Spectrum Creation
-----------------------

The simplest way to create a `~specutils.Spectrum1D` is to
create it explicitly from arrays or `~astropy.units.Quantity` objects:
The simplest way to create a :class:`~specutils.Spectrum1D` is to create
it explicitly from arrays or :class:`~astropy.units.Quantity` objects:

.. plot::
:include-source:
Expand Down Expand Up @@ -66,8 +66,8 @@ information to automatically identify a loader.

>>> from specutils import Spectrum1D
>>> import urllib
>>> specs = urllib.request.urlopen('https://data.sdss.org/sas/dr14/sdss/spectro/redux/26/spectra/0751/spec-0751-52251-0160.fits') # doctest: +REMOTE_DATA
>>> Spectrum1D.read(specs, format="SDSS-III/IV spec") # doctest: +REMOTE_DATA
>>> spec = urllib.request.urlopen('https://data.sdss.org/sas/dr14/sdss/spectro/redux/26/spectra/0751/spec-0751-52251-0160.fits') # doctest: +REMOTE_DATA
>>> Spectrum1D.read(spec, format="SDSS-III/IV spec") # doctest: +REMOTE_DATA
<Spectrum1D(flux=[30.59662628173828 ... 51.70271682739258] 1e-17 erg / (Angstrom s cm2) (shape=(3841,), mean=51.88042 1e-17 erg / (Angstrom s cm2)); spectral_axis=<SpectralAxis [3799.2686 3800.1426 3801.0188 ... 9193.905 9196.0205 9198.141 ] Angstrom> (length=3841); uncertainty=InverseVariance)>

Note that the same spectrum could be more conveniently downloaded via
Expand All @@ -94,7 +94,10 @@ installed, which is an optional dependency for ``specutils``.

Call the help function for a specific loader to access further documentation
on that format and optional parameters accepted by the ``read`` function,
e.g. as ``Spectrum1D.read.help('tabular-fits')``.
e.g. as ``Spectrum1D.read.help('tabular-fits')``. Additional optional parameters
are generally passed through to the backend functions performing the actual
reading operation, which depend on the loader. For loaders for FITS files for example,
this will often be :func:`astropy.io.fits.open`.

More information on creating custom loaders for formats not covered
by the above list can be found in the :doc:`custom loading </custom_loading>` page.
Expand All @@ -115,9 +118,25 @@ any format, will default to the ``wcs1d-fits`` loader if the `~specutils.Spectru
has a compatible WCS, and to ``tabular-fits`` otherwise, or if writing
to another than the primary HDU (``hdu=0``) has been selected.
For better control of the file type, the ``format`` parameter should be explicitly passed.
Again, additional optional parameters are forwarded to the backend writing functions,
which for the FITS writers is :meth:`astropy.io.fits.HDUList.writeto`.

| More information on creating custom writers can be found in :ref:`custom_writer`.

Metadata
--------

The :attr:`specutils.Spectrum1D.meta` attribute provides a dictionary to store
additional information on the data, like origin, date and other circumstances.
For spectra read from files containing header-like attributes like a FITS
:class:`~astropy.io.fits.Header` or :attr:`astropy.table.Table.meta`,
loaders are conventionally storing this in ``Spectrum1D.meta['header']``.

The two provided FITS writers (``tabular-fits`` and ``wcs1d-fits``) save the contents of
``Spectrum1D.meta['header']`` (which should be an :class:`astropy.io.fits.Header`
or any object, like a `dict`, that can instantiate one) as the header of the
:class:`~astropy.io.fits.hdu.PrimaryHDU`.

Including Uncertainties
-----------------------

Expand Down
4 changes: 2 additions & 2 deletions specutils/io/default_loaders/ascii.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ def ascii_loader(file_name, column_mapping=None, **kwargs):
# If no column mapping is given, attempt to parse the ascii files using
# unit information
if column_mapping is None:
return generic_spectrum_from_table(tab, **kwargs)
return generic_spectrum_from_table(tab)

return spectrum_from_column_mapping(tab, column_mapping)

Expand Down Expand Up @@ -95,6 +95,6 @@ def ipac_loader(file_name, column_mapping=None, **kwargs):
# If no column mapping is given, attempt to parse the ascii files using
# unit information
if column_mapping is None:
return generic_spectrum_from_table(tab, **kwargs)
return generic_spectrum_from_table(tab)

return spectrum_from_column_mapping(tab, column_mapping)
2 changes: 1 addition & 1 deletion specutils/io/default_loaders/generic_ecsv_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,6 @@ def generic_ecsv(file_name, column_mapping=None, **kwargs):
table = Table.read(file_name, format='ascii.ecsv')

if column_mapping is None:
return generic_spectrum_from_table(table, **kwargs)
return generic_spectrum_from_table(table)

return spectrum_from_column_mapping(table, column_mapping)
41 changes: 25 additions & 16 deletions specutils/io/default_loaders/tabular_fits.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,16 +43,16 @@ def identify_tabular_fits(origin, *args, **kwargs):
dtype=Spectrum1D, extensions=['fits', 'fit'], priority=6)
def tabular_fits_loader(file_obj, column_mapping=None, hdu=1, store_data_header=False, **kwargs):
"""
Load spectrum from a FITS file.
Load spectrum from a FITS file tabular extension.

Parameters
----------
file_obj: str, file-like, or HDUList
file_obj : str, file-like, or :class:`~astropy.io.fits.HDUList`
FITS file name, object (provided from name by Astropy I/O Registry),
or HDUList (as resulting from astropy.io.fits.open()).
hdu: int
or HDU list (as resulting from `~astropy.io.fits.open`).
hdu : int
The HDU of the fits file (default: 1st extension) to read from
store_data_header: bool
store_data_header : bool
Defaults to ``False``, which stores the primary header in ``Spectrum1D.meta['header']``.
Set to ``True`` to instead store the header from the specified data HDU.
column_mapping : dict
Expand All @@ -61,14 +61,20 @@ def tabular_fits_loader(file_obj, column_mapping=None, hdu=1, store_data_header=
information. The dictionary keys should be the FITS file column names
while the values should be a two-tuple where the first element is the
associated `Spectrum1D` keyword argument, and the second element is the
unit for the ASCII file column::
unit for the file column (or ``None`` to take unit from the table header)::

column_mapping = {'FLUX': ('flux', 'Jy')}
column_mapping = {'FLUX': ('flux', 'Jy'),
'WAVE': ('spectral_axis', 'um')}

**kwargs
Additional optional keywords passed to
:func:`~specutils.io.parsing_utils.read_fileobj_or_hdulist`, and when
reading from a file-like object, through to :func:`~astropy.io.fits.open`.

Returns
-------
data: Spectrum1D
The spectrum that is represented by the data in this table.
data : :class:`Spectrum1D`
The spectrum that is represented by the data in the input table.
"""
# Parse the wcs information. The wcs will be passed to the column finding
# routines to search for spectral axis information in the file.
Expand All @@ -90,7 +96,7 @@ def tabular_fits_loader(file_obj, column_mapping=None, hdu=1, store_data_header=
# If no column mapping is given, attempt to parse the file using
# unit information
if column_mapping is None:
return generic_spectrum_from_table(tab, wcs=wcs, **kwargs)
return generic_spectrum_from_table(tab, wcs=wcs)

return spectrum_from_column_mapping(tab, column_mapping, wcs=wcs)

Expand All @@ -102,15 +108,15 @@ def tabular_fits_writer(spectrum, file_name, hdu=1, update_header=False, store_d

Parameters
----------
spectrum: Spectrum1D
file_name: str
The path to the FITS file
hdu: int
spectrum : :class:`Spectrum1D`
file_name : str, file-like or `pathlib.Path`
File to write to. If a file object, must be opened in a writeable mode.
hdu : int
Header Data Unit in FITS file to write to (currently only extension HDU 1)
update_header: bool
update_header : bool
Write all compatible items in ``Spectrum1D.meta`` directly to FITS header;
this will overwrite any identically named keys from ``Spectrum1D.meta['header']``.
store_data_header: bool
store_data_header : bool
If ``True``, store ``Spectrum1D.meta['header']`` in the header of the target data HDU
instead of the primary header (default ``False``).
wunit : str or `~astropy.units.Unit`
Expand All @@ -121,6 +127,9 @@ def tabular_fits_writer(spectrum, file_name, hdu=1, update_header=False, store_d
Floating point type for storing spectral axis array
ftype : str or `~numpy.dtype`
Floating point type for storing flux array
hdulist : :class:`~astropy.io.fits.HDUList`
**kwargs
Additional optional keywords passed to :func:`~astropy.io.fits.HDUList.writeto`.
"""
if hdu < 1:
raise ValueError(f'FITS does not support BINTABLE extension in HDU {hdu}.')
Expand Down
24 changes: 14 additions & 10 deletions specutils/io/default_loaders/wcs_fits.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,9 @@ def wcs1d_fits_loader(file_obj, spectral_axis_unit=None, flux_unit=None,
The ``uncertainty_type`` of `~astropy.nddata.NDUncertainty`
(one of 'std', 'var', 'ivar'; default: try to infer from HDU EXTNAME).
**kwargs
Extra keywords for :func:`~specutils.io.parsing_utils.read_fileobj_or_hdulist`.
Additional optional keywords passed to
:func:`~specutils.io.parsing_utils.read_fileobj_or_hdulist`, and when
reading from a file-like object, through to :func:`~astropy.io.fits.open`.

Returns
-------
Expand Down Expand Up @@ -225,22 +227,24 @@ def wcs1d_fits_writer(spectrum, file_name, hdu=0, update_header=False,
Parameters
----------
spectrum : :class:`~specutils.Spectrum1D`
file_name : str
The path to the FITS file
file_name : str, file-like or `pathlib.Path`
File to write to. If a file object, must be opened in a writeable mode.
hdu : int, optional
Header Data Unit in FITS file to write to (base 0; default primary HDU)
Header Data Unit in FITS file to write to (base 0; default primary HDU).
update_header : bool, optional
Update FITS header with all compatible entries in `spectrum.meta`
Update FITS header with all compatible entries in `spectrum.meta`.
flux_name : str, optional
HDU name to store flux spectrum under (default 'FLUX')
HDU name to store flux spectrum under (default 'FLUX').
mask_name : str or `None`, optional
HDU name to store mask under (default 'MASK'; `None`: do not save)
HDU name to store mask under (default 'MASK'; `None`: do not save).
uncertainty_name : str or `None`, optional
HDU name to store uncertainty under (default set from type; `None`: do not save)
HDU name to store uncertainty under (default set from type; `None`: do not save).
unit : str or :class:`~astropy.units.Unit`, optional
Unit for the flux (and associated uncertainty; defaults to ``spectrum.flux.unit``)
Unit for the flux (and associated uncertainty; defaults to ``spectrum.flux.unit``).
dtype : str or :class:`~numpy.dtype`, optional
Floating point type for storing flux array (defaults to ``spectrum.flux.dtype``)
Floating point type for storing flux array (defaults to ``spectrum.flux.dtype``).
**kwargs
Additional optional keywords passed to :func:`~astropy.io.fits.HDUList.writeto`.
"""
# Create HDU list from WCS
try:
Expand Down
20 changes: 11 additions & 9 deletions specutils/io/parsing_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,10 +68,10 @@ def spectrum_from_column_mapping(table, column_mapping, wcs=None, verbose=False)
information. The dictionary keys should be the table column names
while the values should be a two-tuple where the first element is the
associated `Spectrum1D` keyword argument, and the second element is the
unit for the file column (or ``None`` to take unit from the table)::
unit for the file column (or ``None`` to take unit from the table header)::

column_mapping = {'FLUX': ('flux', 'Jy'),
'WAVE': ('spectral_axis'spectral_axisu', 'um')}
'WAVE': ('spectral_axis', 'um')}

wcs : :class:`~astropy.wcs.WCS` or :class:`gwcs.WCS`
WCS object passed to the Spectrum1D initializer.
Expand Down Expand Up @@ -138,27 +138,29 @@ def spectrum_from_column_mapping(table, column_mapping, wcs=None, verbose=False)
return Spectrum1D(**spec_kwargs, wcs=wcs, meta={'header': table.meta})


def generic_spectrum_from_table(table, wcs=None, **kwargs):
def generic_spectrum_from_table(table, wcs=None):
"""
Load spectrum from an Astropy table into a Spectrum1D object.
Uses the following logic to figure out which column is which:

* Spectral axis (dispersion) is the first column with units
compatible with u.spectral() or with length units such as 'pix'.
compatible with ``u.spectral()`` or with length units such as 'pix'.
Need not be present, if a valid ``wcs`` parameter is passed.

* Flux is taken from the first column with units compatible with
u.spectral_density(), or with other likely culprits such as
``u.spectral_density()``, or with other likely culprits such as
'adu' or 'cts/s'.

* Uncertainty comes from the next column with the same units as flux.

Parameters
----------
file_name: str
The path to the ECSV file
table : :class:`~astropy.table.Table`
Table containing a column of ``flux``, and optionally ``spectral_axis``
and ``uncertainty`` as defined above.
wcs : :class:`~astropy.wcs.WCS`
A FITS WCS object. If this is present, the machinery will fall back
to using the wcs to find the dispersion information.
and default to using the ``wcs`` to find the dispersion information.

Returns
-------
Expand Down Expand Up @@ -212,7 +214,7 @@ def _find_spectral_column(table, columns_to_search, spectral_axis):
additional_valid_units = [u.Unit('adu'), u.Unit('ct/s'), u.Unit('count')]
found_column = None

# First, search for a column with units compatible with Janskies
# First, search for a column with units compatible with Jansky
for c in columns_to_search:
try:
# Check for multi-D flux columns
Expand Down
Loading