Skip to content

Commit

Permalink
Merge pull request #330 from simpeg/patches
Browse files Browse the repository at this point in the history
Patches
  • Loading branch information
kkappler authored Aug 29, 2024
2 parents b11970b + f98c1f8 commit ca403e6
Show file tree
Hide file tree
Showing 137 changed files with 22,905 additions and 85,848 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
[bumpversion]
current_version = 0.3.13
current_version = 0.3.14
files = setup.py aurora/__init__.py
17 changes: 11 additions & 6 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ jobs:
# python-version: ["3.10", ]

steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4

- name: Setup Miniconda
uses: conda-incubator/setup-miniconda@v2.1.1
Expand All @@ -37,6 +37,7 @@ jobs:
pip install -r requirements-dev.txt
pip install git+https://github.com/kujaku11/mt_metadata.git@main
pip install git+https://github.com/kujaku11/mth5.git@master
pip install git+https://github.com/MTgeophysics/mtpy-v2.git@main
- name: Install Our Package
run: |
Expand All @@ -53,12 +54,12 @@ jobs:
- name: Execute Jupyter Notebooks
run: |
jupyter nbconvert --to notebook --execute docs/examples/dataset_definition.ipynb
jupyter nbconvert --to notebook --execute docs/examples/make_cas04_single_station_h5.ipynb
jupyter nbconvert --to notebook --execute docs/examples/operate_aurora.ipynb
jupyter nbconvert --to notebook --execute tests/test_run_on_commit.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/pole_zero_fitting/lemi_pole_zero_fitting_example.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/processing_configuration.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/process_cas04_single_station.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/synthetic_data_processing.ipynb
jupyter nbconvert --to notebook --execute tests/test_run_on_commit.ipynb
# Replace "notebook.ipynb" with your notebook's filename
# - name: Commit changes (if any)
Expand All @@ -72,13 +73,17 @@ jobs:

- name: Run Tests
run: |
# pytest -s -v tests/synthetic/test_fourier_coefficients.py --cov=./ --cov-report=xml --cov=aurora
# pytest -s -v tests/synthetic/test_fourier_coefficients.py
# pytest -s -v tests/test_general_helper_functions.py
pytest -s -v --cov=./ --cov-report=xml --cov=aurora
- name: "Upload coverage to Codecov"
uses: codecov/codecov-action@v1
- name: "Upload coverage reports to Codecov"
uses: codecov/codecov-action@v4
with:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: false
flags: tests
# token: ${{ secrets.CODECOV_TOKEN }}

- name: Build Doc
if: ${{ (github.ref == 'refs/heads/main') && (matrix.python-version == '3.8')}}
Expand Down
11 changes: 11 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,9 +1,17 @@
.idea
data/figures
data/parkfield/*png
data/parkfield/aurora_results/
data/parkfield/*h5
data/synthetic/aurora_results/
data/synthetic/config/
data/synthetic/mth5/
docs/tutorials/*h5
docs/tutorials/CAS04*edi
docs/tutorials/CAS04*zrr
docs/tutorials/CAS04*xml
docs/tutorials/CAS04*png
docs/tutorials/config.json
tests/io/from_matlab.zss
*ignore*
*fix_issue*
Expand All @@ -27,6 +35,9 @@ tests/synthetic/data/*h5
tests/synthetic/test2*csv
tests/synthetic/out.png

*.h5
*.xml

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[codx]
Expand Down
92 changes: 92 additions & 0 deletions CITATION.cff
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: aurora
message: >-
If you use this software, please cite it using the
metadata from this file.
type: software
authors:
- given-names: Karl
name-particle: N.
family-names: Kappler
orcid: 'https://orcid.org/0000-0002-1877-1255'
- given-names: Jared
name-particle: R.
family-names: Peacock
orcid: 'https://orcid.org/0000-0002-0439-0224'
- given-names: Gary
name-particle: D.
family-names: Egbert
orcid: 'https://orcid.org/0000-0003-1276-8538'
- orcid: 'https://orcid.org/0000-0002-8818-3731'
given-names: Frassetto
family-names: Andrew
- given-names: Heagy
family-names: Lindsey
orcid: 'https://orcid.org/0000-0002-1551-5926'
- given-names: Kelbert
family-names: Anna
orcid: 'https://orcid.org/0000-0003-4395-398X'
- given-names: Laura
family-names: Keyson
- given-names: Douglas
family-names: Oldenburg
orcid: 'https://orcid.org/0000-0002-4327-2124'
- given-names: Timothy
family-names: Ronan
orcid: 'https://orcid.org/0000-0001-8450-9573'
- given-names: Justin
family-names: Sweet
orcid: 'https://orcid.org/0000-0001-7323-9758'
identifiers:
- type: doi
value: 10.5281/zenodo.13334589
description: >-
Contains the software at time of manuscript
acceptance.
- type: doi
value: 10.21105/joss.06832
description: The JOSS manuscript
repository-code: 'https://github.com/simpeg/aurora'
url: 'https://simpeg.xyz/aurora/'
abstract: >-
The Aurora software package robustly estimates single
station and remote reference electro-
magnetic transfer functions (TFs) from magnetotelluric
(MT) time series. Aurora is part of
an open-source processing workflow that leverages the
self-describing data container MTH5,
which in turn leverages the general mt_metadata framework
to manage metadata. These
pre-existing packages simplify the processing workflow by
providing managed data structures,
transfer functions to be generated with only a few lines
of code. The processing depends on
two inputs – a table defining the data to use for TF
estimation and a JSON file specifying
the processing parameters, both of which are generated
automatically and can be modified if
desired. Output TFs are returned as mt_metadata objects,
and can be exported to a variety
of common formats for plotting, modeling, and inversion.
keywords:
- open-source
- python
- 'magnetotelluric '
- processing
- transfer function
license: MIT
commit: abcdefgh
version: 0.3.14
date-released: '2024-08-28'
14 changes: 14 additions & 0 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
Contributing to Aurora
----------------------


If you'd like to contribute, start by searching through the `issues <https://github.com/simpeg/aurora/issues>`_ to see whether someone else has raised a similar idea or question. If you don't see your idea listed, open an issue, and if there are code changes, also a pull request.


There are many ways to contribute to the aurora, such as:

* Report/fix bugs or issues encountered when using the software
* Suggest additional features or functionalities
* Fix editorial inconsistencies or inaccuracies

Aurora is hosted by simpeg, please refer to their `contributing guidelines page <https://docs.simpeg.xyz/latest/content/getting_started/contributing/index.html>`_. for more details.
36 changes: 32 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
.. image:: ../docs/figures/aurora_logo.png
.. image:: docs/figures/aurora_logo.png
:width: 900
:alt: AURORA

|


.. image:: https://img.shields.io/pypi/v/aurora.svg
:target: https://pypi.python.org/pypi/aurora

Expand All @@ -15,6 +13,36 @@
.. image:: https://img.shields.io/pypi/l/aurora.svg
:target: https://pypi.python.org/pypi/aurora

Aurora is a Python library for processing natural source electromagnetic data. It uses MTH5 formatted Magnetotelluric data and a configuration file as inputs and generates transfer functions that can be formatted as EMTF XML or other output formats.
Aurora is an open-source package that robustly estimates single station and remote reference electromagnetic transfer functions (TFs) from magnetotelluric (MT) time series. Aurora is part of an open-source processing workflow that leverages the self-describing data container `MTH5 <https://github.com/kujaku11/mth5>`_, which in turn leverages the general `mt-metadata <https://github.com/kujaku11/mth5>`_ framework to manage metadata. These pre-existing packages simplify the processing by providing managed data structures, transfer functions to be generated with only a few lines of code. The processing depends on two inputs -- a table defining the data to use for TF estimation, and a JSON file specifying the processing parameters, both of which are generated automatically, and can be modified if desired. Output TFs are returned as mt-metadata objects, and can be exported to a variety of common formats for plotting, modeling and inversion.

Key Features
-------------

- Tabular data indexing and management (Pandas dataframes),
- Dictionary-like processing parameters configuration
- Programmatic or manual editing of inputs
- Largely automated workflow

Documentation for the Aurora project can be found at http://simpeg.xyz/aurora/

Installation
---------------

Suggest using PyPi as the default repository to install from

``pip install aurora``

Can use Conda but that is not updated as often

``conda -c conda-forge install aurora``

General Work Flow
-------------------

1. Convert raw time series data to MTH5 format, see `MTH5 Documentation and Examples <https://mth5.readthedocs.io/en/latest/index.html>`_.
2. Understand the time series data and which runs to process for local station `RunSummary`.
3. Choose remote reference station ``KernelDataset``.
4. Create a recipe for how the data will be processed ``Config``.
5. Estimate transfer function `process_mth5` and out put as a ``mt_metadata.transfer_function.core.TF`` object which can output [ EMTFXML | EDI | ZMM | ZSS | ZRR ] files.


2 changes: 1 addition & 1 deletion aurora/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "0.3.13"
__version__ = "0.3.14"

import sys
from loguru import logger
Expand Down
Loading

0 comments on commit ca403e6

Please sign in to comment.