Skip to content

Commit

Permalink
Merge main to fix_li_lfl_float to pass with success CI pipeline
Browse files Browse the repository at this point in the history
  • Loading branch information
ClementLaplace committed Jan 27, 2025
2 parents a02f8b1 + 3990f37 commit f06d07e
Show file tree
Hide file tree
Showing 63 changed files with 3,678 additions and 622 deletions.
1 change: 1 addition & 0 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
---
name: Bug report
about: Create a report to help us improve
type: 'bug'

---

Expand Down
1 change: 1 addition & 0 deletions .github/ISSUE_TEMPLATE/feature_request.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
---
name: Feature request
about: Suggest an idea for this project
type: 'feature'

---

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ jobs:
pytest -n auto --cov=satpy satpy/tests --cov-report=xml --cov-report=
- name: Upload unittest coverage to Codecov
uses: codecov/codecov-action@v4
uses: codecov/codecov-action@v5
with:
flags: unittests
file: ./coverage.xml
Expand All @@ -136,7 +136,7 @@ jobs:
coverage xml
- name: Upload behaviour test coverage to Codecov
uses: codecov/codecov-action@v4
uses: codecov/codecov-action@v5
with:
flags: behaviourtests
file: ./coverage.xml
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/deploy-sdist.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
- name: Publish package to PyPI
if: github.event.action == 'published'
uses: pypa/gh-action-pypi-publish@v1.11.0
uses: pypa/gh-action-pypi-publish@v1.12.3
with:
user: __token__
password: ${{ secrets.pypi_password }}
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ fail_fast: false
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: 'v0.7.2'
rev: 'v0.8.6'
hooks:
- id: ruff
- repo: https://github.com/pre-commit/pre-commit-hooks
Expand All @@ -14,12 +14,12 @@ repos:
- id: check-yaml
args: [--unsafe]
- repo: https://github.com/PyCQA/bandit
rev: '1.7.10' # Update me!
rev: '1.8.0' # Update me!
hooks:
- id: bandit
args: [--ini, .bandit]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: 'v1.13.0' # Use the sha / tag you want to point at
rev: 'v1.14.1' # Use the sha / tag you want to point at
hooks:
- id: mypy
additional_dependencies:
Expand Down
2 changes: 1 addition & 1 deletion AUTHORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,5 +106,5 @@ The following people have made contributions to this project:
- [Clement Laplace (ClementLaplace)](https://github.com/ClementLaplace)
- [Will Sharpe (wjsharpe)](https://github.com/wjsharpe)
- [Sara Hörnquist (shornqui)](https://github.com/shornqui)
- [Antonio Valentino](https://github.com/avalentino)
- [Clément (ludwigvonkoopa)](https://github.com/ludwigVonKoopa)
- [Xuanhan Lai (sgxl)](https://github.com/sgxl)
53 changes: 53 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,56 @@
## Version 0.54.0 (2025/01/20)

### Issues Closed

* [Issue 3020](https://github.com/pytroll/satpy/issues/3020) - Re-implement essl_colorized_low_level_moisture using colorize ([PR 3021](https://github.com/pytroll/satpy/pull/3021) by [@gerritholl](https://github.com/gerritholl))
* [Issue 3009](https://github.com/pytroll/satpy/issues/3009) - artefacts in FCI RGBs using 3.8 µm ([PR 3013](https://github.com/pytroll/satpy/pull/3013) by [@gerritholl](https://github.com/gerritholl))
* [Issue 2991](https://github.com/pytroll/satpy/issues/2991) - Resampling MTG FCI high res bands fails when the resample includes bands at different spatial resolutions
* [Issue 2981](https://github.com/pytroll/satpy/issues/2981) - Fix the bug with `satpy` when using `numpy 2.x` which leads to `SEVIRI` resampled files having a double size ([PR 2983](https://github.com/pytroll/satpy/pull/2983) by [@pkhalaj](https://github.com/pkhalaj))
* [Issue 2979](https://github.com/pytroll/satpy/issues/2979) - Improving resolution when setting extent
* [Issue 2977](https://github.com/pytroll/satpy/issues/2977) - CRS data is being printed to title of image
* [Issue 2975](https://github.com/pytroll/satpy/issues/2975) - can't create ABI geo_color composite
* [Issue 2963](https://github.com/pytroll/satpy/issues/2963) - ahi_hrit reader cannot create a Scene
* [Issue 2814](https://github.com/pytroll/satpy/issues/2814) - Reading LI L2 point data is not daskified ([PR 2985](https://github.com/pytroll/satpy/pull/2985) by [@ClementLaplace](https://github.com/ClementLaplace))
* [Issue 2566](https://github.com/pytroll/satpy/issues/2566) - Wrong version numbers at readthedocs
* [Issue 1997](https://github.com/pytroll/satpy/issues/1997) - Resampling from SwathDefinition to AreaDefinition fails with OSError and AssertionError
* [Issue 1788](https://github.com/pytroll/satpy/issues/1788) - integration / regression tests that compare images
* [Issue 1755](https://github.com/pytroll/satpy/issues/1755) - Store project metadata in pyproject.toml
* [Issue 1240](https://github.com/pytroll/satpy/issues/1240) - iber projection lost in the North Pacific

In this release 14 issues were closed.

### Pull Requests Merged

#### Bugs fixed

* [PR 3035](https://github.com/pytroll/satpy/pull/3035) - Pin dask to avoid dataframe problem
* [PR 3030](https://github.com/pytroll/satpy/pull/3030) - Fix sdist tarball including unnecessary files
* [PR 2995](https://github.com/pytroll/satpy/pull/2995) - Add new ABI L2 "CPS" variable name for Cloud Particle Size
* [PR 2985](https://github.com/pytroll/satpy/pull/2985) - li2_nc reader daskified ([2814](https://github.com/pytroll/satpy/issues/2814))
* [PR 2983](https://github.com/pytroll/satpy/pull/2983) - Fix dtype promotion in SEVIRI native reader ([2981](https://github.com/pytroll/satpy/issues/2981))
* [PR 2976](https://github.com/pytroll/satpy/pull/2976) - Fix dtype promotion in `mersi2_l1b` reader
* [PR 2969](https://github.com/pytroll/satpy/pull/2969) - Fix geos proj parameters for Insat 3d satellites
* [PR 2959](https://github.com/pytroll/satpy/pull/2959) - Modified the issue with the calibration coefficient indices for FY-3 satellite data reader

#### Features added

* [PR 3034](https://github.com/pytroll/satpy/pull/3034) - Set issue type in templates
* [PR 3021](https://github.com/pytroll/satpy/pull/3021) - Change ESSL colorisation approach ([3020](https://github.com/pytroll/satpy/issues/3020))
* [PR 3013](https://github.com/pytroll/satpy/pull/3013) - Clip negative FCI radiances ([3009](https://github.com/pytroll/satpy/issues/3009))
* [PR 3007](https://github.com/pytroll/satpy/pull/3007) - Add t865 dataset to olci l2 list ([1767](https://github.com/pytroll/satpy/issues/1767))
* [PR 2999](https://github.com/pytroll/satpy/pull/2999) - Add Accsos image comparison tests
* [PR 2941](https://github.com/pytroll/satpy/pull/2941) - Refactor MVIRI dataset access
* [PR 2565](https://github.com/pytroll/satpy/pull/2565) - Add level-1 readers for the arctic weather satelliter data

#### Clean ups

* [PR 3030](https://github.com/pytroll/satpy/pull/3030) - Fix sdist tarball including unnecessary files
* [PR 3014](https://github.com/pytroll/satpy/pull/3014) - Remove xarray-datatree dependency from CI
* [PR 3010](https://github.com/pytroll/satpy/pull/3010) - Remove version limit on pytest in CI

In this release 18 pull requests were closed.


## Version 0.53.0 (2024/11/08)

### Issues Closed
Expand Down
17 changes: 0 additions & 17 deletions MANIFEST.in

This file was deleted.

5 changes: 2 additions & 3 deletions continuous_integration/environment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ channels:
- conda-forge
dependencies:
- xarray!=2022.9.0
- dask
- dask<2025.1.0
- distributed
- dask-image
- donfig
Expand Down Expand Up @@ -43,7 +43,7 @@ dependencies:
- python-eccodes
# 2.19.1 seems to cause library linking issues
- eccodes>=2.20
- pytest<8.0.0
- pytest
- pytest-cov
- fsspec
- botocore>=1.33
Expand All @@ -53,7 +53,6 @@ dependencies:
- pip
- skyfield
- astropy
- xarray-datatree
- pint-xarray
- ephem
- bokeh
Expand Down
9 changes: 7 additions & 2 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,10 +85,15 @@ def __getattr__(cls, name):
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ["sphinx.ext.autodoc", "sphinx.ext.intersphinx", "sphinx.ext.todo", "sphinx.ext.coverage",
"sphinx.ext.doctest", "sphinx.ext.napoleon", "sphinx.ext.autosummary", "doi_role",
"sphinx.ext.viewcode", "sphinxcontrib.apidoc",
"sphinx.ext.doctest", "sphinx.ext.napoleon", "sphinx.ext.autosummary", "sphinx.ext.autosectionlabel",
"doi_role", "sphinx.ext.viewcode", "sphinxcontrib.apidoc",
"sphinx.ext.mathjax"]

# Autosectionlabel
# Make sure target is unique
autosectionlabel_prefix_document = True
autosectionlabel_maxdepth = 3

# API docs
apidoc_module_dir = "../../satpy"
apidoc_output_dir = "api"
Expand Down
2 changes: 1 addition & 1 deletion doc/source/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -272,7 +272,7 @@ If ``clip_negative_radiances=False``, pixels with negative radiances will have

Clipping of negative radiances is currently implemented for the following readers:

* ``abi_l1b``, ``ami_l1b``
* ``abi_l1b``, ``ami_l1b``, ``fci_l1c_nc``


Temporary Directory
Expand Down
3 changes: 3 additions & 0 deletions doc/source/reading.rst
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,9 @@ load the datasets using e.g.::
:meth:`scn.missing_datasets <satpy.scene.Scene.missing_datasets>`
property for any ``DataID`` that could not be loaded.

Available datasets
------------------

To find out what datasets are available from a reader from the files that were
provided to the ``Scene`` use
:meth:`~satpy.scene.Scene.available_dataset_ids`::
Expand Down
16 changes: 13 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ authors = [
]
dependencies = [
"platformdirs",
"dask[array]>=0.17.1",
"dask[array]>=0.17.1,<2025.1.0",
"donfig",
"numpy>=1.21",
"packaging",
Expand Down Expand Up @@ -62,7 +62,7 @@ seviri_l2_bufr = ["eccodes"]
seviri_l2_grib = ["eccodes"]
hsaf_grib = ["pygrib"]
remote_reading = ["fsspec"]
insat_3d = ["xarray-datatree"]
insat_3d = ["xarray>=2024.10.0"]
gms5-vissr_l1b = ["numba"]
# Writers:
cf = ["h5netcdf >= 0.7.3"]
Expand All @@ -87,7 +87,7 @@ satpos_from_tle = ["skyfield", "astropy"]
tests = ["behave", "h5py", "netCDF4", "pyhdf", "imageio",
"rasterio", "geoviews", "trollimage", "fsspec", "bottleneck",
"rioxarray", "pytest", "pytest-lazy-fixtures", "defusedxml",
"s3fs", "eccodes", "h5netcdf", "xarray-datatree",
"s3fs", "eccodes", "h5netcdf", "xarray>=2024.10.0",
"skyfield", "ephem", "pint-xarray", "astropy", "dask-image", "python-geotiepoints", "numba"]
dev = ["satpy[doc,tests]"]

Expand All @@ -112,6 +112,16 @@ build-backend = "hatchling.build"
[tool.hatch.metadata]
allow-direct-references = true

[tool.hatch.build.targets.sdist]
only-include = [
"satpy",
"doc",
"AUTHORS.md",
"CHANGELOG.md",
"SECURITY.md",
"CITATION",
]

[tool.hatch.build.targets.wheel]
packages = ["satpy"]

Expand Down
108 changes: 108 additions & 0 deletions satpy/composites/lightning.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright (c) 2019 Satpy developers
#
# This file is part of satpy.
#
# satpy is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# satpy is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR
# A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# satpy. If not, see <http://www.gnu.org/licenses/>.
"""Composite classes for the LI instrument."""

import logging

import numpy as np
import xarray as xr

from satpy.composites import CompositeBase

LOG = logging.getLogger(__name__)


class LightningTimeCompositor(CompositeBase):
"""Class used to create the flash_age compositor usefull for lighting event visualisation.
The datas used are dates related to the lightning event that should be normalised between
0 and 1. The value 1 corresponds to the latest lightning event and the value 0 corresponds
to the latest lightning event - time_range. The time_range is defined in the satpy/etc/composites/li.yaml
and is in minutes.
"""
def __init__(self, name, prerequisites=None, optional_prerequisites=None, **kwargs):
"""Initialisation of the class."""
super().__init__(name, prerequisites, optional_prerequisites, **kwargs)
# Get the time_range which is in minute
self.time_range = self.attrs["time_range"]
self.standard_name = self.attrs["standard_name"]
self.reference_time_attr = self.attrs["reference_time"]


def _normalize_time(self, data:xr.DataArray, attrs:dict) -> xr.DataArray:
"""Normalize the time in the range between [end_time, end_time - time_range].
The range of the normalised data is between 0 and 1 where 0 corresponds to the date end_time - time_range
and 1 to the end_time. Where end_times represent the latest lightning event and time_range is the range of
time you want to see the event.The dates that are earlier to end_time - time_range are removed.
Args:
data (xr.DataArray): datas containing dates to be normalised
attrs (dict): Attributes suited to the flash_age composite
Returns:
xr.DataArray: Normalised time
"""
# Compute the maximum time value
end_time = np.array(np.datetime64(data.attrs[self.reference_time_attr]))
# Compute the minimum time value based on the time range
begin_time = end_time - np.timedelta64(self.time_range, "m")
# Drop values that are bellow begin_time
condition_time = data >= begin_time
condition_time_computed = condition_time.compute()
data = data.where(condition_time_computed, drop=True)
# exit if data is empty afer filtering
if data.size == 0 :
LOG.error(f"All the flash_age events happened before {begin_time}")
raise ValueError(f"Invalid data: data size is zero. All flash_age "
f"events occurred before the specified start time ({begin_time})."
)
# Normalize the time values
normalized_data = (data - begin_time) / (end_time - begin_time)
# Ensure the result is still an xarray.DataArray
return xr.DataArray(normalized_data, dims=data.dims, coords=data.coords, attrs=attrs)


@staticmethod
def _update_missing_metadata(existing_attrs, new_attrs):
for key, val in new_attrs.items():
if key not in existing_attrs and val is not None:
existing_attrs[key] = val

def _redefine_metadata(self,attrs:dict)->dict:
"""Modify the standard_name and name metadatas.
Args:
attrs (dict): data's attributes
Returns:
dict: atualised attributes
"""
attrs["name"] = self.standard_name
attrs["standard_name"] = self.standard_name
# Attributes to describe the values range
return attrs


def __call__(self,projectables, nonprojectables=None, **attrs):
"""Normalise the dates."""
data = projectables[0]
new_attrs = data.attrs.copy()
self._update_missing_metadata(new_attrs, attrs)
new_attrs = self._redefine_metadata(new_attrs)
return self._normalize_time(data, new_attrs)
Loading

0 comments on commit f06d07e

Please sign in to comment.