Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump pandas to 0.25.0; test updates #1448

Merged
merged 15 commits into from
Apr 28, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion benchmarks/asv.conf.json
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@
{
"python": "3.6",
"numpy": "1.16.0",
"pandas": "0.22.0",
"pandas": "0.25.0",
"scipy": "1.2.0",
// Note: these don't have a minimum in setup.py
"h5py": "2.10.0",
Expand Down
2 changes: 1 addition & 1 deletion ci/requirements-py36-min.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ dependencies:
- dataclasses
- h5py==3.1.0
- numpy==1.16.0
- pandas==0.22.0
- pandas==0.25.0
- scipy==1.2.0
- pytest-rerunfailures # conda version is >3.6
- pytest-remotedata # conda package is 0.3.0, needs > 0.3.1
Expand Down
2 changes: 1 addition & 1 deletion ci/requirements-py36.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ dependencies:
- nose
- numba
- numpy >= 1.16.0
- pandas >= 0.22.0
- pandas >= 0.25.0
- pip
- pytest
- pytest-cov
Expand Down
2 changes: 1 addition & 1 deletion ci/requirements-py37.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ dependencies:
- nose
- numba
- numpy >= 1.16.0
- pandas >= 0.22.0
- pandas >= 0.25.0
- pip
- pytest
- pytest-cov
Expand Down
2 changes: 1 addition & 1 deletion ci/requirements-py38.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ dependencies:
- nose
- numba
- numpy >= 1.16.0
- pandas >= 0.22.0
- pandas >= 0.25.0
- pip
- pytest
- pytest-cov
Expand Down
2 changes: 1 addition & 1 deletion ci/requirements-py39.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ dependencies:
- nose
# - numba # python 3.9 compat in early 2021
- numpy >= 1.16.0
- pandas >= 0.22.0
- pandas >= 0.25.0
- pip
- pytest
- pytest-cov
Expand Down
10 changes: 9 additions & 1 deletion docs/sphinx/source/whatsnew/v0.9.2.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,13 @@ Bug fixes
~~~~~~~~~
* :py:func:`pvlib.irradiance.get_total_irradiance` and
:py:func:`pvlib.solarposition.spa_python` now raise an error instead
of silently ignoring unknown parameters (:ghpull:`1437`)
of silently ignoring unknown parameters (:pull:`1437`)
* Fix a bug in :py:func:`pvlib.solarposition.sun_rise_set_transit_ephem`
where passing localized timezones with large UTC offsets could return
rise/set/transit times for the wrong day in recent versions of ``ephem``
(:issue:`1449`, :pull:`1448`)


Testing
~~~~~~~

Expand All @@ -23,8 +29,10 @@ Documentation
Benchmarking
~~~~~~~~~~~~~
* Updated version of numba in asv.conf from 0.36.1 to 0.40.0 to solve numba/numpy conflict. (:issue:`1439`, :pull:`1440`)

Requirements
~~~~~~~~~~~~
* Minimum pandas version increased to v0.25.0, released July 18, 2019. (:pull:`1448`)

Contributors
~~~~~~~~~~~~
Expand Down
18 changes: 15 additions & 3 deletions pvlib/iotools/crn.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"""

import pandas as pd
import numpy as np


HEADERS = [
Expand Down Expand Up @@ -107,13 +108,24 @@ def read_crn(filename, map_variables=True):
"""

# read in data
# TODO: instead of parsing as strings and then post-processing, switch to
# pd.read_fwf(..., dtype=dict(zip(HEADERS, DTYPES)), skip_blank_lines=True)
# when our minimum pandas >= 1.2.0 (skip_blank_lines bug for <1.2.0).
# As a workaround, parse all values as strings, then drop NaN, then cast
# to the appropriate dtypes, and mask "sentinal" NaN (e.g. -9999.0)
data = pd.read_fwf(filename, header=None, names=HEADERS, widths=WIDTHS,
na_values=NAN_DICT)
# Remove rows with all nans
dtype=str)

# drop empty (bad) lines
data = data.dropna(axis=0, how='all')
# set dtypes here because dtype kwarg not supported in read_fwf until 0.20

# can't set dtypes in read_fwf because int cols can't contain NaN, so
# do it here instead
data = data.astype(dict(zip(HEADERS, DTYPES)))

# finally, replace -999 values with NaN
data = data.replace(NAN_DICT, value=np.nan)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this reworking of read_crn is more correct in two ways:

  • It doesn't add .0 to string columns that happen to only contain numeric characters.
  • It only drops all-blank rows, not rows containing all -999 values. Not sure the latter would ever happen in reality, but if it does then it will return a row of NaN instead of dropping it altogether.

Technically this change does affect the returned values (e.g. '3' vs '3.0'); does it warrant a what's new entry?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically this change does affect the returned values (e.g. '3' vs '3.0'); does it warrant a what's new entry?

I don't think so


# set index
# UTC_TIME does not have leading 0s, so must zfill(4) to comply
# with %H%M format
Expand Down
8 changes: 5 additions & 3 deletions pvlib/solarposition.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import pandas as pd
import scipy.optimize as so
import warnings
import datetime

from pvlib import atmosphere
from pvlib.tools import datetime_to_djd, djd_to_datetime
Expand Down Expand Up @@ -574,9 +575,10 @@ def sun_rise_set_transit_ephem(times, latitude, longitude,
trans = []
for thetime in times:
thetime = thetime.to_pydatetime()
# pyephem drops timezone when converting to its internal datetime
# format, so handle timezone explicitly here
obs.date = ephem.Date(thetime - thetime.utcoffset())
# older versions of pyephem ignore timezone when converting to its
# internal datetime format, so convert to UTC here to support
# all versions. GH #1449
obs.date = ephem.Date(thetime.astimezone(datetime.timezone.utc))
sunrise.append(_ephem_to_timezone(rising(sun), tzinfo))
sunset.append(_ephem_to_timezone(setting(sun), tzinfo))
trans.append(_ephem_to_timezone(transit(sun), tzinfo))
Expand Down
2 changes: 1 addition & 1 deletion pvlib/tests/iotools/test_crn.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ def test_read_crn_problems(testfile_problems, columns_mapped, dtypes):
'2020-07-06 13:10:00'],
freq=None).tz_localize('UTC')
values = np.array([
[92821, 20200706, 1200, 20200706, 700, '3.0', -80.69, 28.62, 24.9,
[92821, 20200706, 1200, 20200706, 700, '3', -80.69, 28.62, 24.9,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think '3' is the correct value here, see previous discussion #1368 (comment)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mystery solved, thanks!

0.0, np.nan, 0, 25.5, 'C', 0, 93.0, 0, nan, nan, 990, 0, 1.57, 0],
[92821, 20200706, 1310, 20200706, 810, '2.623', -80.69, 28.62,
26.9, 0.0, 430.0, 0, 30.2, 'C', 0, 87.0, 0, nan, nan, 989, 0,
Expand Down
2 changes: 1 addition & 1 deletion pvlib/tests/iotools/test_psm3.py
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ def test_read_psm3_map_variables():
data, metadata = psm3.read_psm3(MANUAL_TEST_DATA, map_variables=True)
columns_mapped = ['Year', 'Month', 'Day', 'Hour', 'Minute', 'dhi', 'dni',
'ghi', 'dhi_clear', 'dni_clear', 'ghi_clear',
'Cloud Type', 'Dew Point', 'apparent_zenith',
'Cloud Type', 'Dew Point', 'solar_zenith',
'Fill Flag', 'albedo', 'wind_speed',
'precipitable_water', 'wind_direction',
'relative_humidity', 'temp_air', 'pressure']
Expand Down
19 changes: 9 additions & 10 deletions pvlib/tests/test_conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,22 +52,21 @@ def test_use_fixture_with_decorator(some_data):
'assert_frame_equal'])
@pytest.mark.parametrize('pd_version', ['1.0.0', '1.1.0'])
@pytest.mark.parametrize('check_less_precise', [True, False])
def test__check_pandas_assert_kwargs(mocker, monkeypatch,
function_name, pd_version,
def test__check_pandas_assert_kwargs(mocker, function_name, pd_version,
check_less_precise):
# test that conftest._check_pandas_assert_kwargs returns appropriate
# kwargs for the assert_x_equal functions

# patch the pandas assert; not interested in actually calling them:
def patched_assert(*args, **kwargs):
pass
# NOTE: be careful about mixing mocker.patch and pytest.MonkeyPatch!
# they do not coordinate their cleanups, so it is safest to only
# use one or the other. GH #1447

monkeypatch.setattr(pandas.testing, function_name, patched_assert)
# then attach a spy to it so we can see what args it is called with:
mocked_function = mocker.spy(pandas.testing, function_name)
# patch the pandas assert; not interested in actually calling them,
# plus we want to spy on how they get called.
spy = mocker.patch('pandas.testing.' + function_name)
# patch pd.__version__ to exercise the two branches in
# conftest._check_pandas_assert_kwargs
monkeypatch.setattr(pandas, '__version__', pd_version)
mocker.patch('pandas.__version__', new=pd_version)

# finally, run the function and check what args got passed to pandas:
assert_function = getattr(conftest, function_name)
Expand All @@ -79,4 +78,4 @@ def patched_assert(*args, **kwargs):
else:
expected_kwargs = {'check_less_precise': check_less_precise}

mocked_function.assert_called_with(*args, **expected_kwargs)
spy.assert_called_once_with(*args, **expected_kwargs)
3 changes: 3 additions & 0 deletions pvlib/tracking.py
Original file line number Diff line number Diff line change
Expand Up @@ -510,6 +510,9 @@ def singleaxis(apparent_zenith, apparent_azimuth,

# Calculate surface_tilt
dotproduct = (panel_norm_earth * projected_normal).sum(axis=1)
# for edge cases like axis_tilt=90, numpy's SIMD can produce values like
# dotproduct = (1 + 2e-16). Clip off the excess so that arccos works:
dotproduct = np.clip(dotproduct, -1, 1)
surface_tilt = 90 - np.degrees(np.arccos(dotproduct))

# Bundle DataFrame for return values and filter for sun below horizon.
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
URL = 'https://github.com/pvlib/pvlib-python'

INSTALL_REQUIRES = ['numpy >= 1.16.0',
'pandas >= 0.22.0',
'pandas >= 0.25.0',
'pytz',
'requests',
'scipy >= 1.2.0',
Expand Down