Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MNT] numpy 2 compatibility #1594

Closed
fkiraly opened this issue Aug 22, 2024 · 1 comment · Fixed by #1599
Closed

[MNT] numpy 2 compatibility #1594

fkiraly opened this issue Aug 22, 2024 · 1 comment · Fixed by #1599
Labels
maintenance Continuous integration, unit testing & package distribution

Comments

@fkiraly
Copy link
Collaborator

fkiraly commented Aug 22, 2024

Currently, the package is incompatible with numpy 2, though this is not restricted in the bounds.

If numpy 2 is installed, the failure traceback is

pytorch_forecasting/__init__.py:31: in <module>
    from pytorch_forecasting.models import (
pytorch_forecasting/models/__init__.py:4: in <module>
    from pytorch_forecasting.models.base_model import (
pytorch_forecasting/models/base_model.py:20: in <module>
    from numpy.lib.function_base import iterable
E   ModuleNotFoundError: No module named 'numpy.lib.function_base'
@fkiraly fkiraly added the maintenance Continuous integration, unit testing & package distribution label Aug 22, 2024
fkiraly added a commit that referenced this issue Aug 25, 2024
…ests`, MacOS MPS

Fixes #1594, fixes #1595, fixes #1596

Added or moved some dependencies to core dependency set.

Fixed some `numpy2` and `optuna-integrations` problems.

`requests` replaced by `urllib.request.urlretrieve`.
@fkiraly
Copy link
Collaborator Author

fkiraly commented Aug 25, 2024

another failure, only on macos-13 this time:


Epoch 0:   0%|          | 0/33 [00:01<?, ?it/s]
________________________ test_prediction_with_dataframe ________________________

model = TemporalFusionTransformer(
  	"attention_head_size":               1
  	"categorical_groups":                {}
  	"ca...4,), eps=1e-05, elementwise_affine=True)
    )
  )
  (output_layer): Linear(in_features=4, out_features=1, bias=True)
)
data_with_covariates =          agency     sku    volume  ...     weight  time_idx  target
0     Agency_22  SKU_01   52.2720  ...   8.229938 ...1.963758        59   1.000
6772  Agency_22  SKU_04   72.0153  ...   9.486183        59   1.000

[180 rows x 31 columns]

    def test_prediction_with_dataframe(model, data_with_covariates):
>       model.predict(data_with_covariates, fast_dev_run=True)

tests/test_models/test_temporal_fusion_transformer.py:372: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pytorch_forecasting/models/base_model.py:1432: in predict
    trainer.predict(self, dataloader)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/trainer/trainer.py:858: in predict
    return call._call_and_handle_interrupt(
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/trainer/call.py:47: in _call_and_handle_interrupt
    return trainer_fn(*args, **kwargs)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/trainer/trainer.py:897: in _predict_impl
    results = self._run(model, ckpt_path=ckpt_path)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/trainer/trainer.py:981: in _run
    results = self._run_stage()
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/trainer/trainer.py:1020: in _run_stage
    return self.predict_loop.run()
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/loops/utilities.py:178: in _decorator
    return loop_run(self, *args, **kwargs)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/loops/prediction_loop.py:121: in run
    batch, batch_idx, dataloader_idx = next(data_fetcher)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/loops/fetchers.py:133: in __next__
    batch = super().__next__()
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/loops/fetchers.py:60: in __next__
    batch = next(self.iterator)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/utilities/combined_loader.py:341: in __next__
    out = next(self._iterator)
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/lightning/pytorch/utilities/combined_loader.py:142: in __next__
    out = next(self.iterators[0])
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/torch/utils/data/dataloader.py:631: in __next__
    data = self._next_data()
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/torch/utils/data/dataloader.py:675: in _next_data
    data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py:51: in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
../../../hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py:51: in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
pytorch_forecasting/data/timeseries.py:1538: in __getitem__
    decoder_length = self.calculate_decoder_length(time[-1], sequence_length)
pytorch_forecasting/data/timeseries.py:1462: in calculate_decoder_length
    time_last - (self.min_prediction_idx - 1),
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = tensor(59), dtype = None

    def __array__(self, dtype=None):
        if has_torch_function_unary(self):
            return handle_torch_function(Tensor.__array__, (self,), self, dtype=dtype)
        if dtype is None:
>           return self.numpy()
E           RuntimeError: Numpy is not available

@fkiraly fkiraly reopened this Aug 25, 2024
fkiraly added a commit that referenced this issue Aug 25, 2024
This PR rewrites the GHA test workflow to be based on `pip`.

The idea is to make it more maintainable by a larger number of contributors, by simplifying the GHA workflow, and making it uniform across packages.

Also makes small improvements:

* updates GHA action versions
* adds concurrency management flag
* adds `numpy<2` dependency, see #1594
fkiraly added a commit that referenced this issue Aug 25, 2024
This PR rewrites the GHA test workflow to be based on `pip`.

The idea is to make it more maintainable by a larger number of contributors, by simplifying the GHA workflow, and making it uniform across packages.

Also makes small improvements:

* updates GHA action versions
* adds concurrency management flag
* adds `numpy<2` dependency, see #1594
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance Continuous integration, unit testing & package distribution
Projects
None yet
1 participant