You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, DeepARModel during each forecast predicts different values.
It seems like it is not a general problem for neural nets: TFTModel predicts the same values each time.
We can probably make forecast deterministic if we fix n_samples parameter for prediction in init of DeepARModel. For deterministic behaviour n_samples should be set to None.
Expected behavior
Each time DeepARModel predicts the same values.
After fixing you should
Init DeepARModel in test_inference as deterministic.
Remove torch random initialization from test_inference._test_forecast_out_sample_prefix.
🐛 Bug Report
Currently,
DeepARModel
during each forecast predicts different values.It seems like it is not a general problem for neural nets:
TFTModel
predicts the same values each time.We can probably make forecast deterministic if we fix n_samples parameter for prediction in init of
DeepARModel
. For deterministic behaviourn_samples
should be set toNone
.Expected behavior
Each time
DeepARModel
predicts the same values.After fixing you should
DeepARModel
intest_inference
as deterministic.test_inference._test_forecast_out_sample_prefix
.How To Reproduce
Environment
No response
Additional context
No response
Checklist
The text was updated successfully, but these errors were encountered: