-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
climex.ipynb, eccc-geoapi-climate-stations.ipynb: fix Jenkins failure #276
Conversation
Probably because PR #268 has not been refreshed using current Jupyter env. ``` _________ pavics-sdi-master/docs/source/notebooks/climex.ipynb::Cell 1 _________ Notebook cell execution failed Cell 1: Cell outputs differ Input: # Opening the link takes a while, because the server creates an aggregated view of 435,000 individual files. url = cat.df.path[0] ds = xr.open_dataset(url, chunks=dict(realization=2, time=30 * 3)) ds Traceback: mismatch 'text/plain' assert reference_output == test_output failed: '<xarray.Data... RCM' == '<xarray.Data...N.string1: 1' Skipping 599 identical leading characters in diff, use -v to show - ted_pole |S64 ... + ted_pole (time) |S64 dask.array<chunksize=(90,), meta=np.ndarray> tasmin (realization, time, rlat, rlon) float32 dask.array<chunksize=(2, 90, 280, 280), meta=np.ndarray> tasmax (realization, time, rlat, rlon) float32 dask.array<chunksize=(2, 90, 280, 280), meta=np.ndarray> tas (realization, time, rlat, rlon) float32 dask.array<chunksize=(2, 90, 280, 280), meta=np.ndarray> pr (realization, time, rlat, rlon) float32 dask.array<chunksize=(2, 90, 280, 280), meta=np.ndarray> prsn (realization, time, rlat, rlon) float32 dask.array<chunksize=(2, 90, 280, 280), meta=np.ndarray> - Attributes: (12/30) ? ^^ + Attributes: (12/29) ? ^^ - Conventions: CF-1.6 ? --- + Conventions: CF-1.6 - DODS.dimName: string1 ? --- + DODS.dimName: string1 - DODS.strlen: 0 ? ^^^^ + DODS.strlen: 1 ? ^ - EXTRA_DIMENSION.bnds: 2 ? --- + EXTRA_DIMENSION.bnds: 2 - NCO: "4.5.2" ? --- + NCO: "4.5.2" - abstract: The ClimEx CRCM5 Large Ensemble of high-resolut... ? --- + abstract: The ClimEx CRCM5 Large Ensemble of high-resolution... ? +++ - ... ... ? --- + ... ... + product: output - project_id: CLIMEX ? --- + project_id: CLIMEX - rcm_version_id: v3331 ? --- + rcm_version_id: v3331 - terms_of_use: http://www.climex-project.org/sites/default/fil... ? --- + terms_of_use: http://www.climex-project.org/sites/default/files/... ? +++ - title: The ClimEx CRCM5 Large Ensemble ? --- + title: The ClimEx CRCM5 Large Ensemble - type: RCM ? --- - + type: RCM - EXTRA_DIMENSION.string1: 1 _________ pavics-sdi-master/docs/source/notebooks/climex.ipynb::Cell 8 _________ Notebook cell execution failed Cell 8: Cell outputs differ Input: # Subset over the Montreal gridpoint ds = xr.open_dataset(url, chunks=dict(realization=2, time=-1, rlon=25, rlat=25)) pt = subset_gridpoint(ds, lon=-73.69, lat=45.50) print("Input dataset for Montreal :") display(pt) out = xclim.atmos.max_1day_precipitation_amount(pr=pt.pr, freq="YS") print("Maximim 1-day precipitation `lazy` output ..") out Traceback: mismatch 'text/plain' assert reference_output == test_output failed: '<xarray.Data... RCM' == '<xarray.Data...N.string1: 1' Skipping 372 identical leading characters in diff, use -v to show - ted_pole |S64 b'' + ted_pole (time) |S64 dask.array<chunksize=(52924,), meta=np.ndarray> tasmin (realization, time) float32 dask.array<chunksize=(2, 52924), meta=np.ndarray> tasmax (realization, time) float32 dask.array<chunksize=(2, 52924), meta=np.ndarray> tas (realization, time) float32 dask.array<chunksize=(2, 52924), meta=np.ndarray> pr (realization, time) float32 dask.array<chunksize=(2, 52924), meta=np.ndarray> prsn (realization, time) float32 dask.array<chunksize=(2, 52924), meta=np.ndarray> - Attributes: (12/30) ? ^^ + Attributes: (12/29) ? ^^ - Conventions: CF-1.6 ? --- + Conventions: CF-1.6 - DODS.dimName: string1 ? --- + DODS.dimName: string1 - DODS.strlen: 0 ? ^^^^ + DODS.strlen: 1 ? ^ - EXTRA_DIMENSION.bnds: 2 ? --- + EXTRA_DIMENSION.bnds: 2 - NCO: "4.5.2" ? --- + NCO: "4.5.2" - abstract: The ClimEx CRCM5 Large Ensemble of high-resolut... ? --- + abstract: The ClimEx CRCM5 Large Ensemble of high-resolution... ? +++ - ... ... ? --- + ... ... + product: output - project_id: CLIMEX ? --- + project_id: CLIMEX - rcm_version_id: v3331 ? --- + rcm_version_id: v3331 - terms_of_use: http://www.climex-project.org/sites/default/fil... ? --- + terms_of_use: http://www.climex-project.org/sites/default/files/... ? +++ - title: The ClimEx CRCM5 Large Ensemble ? --- + title: The ClimEx CRCM5 Large Ensemble - type: RCM ? --- - + type: RCM - EXTRA_DIMENSION.string1: 1 _ pavics-sdi-master/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 3 _ Notebook cell execution failed Cell 3: Cell outputs differ Input: import pandas as pd # Create a datetime.Timedelta object from the subtraction of two dates. delta = pd.to_datetime(stations["DLY_LAST_DATE"]) - pd.to_datetime( stations["DLY_FIRST_DATE"] ) # Get the number of days in the time delta stations["n_days"] = delta.apply(lambda x: x.days) # Compute condition over_50 = stations["n_days"] > 50 * 365.25 # Index the data frame using the condition select = stations[over_50] select.head() Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' id ...x 35 columns]' == ' id ...x 35 columns]' - id STN_ID STATION_NAME PROV_STATE_TERR_CODE ENG_PROV_NAME \ + id STN_ID STATION_NAME PROV_STATE_TERR_CODE ENG_PROV_NAME \ ? +++ - 2 8203400 6399 MALAY FALLS NS NOVA SCOTIA + 2 8203400 6399 MALAY FALLS NS NOVA SCOTIA ? +++ - 18 8205090 6465 SHEARWATER A NS NOVA SCOTIA - 19 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^ + 18 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^ +++ - 23 8206300 6506 WHITEHEAD NS NOVA SCOTIA ? ^ + 22 8206300 6506 WHITEHEAD NS NOVA SCOTIA ? ^ +++ - 24 8206440 6513 WOLFVILLE NS NOVA SCOTIA ? ^ + 23 8206440 6513 WOLFVILLE NS NOVA SCOTIA ? ^ +++ + 33 8200100 6289 ANNAPOLIS ROYAL NS NOVA SCOTIA FRE_PROV_NAME COUNTRY LATITUDE LONGITUDE TIMEZONE ... \ 2 NOUVELLE-��COSSE CAN 445900000 -622900000 AST ... - 18 NOUVELLE-��COSSE CAN 443800000 -633000000 AST ... - 19 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^ + 18 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^ - 23 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... ? ^ + 22 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... ? ^ - 24 NOUVELLE-��COSSE CAN 450600000 -642200000 AST ... ? ^ + 23 NOUVELLE-��COSSE CAN 450600000 -642200000 AST ... ? ^ + 33 NOUVELLE-��COSSE CAN 444500000 -653100000 AST ... HLY_LAST_DATE DLY_FIRST_DATE DLY_LAST_DATE MLY_FIRST_DATE \ 2 /DATE/TIME/ /DATE/ /DATE/ /DATE/ - 18/DATE/TIME/ /DATE/ /DATE/ /DATE/ - 19 NaT /DATE/ /DATE/ /DATE/ ? ^ + 18 NaT /DATE/ /DATE/ /DATE/ ? ^ + 22 NaT /DATE/ /DATE/ /DATE/ 23 NaT /DATE/ /DATE/ /DATE/ - 24 NaT /DATE/ /DATE/ /DATE/ ? ^^ + 33 NaT /DATE/ /DATE/ /DATE/ ? ^^ MLY_LAST_DATE HAS_MONTHLY_SUMMARY HAS_NORMALS_DATA HAS_HOURLY_DATA \ 2 /DATE/ Y N N - 18 /DATE/ Y Y Y - 19 /DATE/ Y N N ? ^ + 18 /DATE/ Y N N ? ^ + 22 /DATE/ Y N N 23 /DATE/ Y N N - 24 /DATE/ Y N N ? ^^ + 33 /DATE/ Y N N ? ^^ geometry n_days 2 POINT (-62.48333 44.98333) 18474 - 18 POINT (-63.50000 44.63333) 23325 - 19 POINT (-60.20000 46.15000) 26021 ? ^ + 18 POINT (-60.20000 46.15000) 26021 ? ^ - 23 POINT (-61.18333 45.21667) 27970 ? ^ + 22 POINT (-61.18333 45.21667) 27970 ? ^ - 24 POINT (-64.36667 45.10000) 28883 ? ^ + 23 POINT (-64.36667 45.10000) 28883 ? ^ + 33 POINT (-65.51667 44.75000) 34032 [5 rows x 35 columns] _ pavics-sdi-master/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 5 _ Notebook cell execution failed Cell 5: Cell outputs differ Input: # Adjust directory if running this locally. # rect = gpd.read_file("~/Downloads/data.geojson") # Here we're using an existing file so the notebook runs without user interaction. rect = gpd.read_file("./data.geojson") # Filter stations DataFrame using bbox inbox = select.within(rect.loc[0].geometry) print("Number of stations within subregion: ", sum(inbox)) sub_select = select[inbox] sub_select.head() Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' id...x 35 columns]' == ' id...x 35 columns]' Skipping 64 identical leading characters in diff, use -v to show Skipping 51 identical trailing characters in diff, use -v to show _NAME \ - 19 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^ + 18 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^ - 23 8206300 6506 WHITEHEAD NS NOVA SCOTIA ? ^ + 22 8206300 6506 WHITEHEAD NS NOVA SCOTIA ? ^ - 45 8201410 6336 DEMING NS NOVA SCOTIA ? ^ + 43 8201410 6336 DEMING NS NOVA SCOTIA ? ^ - 134 8205600 6481 STILLWATER NS NOVA SCOTIA ? ^ + 133 8205600 6481 STILLWATER NS NOVA SCOTIA ? ^ - 146 8201000 6329 COLLEGEVILLE NS NOVA SCOTIA ? ^ + 144 8201000 6329 COLLEGEVILLE NS NOVA SCOTIA ? ^ FRE_PROV_NAME COUNTRY LATITUDE LONGITUDE TIMEZONE ... \ - 19 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^ + 18 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^ - 23 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... ? ^ + 22 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... ? ^ - 45 NOUVELLE-��COSSE CAN 451259007 -611040090 AST ... ? ^ + 43 NOUVELLE-��COSSE CAN 451259007 -611040090 AST ... ? ^ - 134 NOUVELLE-��COSSE CAN 451100000 -620000000 AST ... ? ^ + 133 NOUVELLE-��COSSE CAN 451100000 -620000000 AST ... ? ^ - 146 NOUVELLE-��COSSE CAN 452900000 -620100000 AST ... ? ^ + 144 NOUVELLE-��COSSE CAN 452900000 -620100000 AST ... ? ^ HLY_LAST_DATE DLY_FIRST_DATE DLY_LAST_DATE MLY_FIRST_DATE MLY_LAST_DATE \ - 19 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 18 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ - 23 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 22 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ - 45 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 43 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ - 134 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 133 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ - 146 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 144 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ HAS_MONTHLY_SUMMARY HAS_NORMALS_DATA HAS_HOURLY_DATA \ - 19 Y N N ? ^ + 18 Y N N ? ^ - 23 Y N N ? ^ + 22 Y N N ? ^ - 45 Y Y N ? ^ + 43 Y Y N ? ^ - 134 Y N N ? ^ + 133 Y N N ? ^ - 146 Y Y N ? ^ + 144 Y Y N ? ^ geometry n_days - 19 POINT (-60.20000 46.15000) 26021 ? ^ + 18 POINT (-60.20000 46.15000) 26021 ? ^ - 23 POINT (-61.18333 45.21667) 27970 ? ^ + 22 POINT (-61.18333 45.21667) 27970 ? ^ - 45 POINT (-61.17780 45.21639) 20179 ? ^ + 43 POINT (-61.17780 45.21639) 20179 ? ^ - 134 POINT (-62.00000 45.18333) 23345 ? ^ + 133 POINT (-62.00000 45.18333) 23345 ? ^ - 146 POINT ( ? ^ + 144 POINT ( ? ^ ```
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
for more information, see https://pre-commit.ci
I ran the eccc nbs on PAVICS, but I think the catalog may have changed in the meantime. This error is suspect |
Yes I think this is possibly the culprit still breaking the build, even after my refresh. If I can not find anything, I probably will have to use |
…ince the STN_ID seems changing Changing STN_ID seems to have a cascading effect. Errors on Jenkins re-run: ``` _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 1 _ Notebook cell execution failed Cell 1: Cell outputs differ Input: resp.content[:100] Traceback: mismatch 'text/plain' assert reference_output == test_output failed: 'b\'{"type": ...390, "STATI\'' == 'b\'{"type": ...298, "STATI\'' Skipping 81 identical leading characters in diff, use -v to show - STN_ID": 6298, "STATI' ? ^ ^ + STN_ID": 6390, "STATI' ? ^ ^ _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 2 _ Notebook cell execution failed Cell 2: Cell outputs differ Input: # The first approach would look like this: # import json # stations = gpd.GeoDataFrame.from_features(json.loads(resp.content)) stations = gpd.read_file(str(url)) stations.head() Traceback: Missing output fields from running code: {'stderr'} _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 3 _ Notebook cell execution failed Cell 3: Cell outputs differ Input: import pandas as pd # Create a datetime.Timedelta object from the subtraction of two dates. delta = pd.to_datetime(stations["DLY_LAST_DATE"]) - pd.to_datetime( stations["DLY_FIRST_DATE"] ) # Get the number of days in the time delta stations["n_days"] = delta.apply(lambda x: x.days) # Compute condition over_50 = stations["n_days"] > 50 * 365.25 # Index the data frame using the condition select = stations[over_50] select.head() Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' id ...x 35 columns]' == ' id ...x 35 columns]' - id STN_ID STATION_NAME PROV_STATE_TERR_CODE ENG_PROV_NAME \ ? --- + id STN_ID STATION_NAME PROV_STATE_TERR_CODE ENG_PROV_NAME \ - 8 8201410 6336 DEMING NS NOVA SCOTIA - 12 8202000 6354 GREENWOOD A NS NOVA SCOTIA - 17 8200100 6289 ANNAPOLIS ROYAL NS NOVA SCOTIA - 19 8203400 6399 MALAY FALLS NS NOVA SCOTIA ? ^^ --- + 2 8203400 6399 MALAY FALLS NS NOVA SCOTIA ? ^^ + 18 8205090 6465 SHEARWATER A NS NOVA SCOTIA - 35 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^^ --- + 19 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^^ + 23 8206300 6506 WHITEHEAD NS NOVA SCOTIA + 24 8206440 6513 WOLFVILLE NS NOVA SCOTIA FRE_PROV_NAME COUNTRY LATITUDE LONGITUDE TIMEZONE ... \ - 8 NOUVELLE-��COSSE CAN 451259007 -611040090 AST ... - 12 NOUVELLE-��COSSE CAN 445900000 -645500000 AST ... - 17 NOUVELLE-��COSSE CAN 444500000 -653100000 AST ... - 19 NOUVELLE-��COSSE CAN 445900000 -622900000 AST ... ? ^^ + 2 NOUVELLE-��COSSE CAN 445900000 -622900000 AST ... ? ^^ + 18 NOUVELLE-��COSSE CAN 443800000 -633000000 AST ... - 35 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^^ + 19 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^^ + 23 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... + 24 NOUVELLE-��COSSE CAN 450600000 -642200000 AST ... HLY_LAST_DATE DLY_FIRST_DATE DLY_LAST_DATE MLY_FIRST_DATE \ - 8 NaT /DATE/ /DATE/ /DATE/ - 12/DATE/TIME/ /DATE/ /DATE/ /DATE/ ? - + 2 /DATE/TIME/ /DATE/ /DATE/ /DATE/ ? + + 18/DATE/TIME/ /DATE/ /DATE/ /DATE/ - 17 NaT /DATE/ /DATE/ /DATE/ ? ^ + 19 NaT /DATE/ /DATE/ /DATE/ ? ^ - 19/DATE/TIME/ /DATE/ /DATE/ /DATE/ - 35 NaT /DATE/ /DATE/ /DATE/ ? - + 23 NaT /DATE/ /DATE/ /DATE/ ? + + 24 NaT /DATE/ /DATE/ /DATE/ MLY_LAST_DATE HAS_MONTHLY_SUMMARY HAS_NORMALS_DATA HAS_HOURLY_DATA \ - 8 /DATE/ Y Y N + 2 /DATE/ Y N N - 12 /DATE/ Y Y Y ? ^ + 18 /DATE/ Y Y Y ? ^ - 17 /DATE/ Y N N 19 /DATE/ Y N N - 35 /DATE/ Y N N ? - + 23 /DATE/ Y N N ? + + 24 /DATE/ Y N N geometry n_days - 8 POINT (-61.17780 45.21639) 20179 - 12 POINT (-64.91667 44.98333) 29301 - 17 POINT (-65.51667 44.75000) 34032 - 19 POINT (-62.48333 44.98333) 18474 ? ^^ + 2 POINT (-62.48333 44.98333) 18474 ? ^^ + 18 POINT (-63.50000 44.63333) 23325 - 35 POINT (-60.20000 46.15000) 26021 ? ^^ + 19 POINT (-60.20000 46.15000) 26021 ? ^^ + 23 POINT (-61.18333 45.21667) 27970 + 24 POINT (-64.36667 45.10000) 28883 [5 rows x 35 columns] _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 5 _ Notebook cell execution failed Cell 5: Cell outputs differ Input: # Adjust directory if running this locally. # rect = gpd.read_file("~/Downloads/data.geojson") # Here we're using an existing file so the notebook runs without user interaction. rect = gpd.read_file("./data.geojson") # Filter stations DataFrame using bbox inbox = select.within(rect.loc[0].geometry) print("Number of stations within subregion: ", sum(inbox)) sub_select = select[inbox] sub_select.head() Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' id...x 35 columns]' == ' id...x 35 columns]' Skipping 63 identical leading characters in diff, use -v to show V_NAME \ - 8 8201410 6336 DEMING NS NOVA SCOTIA - 35 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^^ + 19 8205698 6485 SYDNEY NS NOVA SCOTIA ? ^^ - 39 8206300 6506 WHITEHEAD NS NOVA SCOTIA ? - + 23 8206300 6506 WHITEHEAD NS NOVA SCOTIA ? + - 119 8201000 6329 COLLEGEVILLE NS NOVA SCOTIA + 45 8201410 6336 DEMING NS NOVA SCOTIA - 138 8205600 6481 STILLWATER NS NOVA SCOTIA ? ^ + 134 8205600 6481 STILLWATER NS NOVA SCOTIA ? ^ + 146 8201000 6329 COLLEGEVILLE NS NOVA SCOTIA FRE_PROV_NAME COUNTRY LATITUDE LONGITUDE TIMEZONE ... \ - 8 NOUVELLE-��COSSE CAN 451259007 -611040090 AST ... - 35 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^^ + 19 NOUVELLE-��COSSE CAN 460900000 -601200000 AST ... ? ^^ - 39 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... ? - + 23 NOUVELLE-��COSSE CAN 451300000 -611100000 AST ... ? + - 119 NOUVELLE-��COSSE CAN 452900000 -620100000 AST ... ? ^^^ ----------- + 45 NOUVELLE-��COSSE CAN 451259007 -611040090 AST ... ? ^^^ + + ++++++++ + - 138 NOUVELLE-��COSSE CAN 451100000 -620000000 AST ... ? ^ + 134 NOUVELLE-��COSSE CAN 451100000 -620000000 AST ... ? ^ + 146 NOUVELLE-��COSSE CAN 452900000 -620100000 AST ... HLY_LAST_DATE DLY_FIRST_DATE DLY_LAST_DATE MLY_FIRST_DATE MLY_LAST_DATE \ - 8 NaT /DATE/ /DATE/ /DATE/ /DATE/ - 35 NaT /DATE/ /DATE/ /DATE/ /DATE/ - 39 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 19 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ - 119 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^^^ + 23 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^^^ + 45 NaT /DATE/ /DATE/ /DATE/ /DATE/ - 138 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 134 NaT /DATE/ /DATE/ /DATE/ /DATE/ ? ^ + 146 NaT /DATE/ /DATE/ /DATE/ /DATE/ HAS_MONTHLY_SUMMARY HAS_NORMALS_DATA HAS_HOURLY_DATA \ - 8 Y Y N - 35 Y N N - 39 Y N N ? ^ + 19 Y N N ? ^ + 23 Y N N - 119 Y Y N ? ^^^ + 45 Y Y N ? ^^^ - 138 Y N N ? ^ + 134 Y N N ? ^ + 146 Y Y N geometry n_days - 8 POINT (-61.17780 45.21639) 20179 - 35 POINT (-60.20000 46.15000) 26021 ? ^^ + 19 POINT (-60.20000 46.15000) 26021 ? ^^ - 39 POINT (-61.18333 45.21667) 27970 ? - + 23 POINT (-61.18333 45.21667) 27970 ? + - 119 POINT (-62.01667 45.48333) 36646 + 45 POINT (-61.17780 45.21639) 20179 - 138 POINT (-62.00000 45.18333) 23345 ? ^ + 134 POINT (-62.00000 45.18333) 23345 ? ^ + 146 POINT (-62.01667 45.48333) 36646 [5 rows x 35 columns] _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 6 _ Notebook cell execution failed Cell 6: Cell outputs differ Input: coll = host / "collections" / "climate-daily" / "items" station_id = "8201410" # Restricting the number of entries returned to keep things fast. url = str(coll.with_query({"CLIMATE_IDENTIFIER": station_id, "limit": 365})) print("Request: ", url) data = gpd.read_file(url) data.head() Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' ...x 36 columns]' == ' ...x 36 columns]' - id STATION_NAME CLIMATE_IDENTIFIER ID \ ? - - + id STATION_NAME CLIMATE_IDENTIFIER ID \ + 0 8201410.1970.5.12 DEMING 8201410 8201410.1970.5.12 + 1 8201410.1970.5.13 DEMING 8201410 8201410.1970.5.13 + 2 8201410.1970.5.14 DEMING 8201410 8201410.1970.5.14 + 3 8201410.1970.5.15 DEMING 8201410 8201410.1970.5.15 - 0 8201410.2009.11.6 DEMING 8201410 8201410.2009.11.6 ? ^^ ^ ^^ -- - ^ ^^ -- + 4 8201410.1970.5.16 DEMING 8201410 8201410.1970.5.16 ? ^ ^^^ ^^ ^^^ ^^ - 1 8201410.2009.11.7 DEMING 8201410 8201410.2009.11.7 - 2 8201410.2009.11.8 DEMING 8201410 8201410.2009.11.8 - 3 8201410.2009.11.9 DEMING 8201410 8201410.2009.11.9 - 4 8201410.2009.11.10 DEMING 8201410 8201410.2009.11.10 LOCAL_DATE PROVINCE_CODE LOCAL_YEAR LOCAL_MONTH LOCAL_DAY \ - 0/DATE/ NS 2009 11 6 - 1/DATE/ NS 2009 11 7 - 2/DATE/ NS 2009 11 8 - 3/DATE/ NS 2009 11 9 - 4/DATE/ NS 2009 11 10 + 0/DATE/ NS 1970 5 12 + 1/DATE/ NS 1970 5 13 + 2/DATE/ NS 1970 5 14 + 3/DATE/ NS 1970 5 15 + 4/DATE/ NS 1970 5 16 MEAN_TEMPERATURE ... SPEED_MAX_GUST_FLAG COOLING_DEGREE_DAYS \ - 0 2.5 ... None 0.0 - 1 4.3 ... None 0.0 - 2 5.0 ... None 0.0 ? ^ + 0 5.0 ... None 0.0 ? ^ + 1 5.8 ... None 0.0 + 2 6.1 ... None 0.0 - 3 10.0 ... None 0.0 ? ^^ ^ + 3 7.8 ... None 0.0 ? ^^ ^ - 4 12.0 ... None 0.0 ? ^^ ^ + 4 3.9 ... None 0.0 ? ^^ ^ COOLING_DEGREE_DAYS_FLAG HEATING_DEGREE_DAYS HEATING_DEGREE_DAYS_FLAG \ - 0 None 15.5 None - 1 None 13.7 None - 2 None 13.0 None ? ^ + 0 None 13.0 None ? ^ + 1 None 12.2 None + 2 None 11.9 None - 3 None 8.0 None ? ^^ ^ + 3 None 10.2 None ? ^^ ^ - 4 None 6.0 None ? ^^ ^ + 4 None 14.1 None ? ^^ ^ MIN_REL_HUMIDITY MIN_REL_HUMIDITY_FLAG MAX_REL_HUMIDITY \ 0 None None None 1 None None None 2 None None None 3 None None None 4 None None None MAX_REL_HUMIDITY_FLAG geometry 0 None POINT (-61.17780 45.21639) 1 None POINT (-61.17780 45.21639) 2 None POINT (-61.17780 45.21639) 3 None POINT (-61.17780 45.21639) 4 None POINT (-61.17780 45.21639) [5 rows x 36 columns] _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 12 _ Notebook cell execution failed Cell 12: Cell outputs differ Input: # The url to query station metadata - this should behave similarly as `climate-stations` ahccd_stations = host / "collections" / "ahccd-stations" / "items" url = ahccd_stations.with_query({"limit": 1}) print(ahccd_stations) gpd.read_file(str(url)) Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' id i...0 66.13000) ' == ' id i...0 56.54000) ' Skipping 55 identical leading characters in diff, use -v to show on \ - 0 2303667 2303667 2303667 ? --- --- --- + 0 2403053 2403053 2403053 ? ++ + ++ + ++ + station_name__nom_station measurement_type__type_mesure period__periode \ - 0 SANIKILUAQ A pressure_sea_level Ann ? ^^ ^^^ --- + 0 PANGNIRTUNG A pressure_sea_level Ann ? ^ ++ ^^ +++ trend_value__valeur_tendance elevation__elevation province__province \ - 0 None 31.7 NU ? ^^ ^ + 0 None 22.6 NU ? ^^ ^ joined__rejoint year_range__annees start_date__date_debut \ - 0 0 None /DATE/ ? ^ + 0 1 None /DATE/ ? ^ end_date__date_fin geometry - 0 /DATE/ POINT (-79.25000 56.54000) ? ---- ^^ ^^ + 0 /DATE/ POINT (-65.70000 66.13000) ? +++ ^^^ ^^ _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 13 _ Notebook cell execution failed Cell 13: Cell outputs differ Input: url = ahccd_stations.with_query({"province__province": "YT"}) gpd.read_file(str(url)) Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' id ...0 60.70950) ' == ' id ...0 60.70000) ' Skipping 60 identical leading characters in diff, use -v to show \ - 0 2100100 2100100 2100100 - 1 2100160 2100160 2100160 - 2 2100400 2100400 2100400 - 3 2100402 2100402 2100402 - 4 2100517 2100517 2100517 - 5 2100636 2100636 2100636 - 6 2100685 2100685 2100685 - 7 2100800 2100800 2100800 - 8 2100935 2100935 2100935 - 9 2101000 2101000 2101000 - 10 2100182 2100182 2100182 - 11 2100700 2100700 2100700 - 12 2101100 2101100 2101100 - 13 2101200 2101200 2101200 - 14 2101300 2101300 2101300 - 15 2100300 2100300 2100300 ? ^^ + 0 2100300 2100300 2100300 ? ^^ - 16 2100460 2100460 2100460 ? ^ + 1 2100460 2100460 2100460 ? ^ - 17 2100631 2100631 2100631 ? ^^ + 2 2100631 2100631 2100631 ? ^^ - 18 2101081 2101081 2101081 ? ^^ + 3 2101081 2101081 2101081 ? ^^ - 19 2100184 2100184 2100184 ? ^^ + 4 2100184 2100184 2100184 ? ^^ - 20 2100301 2100301 2100301 ? ^^ + 5 2100301 2100301 2100301 ? ^^ - 21 2100LRP 2100LRP 2100LRP ? ^^ + 6 2100LRP 2100LRP 2100LRP ? ^^ - 22 2100518 2100518 2100518 ? ^^ + 7 2100518 2100518 2100518 ? ^^ - 23 2100630 2100630 2100630 ? ^^ + 8 2100630 2100630 2100630 ? ^^ - 24 2100660 2100660 2100660 ? ^^ + 9 2100660 2100660 2100660 ? ^^ - 25 2100682 2100682 2100682 ? ^^ + 10 2100682 2100682 2100682 ? ^^ - 26 2100693 2100693 2100693 ? ^^ + 11 2100693 2100693 2100693 ? ^^ - 27 2100701 2100701 2100701 ? - + 12 2100701 2100701 2100701 ? + - 28 2100805 2100805 2100805 ? ^^ + 13 2100805 2100805 2100805 ? ^^ - 29 2100880 2100880 2100880 ? ^^ + 14 2100880 2100880 2100880 ? ^^ - 30 2100941 2100941 2100941 ? ^^ + 15 2100941 2100941 2100941 ? ^^ - 31 2100950 2100950 2100950 ? - + 16 2100950 2100950 2100950 ? + - 32 2101102 2101102 2101102 ? ^^ + 17 2101102 2101102 2101102 ? ^^ - 33 2101135 2101135 2101135 ? ^^ + 18 2101135 2101135 2101135 ? ^^ - 34 2101204 2101204 2101204 ? ^^ + 19 2101204 2101204 2101204 ? ^^ - 35 2101310 2101310 2101310 ? ^^ + 20 2101310 2101310 2101310 ? ^^ - 36 2101303 2101303 2101303 ? ^^ + 21 2101303 2101303 2101303 ? ^^ + 22 2100100 2100100 2100100 + 23 2100160 2100160 2100160 + 24 2100400 2100400 2100400 + 25 2100402 2100402 2100402 + 26 2100517 2100517 2100517 + 27 2100636 2100636 2100636 + 28 2100685 2100685 2100685 + 29 2100800 2100800 2100800 + 30 2100935 2100935 2100935 + 31 2101000 2101000 2101000 + 32 2100182 2100182 2100182 + 33 2100700 2100700 2100700 + 34 2101100 2101100 2101100 + 35 2101200 2101200 2101200 + 36 2101300 2101300 2101300 station_name__nom_station measurement_type__type_mesure period__periode \ - 0 AISHIHIK A pressure_sea_level Ann - 1 BEAVER CREEK A pressure_sea_level Ann - 2 DAWSON pressure_sea_level Ann - 3 DAWSON A pressure_sea_level Ann - 4 FARO A pressure_sea_level Ann - 5 HERSCHEL ISLAND pressure_sea_level Ann - 6 KOMAKUK BEACH A pressure_sea_level Ann - 7 OLD CROW A pressure_sea_level Ann - 8 ROCK RIVER pressure_sea_level Ann - 9 SNAG A pressure_sea_level Ann - 10 BURWASH A wind_speed Ann - 11 MAYO A wind_speed Ann - 12 TESLIN A wind_speed Ann - 13 WATSON LAKE A wind_speed Ann - 14 WHITEHORSE A wind_speed Ann - 15 CARMACKS snow Ann ? ^^ + 0 CARMACKS snow Ann ? ^^ - 16 DRURY CREEK snow Ann ? ^ + 1 DRURY CREEK snow Ann ? ^ - 17 HAINES JUNCTION snow Ann ? ^^ + 2 HAINES JUNCTION snow Ann ? ^^ - 18 SWIFT RIVER snow Ann ? ^^ + 3 SWIFT RIVER snow Ann ? ^^ - 19 BURWASH temp_mean Ann ? ^^ + 4 BURWASH temp_mean Ann ? ^^ - 20 CARMACKS temp_mean Ann ? ^^ + 5 CARMACKS temp_mean Ann ? ^^ - 21 DAWSON temp_mean Ann ? ^^ + 6 DAWSON temp_mean Ann ? ^^ - 22 FARO_(AUT) temp_mean Ann ? ^^ + 7 FARO_(AUT) temp_mean Ann ? ^^ - 23 HAINES_JUNCTION temp_mean Ann ? ^^ + 8 HAINES_JUNCTION temp_mean Ann ? ^^ - 24 IVVAVIK_NAT_PARK temp_mean Ann ? ^^ + 9 IVVAVIK_NAT_PARK temp_mean Ann ? ^^ - 25 KOMAKUK_BEACH temp_mean Ann ? ^^ + 10 KOMAKUK_BEACH temp_mean Ann ? ^^ - 26 MACMILLAN_PASS temp_mean Ann ? ^^ + 11 MACMILLAN_PASS temp_mean Ann ? ^^ - 27 MAYO temp_mean Ann ? - + 12 MAYO temp_mean Ann ? + - 28 OLD_CROW temp_mean Ann ? ^^ + 13 OLD_CROW temp_mean Ann ? ^^ - 29 PELLY_RANCH temp_mean Ann ? ^^ + 14 PELLY_RANCH temp_mean Ann ? ^^ - 30 ROSS_RIVER temp_mean Ann ? ^^ + 15 ROSS_RIVER temp_mean Ann ? ^^ - 31 SHINGLE_POINT temp_mean Ann ? - + 16 SHINGLE_POINT temp_mean Ann ? + - 32 TESLIN temp_mean Ann ? ^^ + 17 TESLIN temp_mean Ann ? ^^ - 33 TUCHITUA temp_mean Ann ? ^^ + 18 TUCHITUA temp_mean Ann ? ^^ - 34 WATSON_LAKE temp_mean Ann ? ^^ + 19 WATSON_LAKE temp_mean Ann ? ^^ - 35 WHITEHORSE temp_mean Ann ? ^^ + 20 WHITEHORSE temp_mean Ann ? ^^ - 36 WHITEHORSE_A temp_mean Ann ? ^^ + 21 WHITEHORSE_A temp_mean Ann ? ^^ + 22 AISHIHIK A pressure_sea_level Ann + 23 BEAVER CREEK A pressure_sea_level Ann + 24 DAWSON pressure_sea_level Ann + 25 DAWSON A pressure_sea_level Ann + 26 FARO A pressure_sea_level Ann + 27 HERSCHEL ISLAND pressure_sea_level Ann + 28 KOMAKUK BEACH A pressure_sea_level Ann + 29 OLD CROW A pressure_sea_level Ann + 30 ROCK RIVER pressure_sea_level Ann + 31 SNAG A pressure_sea_level Ann + 32 BURWASH A wind_speed Ann + 33 MAYO A wind_speed Ann + 34 TESLIN A wind_speed Ann + 35 WATSON LAKE A wind_speed Ann + 36 WHITEHORSE A wind_speed Ann trend_value__valeur_tendance elevation__elevation province__province \ - 0 NaN 966.20 YT ? ^^^ - + 0 NaN 525.00 YT ? ^^^ + - 1 NaN 649.00 YT ? ^ + 1 NaN 609.00 YT ? ^ - 2 NaN 320.00 YT - 3 NaN 370.30 YT - 4 NaN 716.60 YT - 5 NaN 1.20 YT - 6 NaN 7.30 YT - 7 NaN 251.20 YT - 8 NaN 731.00 YT - 9 NaN 586.70 YT - 10 NaN 806.20 YT - 11 0.00 503.80 YT - 12 NaN 705.00 YT - 13 -0.72 687.35 YT - 14 -0.25 706.20 YT - 15 NaN 525.00 YT - 16 NaN 609.00 YT - 17 NaN 596.00 YT ? ^^ + 2 NaN 596.00 YT ? ^^ - 18 NaN 891.00 YT ? ^^ + 3 NaN 891.00 YT ? ^^ - 19 NaN 80.00 YT ? ^^ + 4 NaN 80.00 YT ? ^^ - 20 NaN 54.00 YT ? ^^ + 5 NaN 54.00 YT ? ^^ - 21 2.34 37.00 YT ? ^^ + 6 2.34 37.00 YT ? ^^ - 22 NaN 71.00 YT ? ^^ + 7 NaN 71.00 YT ? ^^ - 23 NaN 59.00 YT ? ^^ + 8 NaN 59.00 YT ? ^^ - 24 NaN 24.00 YT ? ^^ + 9 NaN 24.00 YT ? ^^ - 25 NaN 1.00 YT ? ^^ + 10 NaN 1.00 YT ? ^^ - 26 NaN 137.00 YT ? ^^ + 11 NaN 137.00 YT ? ^^ - 27 NaN 50.00 YT ? - + 12 NaN 50.00 YT ? + - 28 NaN 25.00 YT ? ^^ + 13 NaN 25.00 YT ? ^^ - 29 NaN 44.00 YT ? ^^ + 14 NaN 44.00 YT ? ^^ - 30 NaN 69.00 YT ? ^^ + 15 NaN 69.00 YT ? ^^ - 31 NaN 4.00 YT ? - + 16 NaN 4.00 YT ? + + 17 NaN 70.00 YT + 18 NaN 72.00 YT + 19 1.53 68.00 YT - 32 NaN 70.00 YT ? - + 20 NaN 70.00 YT ? + - 33 NaN 72.00 YT - 34 1.53 68.00 YT - 35 NaN 70.00 YT - 36 1.80 70.00 YT ? ^^ + 21 1.80 70.00 YT ? ^^ + 22 NaN 966.20 YT + 23 NaN 649.00 YT + 24 NaN 320.00 YT + 25 NaN 370.30 YT + 26 NaN 716.60 YT + 27 NaN 1.20 YT + 28 NaN 7.30 YT + 29 NaN 251.20 YT + 30 NaN 731.00 YT + 31 NaN 586.70 YT + 32 NaN 806.20 YT + 33 0.00 503.80 YT + 34 NaN 705.00 YT + 35 -0.72 687.35 YT + 36 -0.25 706.20 YT joined__rejoint year_range__annees start_date__date_debut \ 0 0 None /DATE/ - 1 1 None /DATE/ ? ^ + 1 0 None /DATE/ ? ^ - 2 0 None /DATE/ ? ^ + 2 1 None /DATE/ ? ^ - 3 1 None /DATE/ ? ^ + 3 0 None /DATE/ ? ^ 4 1 None /DATE/ - 5 0 None /DATE/ ? ^ + 5 1 None /DATE/ ? ^ - 6 0 None /DATE/ + 6 1 1901-2019 /DATE/ 7 1 None /DATE/ 8 0 None /DATE/ 9 0 None /DATE/ 10 1 None /DATE/ - 11 1 1953-2014 /DATE/ + 11 0 None /DATE/ 12 1 None /DATE/ - 13 0 1953-2014 /DATE/ - 14 1 1953-2014 /DATE/ ? ^ ^^^^^^^^^ + 13 1 None /DATE/ ? ^ ^^^^^^^^^ - 15 0 None /DATE/ ? ^ + 14 0 None /DATE/ ? ^ + 15 1 None /DATE/ 16 0 None /DATE/ 17 1 None /DATE/ 18 0 None /DATE/ - 19 1 None /DATE/ ? ^^^^^^^^^ + 19 1 1939-2019 /DATE/ ? ^^^^^^^^^ 20 1 None /DATE/ - 21 1 1901-2019 /DATE/ ? ^^ + 21 1 1943-2019 /DATE/ ? ^^ - 22 1 None /DATE/ ? ^ + 22 0 None /DATE/ ? ^ - 23 0 None /DATE/ ? ^ + 23 1 None /DATE/ ? ^ 24 0 None /DATE/ 25 1 None /DATE/ - 26 0 None /DATE/ ? ^ + 26 1 None /DATE/ ? ^ - 27 1 None /DATE/ ? ^ + 27 0 None /DATE/ ? ^ - 28 1 None /DATE/ ? ^ + 28 0 None /DATE/ ? ^ - 29 0 None /DATE/ ? ^ + 29 1 None /DATE/ ? ^ - 30 1 None /DATE/ ? ^ + 30 0 None /DATE/ ? ^ 31 0 None /DATE/ 32 1 None /DATE/ - 33 0 None /DATE/ - 34 1 1939-2019 /DATE/ ? ^ - ^ + 33 1 1953-2014 /DATE/ ? ^ + ^ - 35 1 None /DATE/ ? ^ + 34 1 None /DATE/ ? ^ + 35 0 1953-2014 /DATE/ - 36 1 1943-2019 /DATE/ ? ^ ^ + 36 1 1953-2014 /DATE/ ? ^ ^ end_date__date_fin geometry - 0 /DATE/ POINT (-137.48000 61.65000) - 1 /DATE/ POINT (-140.87000 62.41000) - 2 /DATE/ POINT (-139.43000 64.05000) - 3 /DATE/ POINT (-139.13000 64.04000) - 4 /DATE/ POINT (-133.38000 62.21000) - 5 /DATE/ POINT (-138.91000 69.57000) - 6 /DATE/ POINT (-140.18000 69.58000) - 7 /DATE/ POINT (-139.84000 67.57000) - 8 /DATE/ POINT (-136.22000 66.98000) - 9 /DATE/ POINT (-140.40000 62.37000) - 10 /DATE/ POINT (-139.05000 61.36670) - 11 /DATE/ POINT (-135.86670 63.61670) - 12 /DATE/ POINT (-132.73590 60.17410) - 13 /DATE/ POINT (-128.82230 60.11650) - 14 /DATE/ POINT (-135.06880 60.70950) - 15 /DATE/ POINT (-136.30000 62.10000) ? ^^ + 0 /DATE/ POINT (-136.30000 62.10000) ? ^^ - 16 /DATE/ POINT (-134.39000 62.20190) ? ^ + 1 /DATE/ POINT (-134.39000 62.20190) ? ^ - 17 /DATE/ POINT (-137.50530 60.74950) ? ^^ + 2 /DATE/ POINT (-137.50530 60.74950) ? ^^ - 18 /DATE/ POINT (-131.18330 60.00000) ? ^^ + 3 /DATE/ POINT (-131.18330 60.00000) ? ^^ - 19 /DATE/ POINT (-139.00000 61.30000) ? ^^ + 4 /DATE/ POINT (-139.00000 61.30000) ? ^^ - 20 /DATE/ POINT (-136.20000 62.10000) ? ^^ + 5 /DATE/ POINT (-136.20000 62.10000) ? ^^ - 21 /DATE/ POINT (-139.10000 64.00000) ? ^^ + 6 /DATE/ POINT (-139.10000 64.00000) ? ^^ - 22 /DATE/ POINT (-133.30000 62.20000) ? ^^ + 7 /DATE/ POINT (-133.30000 62.20000) ? ^^ - 23 /DATE/ POINT (-137.50000 60.70000) ? ^^ + 8 /DATE/ POINT (-137.50000 60.70000) ? ^^ - 24 /DATE/ POINT (-140.10000 69.10000) ? ^^ + 9 /DATE/ POINT (-140.10000 69.10000) ? ^^ - 25 /DATE/ POINT (-140.20000 69.60000) ? ^^ + 10 /DATE/ POINT (-140.20000 69.60000) ? ^^ - 26 /DATE/ POINT (-130.00000 63.20000) ? ^^ + 11 /DATE/ POINT (-130.00000 63.20000) ? ^^ - 27 /DATE/ POINT (-135.80000 63.60000) ? - + 12 /DATE/ POINT (-135.80000 63.60000) ? + - 28 /DATE/ POINT (-139.80000 67.50000) ? ^^ + 13 /DATE/ POINT (-139.80000 67.50000) ? ^^ - 29 /DATE/ POINT (-137.30000 62.80000) ? ^^ + 14 /DATE/ POINT (-137.30000 62.80000) ? ^^ - 30 /DATE/ POINT (-132.40000 61.90000) ? ^^ + 15 /DATE/ POINT (-132.40000 61.90000) ? ^^ - 31 /DATE/ POINT (-137.20000 68.90000) ? - + 16 /DATE/ POINT (-137.20000 68.90000) ? + - 32 /DATE/ POINT (-132.70000 60.10000) ? ^^ + 17 /DATE/ POINT (-132.70000 60.10000) ? ^^ - 33 /DATE/ POINT (-129.20000 60.90000) ? ^^ + 18 /DATE/ POINT (-129.20000 60.90000) ? ^^ - 34 /DATE/ POINT (-128.80000 60.10000) ? ^^ + 19 /DATE/ POINT (-128.80000 60.10000) ? ^^ - 35 /DATE/ POINT (-135.10000 60.70000) ? ^^ + 20 /DATE/ POINT (-135.10000 60.70000) ? ^^ - 36 /DATE/ POINT (-135.00000 60.70000) ? ^^ + 21 /DATE/ POINT (-135.00000 60.70000) ? ^^ + + 22 /DATE/ POINT (-137.48000 61.65000) + 23 /DATE/ POINT (-140.87000 62.41000) + 24 /DATE/ POINT (-139.43000 64.05000) + 25 /DATE/ POINT (-139.13000 64.04000) + 26 /DATE/ POINT (-133.38000 62.21000) + 27 /DATE/ POINT (-138.91000 69.57000) + 28 /DATE/ POINT (-140.18000 69.58000) + 29 /DATE/ POINT (-139.84000 67.57000) + 30 /DATE/ POINT (-136.22000 66.98000) + 31 /DATE/ POINT (-140.40000 62.37000) + 32 /DATE/ POINT (-139.05000 61.36670) + 33 /DATE/ POINT (-135.86670 63.61670) + 34 /DATE/ POINT (-132.73590 60.17410) + 35 /DATE/ POINT (-128.82230 60.11650) + 36 /DATE/ POINT (-135.06880 60.70950) _ pavics-sdi-lvutest/docs/source/notebooks/eccc-geoapi-climate-stations.ipynb::Cell 14 _ Notebook cell execution failed Cell 14: Cell outputs differ Input: ahccd_mon = host / "collections" / "ahccd-monthly" / "items" url = ahccd_mon.with_query({"station_id__id_station": "2100LRP"}) mts = gpd.read_file(str(url)) mts.head() Traceback: mismatch 'text/plain' assert reference_output == test_output failed: ' ...x 28 columns]' == ' ...x 28 columns]' Skipping 70 identical leading characters in diff, use -v to show Skipping 1967 identical trailing characters in diff, use -v to show - 2100LRP.1975.02 64 -139.1 2100LRP.1975.02 ? ^^ ^ ^^ ^ + 2100LRP.1983.03 64 -139.1 2100LRP.1983.03 ? ^^ ^ ^^ ^ - 1 2100LRP.1905.07 64 -139.1 2100LRP.1905.07 ? ^^ ^ ^^ ^ + 1 2100LRP.1921.03 64 -139.1 2100LRP.1921.03 ? ^^ ^ ^^ ^ - 2 2100LRP.1974.01 64 -139.1 2100LRP.1974.01 ? ^^ ^ ^^ ^ + 2 2100LRP.1992.06 64 -139.1 2100LRP.1992.06 ? ^^ ^ ^^ ^ - 3 2100LRP.1982.11 64 -139.1 2100LRP.1982.11 ? - ^^ - ^^ + 3 2100LRP.1938.03 64 -139.1 2100LRP.1938.03 ? + ^^ + ^^ - 4 2100LRP.1936.02 64 -139.1 2100LRP.1936.02 ? - ^ - ^ + 4 2100LRP.1963.09 64 -139.1 2100LRP.1963.09 ? + ^ + ^ station_id__id_station period_value__valeur_periode \ - 0 2100LRP Feb ? ^^^ + 0 2100LRP Mar ? ^^^ - 1 2100LRP Jul ? ^^^ + 1 2100LRP Mar ? ^^^ - 2 2100LRP Jan ? ^ + 2 2100LRP Jun ? ^ - 3 2100LRP Nov ? ^^^ + 3 2100LRP Mar ? ^^^ - 4 2100LRP Feb ? ^ ^ + 4 2100LRP Sep ? ^ ^ period_group__groupe_periode province__province date \ - 0 Monthly YT 1975-02 ? ^^ ^ + 0 Monthly YT 1983-03 ? ^^ ^ - 1 Monthly YT 1905-07 ? ^^ ^ + 1 Monthly YT 1921-03 ? ^^ ^ - 2 Monthly YT 1974-01 ? ^^ ^ + 2 Monthly YT 1992-06 ? ^^ ^ - 3 Monthly YT 1982-11 ? - ^^ + 3 Monthly YT 1938-03 ? + ^^ - 4 Monthly YT 1936-02 ? - ^ + 4 Monthly YT 1963-09 ? + ^ temp_mean__temp_moyenne ... rain_units__pluie_unites snow__neige \ - 0 -27.2 ... mm None ? ^ -- + 0 -15.7 ... mm None ? ^^^ - 1 14.3 ... mm None - 2 -36.8 ... mm None ? ^ ^ + 1 -16.8 ... mm None ? ^ ^ + 2 14.2 ... mm None - 3 -19.4 ... mm None ? ^ ^ + 3 -13.7 ... mm None ? ^ ^ - 4 -35.9 ... ? ^^ -- + 4 6.5 ... ? ^^^^ ```
It was indeed that PROJ error but also the This is suspicious that the |
I saw yesterday, was intrigued, but I assumed there is something going on ECCC side. If that's the only issue remaining, feel free to merge. Short story, I don't think we should NB_EVAL the response from the API, as its bound to change all the time. What we want is make sure that the functionality remains. |
Yes so I still need an approval to merge :) |
Relate to PR #268
Jenkins build passed: http://jenkins.ouranos.ca/job/PAVICS-e2e-workflow-tests/job/current-production-version/85/console
Note the
STN_ID
seems changing, causing cascading output change. Had to use# NBVAL_IGNORE_OUTPUT
to get the notebook to pass. See commit description of 7a52fc2 to seeSTN_ID
change.Original Jenkins failure fixed: