Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test multi-ghe network on CI #15

Merged
merged 25 commits into from
Feb 29, 2024
Merged
Show file tree
Hide file tree
Changes from 24 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
fa0a198
experiment with 2_ghe test on new github runners
vtnate Jan 23, 2024
8aa8384
downselect geojson to only include buildings from sys-params file
vtnate Jan 25, 2024
983aca8
Merge branch 'main' into test_2_ghe
vtnate Jan 25, 2024
4fc8e72
Merge branch 'main' into downselect-geojson
vtnate Jan 25, 2024
f315fa9
Merge branch 'main' into test_2_ghe
vtnate Feb 1, 2024
ce15380
update TN version identifier in ghe sys-params test files
vtnate Feb 5, 2024
8800508
Merge branch 'main' into downselect-geojson
vtnate Feb 6, 2024
3b75701
more debug logging to check run time
vtnate Feb 6, 2024
132585a
TEMPORARY: notebook code to run examples a little more granularly
vtnate Feb 6, 2024
0e6a32c
don't run coverage every time, and adapt to new tests dir location
vtnate Feb 9, 2024
68e3d3f
move tests folder up to project root
vtnate Feb 9, 2024
0f4e4e0
upgrade ruff to v0.2.1
vtnate Feb 9, 2024
d69a902
restore coverage, now outputting to `htmlcov` dir
vtnate Feb 9, 2024
7207374
TEMPORARY: updates to notebook to generate experimental test files
vtnate Feb 9, 2024
a24d379
update GHED & ruff versions
vtnate Feb 29, 2024
08b7183
use new continue_if_design_unmet parameter of GHED
vtnate Feb 29, 2024
3bde3fa
add more comments to building downselect code
vtnate Feb 29, 2024
7b723ab
point test sys-param file to the test buildings that exist
vtnate Feb 29, 2024
52a3679
Merge branch 'downselect-geojson' into test_2_ghe
vtnate Feb 29, 2024
9d467d8
move tests to their own top-level dir. Did I not already do this?
vtnate Feb 29, 2024
272971c
add max_boreholes parameter to the GHED call
vtnate Feb 29, 2024
7da35e2
Revert "TEMPORARY: updates to notebook to generate experimental test …
vtnate Feb 29, 2024
253a6aa
Revert "TEMPORARY: notebook code to run examples a little more granul…
vtnate Feb 29, 2024
da3335c
lint & format
vtnate Feb 29, 2024
8e96f9f
Merge branch 'main' into test_2_ghe
vtnate Feb 29, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ dmypy.json
.vscode/

# Test output files
thermalnetwork/tests/test_outputs
tests/test_outputs

output/
tmp/
Expand Down
7 changes: 4 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,16 +24,17 @@ repos:
# hooks:
# - id: check-useless-excludes # Ensure the exclude syntax is correct
# - id: check-hooks-apply # Fails if a hook doesn't apply to any file
# Run the Ruff linter
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.3
rev: v0.3.0
hooks:
# Run the Ruff linter
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
types_or: [python, pyi, jupyter]
# Run the Ruff formatter
# https://docs.astral.sh/ruff/integrations/#pre-commit
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.3
rev: v0.3.0
hooks:
- id: ruff-format
types_or: [python, pyi, jupyter]
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"buildings": [
{
"geojson_id": "2",
"geojson_id": "8",
"load_model": "time_series",
"load_model_parameters": {
"time_series": {
Expand Down Expand Up @@ -43,7 +43,7 @@
}
},
{
"geojson_id": "5",
"geojson_id": "9",
"load_model": "time_series",
"load_model_parameters": {
"time_series": {
Expand Down Expand Up @@ -88,7 +88,7 @@
"district_system": {
"fifth_generation": {
"ghe_parameters": {
"version": "1.0",
"version": "0.2.3",
"ghe_dir": "tests\\management\\data\\sdk_project_scraps\\run\\baseline_scenario\\ghe_dir",
"fluid": {
"fluid_name": "Water",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
"district_system": {
"fifth_generation": {
"ghe_parameters": {
"version": "1.0",
"version": "0.2.3",
"ghe_dir": "tests\\management\\data\\sdk_project_scraps\\run\\baseline_scenario\\ghe_dir",
"fluid": {
"fluid_name": "Water",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
"district_system": {
"fifth_generation": {
"ghe_parameters": {
"version": "1.0",
"version": "0.2.3",
"ghe_dir": "ghe\\run\\baseline_scenario\\ghe_dir",
"fluid": {
"fluid_name": "Water",
Expand Down
36 changes: 17 additions & 19 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ classifiers = [
]
dependencies = [
'click ~= 8.1',
'ghedesigner ~= 1.3',
'ghedesigner ~= 1.4',
'pandas ~= 2.1',
"rich ~= 13.6",
]
Expand All @@ -40,7 +40,7 @@ dev = [
"pytest >= 6.0",
"pytest-cov ~= 4.1",
"pre-commit ~= 3.5",
"ruff ~= 0.1",
"ruff ~= 0.3",
"jupyterlab ~= 4.0",
]

Expand All @@ -56,36 +56,34 @@ dev = [
[tool.setuptools.dynamic]
readme = {file = "README.md", content-type = "text/markdown"}

# https://setuptools-scm.readthedocs.io/
# Presence of this command tells it to find the version from GitHub
[tool.setuptools_scm]

# https://docs.pytest.org/en/6.2.x/customize.html#pyproject-toml
[tool.pytest.ini_options]
minversion = "6.0"
testpaths = "thermalnetwork/tests"
addopts = ["--cov=thermalnetwork"]
testpaths = "tests"
# Manually add these flags to `pytest` when running locally for coverage details.
addopts = ["--cov=thermalnetwork", "--cov-report=html"]


# https://pytest-cov.readthedocs.io/en/latest/config.html
# https://coverage.readthedocs.io/en/latest/config.html
[tool.coverage.run]
omit = [
"thermalnetwork/tests/**"
]

# https://docs.astral.sh/ruff/settings/
# https://docs.astral.sh/ruff/tutorial/#configuration
[tool.ruff]
fix = true # automatically fix problems if possible
select = ["RUF", "E", "F", "I", "UP", "N", "S", "BLE", "A", "C4", "T10", "ISC", "ICN", "PT",
line-length = 120

# https://docs.astral.sh/ruff/linter/#rule-selection
[tool.ruff.lint]
extend-select = ["RUF", "E", "F", "I", "UP", "N", "S", "BLE", "A", "C4", "T10", "ISC", "ICN", "PT",
"Q", "SIM", "TID", "ARG", "DTZ", "PD", "PGH", "PLC", "PLE", "PLR", "PLW", "PIE", "COM"] # Enable these rules
ignore = ["PLR0913", "PLR2004", "PLR0402", "COM812", "COM819", "SIM108", "ARG002", "ISC001"] # except for these specific errors
line-length = 120

# https://docs.astral.sh/ruff/settings/#format
[tool.ruff.lint.per-file-ignores]
"tests/*" = ["S101", "S607", "S603"] # assert statements are allowed in tests, and paths are safe

# https://docs.astral.sh/ruff/formatter/#configuration
[tool.ruff.format]
# quote-style = "double"

[tool.ruff.per-file-ignores]
"thermalnetwork/tests/*" = ["S101", "S607", "S603"] # assert statements are allowed in tests, and paths are safe

[project.scripts]
thermalnetwork = "thermalnetwork.network:run_sizer_from_cli"
2 changes: 1 addition & 1 deletion thermalnetwork/tests/test_base.py → tests/test_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ def setUp(self) -> None:
here = Path(__file__).parent

# -- Input paths
self.demos_path = here.parent.parent / "demos"
self.demos_path = here.parent / "demos"

self.geojson_file_path_1_ghe = (self.demos_path / "sdk_output_skeleton_1_ghe" / "network.geojson").resolve()
self.scenario_directory_path_1_ghe = (
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from tests.test_base import BaseCase
from thermalnetwork.ground_heat_exchanger import GHE
from thermalnetwork.tests.test_base import BaseCase


class TestGHE(BaseCase):
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import pytest

from tests.test_base import BaseCase
from thermalnetwork.heat_pump import HeatPump
from thermalnetwork.tests.test_base import BaseCase


class TestHeatPump(BaseCase):
Expand Down
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
import json

import pytest

from tests.test_base import BaseCase
from thermalnetwork.network import run_sizer_from_cli_worker
from thermalnetwork.tests.test_base import BaseCase


class TestNetwork(BaseCase):
Expand Down Expand Up @@ -43,7 +41,6 @@ def test_network_one_ghe(self):
# Restore the trailing newline
sys_param_file.write("\n")

@pytest.mark.skip(reason="Test consumes too much memory/cpu for GHA runners. Please run locally instead")
def test_network_two_ghe(self):
# -- Set up
output_path = self.test_outputs_path / "two_ghe"
Expand Down
7 changes: 7 additions & 0 deletions thermalnetwork/ground_heat_exchanger.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,8 @@ def ghe_size(self, total_space_loads, output_path: Path) -> float:
min_eft=self.json_data["design"]["min_eft"],
max_height=self.json_data["geometric_constraints"]["max_height"],
min_height=self.json_data["geometric_constraints"]["min_height"],
continue_if_design_unmet=True,
max_boreholes=2500,
)
ghe.set_ground_loads_from_hourly_list(self.json_data["loads"]["ground_loads"])
ghe.set_geometry_constraints_rectangle(
Expand Down Expand Up @@ -82,13 +84,18 @@ def ghe_size(self, total_space_loads, output_path: Path) -> float:
file_name = output_file_directory / "ground_loads.csv"
logger.info(f"saving loads to: {file_name}")
ground_loads_df.to_csv(file_name, index=False)
logger.debug("loads saved to csv file")

ghe.find_design()
logger.debug("design found")
ghe.prepare_results("Project Name", "Notes", "Author", "Iteration Name")
logger.debug("results prepared for writing to output directory")

ghe.write_output_files(output_file_directory, "")
logger.debug("output written to output directory")
u_tube_height = ghe.results.output_dict["ghe_system"]["active_borehole_length"]["value"]
# selected_coordinates = ghe.results.borehole_location_data_rows # includes a header row
logger.debug("Done writing output")
return u_tube_height

def get_atlanta_loads(self) -> list[float]:
Expand Down
24 changes: 23 additions & 1 deletion thermalnetwork/network.py
Original file line number Diff line number Diff line change
Expand Up @@ -510,7 +510,29 @@ def run_sizer_from_cli_worker(
return 1

system_parameters_data = json.loads(system_parameter_path.read_text())
# print(f"system_parameters_data: {system_parameters_data}")

# Downselect the buildings in the geojson that are in the system parameters file

# List the building ids from the system parameters file
building_id_list = []
for building in system_parameters_data["buildings"]:
building_id_list.append(building["geojson_id"])

# Select the buildings in the geojson that are in the system parameters file
building_features = [
feature
for feature in geojson_data["features"]
if feature["properties"]["type"] == "Building" and feature["properties"]["id"] in building_id_list
]

# Rebuild the geojson data using only the buildings in the system parameters file
# Put in everything that isn't a building
geojson_data["features"] = [
feature for feature in geojson_data["features"] if feature["properties"]["type"] != "Building"
]
# Only add the buildings in the system parameters file back to the geojson data
# This has the effect of removing buildings that are not in the system parameters file
geojson_data["features"].extend(building_features)

# load all input data
sys_param_version: int = system_parameters_data["district_system"]["fifth_generation"]["ghe_parameters"]["version"]
Expand Down
39 changes: 21 additions & 18 deletions topology/final.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
"outputs": [],
"source": [
"import json\n",
"from pathlib import Path\n"
"from pathlib import Path"
]
},
{
Expand All @@ -20,7 +20,7 @@
"source": [
"geojson_path = Path.cwd().parent / \"demos\" / \"sdk_output_skeleton_1_ghe\" / \"network.geojson\"\n",
"with open(geojson_path) as f:\n",
" geojson_data = json.load(f)\n"
" geojson_data = json.load(f)"
]
},
{
Expand All @@ -32,7 +32,7 @@
"source": [
"geojson_path = Path.cwd().parent / \"demos\" / \"sdk_output_skeleton_2_ghe_sequential\" / \"network.geojson\"\n",
"with open(geojson_path) as f:\n",
" geojson_data2 = json.load(f)\n"
" geojson_data2 = json.load(f)"
]
},
{
Expand All @@ -44,7 +44,7 @@
"source": [
"geojson_path = Path.cwd().parent / \"demos\" / \"sdk_output_skeleton_2_ghe_staggered\" / \"network.geojson\"\n",
"with open(geojson_path) as f:\n",
" geojson_data2_staggered = json.load(f)\n"
" geojson_data2_staggered = json.load(f)"
]
},
{
Expand Down Expand Up @@ -73,21 +73,21 @@
"metadata": {},
"outputs": [],
"source": [
"\n",
"#THIS IS GOOD\n",
"# THIS IS GOOD\n",
"def find_startloop_feature_id(features):\n",
" for feature in features:\n",
" if feature[\"properties\"].get(\"is_ghe_start_loop\") == \"true\":\n",
" start_feature_id = feature[\"properties\"].get(\"buildingId\") or feature[\"properties\"].get(\"DSId\")\n",
" return start_feature_id\n",
" return None\n",
"\n",
"\n",
"def get_connected_features(geojson_data):\n",
" features = geojson_data[\"features\"]\n",
" connectors = [feature for feature in features if feature[\"properties\"][\"type\"] == \"ThermalConnector\"]\n",
" connected_features = []\n",
"\n",
" #get the id of the building or ds from the thermaljunction that has startloop: true\n",
" # get the id of the building or ds from the thermaljunction that has startloop: true\n",
" startloop_feature_id = find_startloop_feature_id(features)\n",
"\n",
" # Start with the first connector\n",
Expand All @@ -113,19 +113,22 @@
" for feature in features:\n",
" feature_id = feature[\"properties\"][\"id\"]\n",
" if feature_id in connected_features and feature[\"properties\"][\"type\"] in [\"Building\", \"District System\"]:\n",
" connected_objects.append({\n",
" \"id\": feature_id,\n",
" \"type\": feature[\"properties\"][\"type\"],\n",
" \"name\": feature[\"properties\"].get(\"name\", \"\"),\n",
" \"start_loop\": \"true\" if feature_id == startloop_feature_id else None\n",
" })\n",
" connected_objects.append(\n",
" {\n",
" \"id\": feature_id,\n",
" \"type\": feature[\"properties\"][\"type\"],\n",
" \"name\": feature[\"properties\"].get(\"name\", \"\"),\n",
" \"start_loop\": \"true\" if feature_id == startloop_feature_id else None,\n",
" }\n",
" )\n",
"\n",
" return connected_objects\n",
"\n",
"\n",
"def reorder_connected_features(features):\n",
" while features[0].get(\"start_loop\") != \"true\":\n",
" features.append(features.pop(0))\n",
" return features\n"
" return features"
]
},
{
Expand All @@ -135,7 +138,7 @@
"metadata": {},
"outputs": [],
"source": [
"connected_features = get_connected_features(geojson_data2)\n"
"connected_features = get_connected_features(geojson_data2)"
]
},
{
Expand All @@ -159,7 +162,7 @@
"source": [
"connected_features = get_connected_features(geojson_data2)\n",
"for feature in connected_features:\n",
" print(feature)\n"
" print(feature)"
]
},
{
Expand All @@ -169,7 +172,7 @@
"metadata": {},
"outputs": [],
"source": [
"reordered_features = reorder_connected_features(connected_features)\n"
"reordered_features = reorder_connected_features(connected_features)"
]
},
{
Expand Down Expand Up @@ -200,7 +203,7 @@
}
],
"source": [
"reordered_features\n"
"reordered_features"
]
},
{
Expand Down
Loading