Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix v2.5.0 bug in DatasetRead response validation #1784

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,18 @@
**Note**: Numbers like (\#1234) point to closed Pull Requests on the fractal-server repository.


# 2.5.1 (Unreleased)
# 2.5.1

* API:
* Make `WorkflowTaskDumpV2` attributes `task_id` and `task` optional (\#1784).
* Add validation for user-provided strings that execute commands with subprocess or remote-shell (\#1767).
* Runner and task collection:
* Validate commands before running them via `subprocess` or `fabric` (\#1767).

# 2.5.0

> WARNING: This release has a minor API bug when displaying a V2 dataset with a history that contains legacy tasks. It's recommended to update to 2.5.1.

This release removes support for including V1 tasks in V2 workflows. This comes
with changes to the database (data and metadata), to the API, and to the V2
runner.
Expand Down
13 changes: 11 additions & 2 deletions fractal_server/app/schemas/v2/dumps.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,14 +39,23 @@ class TaskDumpV2(BaseModel):


class WorkflowTaskDumpV2(BaseModel):
"""
Before v2.5.0, WorkflowTaskV2 could have `task_id=task=None` and
non-`None` `task_legacy_id` and `task_legacy`. Since these objects
may still exist in the database after version updates, we are setting
`task_id` and `task` to `Optional` to avoid response-validation errors
for the endpoints that GET datasets.
Ref issue #1783.
"""

id: int
workflow_id: int
order: Optional[int]

input_filters: Filters

task_id: int
task: TaskDumpV2
task_id: Optional[int]
task: Optional[TaskDumpV2]


class WorkflowDumpV2(BaseModel, extra=Extra.forbid):
Expand Down
55 changes: 55 additions & 0 deletions tests/v2/03_api/test_unit_issue_1783.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
from fractal_server.app.schemas.v2.dataset import _DatasetHistoryItemV2
from fractal_server.app.schemas.v2.workflowtask import WorkflowTaskStatusTypeV2


def test_issue_1783():
"""
Given a dataset that was processed with legacy tasks from within V2
workflows, verify that the responds of its GET endpoint is valid.

The differences
"""

wftask1 = dict(
id=1,
workflow_id=1,
order=0,
input_filters=dict(attributes=dict(), types=dict()),
task_id=1,
task=dict(
id=1,
name="name",
type="parallel",
source="source",
input_types={},
output_types={},
),
)
history_item_1 = dict(
workflowtask=wftask1,
status=WorkflowTaskStatusTypeV2.FAILED,
parallelization=dict(),
)
_DatasetHistoryItemV2(**history_item_1)

wftask2 = dict(
id=1,
workflow_id=1,
order=0,
input_filters=dict(attributes=dict(), types=dict()),
task_legacy_id=1,
task_legacy=dict(
id=1,
input_type="image",
output_type="zarr",
command="echo",
name="name",
source="source",
),
)
history_item_2 = dict(
workflowtask=wftask2,
status=WorkflowTaskStatusTypeV2.FAILED,
parallelization=dict(),
)
_DatasetHistoryItemV2(**history_item_2)