Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: use start-stack command for e2e tests, replace Makefile by poetry-exec-plugin (DEV-1597) #279

Merged
merged 19 commits into from
Jan 16, 2023
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 5 additions & 12 deletions .github/workflows/daily-test.yml
Original file line number Diff line number Diff line change
@@ -1,29 +1,22 @@
name: Daily test

on:
schedule:
- cron: '0 7 * * *'
schedule:
- cron: '0 7 * * *'

jobs:
tests:
name: daily-tests
daily-test:
runs-on: ubuntu-latest
steps:
- name: Checkout source
- name: Check out repo
uses: actions/checkout@v3
with:
fetch-depth: 1
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
distribution: "temurin"
java-version: "17"
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install dependencies
run: |
make install
run: make install
- name: Run e2e tests
run: make test-end-to-end-ci
Original file line number Diff line number Diff line change
@@ -1,42 +1,37 @@
name: Release
name: Publish to PyPI

on:
release:
types: [released]

# set environment variables
env:
TOKEN: ${{ secrets.GH_TOKEN }}
POETRY_HTTP_BASIC_PYPI_USERNAME: ${{ secrets.PYPI_USER }}
POETRY_HTTP_BASIC_PYPI_PASSWORD: ${{ secrets.PYPI_PW }}

jobs:
# release to PyPI
release-pypi:
name: Release to PyPI
publish-to-pypi:
runs-on: ubuntu-latest
steps:
# check out repo
- uses: actions/checkout@v3
- name: Check out repo
uses: actions/checkout@v3
with:
fetch-depth: 1
# install python
- name: Set up Python
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: '3.9'
# install python dependencies
- name: Install dependencies
run: make install
- name: publish new version to PyPI
run: |
make install
# release new version to PyPI
- run: |
make dist
make upload
rm -rf dist/ build/
poetry build # generate distribution package
poetry publish # upload distribution package to PyPI

notification:
name: Google chat notification about release and published version
needs: [release-pypi]
needs: [publish-to-pypi]
runs-on: ubuntu-latest
steps:
- name: Send notification to google chat room "DSP releases"
Expand Down
5 changes: 1 addition & 4 deletions .github/workflows/release-please.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,15 @@
name: Release-Please

# triggered when a PR is merged into main
on:
pull_request:
types:
- closed

jobs:
release-please:
# triggered when a PR is merged into main
if: github.event.pull_request.merged == true
# Automate releases with Conventional Commit Messages as Pull Requests are merged into "main" branch
name: Prepare next release
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
runs-on: ubuntu-latest
# release only if tests pass
steps:
- uses: google-github-actions/release-please-action@v3
with:
Expand Down
22 changes: 8 additions & 14 deletions .github/workflows/tests-on-push.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,28 +6,22 @@ on:
- main

jobs:
tests:
name: tests
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
tests-on-push:
runs-on: ubuntu-latest
strategy:
matrix:
target: [ 'test-end-to-end-ci', 'test-unittests', 'docs-build' ]
steps:
- name: Checkout source
uses: actions/checkout@v3
with:
fetch-depth: 1
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
distribution: "temurin"
java-version: "17"
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install dependencies
run: |
make install
- name: Run tests
run: make ${{ matrix.target }}
run: make install
- name: unittests
run: make test-unittests
- name: e2e tests
run: make test-end-to-end-ci
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
- name: build docs
run: poetry run mkdocs build
48 changes: 10 additions & 38 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,33 +7,6 @@ CURRENT_DIR := $(shell dirname $(realpath $(firstword $(MAKEFILE_LIST))))
# Make targets for dsp-tools
############################

.PHONY: dsp-stack
dsp-stack: ## clone the dsp-api git repository and run the dsp-stack
@mkdir -p .tmp
@git clone --branch v24.0.8 --single-branch https://github.com/dasch-swiss/dsp-api.git .tmp/dsp-stack
$(MAKE) -C .tmp/dsp-stack env-file
$(MAKE) -C .tmp/dsp-stack init-db-test
$(MAKE) -C .tmp/dsp-stack stack-up
$(MAKE) -C .tmp/dsp-stack stack-logs-api-no-follow

.PHONY: stack-down
stack-down: ## stop dsp-stack and remove the cloned dsp-api repository
$(MAKE) -C .tmp/dsp-stack stack-down-delete-volumes
@test -x .tmp && rm -rf .tmp

.PHONY: dist
dist: ## generate distribution package
@rm -rf dist/ build/
poetry build

.PHONY: upload
upload: ## upload distribution package to PyPI
poetry publish

.PHONY: docs-build
docs-build: ## build docs into the local 'site' folder
poetry run mkdocs build

.PHONY: docs-serve
docs-serve: ## serve docs for local viewing
poetry run mkdocs serve --dev-addr=localhost:7979
Expand All @@ -44,25 +17,24 @@ install: ## install Poetry, which in turn installs the dependencies and makes an
poetry install

.PHONY: test
test: dsp-stack ## run all tests located in the "test" folder (intended for local usage)
-poetry run pytest test/ # ignore errors, continue anyway with stack-down (see https://www.gnu.org/software/make/manual/make.html#Errors)
$(MAKE) stack-down
test: ## run all tests located in the "test" folder
poetry run dsp-tools start-stack --no-prune
-poetry run pytest test/ # ignore errors, continue anyway with stop-stack (see https://www.gnu.org/software/make/manual/make.html#Errors)
poetry run dsp-tools stop-stack
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you could also look into the poetry exec plugin to automate poetry workflows, which might spare you even more of the makefile

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(for a discussion, see here python-poetry/poetry#241)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea, it would be cool to get rid of the Makefile, esp. because there are only 2 commands left in it, each a 1-liner. But I'm a bit reluctant using a plugin which is still pre-1.0... What do you think?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be worth trying, even if it's only to see how well it works. In any case, it would be easy enough to revert back to having the make file (and if it's only two one-liners, you could just as well put them into the readme and execute them manually).
Speaking of those two commands: in the make clean command, are most of them still needed? I would guess that both mkdocs (for the docs) and poetry (for building) take care of removing old files pretty well.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, done :-)
There is only one exec-script remaining, and that one is only used by me. This allowed me to keep the documentation of the poetry-exec-plugin minimal.


.PHONY: test-no-stack
test-no-stack: ## run all tests located in the "test" folder, without starting the stack (intended for local usage)
poetry run pytest test/

.PHONY: test-end-to-end
test-end-to-end: dsp-stack ## run e2e tests (intended for local usage)
-poetry run pytest test/e2e/ # ignore errors, continue anyway with stack-down (see https://www.gnu.org/software/make/manual/make.html#Errors)
$(MAKE) stack-down
test-end-to-end: ## run e2e tests
poetry run dsp-tools start-stack --no-prune
-poetry run pytest test/e2e/ # ignore errors, continue anyway with stop-stack (see https://www.gnu.org/software/make/manual/make.html#Errors)
poetry run dsp-tools stop-stack

.PHONY: test-end-to-end-ci
test-end-to-end-ci: dsp-stack ## run e2e tests (intended for GitHub CI, where it isn't possible nor necessary to remove .tmp)
poetry run pytest test/e2e/

.PHONY: test-end-to-end-no-stack
test-end-to-end-no-stack: ## run e2e tests without starting the dsp-stack (intended for local usage)
test-end-to-end-ci: ## run e2e tests (intended for GitHub CI, where the errors must not be ignored, and the stack doesn't have to be stopped)
poetry run dsp-tools start-stack --no-prune
poetry run pytest test/e2e/
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved

.PHONY: test-unittests
Expand Down
6 changes: 3 additions & 3 deletions docs/developers-packaging.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ steps below.
Generate the distribution package:

```bash
make dist
poetry build
```

You can install the package locally from the dist:
Expand All @@ -137,8 +137,8 @@ You can install the package locally from the dist:
pip install dist/some_name.whl
```

Upload package works also with `make`:
Upload package:

```bash
make upload
poetry publish
```
52 changes: 28 additions & 24 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,46 +2,50 @@

# DSP-TOOLS documentation

DSP-TOOLS is a Python package with a command line interface that helps you interact with a DSP server. The DSP server
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
you interact with can be on a remote server, or on your local machine. The two main tasks that DSP-TOOLS serves for are:
DSP-TOOLS is a Python package with a command line interface that helps you interact with a DSP server. A DSP server
is a server or a local machine where the [DSP-API](https://github.com/dasch-swiss/dsp-api) is running on. The two main
tasks that DSP-TOOLS serves for are:

**Create a project with its data model(s), described in a JSON file, on a DSP server**
In order to archive your data on the DaSCH Service Platform, you need a data model that describes your data.
The data model is defined in a JSON project definition file which has to be transmitted to the DSP server. If the DSP
server is aware of the data model for your project, conforming data can be uploaded into the DSP repository.

**Upload data, described in an XML file, to a DSP server that has a project with a matching data model**
Sometimes, data is added in large quantities. Therefore, DSP-TOOLS allows you to perform bulk imports of your
data. In order to do so, the data has to be described in an XML file. DSP-TOOLS is able to read the XML file and upload
all data to the DSP server.
- **Create a project with its data model(s), described in a JSON file, on a DSP server**
In order to archive your data on the DaSCH Service Platform, you need a data model that describes your data.
The data model is defined in a JSON project definition file which has to be transmitted to the DSP server. If the DSP
server is aware of the data model for your project, conforming data can be uploaded into the DSP repository.
- **Upload data, described in an XML file, to a DSP server that has a project with a matching data model**
Sometimes, data is added in large quantities. Therefore, DSP-TOOLS allows you to perform bulk imports of your
data. In order to do so, the data has to be described in an XML file. DSP-TOOLS is able to read the XML file and
upload
all data to the DSP server.

All of DSP-TOOLS' functionality revolves around these two basic tasks.

DSP-TOOLS provides the following functionalities:

- [`dsp-tools create`](./dsp-tools-usage.md#create-a-project-on-a-dsp-server) creates the project with its data model(s)
on a DSP server from a JSON file.
- [`dsp-tools get`](./dsp-tools-usage.md#get-a-project-from-a-dsp-server) reads a project with its data model(s) from
- [`dsp-tools create`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#create-a-project-on-a-dsp-server)
creates the project with its data model(s) on a DSP server from a JSON file.
- [`dsp-tools get`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage#get-a-project-from-a-dsp-server) reads a project with its data model(s) from
a DSP server and writes it into a JSON file.
- [`dsp-tools xmlupload`](./dsp-tools-usage.md#upload-data-to-a-dsp-server) uploads data from an XML file (bulk
- [`dsp-tools xmlupload`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#upload-data-to-a-dsp-server)
uploads data from an XML file (bulk
data import) and writes the mapping from internal IDs to IRIs into a local file.
- [`dsp-tools excel2json`](./dsp-tools-usage.md#create-a-json-project-file-from-excel-files) creates an entire JSON
- [`dsp-tools excel2json`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#create-a-json-project-file-from
-excel-files) creates an entire JSON
project file from a folder with Excel files in it.
- [`dsp-tools excel2lists`](./dsp-tools-usage.md#create-the-lists-section-of-a-json-project-file-from-excel-files)
- [`dsp-tools excel2lists`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#create-the-lists-section-of-a-json-project-file-from-excel-files)
creates the "lists" section of a JSON project file from one or several Excel files. The resulting section can be
integrated into a JSON project file and then be uploaded to a DSP server with `dsp-tools create`.
- [`dsp-tools excel2resources`](./dsp-tools-usage.md#create-the-resources-section-of-a-json-project-file-from-an-excel-file)
- [`dsp-tools excel2resources`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#create-the-resources-section-of-a-json-project-file-from-an-excel-file)
creates the "resources" section of a JSON project file from an Excel file. The resulting section can be integrated
into a JSON project file and then be uploaded to a DSP server with `dsp-tools create`.
- [`dsp-tools excel2properties`](./dsp-tools-usage.md#create-the-properties-section-of-a-json-project-file-from-an-excel-file)
- [`dsp-tools excel2properties`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#create-the-properties-section-of-a-json-project-file-from-an-excel-file)
creates the "properties" section of a JSON project file from an Excel file. The resulting section can be integrated
into a JSON project file and then be uploaded to a DSP server with `dsp-tools create`.
- [`dsp-tools excel2xml`](./dsp-tools-usage.md#create-an-xml-file-from-excelcsv) transforms a data source to XML if it
is already structured according to the DSP specifications.
- [The module `excel2xml`](./dsp-tools-usage.md#use-the-module-excel2xml-to-convert-a-data-source-to-xml) provides helper
- [`dsp-tools excel2xml`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#create-an-xml-file-from-excelcsv)
transforms a data source to XML if it is already structured according to the DSP specifications.
- [The module `excel2xml`](https://docs.dasch.
swiss/latest/DSP-TOOLS/dsp-tools-usage/#use-the-module-excel2xml-to-convert-a-data-source-to-xml) provides helper
methods that can be used in a Python script to convert data from a tabular format into XML.
- [`dsp-tools id2iri`](./dsp-tools-usage.md#replace-internal-ids-with-iris-in-xml-file)
- [`dsp-tools id2iri`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#replace-internal-ids-with-iris-in-xml-file)
takes an XML file for bulk data import and replaces referenced internal IDs with IRIs. The mapping has to be provided
with a JSON file.
- [`dsp-tools start-stack / stop-stack`](./dsp-tools-usage.md#start-a-dsp-stack-on-your-local-machine)
- [`dsp-tools start-stack / stop-stack`](https://docs.dasch.swiss/latest/DSP-TOOLS/dsp-tools-usage/#start-a-dsp-stack-on-your-local-machine)
assist you in running a DSP stack on your local machine.
6 changes: 5 additions & 1 deletion src/dsp_tools/utils/stack_handling.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ def start_stack(
time.sleep(1)

# inside fuseki, create the "knora-test" repository
# (same behaviour as dsp-api/webapi/target/docker/stage/opt/docker/scripts/fuseki-init-knora-test.sh)
repo_template = requests.get(f"{url_prefix}webapi/scripts/fuseki-repository-config.ttl.template").text
repo_template = repo_template.replace("@REPOSITORY@", "knora-test")
response = requests.post(
Expand All @@ -83,6 +84,7 @@ def start_stack(
"running already?")

# load some basic ontos and data into the repository
# (same behaviour as dsp-api/webapi/target/docker/stage/opt/docker/scripts/fuseki-init-knora-test.sh)
graph_prefix = "http://0.0.0.0:3030/knora-test/data?graph="
ttl_files = [
("knora-ontologies/knora-admin.ttl", "http://www.knora.org/ontology/knora-admin"),
Expand All @@ -91,7 +93,9 @@ def start_stack(
("knora-ontologies/standoff-data.ttl", "http://www.knora.org/data/standoff"),
("knora-ontologies/salsah-gui.ttl", "http://www.knora.org/ontology/salsah-gui"),
("test_data/all_data/admin-data-minimal.ttl", "http://www.knora.org/data/admin"),
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
("test_data/all_data/permissions-data-minimal.ttl", "http://www.knora.org/data/permissions")
("test_data/all_data/permissions-data-minimal.ttl", "http://www.knora.org/data/permissions"),
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
jnussbaum marked this conversation as resolved.
Show resolved Hide resolved
("test_data/ontologies/anything-onto.ttl", "http://www.knora.org/ontology/0001/anything"),
("test_data/all_data/anything-data.ttl", "http://www.knora.org/data/0001/anything")
]
for ttl_file, graph in ttl_files:
ttl_text = requests.get(url_prefix + ttl_file).text
Expand Down