Skip to content

Commit

Permalink
Update delta Dockerfile
Browse files Browse the repository at this point in the history
(Cherry-pick of c3020e1 to branch-2.0)

- update the delta Dockefile to install twine, setuptools, and wheel
- remove the `pip install` of the above packages from the pypi tests that are run as part of the python unit tests
- do not uninstall pyspark during unit test. **Without this change** when we install the local delta-spark artifact, it will re-install pyspark, which is not what we want.

GitOrigin-RevId: 9971993e0de5e35e24293387177305039960671c
  • Loading branch information
scottsand-db authored and vkorukanti committed Jan 6, 2023
1 parent e991f71 commit 955cff4
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 9 deletions.
4 changes: 4 additions & 0 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,10 @@ jobs:
pipenv run pip install flake8==3.5.0 pypandoc==1.3.3
pipenv run pip install importlib_metadata==3.10.0
pipenv run pip install mypy==0.910
pipenv run pip install cryptography==37.0.4
pipenv run pip install twine==4.0.1
pipenv run pip install wheel==0.33.4
pipenv run pip install setuptools==41.0.1
- name: Run Scala/Java and Python tests
run: |
pipenv run python run-tests.py
10 changes: 10 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,16 @@ RUN pip install mypy==0.910

RUN pip install importlib_metadata==3.10.0

RUN pip install cryptography==37.0.4

# We must install cryptography before twine. Else, twine will pull a newer version of
# cryptography that requires a newer version of Rust and may break tests.
RUN pip install twine==4.0.1

RUN pip install wheel==0.33.4

RUN pip install setuptools==41.0.1

# Do not add any non-deterministic changes (e.g., copy from files
# from repo) in this Dockerfile, so that the docker image
# generated from this can be reused across builds.
15 changes: 6 additions & 9 deletions python/run-tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,13 +115,13 @@ def run_mypy_tests(root_dir):

def run_pypi_packaging_tests(root_dir):
"""
We want to test that the PyPi artifact for this delta version can be generated,
We want to test that the delta-spark PyPi artifact for this delta version can be generated,
locally installed, and used in python tests.
We will uninstall any existing local delta PyPi artifact.
We will generate a new local delta PyPi artifact.
We will uninstall any existing local delta-spark PyPi artifact.
We will generate a new local delta-spark PyPi artifact.
We will install it into the local PyPi repository.
And then we will run relevant python tests to ensure everything works as exepcted.
And then we will run relevant python tests to ensure everything works as expected.
"""
print("##### Running PyPi Packaging tests #####")

Expand All @@ -130,10 +130,7 @@ def run_pypi_packaging_tests(root_dir):
version = fd.readline().split('"')[1]

# uninstall packages if they exist
run_cmd(["pip3", "uninstall", "--yes", "delta-spark", "pyspark"], stream_output=True)

# install helper pip packages
run_cmd(["pip3", "install", "wheel", "twine", "setuptools", "--upgrade"], stream_output=True)
run_cmd(["pip3", "uninstall", "--yes", "delta-spark"], stream_output=True)

wheel_dist_dir = path.join(root_dir, "dist")

Expand All @@ -152,7 +149,7 @@ def run_pypi_packaging_tests(root_dir):
version_formatted = version.replace("-", "_")
delta_whl_name = "delta_spark-" + version_formatted + "-py3-none-any.whl"

# this will install delta-spark-$version and pyspark
# this will install delta-spark-$version
install_whl_cmd = ["pip3", "install", path.join(wheel_dist_dir, delta_whl_name)]
run_cmd(install_whl_cmd, stream_output=True)

Expand Down

0 comments on commit 955cff4

Please sign in to comment.