Skip to content

Commit

Permalink
Add robots for acceptance level testing (jupyter-server#252)
Browse files Browse the repository at this point in the history
* add robots

* subclass jupyter robots

* working robots

* ignore robots logic for test coverage score

* fix command for running robots in make file

* add dependency on jupyter robot framework

* try a different test image

* setup chromedriver

* try installing chrome

* install chromedriver from source

* install chrome

* try different image

* remove webdriver

* top level docker image in rio

* see if new docker image is used

* build chrome and chromedriver from scratch

* different installing command for chrome

* wrong package name in rio

* just install driver

* install webdriver with new docker image

* remove robots from rio

* update manifest to robots
  • Loading branch information
Zsailer authored and GitHub Enterprise committed Dec 4, 2021
1 parent f3cf1a1 commit 5676e2f
Show file tree
Hide file tree
Showing 14 changed files with 370 additions and 181 deletions.
1 change: 1 addition & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
[run]
omit = data_studio_jupyter_extensions/tests/*
data_studio_jupyter_extensions/config/*
data_studio_jupyter_extensions/robots/*
conftest.py
2 changes: 2 additions & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ graft data_studio_jupyter_extensions/labextension
recursive-include data_studio_jupyter_extensions *.html
recursive-include data_studio_jupyter_extensions *.png
recursive-include data_studio_jupyter_extensions *.yaml
recursive-include data_studio_jupyter_extensions *.robot
recursive-include docs *.md

# Javascript files
Expand All @@ -33,6 +34,7 @@ prune lib
prune binder
prune builder.d
exclude Dockerfile
graft atests

# Patterns to exclude from any directory
global-exclude *~
Expand Down
9 changes: 6 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# Makefile for data_studio_jupyter_extensions

.PHONY: install install-dev build watch run-remote run-local test version dist build-docker checkout-docker-deps run-docker-local run-docker-remote run-docker-local-dev test-python
.PHONY: install install-dev build watch run-remote run-local test version dist build-docker checkout-docker-deps run-docker-local run-docker-remote run-docker-local-dev test-python robots

install:
pip install -q ".[test]" --index-url https://pypi.apple.com/simple
pip install -q ".[test,robots]" --index-url https://pypi.apple.com/simple

install-dev:
pip install -q -e ".[test]" --index-url https://pypi.apple.com/simple
pip install -q -e ".[test,robots]" --index-url https://pypi.apple.com/simple
jupyter server extension enable data_studio_jupyter_extensions
jupyter labextension develop . --overwrite
# Build JupyterLab to pick up source maps
Expand All @@ -19,6 +19,9 @@ build:
watch:
jlpm run watch

robots:
robot --log NONE --report NONE --output NONE --name "Robot Tests" atests

test-python:
pytest --cov=data_studio_jupyter_extensions
coverage report --fail-under=80
Expand Down
178 changes: 2 additions & 176 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,180 +2,6 @@

**JupyterLab, "Data Studio style"**

[CI Link](https://rio.apple.com/projects/aci-pie-data-studio-data-studio-jupyter-extensions?search=&sortBy=name)
[RIO CI/CD Page](https://rio.apple.com/projects/aci-pie-data-studio-data-studio-jupyter-extensions?search=&sortBy=name)

The repository exposes a single entry-point, `jupyter datastudio`.

When you run this command, you a get an enhanced JupyterLab experience built specifically for Data Studio customers.

## JupyterLab Data Studio Modes

JupyterLab Data Studio depends on the Data Studio Team's [Notebook Service](https://github.pie.apple.com/pie-data-studio/notebook-service). The Notebook Service is designed to run on a Kubernetes cluster, so JupyterLab DS is meant to run side-by-side with Notebook Service in that cluster. However, JupyterLab DS can also run locally.

JupyterLab Data Studio has "modes"——preset configurations to configure the application for whereever you're running it. These modes include:

- `local-local`: local JupyterLab DS pointing at a running, [mock Notebook Service](https://github.pie.apple.com/pie-data-studio/notebook_service_gateway), running kernels locally.
- `local-cluster`: local JupyterLab DS pointing at a remote Notebook Service running on a Kubernetes Cluster.
- `cluster-cluster`: remote JupyterLab DS pointing at a remote Notebook Service.

## Getting started

Install this package from Apple's PyPI repository:

```bash
pip install data_studio_jupyter_extensions --index-url https://pypi.apple.com/simple
```

If you'd like to run a fully local version of JupyterLab Data Studio, install the (mock) Notebook Service package, [notebook_service_gateway](https://github.pie.apple.com/pie-data-studio/notebook_service_gateway). Launch that application in a separate terminal window.

Then run JupyterLab Data Studio in `local-local` mode:

```
jupyter datastudio --mode local-local
```

JupyterLab Data Studio can be run against a remote Notebook Service using by simply switching the mode:

```
jupyter datastudio --mode local-cluster
```

When running the application against a remote Notebook Service, JupyterLab Data Studio expects the following variables in the current environment:

```
DATASTUDIO_PROJECT_ID=... # Get from Data Studio Projects page
DATASTUDIO_NOTEBOOK_ID=... # Get from Data Studio Notebook Servers page.
DATASTUDIO_UI_URL=https://ds-int.apple.com/ # Data Studio's UI page.
DATASTUDIO_API_URL=https://notebook-service-dev.us-east-1a.app.apple.com/api/v1 # Data Studio's API URL
DS_NAMESPACE=spark-data-studio-test-us-west-3a-dev1 # Get from Kubenetes cluster. Namespace where kernels are running.
ISSUER=https://iam.corp.apple.com
KEY_URL=https://iam.corp.apple.com/oauth2/v3/certs
AUDIENCE=notebook-server-int
CLIENT_ID=d314ef93-a7b3-4451-b68d-51bd395ba7f2
HBPORT=18525
SHELLPORT=27965
IOPUBPORT=6232
STDINPORT=10270
CONTROLPORT=25232
API_TOKEN=... # Get from Data Studio page. This is the cookie value for "datastudio-session-token`.
IAS_CLIENT_SECRET=... # [SECRET] Get from Data Studio UI Team.
IAS_CLIENT_ID=... # [SECRET] Get from Data Studio UI Team
```

## Local development

Install using the Makefile. This installs a local, development version of the Python package and Typescript extensions.

```
make install-dev
```

To run against a with Notebook Service

```bash
int:sc
export API_TOKEN="<api token from ds int ui"
make run-remote
```

To use with local kernels

```bash
make run-local
```

To update the lab extension

```bash
make build
```

To watch the lab extension

```bash
make watch
```

## Development

Run the tests using pytest:

```bash
pytest
```

You can invoke the pre-commit hook manually at any time with

```bash
pre-commit run
```

To upgrade the package version, run:

```bash
tbump --only-patch <version>
```

## Docker Development

### Build

First, build the docker container:

```
make build-docker
```

The Docker image comes with multiple modes.

### Local Mode

Local mode means you're running a "local" Notebook Service. This is useful for local development. We recommend using the mock Notebook Service Gateway service for local development. If using this service, start the Docker container against this local service using:

```bash
make run-docker-local
```

To run with local checkouts of first party extensions, run:

```bash
make checkout-deps
```

Check out desired branch(es) and make desired edits in the `checkout/` folder.

Then, start the local checkout version of `notebook-service` in another terminal and run:

```bash
make run-docker-local-dev
```

### Remote Mode

Remote mode means "remote Notebook Service". Multiple environment variables must be set in the running Docker container to ensure the Data Studio JupyterLab app can properly communicate with this remote service.

```
DATASTUDIO_PROJECT_ID=... # Get from Data Studio Projects page
DATASTUDIO_NOTEBOOK_ID=... # Get from Data Studio Notebook Servers page.
DATASTUDIO_UI_URL=https://ds-int.apple.com/ # Data Studio's UI page.
DATASTUDIO_API_URL=https://notebook-service-dev.us-east-1a.app.apple.com/api/v1 # Data Studio's API URL
DS_NAMESPACE=spark-data-studio-test-us-west-3a-dev1 # Get from Kubenetes cluster. Namespace where kernels are running.
ISSUER=https://iam.corp.apple.com
KEY_URL=https://iam.corp.apple.com/oauth2/v3/certs
AUDIENCE=notebook-server-int
CLIENT_ID=d314ef93-a7b3-4451-b68d-51bd395ba7f2
HBPORT=18525
SHELLPORT=27965
IOPUBPORT=6232
STDINPORT=10270
CONTROLPORT=25232
API_TOKEN=... # Get from Data Studio page. This is the cookie value for "datastudio-session-token`.
IAS_CLIENT_SECRET=... # [SECRET] Get from Data Studio UI Team.
IAS_CLIENT_ID=... # [SECRET] Get from Data Studio UI Team
```

If these environment variables are stored in a local file named `myenv.sh` (DO NOT check this file into git), start the Docker container against a remote Notebook Service using:

```
make run-docker-remote
```
This repository provides a list of JupyterLab plugins that, together, create the Data Studio Notebooks Experience.
28 changes: 28 additions & 0 deletions atests/TestNotebookService.robot
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
*** Settings ***

Library data_studio_notebook_service_gateway.robots.NotebookService
Library data_studio_jupyter_extensions.robots.DataStudioNotebooks

*** Keyword ***

Launch Data Studio Notebooks
Setup Notebook Service
${server} = Start New Jupyter Server
Wait for Jupyter Server to be ready ${server}
Authenticate with Jupyter Server ${server}

Teardown Data Studio Notebooks
Terminate all Jupyter Servers
Teardown Notebook Service

*** Test Cases ***

Talk to Notebook Service
[Setup] Launch Data Studio Notebooks
Open JupyterLab headlesschrome
Launch a new JupyterLab Document Python 3 (ipykernel)
Wait Until JupyterLab Kernel Is Idle
Add and Run JupyterLab Code Cell print("hello world")
Wait Until JupyterLab Kernel Is Idle
Page Should Contain hello world
[Teardown] Teardown Data Studio Notebooks
2 changes: 2 additions & 0 deletions build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ set -ex

cd /workspace


echo "++++++++++++++++++++CI SETUP+++++++++++++++++++++++++++++++++++"
export NODE_OPTIONS="--max-old-space-size=4096"
export NPM_CONFIG_CAFILE="/etc/ssl/certs/ca-certificates.crt"
Expand All @@ -12,6 +13,7 @@ export PIP_INDEX_URL=https://pypi.apple.com/simple
export PIP_CACHE_DIR=${CI_CACHE_DIR}/pip
export PRE_COMMIT_HOME=${CI_CACHE_DIR}/pre-commit


curl https://artifacts.apple.com/conda-dist/miniconda/Miniconda3-py38_4.8.3-Linux-x86_64.sh --output miniconda.sh
bash miniconda.sh -b -p ./miniconda
rm miniconda.sh
Expand Down
4 changes: 4 additions & 0 deletions data_studio_jupyter_extensions/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@
from jupyter_core.paths import jupyter_runtime_dir
from jupyter_server.extension.application import ExtensionApp
from jupyter_server.extension.application import ExtensionAppJinjaMixin
from jupyter_server.serverapp import aliases as jpserver_aliases
from jupyter_server.serverapp import flags as jpserver_flags
from traitlets import default
from traitlets import TraitError
from traitlets import Type
Expand Down Expand Up @@ -41,6 +43,7 @@
]

aliases = {name: f"DataStudioJupyterExtensions.{name}" for name in alias_list}
aliases.update(jpserver_aliases)


class DataStudioJupyterExtensions(ExtensionAppJinjaMixin, ExtensionApp):
Expand All @@ -66,6 +69,7 @@ class DataStudioJupyterExtensions(ExtensionAppJinjaMixin, ExtensionApp):
# in the --help.
classes = [JWTAuthenticator, HubbleAgentConfigurable]

flags = jpserver_flags
aliases = aliases

mode = UnicodeFromEnv(
Expand Down
42 changes: 42 additions & 0 deletions data_studio_jupyter_extensions/robots/DataStudioNotebooks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
from JupyterLibrary.core import JupyterLibrary
from JupyterLibrary.core import JupyterLibraryListener

from .DataStudioNotebooksServerKeywords import DataStudioNotebooksServerKeywords


component_classes = [DataStudioNotebooksServerKeywords]


class DataStudioNotebooks(JupyterLibrary):
"""JupyterLibrary is a Jupyter testing library for Robot Framework."""

def __init__(
self,
timeout=5.0,
implicit_wait=0.0,
run_on_failure="Capture Page Screenshot",
screenshot_root_directory=None,
**kwargs
):
"""JupyterLibrary can be imported with several optional arguments.
- ``timeout``:
Default value for `timeouts` used with ``Wait ...`` keywords.
- ``implicit_wait``:
Default value for `implicit wait` used when locating elements.
- ``run_on_failure``:
Default action for the `run-on-failure functionality`.
- ``screenshot_root_directory``:
Location where possible screenshots are created. If not given,
the directory where the log file is written is used.
"""
super(JupyterLibrary, self).__init__(
timeout=timeout,
implicit_wait=implicit_wait,
run_on_failure=run_on_failure,
screenshot_root_directory=screenshot_root_directory,
**kwargs
)
self.add_library_components(
[Component(self) for Component in component_classes]
)
self.ROBOT_LIBRARY_LISTENER = JupyterLibraryListener()
Loading

0 comments on commit 5676e2f

Please sign in to comment.