Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: adding sphinx documentation for the dandi-cli #712

Merged
merged 10 commits into from
Jul 16, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 42 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: Build Docs

on:
push:
branches:
- master
pull_request:

jobs:
docs:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python:
- 3.7
#- 3.8
#- 3.9
steps:
- name: Check out repository
uses: actions/checkout@v2
with:
# Fetch all commits so that versioneer will return something compatible
# with semantic-version
fetch-depth: 0

- name: Set up Python ${{ matrix.python }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python }}

- name: Install hdf5 (Ubuntu)
if: matrix.python == '3.9' && startsWith(matrix.os, 'ubuntu')
run: sudo apt-get update && sudo apt-get install -y libhdf5-dev

- name: Install dependencies
run: |
python -m pip install --upgrade pip wheel
python -m pip install --upgrade tox

- name: Build docs
run: tox -e docs
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,14 @@
.*.swp
.coverage
.coverage.*
.docker/
.eggs
.idea
.tox/
__pycache__/
build/
dist/
docs/**/generated/
pip-wheel-metadata/
sandbox/
venv/
Expand Down
46 changes: 20 additions & 26 deletions dandi/dandiapi.py
Original file line number Diff line number Diff line change
Expand Up @@ -526,16 +526,13 @@ def upload_raw_asset(
this version of the Dandiset and return the resulting asset. Blocks
until the upload is complete.

Parameters
----------
filepath: str or PathLike
the path to the local file to upload
asset_metadata: dict
Metadata for the uploaded asset file. Must include a "path" field
giving the POSIX path at which the uploaded file will be placed on
the server.
jobs: int
Number of threads to use for uploading; defaults to 5
:param filepath: the path to the local file to upload
:type filepath: str or PathLike
:param dict asset_metadata:
Metadata for the uploaded asset file. Must include a "path" field
giving the POSIX path at which the uploaded file will be placed on
the server.
:param int jobs: Number of threads to use for uploading; defaults to 5
"""
for status in self.iter_upload_raw_asset(filepath, asset_metadata, jobs=jobs):
if status["status"] == "done":
Expand All @@ -553,22 +550,19 @@ def iter_upload_raw_asset(
this version of the Dandiset, returning a generator of status
`dict`\\s.

Parameters
----------
filepath: str or PathLike
the path to the local file to upload
asset_metadata: dict
Metadata for the uploaded asset file. Must include a "path" field
giving the POSIX path at which the uploaded file will be placed on
the server.
jobs: int
Number of threads to use for uploading; defaults to 5

Returns
-------
A generator of `dict`\\s containing at least a ``"status"`` key. Upon
successful upload, the last `dict` will have a status of ``"done"`` and
an ``"asset"`` key containing the resulting `RemoteAsset`.
:param filepath: the path to the local file to upload
:type filepath: str or PathLike
:param dict asset_metadata:
Metadata for the uploaded asset file. Must include a "path" field
giving the POSIX path at which the uploaded file will be placed on
the server.
:param int jobs:
Number of threads to use for uploading; defaults to 5
:returns:
A generator of `dict`\\s containing at least a ``"status"`` key.
Upon successful upload, the last `dict` will have a status of
``"done"`` and an ``"asset"`` key containing the resulting
`RemoteAsset`.
"""
from .support.digests import get_dandietag

Expand Down
27 changes: 12 additions & 15 deletions dandi/dandiarchive.py
Original file line number Diff line number Diff line change
Expand Up @@ -206,15 +206,11 @@ def get_assets(self, client: DandiAPIClient) -> Iterator[RemoteAsset]:
def navigate_url(url):
"""Context manager to 'navigate' URL pointing to DANDI archive.

Parameters
----------
url: str
URL which might point to a dandiset, a folder, or an asset(s)

Yields
------
client, dandiset, assets (generator)
`client` will have established a session for the duration of the context
:param str url: URL which might point to a dandiset, a folder, or an
asset(s)

:returns: Generator of one ``(client, dandiset, assets)``; ``client`` will
have established a session for the duration of the context
"""
parsed_url = parse_dandi_url(url)
with parsed_url.navigate() as (client, dandiset, assets):
Expand Down Expand Up @@ -337,27 +333,28 @@ class _dandi_url_parser:

@classmethod
def parse(cls, url, *, map_instance=True):
"""Parse url like and return server (address), asset_id and/or directory
"""
Parse url like and return server (address), asset_id and/or directory

Example URLs (as of 20210428):

- Dataset landing page metadata:
https://gui.dandiarchive.org/#/dandiset/000003

Individual and multiple files:
- dandi???

- dandi???

Multiple selected files + folders -- we do not support ATM, then further
RFing would be due, probably making this into a generator or returning a
list of entries.

"Features":

- uses some of `known_instance`s to map some urls, e.g. from
- uses some of `known_instances` to map some urls, e.g. from
gui.dandiarchive.org ones into girder.

Returns
-------
ParsedDandiURL
:rtype: ParsedDandiURL
"""
lgr.debug("Parsing url %s", url)

Expand Down
14 changes: 7 additions & 7 deletions dandi/tests/skip.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@

There are two main ways to skip in pytest:

* decorating a test function, such as
* decorating a test function, such as::

@pytest.mark.skip(sys.platform.startswith("win"), reason="on windows")
def test_func():
[...]

* skipping inline, such as
* skipping inline, such as::

def test_func():
if sys.platform.startswith("win"):
Expand All @@ -28,16 +28,16 @@ def test_func():
This module provides a mechanism to register a reason and condition as both a
decorator and an inline function:

* Within this module, create a condition function that returns a tuple of the
form (REASON, COND). REASON is a str that will be shown as the reason for
the skip, and COND is a boolean indicating if the test should be skipped.
* Within this module, create a condition function that returns a tuple of the
form (REASON, COND). REASON is a str that will be shown as the reason for
the skip, and COND is a boolean indicating if the test should be skipped.

For example
For example::

def windows():
return "on windows", sys.platform.startswith("win")

* Then add the above function to CONDITION_FNS.
* Then add the above function to CONDITION_FNS.

Doing that will make the skip condition available in two places:
`mark.skipif_NAME` and `skipif.NAME`. So, for the above example, there would
Expand Down
20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?= -W
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
File renamed without changes.
35 changes: 35 additions & 0 deletions docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build

if "%1" == "" goto help

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
2 changes: 2 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
alabaster
Sphinx
16 changes: 16 additions & 0 deletions docs/source/cmdline/delete.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
:program:`dandi delete`
=======================

::

dandi [<global options>] delete [<options>] [<paths> ...]

Delete dandisets and assets from the server.

PATH could be a local path or a URL to an asset, directory, or an entire
dandiset.

Options
-------

.. option:: --skip-missing
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh -- you just manually produced them? great for now but we should look into making them generated automagically... may be smth like https://github.com/click-contrib/sphinx-click could be used ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't really like the way the sphinx-click output looks.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May be there is a way to improve that? I just fear that manually produced docs would quickly become our of sync with actual CLI

15 changes: 15 additions & 0 deletions docs/source/cmdline/digest.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
:program:`dandi digest`
=======================

::

dandi [<global options>] digest [<options>] [<path> ...]

Calculate file digests

Options
-------

.. option:: -d, --digest [dandi-etag|md5|sha1|sha256|sha512]

Digest algorithm to use [default: dandi-etag]
36 changes: 36 additions & 0 deletions docs/source/cmdline/download.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
:program:`dandi download`
=========================

::

dandi [<global options>] download [<options>] [<url> ...]

Options
-------

.. option:: -o, --output-dir <dir>

Directory where to download to (directory must exist). Files will be
downloaded with paths relative to that directory.

.. option:: -e, --existing [error|skip|overwrite|overwrite-different|refresh]

What to do if a file found existing locally. 'refresh': verify
that according to the size and mtime, it is the same file, if not -
download and overwrite.

.. option:: -f, --format [pyout|debug]

Choose the format/frontend for output.

.. option:: -J, --jobs <int>

Number of parallel download jobs.

.. option:: --download [dandiset.yaml,assets,all]

Comma-separated list of elements to download

.. option:: --sync

Delete local assets that do not exist on the server
8 changes: 8 additions & 0 deletions docs/source/cmdline/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
**********************
Command-Line Interface
**********************

.. toctree::
:glob:

*
Loading