Skip to content

Commit

Permalink
docs: Update cloud upload docs (#168)
Browse files Browse the repository at this point in the history
  • Loading branch information
joeyparrish authored Oct 28, 2024
1 parent 7bb46d4 commit fddf413
Show file tree
Hide file tree
Showing 3 changed files with 27 additions and 11 deletions.
5 changes: 2 additions & 3 deletions docs/source/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -158,9 +158,8 @@ All input types are read directly by ``TranscoderNode``. If the input type is
loop that input file indefinitely.

If the ``-c`` option is given with a Google Cloud Storage URL, then an
additional node called ``CloudNode`` is added after ``PackagerNode``. It runs a
thread which watches the output of the packager and pushes updated files to the
cloud.
additional node called ``ProxyNode`` is added after ``PackagerNode``. It runs a
local webserver which takes the output of packager and pushes to cloud storage.

The pipeline and the nodes in it are constructed by ``ControllerNode`` based on
your config files. If you want to write your own front-end or interface
Expand Down
30 changes: 24 additions & 6 deletions docs/source/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,21 +119,29 @@ Cloud Storage (optional)
------------------------

Shaka Streamer can push content directly to a Google Cloud Storage or Amazon S3
bucket. To use this feature, the Google Cloud SDK is required.

See https://cloud.google.com/sdk/install for details on installing the Google
Cloud SDK on your platform.
bucket. To use this feature, additional Python modules are required.


Google Cloud Storage
~~~~~~~~~~~~~~~~~~~~

If you haven’t already, you will need to initialize your gcloud environment and
log in through your browser.
First install the Python module if you haven't yet:

.. code:: sh
python3 -m pip install google-cloud-storage
To use the default authentication, you will need default application
credentials installed. On Linux, these live in
``~/.config/gcloud/application_default_credentials.json``.

The easiest way to install default credentials is through the Google Cloud SDK.
See https://cloud.google.com/sdk/docs/install-sdk to install the SDK. Then run:

.. code:: sh
gcloud init
gcloud auth application-default login
Follow the instructions given to you by gcloud to initialize the environment
and login.
Expand All @@ -142,9 +150,19 @@ and login.
Amazon S3
~~~~~~~~~

First install the Python module if you haven't yet:

.. code:: sh
python3 -m pip install boto3
To authenticate to Amazon S3, you can either add credentials to your `boto
config file`_ or login interactively using the `AWS CLI`_.

.. code:: sh
aws configure
Test Dependencies (optional)
----------------------------
Expand Down
3 changes: 1 addition & 2 deletions shaka-streamer
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,7 @@ def main():
parser.add_argument('-o', '--output',
default='output_files',
help='The output folder to write files to, or an HTTP ' +
'or HTTPS URL where files will be PUT.' +
'Used even if uploading to cloud storage.')
'or HTTPS URL where files will be PUT.')
parser.add_argument('--skip-deps-check',
action='store_true',
help='Skip checks for dependencies and their versions. ' +
Expand Down

0 comments on commit fddf413

Please sign in to comment.