From d8fb96bf524486aa8c707ad245a77c55ab044c53 Mon Sep 17 00:00:00 2001 From: Malte Sander Date: Mon, 30 Sep 2024 12:50:29 +0200 Subject: [PATCH] Apply suggestions from code review --- docs/modules/airflow/pages/getting_started/installation.adoc | 2 +- docs/modules/airflow/pages/index.adoc | 2 +- docs/modules/airflow/pages/usage-guide/mounting-dags.adoc | 2 +- .../pages/usage-guide/operations/graceful-shutdown.adoc | 2 +- docs/modules/airflow/pages/usage-guide/overrides.adoc | 2 +- docs/modules/airflow/pages/usage-guide/security.adoc | 4 ++-- 6 files changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/modules/airflow/pages/getting_started/installation.adoc b/docs/modules/airflow/pages/getting_started/installation.adoc index 4d74d26b..f056ccbc 100644 --- a/docs/modules/airflow/pages/getting_started/installation.adoc +++ b/docs/modules/airflow/pages/getting_started/installation.adoc @@ -31,7 +31,7 @@ Follow the instructions of those components for a production setup. == Stackable operators There are multiple ways to install the Stackable operator for Apache Airflow. -xref:management:stackablectl:index.adoc[] is the preferred way but Helm is also supported. +xref:management:stackablectl:index.adoc[] is the preferred way, but Helm is also supported. OpenShift users may prefer installing the operator from the RedHat Certified Operator catalog using the OpenShift web console. [tabs] diff --git a/docs/modules/airflow/pages/index.adoc b/docs/modules/airflow/pages/index.adoc index 3effeaa3..58a32b69 100644 --- a/docs/modules/airflow/pages/index.adoc +++ b/docs/modules/airflow/pages/index.adoc @@ -97,7 +97,7 @@ When using `spec.kubernetesExecutors` the scheduler takes direct responsibility == Using custom workflows/DAGs {dags}[Direct acyclic graphs (DAGs) of tasks] are the core entities you use in Airflow. -Have a look at the page on xref:usage-guide/mounting-dags.adoc[] to learn about the different ways of loading your custom DAGs into Airflow. +Take a look at the page on xref:usage-guide/mounting-dags.adoc[] to learn about the different ways of loading your custom DAGs into Airflow. == Demo diff --git a/docs/modules/airflow/pages/usage-guide/mounting-dags.adoc b/docs/modules/airflow/pages/usage-guide/mounting-dags.adoc index 59a5af5d..6f106213 100644 --- a/docs/modules/airflow/pages/usage-guide/mounting-dags.adoc +++ b/docs/modules/airflow/pages/usage-guide/mounting-dags.adoc @@ -49,7 +49,7 @@ include::example$example-airflow-gitsync.yaml[] <6> The depth of syncing i.e. the number of commits to clone (defaults to 1) <7> The synchronisation interval in seconds (defaults to 20 seconds) <8> The name of the Secret used to access the repository if it is not public. - This should include two fields: `user` and `password` (which can be either a password -- which is not recommended -- or a github token, as described https://github.com/kubernetes/git-sync/tree/v3.6.4#flags-which-configure-authentication[here]) + This should include two fields: `user` and `password` (which can be either a password -- which is not recommended -- or a GitHub token, as described https://github.com/kubernetes/git-sync/tree/v3.6.4#flags-which-configure-authentication[here]) <9> A map of optional configuration settings that are listed in https://github.com/kubernetes/git-sync/tree/v4.2.1?tab=readme-ov-file#manual[this] configuration section (and the ones that follow on that link) <10> An example showing how to specify a target revision (the default is HEAD). The revision can also be a tag or a commit, though this assumes that the target hash is contained within the number of commits specified by `depth`. diff --git a/docs/modules/airflow/pages/usage-guide/operations/graceful-shutdown.adoc b/docs/modules/airflow/pages/usage-guide/operations/graceful-shutdown.adoc index ffe19aad..29fee9fd 100644 --- a/docs/modules/airflow/pages/usage-guide/operations/graceful-shutdown.adoc +++ b/docs/modules/airflow/pages/usage-guide/operations/graceful-shutdown.adoc @@ -4,7 +4,7 @@ You can configure the graceful shutdown as described in xref:concepts:operations The Airflow processes receive a `SIGTERM` signal when Kubernetes wants to terminate the Pod. The Pod logs the received signal as shown in the log below and initiate a graceful shutdown. -After the graceful shutdown timeout runs out, and the process still didn't exit, Kubernetes issues a `SIGKILL` signal. +After the graceful shutdown timeout runs out, and the process still did not exit, Kubernetes issues a `SIGKILL` signal. == Scheduler diff --git a/docs/modules/airflow/pages/usage-guide/overrides.adoc b/docs/modules/airflow/pages/usage-guide/overrides.adoc index 122407c7..3ed11cd5 100644 --- a/docs/modules/airflow/pages/usage-guide/overrides.adoc +++ b/docs/modules/airflow/pages/usage-guide/overrides.adoc @@ -5,7 +5,7 @@ The cluster definition allows overriding configuration properties and environment variables per role or role group, with role group overrides taking precedence. -IMPORTANT: Overriding operator-set properties (e.g., HTTP port) may cause issues. +IMPORTANT: Overriding operator-set properties (e.g. HTTP port) may cause issues. Additionally, ensure consistent configurations across roles. Not all roles use each setting, but some things -- such as external endpoints -- need to be consistent to avoid problems. diff --git a/docs/modules/airflow/pages/usage-guide/security.adoc b/docs/modules/airflow/pages/usage-guide/security.adoc index 44a8c647..3c8fea46 100644 --- a/docs/modules/airflow/pages/usage-guide/security.adoc +++ b/docs/modules/airflow/pages/usage-guide/security.adoc @@ -3,7 +3,7 @@ :airflow-access-control-docs: https://airflow.apache.org/docs/apache-airflow/stable/security/access-control.html Secure Apache Airflow by configuring user authentication and authorization. -Airflow provides built-in user and role management, but can also connected to an LDAP server to manage users centrally instead. +Airflow provides built-in user and role management, but can also connect to a LDAP server to manage users centrally instead. == Authentication @@ -19,7 +19,7 @@ image::airflow_security.png[Airflow Security menu] === LDAP -Airflow supports xref:concepts:authentication.adoc[user authentication] via an LDAP server. +Airflow supports xref:concepts:authentication.adoc[user authentication] via LDAP. Set up an AuthenticationClass for the LDAP server and reference it in the Airflow Stacklet resource as shown: [source,yaml]