Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Content Updates #36

Merged
merged 1 commit into from
Aug 20, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions workshop/content/concepts.adoc
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
Tekton defines a number of link:https://kubernetes.io/docs/concepts/extend-kubernetes/api-extension/custom-resources/[Kubernetes custom resources] as building blocks in order to standardize pipeline concepts and provide a terminology that is consistent across CI/CD solutions. These custom resources are an extension of the Kubernetes API that let users create and interact with these objects using the OpenShift CLI (`oc`), `kubectl`, and other Kubernetes tools.
Tekton defines a number of link:https://kubernetes.io/docs/concepts/extend-kubernetes/api-extension/custom-resources/[Kubernetes custom resources]
as building blocks in order to standardize pipeline concepts and provide a terminology
that is consistent across CI/CD solutions. These custom resources are an extension of the
Kubernetes API that let users create and interact with these objects using the OpenShift CLI (`oc`), `kubectl`, and other Kubernetes tools.

The custom resources needed to define a pipeline are listed below:

Expand All @@ -16,6 +19,7 @@ In short, in order to create a pipeline, one does the following:
* Create a pipeline and pipeline resources to define your application's delivery pipeline
* Create a pipelinerun to instantiate and invoke the pipeline

For further details on pipeline concepts, refer to the link:https://github.com/tektoncd/pipeline/tree/master/docs#learn-more[Tekton documentation] that provides an excellent guide for understanding various parameters and attributes available for defining `pipelines`.
For further details on pipeline concepts, refer to the link:https://github.com/tektoncd/pipeline/tree/master/docs#learn-more[Tekton documentation]
that provides an excellent guide for understanding various parameters and attributes available for defining pipelines.

In the following sections, you will go through each of the above steps to define and execute a pipeline.
9 changes: 9 additions & 0 deletions workshop/content/exercises/create-pipeline.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,10 @@ Each pipeline has a `tasks` property. Under this property, each `task` has a `na
For this pipeline, it has two tasks named `build` and `deploy`. The `taskRef` property under each
task `name` is where the tasks you just created can be specified as part of the pipeline.

A visual that shows how `s2i-nodejs` is part of the pipeline above can be seen below:

image:../images/task-visual.png[s2i-nodejs Visualization]

You might have noticed that there are no references to the nodejs-ex git repository
and its image in the registry. That's because pipelines in Tekton are designed to
be generic and reusable across environments and stages through the application's lifecycle.
Expand All @@ -74,6 +78,11 @@ defined between the tasks via `inputs` and `outputs` as well as explicit orders
that are defined via `runAfter`. You'll notice the `deploy` task above has a `runAfter`
property specifying to only execute after the `build` task is complete.

A visualization of the pipeline above is shown below to help illustrate what has been
described so far in this section:

image:../images/pipeline-visual.png[Pipeline Visualization]

Create the Pipeline
-------------------

Expand Down
2 changes: 1 addition & 1 deletion workshop/content/exercises/serviceaccount.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ later in this workshop. This is how you will connect the `pipeline` service acco
to the pipelinerun.

For reference, you can add the permissions used by `pipeline` to a service account
using the OpenShift CLI (`oc`) commands below:
using the `oc` commands below:

[source,bash]
----
Expand Down
5 changes: 0 additions & 5 deletions workshop/content/exercises/trigger-pipeline.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -33,11 +33,6 @@ spec:
Under the `spec` property, you'll see the `pipelineRef` property where the pipeline
to be used is specified. You should see the name of the pipeline you created (i.e. `deploy-pipeline`).

Since this pipelinerun will be triggered manually by you using `tkn`, the `trigger`
property is set to `manual`. The `trigger` property is used to specify when the pipeline
should execute. This property is how a webhook can be used to trigger a pipelinerun
via a code commit to a git repository instead of triggering manually.

The last property of the pipelinerun of note is `resources`. This is how the specific
git repository and image registry urls can be entered for the pipelinerun. You'll
see the pipeline resource references you just created in the pipelinerun definition.
Expand Down
Binary file added workshop/content/images/pipeline-visual.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added workshop/content/images/task-visual.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions workshop/content/install-tekton-operator.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ Next, you would click on the **Integration & Delivery** category to find the

image:images/operatorhub.png[OpenShift OperatorHub]

Having trouble viewing the photo above? Simply drag the divider that separates the workshop content
you are reading now from the terminal towards the right. This will expand photos to make
them easier to see.

Click on **OpenShift Pipelines Operator**, **Continue**, and then **Install** as
shown below:

Expand Down
19 changes: 19 additions & 0 deletions workshop/content/workshop-introduction.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Continuous Integration, Continuous Delivery (CI/CD)
--------------------------------------------------

A continuous integration, continuous delivery (CI/CD) pipeline is an automated expression
of your process for getting software from version control right through to your users and customers.
Every change to your software (i.e. committed in source control) goes through a complex
process on its way to being released. This process involves building the software in a
reliable and repeatable manner as well as progressing the built software (called a "build")
through multiple stages of testing and deployment.

Kubernetes
----------

Kubernetes is an open-source system for automating deployment, scaling, and management
of containerized applications.

As OpenShift is a platform based on Kubernetes, there will be Kubernetes concepts
used throughout this workshop. The workshop assumes that attendees will have knowledge
of Kubernetes fundamental concepts.
2 changes: 2 additions & 0 deletions workshop/modules.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
modules:
workshop-introduction:
name: Workshop Introduction
workshop-overview:
name: Workshop Overview
concepts:
Expand Down
1 change: 1 addition & 0 deletions workshop/workshop.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ name: OpenShift Pipelines with Tekton

modules:
activate:
- workshop-introduction
- workshop-overview
- concepts
- install-tekton-operator
Expand Down