Categories
OpenShift

Triggering Tekton Pipelines for only specific branches

Tekton is a fantastic open source cloud native pipeline system, delivered and supported by Red Hat as OpenShift Pipelines on the OpenShift platform. Pipelines may be triggered by various actions on a Git repository via the webhook process.

Many different actions will take place on a Git repository, such as pushing commits, creation and merge of pull requests etc. Given that most teams will use different branches for specific tasks within the development lifecycle it is fair to assume that not all events should trigger the same operations on all branches. For example a commit to a feature branch may require a simple build and test process, whereas a commit to a release branch may require a full build, test, vulnerability analysis, deployment to QA and full QA testing cycle to be initiated.

So the requirement is to have a filtering mechanism available, within the Tekton Trigger process, to allow the specific action / branch / <other-condition> to be used to select whether a particular Tekton pipeline is triggered.

The main Tekton Triggers documentation is available here. This content covers the use of event listeners, trigger templates and trigger bindings, for which a further document here may be useful in understanding how the various parts fit together. The Tekton trigger documentation page also covers the use of interceptors, which can be used to perform filtering of the webhook payload. Interceptors can also perform verification of the identity of the triggering repository, using a secret and transformation and extraction of specific content from the webhook payload.

After the webhook data has been processed by the interceptor it goes to the trigger which uses the trigger binding to forward specific fields to the trigger template. The trigger template then forwards the data to the pipeline to initiate the pipeline run. The diagram in figure 1 shows the event listener object and how it interacts with the trigger binding and trigger template to eventually call the pipeline. The interceptor process takes place within the event listener.

Figure 1 : Components of a Tekton triggering process

A further function of the interceptor is to create new fields that contain extractions / manipulations of fields within the webhook body. Such fields go forward to the trigger binding as described below.

Which SCM system is being used ?

Specific interception capabilities and syntax exist depending on whether you are using GitHub, GitLab or BitBucket. Take a look at the system specific documentation here for more information.

Extending the event listener – Filtering

In order to filter on specific content within the body of the webhook payload specific fields need to be added to the event listener. The example below shows the inclusion of the interceptor section of the trigger for which two fields from the webhook payload and a header are being filtered.

(Note that I use a local Gogs instance running on OpenShift as it is easy to create test repositories for triggering builds. Since my test cluster is behind a VPN connection it is effectively hidden from GitHub so a local instance of Git is required).

The Git webhook payload that the filter will operate on is shown after the event listener text. The full range of syntax that can be used is included here in the Tekton reference page for the Common Expression Language (CEL).

apiVersion: triggers.tekton.dev/v1alpha1
kind: EventListener
metadata:
  name: pipeline-test-listener-interceptor
spec:
  serviceAccountName: pipeline
  triggers:
  - name: github-listener
    interceptors:
    - ref:
        name: "cel"
      params:
      - name: "filter"
        value: 'body["ref"].contains("main") && body["repository"]["name"] == "pipeline-test-gogs" && header.match("X-Gogs-Event", "push")'
      - name: "overlays"
        value:
        - key: branch
          expression: "body.ref.split('/')[2]"
    bindings:
    - ref: pipeline-test-binding
    template:
      ref: pipeline-test-trigger-template

The filtering action

The important section of the event listener with respect to the filtering process is the value associated with the “filter” parameter. This line takes three distinct filter expressions and combines them together with the AND (&&) operator. The individual parts are :

  • body[“ref”].contains(“main”) – Filter on the presence of the word main in the branch name.
  • body[“repository”][“name”] == “pipeline-test-gogs” – Filter on the value indicated for the field repository.name.
  • header.match(“X-Gogs-Event”, “push”)’ – Filter on the Gogs event “push”.

In the JSON content below the fields referred to in the filter above can be clearly seen. All fields below are prefixed with ‘body’ when referenced by the event listener.

Webhook payload text (abbreviated slightly)

{
  "secret": "",
  "ref": "refs/heads/main",
  "before": "7485758892c17e098cc68157036e1d551f38608e",
  "after": "7485758892c17e098cc68157036e1d551f38608e",
  "compare_url": "",
  "commits": [
    {
    
    }
  ],
  "repository": {
    "id": 2,
    "owner": {
      "id": 1,
      "login": "mroberts"
    },
    "name": "pipeline-test-gogs",
    "full_name": "mroberts/pipeline-test-gogs",
    "description": "",
    "private": false,
    "fork": false,
    "html_url": "http://gogs.apps.skylake.demolab.local/mroberts/pipeline-test-gogs",
    "ssh_url": "gogs@gogs.apps.skylake.demolab.local:mroberts/pipeline-test-gogs.git",
    "clone_url": "http://gogs.apps.skylake.demolab.local/mroberts/pipeline-test-gogs.git",
    "website": "",
    "stars_count": 0,
    "forks_count": 0,
    "watchers_count": 1,
    "open_issues_count": 0,
    "default_branch": "master",
    "created_at": "2022-05-11T10:50:04Z",
    "updated_at": "2022-05-11T14:02:41Z"
  },
  "pusher": {
    "id": 1,
    "login": "mroberts",
    "full_name": "",
    "email": "mroberts@redhat.com",
    "avatar_url": "https://secure.gravatar.com/avatar/36e314427be65ccf15b32827d888f816",
    "username": "mroberts"
  },
  "sender": {
    "id": 1,
    "login": "mroberts",
    "full_name": "",
    "email": "mroberts@redhat.com",
    "avatar_url": "https://secure.gravatar.com/avatar/36e314427be65ccf15b32827d888f816",
    "username": "mroberts"
  }
}

Extending the event listener – New fields

The event listener shown above includes the following overlay section :

      - name: "overlays"
        value:
        - key: branch
          expression: "body.ref.split('/')[2]"

The above block will generate a new field called ‘branch’ containing the third field from the ‘body.ref’ field. Since this contains the text ‘refs/head/main’ in the example webhook payload the ‘branch’ parameter will hold the value ‘main’.

Passing new fields to the pipeline

There is a requirement to pass newly generated fields, such as ‘branch’ above, to the pipeline so that it can be used on steps. This is done by adding an entry to the trigger binding object as shown below (final two lines) :

apiVersion: triggers.tekton.dev/v1alpha1
kind: TriggerBinding
metadata:
  name: pipeline-test-binding
spec:
  params:
  - name: gitrepository.url
    value: $(body.repository.html_url)
  - name: gitrevision
    value: $(body.after)
  - name: branch
    value: $(extensions.branch)

The trigger binding above has references to a number of webhook payload fields that all begin with the term ‘body’. Fields added by the interceptor block all begin with the term ‘extensions’. The branch extracted field will go forward to the trigger template under the name ‘branch’.

The trigger template that consumes the fields supplied to it by the trigger binding is shown below. Note the use of $(tt.params.<property-name>) to refer to properties supplied by the trigger process.

apiVersion: triggers.tekton.dev/v1alpha1
kind: TriggerTemplate
metadata:
  name: pipeline-test-trigger-template
spec:
  params:
  - name: gitrepository.url
  - name: gitrevision
  - name: branch
  resourcetemplates:
  - apiVersion: tekton.dev/v1beta1
    kind: PipelineRun
    metadata:
      generateName: pipeline-test-pr-tr-
    spec:
      serviceAccountName: pipeline
      pipelineRef:
        name: pipeline-test
      params:
        - name: git-url
          value: $(tt.params.gitrepository.url)
        - name: git-revision
          value: $(tt.params.gitrevision)
        - name: branch
          value: $(tt.params.branch)
      workspaces:
      - name: resources
        volumeClaimTemplate:
          spec:
            accessModes:
            - ReadWriteOnce
            resources:
              requests:
                storage: 5Gi

When called by a user operation such as a git push the filter within the event listener will process the incoming data and will then either call the trigger or it will choose not to call it. When the trigger is not called the log for the event listener will show that the combined filter block did not return true, but sadly it does not indicate which specific field was not matched.

{"level":"info","ts":"2022-05-12T11:03:20.814Z","logger":"eventlistener","caller":"sink/sink.go:387","msg":"interceptor stopped trigger processing: rpc error: code = FailedPrecondition desc = expression body[\"ref\"].contains(\"main\") && body[\"repository\"][\"name\"] == \"pipeline-test-gogs\" did not return true","commit":"8b4da3f","eventlistener":"pipeline-test-listener-interceptor","namespace":"pipeline-test","/triggers-eventid":"97abd2d1-bd19-4156-b07f-1722f5dc2758","eventlistenerUID":"434581cc-86c1-4dd7-82c2-202c88a72679","/triggers-eventid":"97abd2d1-bd19-4156-b07f-1722f5dc2758","/trigger":"github-listener"}

Custom Event Processors

It is possible to have your event listener call a rest based application running on your OpenShift cluster if required. This will give ultimate flexibility for the analysis of the webhook payload and for the manipulation of the payload to produce whatever is required. There is an obvious maintenance overhead to running an additional micro-service for that purpose so where possible the filtering within the event listener should be preferred.

Sample code

If you want to take a look at a Tekton pipeline that was used to write this article then please go to :

Repository : https://github.com/marrober/pipelineBuildExample.git
Branch : acs
Directory : simple-triggered-process

(Don’t forget that the filter parameters in the event listener will need to be updated to your repository).

Summary

The use of filters can help teams to reduce the noise of unwanted builds when events occur on specific branches, or events of a specific type.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s