Multiple GitHub Applications

Multiple GitHub application support #

Multiple GitHub apps support is a Technology Preview feature only. Technology Preview features are not currently supported and might not be functionally complete. We do not recommend using them in production. These features provide early access to an upcoming Pipelines-as-Code features, enabling you to test functionality and provide feedback during the development process.

Pipelines-as-Code supports running multiple GitHub applications on the same cluster. This allows you to have multiple GitHub applications pointing to the same cluster from different installation (like public GitHub and GitHub Enterprise).

Running a second controller with a different GitHub application #

Each new installs for different GitHub applications have their own controller with a Service and a Ingress or a OpenShift Route attached to it.

Each controller can have their own Configmap for their configuration and should have their own secret with the GitHub application private key/application_id and webhook_secret. See the documentation on how to configure those secrets here.

The controller have three different environment variable on its container to drive this:

Environment VariableDescriptionExample Value
PAC_CONTROLLER_LABELA unique label to identify this controllerghe
PAC_CONTROLLER_SECRETThe Kubernetes secret with the GitHub application secretghe-secret
PAC_CONTROLLER_CONFIGMAPThe Configmap with the Pipelines-as-Code configghe-configmap
While you need multiple controllers for different GitHub applications, only one watcher (the Pipelines-as-Code reconciler that reconcile the status on the GitHub interface) is needed.

Script to help running a second controller #

We have a script in our source code repository to help deploying a second controller with its associated service and ConfigMap. As well setting the environment variables.

Its located in the ./hack directory and called second-controller.py

To use it first check-out the Pipelines-as-Code repository:

git clone https://github.com/openshift-pipelines/pipelines-as-code

You need to make sure the python-yaml module is installed, you can install it by multiple ways (i.e: your operating system package manager) or simply can use pip:

python3 -mpip install PyYAML

And run it with:

python3 ./hack/second-controller.py LABEL

This will output the generated yaml on the standard output, if you are happy with the output you can apply it on your cluster with kubectl:

python3 ./hack/second-controller.py LABEL|kubectl -f-

There is multiple flags you can use to fine grain the output of this script, use the --help flag to list all the flags that can be passed to the script.