I just finished setting up my first automated deployment pipeline from GitHub to Google Cloud Functions using Cloud Build.
Benefits of this was a more streamlined deployment process and being able to know eactly which commit is the current deployed Cloud Function.
This was a deceivingly non-simple process when it came to the fact that I was using private github repo as a submodule. After a lot of googling and learning how to use SSH I finally got it to work!
The Problem 🤔
Google Cloud Build. Easily connect to github repo (even a private one!) and deploy. Sounds super easy! WRONG! No built-in support for submodules.
The Solution 💡
So we need to manually download the submodule first. We can include a step in our cloudbuild.yaml
file to first clone the submodule so Cloud Build can then access its contents in the deployment step. Easy? Right?
fatal: could not read Username for 'https://github.com': No such device or address
No dice.
The Problem #2 🤔
Despite how simple it was to connect Cloud Build to our parent repo, it turns out connecting to the submodule repo is not so straight-forward.
At this stage, we are authenticating with a github account with admin access to both repos.
The Solution(s?) #2 💡
A bit of googling later, there seem to be two ways to get around this by using secrets within our build config file to directly authenticate with github.
- Using Personal Access Tokens (and HTTPS connection)
- Using SSH
If you're past-me, you spend a lot of time trying to get the easy method (no. 1) to work to no avail. If you're current me, you will avoid that mistake and go straight to using SSH.
Here's how:
- I followed this guide from the google docs to set up SSH access to my submodule github repo.
Note about the note:
This does not apply in this case. Cloud Build can access our parent repo through triggers but SSH is required for the submodule repo.
Now with SSH set up as per the guide, the cloudbuild.yaml
file will end up looking something like this:
- copy github known host to the path
- clone our submodule repo
- deploy our cloud function
# Configure Cloud Build steps
steps:
- name: 'gcr.io/cloud-builders/git'
secretEnv: ['SSH_KEY']
entrypoint: 'bash'
args:
- -c
- |
echo "$$SSH_KEY" >> /root/.ssh/id_rsa
chmod 400 /root/.ssh/id_rsa
cp known_hosts.github /root/.ssh/known_hosts
volumes:
- name: 'ssh'
path: /root/.ssh
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk' # Cloud SDK container
args:
- git
- submodule
- update
- --init
- --recursive #to clone submodule
volumes:
- name: 'ssh'
path: /root/.ssh
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk' # Cloud SDK container
args:
- gcloud
- functions
- deploy
- cloud-function-name
- --region=australia-southeast1
- --source=.
- --trigger-http
- --runtime=python312
availableSecrets:
secretManager:
- versionName: projects/PROJECT_ID/secrets/SECRET_NAME/versions/latest
env: 'SSH_KEY'
options:
logging: CLOUD_LOGGING_ONLY
And voila! Now magically (via SSH) Cloud Build can authenticate with our submodule repo and has no problems performing all the build steps and successfully deploying the cloud function.
This task plagued me for a Friday afternoon but now that it is solved, managing our deployment process for multiple cloud functions which all use a common submodule repo will be a piece of cake. 🍰