My name is Daniel Hookins, I am a professional Software Engineer.
This site is about building software. It's about exploration and trying out new things.
My goal is to take this site from humble HTML files to build out some cool web apps.
Note: The purpose of this site is to just have some fun and learn / teach things along the way, please don't take it too seriously!See All Articles
For small home projects it's not worth setting up a massively complex CI pipeline.
But it's also annoying to have to log in to your server, pull the changes, docker down, docker up, etc.
So instead we are going to make use of Bitbucket's Pipelines tool and automate the process.
The end goal is that when we push to the master branch - a 'deploy to production' script will be triggered and take care of everything for us.
From here we are starting from the work previously done in Serving up some HTML files with Nginx - Using Docker and Git - basically, we have a docker container running nginx serving up some html files. We've got a git repo initialized and using the 'master' branch for production.
Our production server is a Digital Ocean droplet, but yours can be anywhere, as long as you can SSH in and set this up.
In order to use Bitbucket's Pipeline tool, you will need to be using Bitbucket (obviously).
The first thing we need to do is write the script that will be doing all the work on the production server. Let's call it deploy-to-production.sh
This is the script that will run when we push our code to master, so it will need to go through all of the steps that you would take if you were SSHing into the production server manually.
For me, those steps are:
As we saw in the script, we are going to use a different docker-compose file for production, this we will call docker-compose-prod.yml
At this stage the file is almost identical to our existing docker-compose.yml file, the only difference is that we have removed the volume.
This is removed because it is not necessary to use the volume on production. The files are not going to be edited on the fly and therefore we only need to copy them over once. This is done in our original Dockerfile
While we probably could have just used the same docker-compose.yml file, setting up one specifically for production will help to give us some flexibility in the future.
It's now time to see this in action.
Commit all the changes you have made, merge them into the master.
After you have merged staging into master, remember to Push them to bitbucket
Then ssh into the production server to make sure that the script is working as intended.
You should see something similar
Once it's all working as intended, we can automate the process with a Bitbucket Pipeline
Essentially, what we need to do is have something call the deploy-to-production.sh script every time we push something to master. Luckily Bitbucket offer this kind of service.
To get it working we need to:
This is the file that tells bitbucket what to do during a deploy.
The most important part of what we are doing here:
Now commit those changes, merge them into master and push them.
Now we need to set up the configuration in bitbucket.
Go in to the Pipeline Settings and Enable Them
Settings > Pipelines Settings > Enable Pipelines
Now we need to add in the variables that the bitbucket-pipelines.yml file refers to:
Then the last settings you need to update are the SSH Keys
Generate a new public key here and then copy it over to your ~/.ssh/authorized_keys on your server (instructions are on screen).
I blurred out my key there, but it is a public key, so I didn't really need to.
Also update your known hosts here. Just add the host address and click Fetch.
Again, I blurred the fingerprint, but didn't really need to.
From this point on it should be all set up for deployments.
Push some new content to the master branch
And that should automatically trigger the build in Bitbucket.
Navigate to "Pipelines" in your Bitbucket and hopefully you will see a success status.
If anything went wrong you can drill down on it by clicking on the status or the commit name.
From here we can see the same deploy-to-production.sh script that we wrote being run via SSH duing the build.
If you have a failure in your build you should be able to debug it from here. If you're seeing success then you should be good to go.
You will now have a simplified CI / CD pipeline setup and any time to push something to master your live site will be updated.