Welcome to HooDev

My name is Daniel Hookins, I am a professional Software Engineer.

This site is about building software. It's about exploration and trying out new things.

My goal is to take this site from humble HTML files to build out some cool web apps.

Note: The purpose of this site is to just have some fun and learn / teach things along the way, please don't take it too seriously!

See All Articles

Continuous Integration for a Small Project - Using Bitbucket

For small home projects it's not worth setting up a massively complex CI pipeline.

But it's also annoying to have to log in to your server, pull the changes, docker down, docker up, etc.

So instead we are going to make use of Bitbucket's Pipelines tool and automate the process.

The end goal is that when we push to the master branch - a 'deploy to production' script will be triggered and take care of everything for us.

Building on What We Have

From here we are starting from the work previously done in Serving up some HTML files with Nginx - Using Docker and Git - basically, we have a docker container running nginx serving up some html files. We've got a git repo initialized and using the 'master' branch for production.

Our production server is a Digital Ocean droplet, but yours can be anywhere, as long as you can SSH in and set this up.

In order to use Bitbucket's Pipeline tool, you will need to be using Bitbucket (obviously).

Writing the script that will do all the work

The first thing we need to do is write the script that will be doing all the work on the production server. Let's call it deploy-to-production.sh

deploy-to-production-sh

This is the script that will run when we push our code to master, so it will need to go through all of the steps that you would take if you were SSHing into the production server manually.

For me, those steps are:

  1. cd into the directory where our docker-compose project is running
  2. down that project
  3. pull the latest changes from the origin master branch
  4. Re-build the docker containers, using a special production docker-compose file (we will look at that more in a minute)
  5. Give a little user feedback

Updating Docker Compose for Production

As we saw in the script, we are going to use a different docker-compose file for production, this we will call docker-compose-prod.yml

docker-compose-prod-yml

At this stage the file is almost identical to our existing docker-compose.yml file, the only difference is that we have removed the volume.

This is removed because it is not necessary to use the volume on production. The files are not going to be edited on the fly and therefore we only need to copy them over once. This is done in our original Dockerfile

dockerfile

While we probably could have just used the same docker-compose.yml file, setting up one specifically for production will help to give us some flexibility in the future.

Running the script on Production

It's now time to see this in action.

Commit all the changes you have made, merge them into the master.

git-commit-staging-merge-into-master

After you have merged staging into master, remember to Push them to bitbucket

Then ssh into the production server to make sure that the script is working as intended.

You should see something similar

deploy-to-production-sh-check-working
  1. cd into the project directory
  2. git pull to get the latest changes we just made (the ones that contain the deploy script)
  3. sh ./deploy-to-production.sh (run the script!)
  4. You should see the script go through the steps and result in some kind of success.

Once it's all working as intended, we can automate the process with a Bitbucket Pipeline

Setting up the Bitbucket Pipeline

Essentially, what we need to do is have something call the deploy-to-production.sh script every time we push something to master. Luckily Bitbucket offer this kind of service.

To get it working we need to:

  1. Create a bitbucket-pipelines.yml file
  2. Enable pipelines in your Bitbucket settings
  3. Set up our environment variables

Create a bitbucket-pipelines.yml file

This is the file that tells bitbucket what to do during a deploy.

bitbucket-pipelines-yml

The most important part of what we are doing here:

Now commit those changes, merge them into master and push them.

Now we need to set up the configuration in bitbucket.

Bitbucket Configuration

Go in to the Pipeline Settings and Enable Them

bitbucket-settings-pipelines-settings-enable

Settings > Pipelines Settings > Enable Pipelines

Now we need to add in the variables that the bitbucket-pipelines.yml file refers to:

bitbucket-settings-pipelines-repository-variables

Then the last settings you need to update are the SSH Keys

bitbucket-settings-pipelines-ssh-keys

Generate a new public key here and then copy it over to your ~/.ssh/authorized_keys on your server (instructions are on screen).

I blurred out my key there, but it is a public key, so I didn't really need to.

Also update your known hosts here. Just add the host address and click Fetch.

Again, I blurred the fingerprint, but didn't really need to.

From this point on it should be all set up for deployments.

Push some content to master and watch it go!

Push some new content to the master branch

push-some-content-to-master

And that should automatically trigger the build in Bitbucket.

Navigate to "Pipelines" in your Bitbucket and hopefully you will see a success status.

pipeline-successful

If anything went wrong you can drill down on it by clicking on the status or the commit name.

build-details

From here we can see the same deploy-to-production.sh script that we wrote being run via SSH duing the build.

If you have a failure in your build you should be able to debug it from here. If you're seeing success then you should be good to go.

You will now have a simplified CI / CD pipeline setup and any time to push something to master your live site will be updated.

Read More Articles