Continuous delivery for WordPress using Bitbucket Pipelines

Keep local, staging and production sites in sync for free.

I look after a team of developers who all follow the practice of continuous delivery. I thought it was about time I applied the same principles to my own site but I wanted to do it for free.

Continuous delivery aims to automate the testing and release process so you can deploy new code with a click of a button.

The problem

My current setup uses Atlassian’s Sourcetree (which is free). It helps simplify how I interact with my Git repository. If you don’t know what Git is you can read the Git basics here.

My repository (repo) is then backed up in the cloud using Atlassian’s Bitbucket which is also free for small teams. So far, so good.

Now here comes the problem. To deploy code I would have to manually move files via SFTP (a secure version of FTP) from my local version of the site, to staging and finally to production. This process was slow, prone to errors and had no version control.


I wanted:

  • all main branch commits to automatically update my staging website
  • all deployments to be versioned
  • the ability to roll back to a previous version
  • to manually trigger a deploy to production with one click.

Pipeline setup

To achieve these goals I used Bitbucket Pipelines, another Atlassian product which will automatically deploy and test a website.

For my setup I decided to use Pipelines to only update my WordPress Theme and to upload files via SFTP but you could use FTP.

For this setup you need to be using Bitbucket and to enable Pipelines.

To enable Pipelines go to your repository settings and then select Pipelines settings.

Toggle the switch to enable Pipelines. Before you configure your bitbucket-pipelines.yml file you need to set up some environment variables which I’ll explain next.

Environment variables

To setup your environment variables go to:

  • ‘Settings’ (within your Bitbucket repository)
  • Selecting Pipelines ‘Environment variables’.

Screenshot selecting environment variables within Pipelines

I created two variables:

  • SFTP_password
  • SFTP_username

The variables are the username and password required to access my site via SFTP. If you don’t know the password and username just ask your web host.

The variables have to be all one word but you can call them anything you want.

.yml file

Bitbucket Pipelines runs your builds in Docker containers. These containers run a Docker image that defines the build environment. You can use the default image provided by Bitbucket or get a custom one.

To create your Docker container you need to create a bitbucket-pipelines.yml file. Go to your repositories settings and then your Pipeline settings and click the button ‘Configure bitbucket-pipelines.yml’. I selected the language as PHP and replaced the template with the following.

image: php:5.6.31

    - step:
          - apt-get update
          - apt-get -qq install git-ftp
          - git ftp init --user $SFTP_username --passwd $SFTP_password s

This updated .yml file basically tells Bitbucket to take the main branch of the repository and copy all the files to the sftp:// URL.

The line ‘image: php:5.6.31’ is the to Docker container. The image version I’m using relates to the PHP version of my WordPress.

The next two lines instal git-ftp. This keeps track of the uploaded files by storing the commit id in a log file on the server. It uses Git to determine which local files have changed.

- apt-get update
- apt-get -qq install git-ftp

The last line is the most important. It references the environment variables we created earlier. It also has the URL of where to sync our repository files.

- git ftp init --user $SFTP_username --passwd $SFTP_password s

In my case I only want to use Git to update my WordPress Theme. You should replace the sftp:// URL with the location where your repository needs to sync. If you can’t use sftp you can use ftp.

Git-ftp init copies all the files in your repo to the URL destination using the username and password within your environment variable.

Note: If you already have the files on your site you need to change ‘init’ to ‘catchup’. Catchup assumes your site has the latest files.

After your first successful pipeline build change the word ‘init’ (or ‘catchup’) to ‘push’. Push will then only update the files that have changed from your repo.

This change makes any commits pushed to the main branch go live automatically. For me this meant my staging site would automatically update whenever I committed any files from my local site.

image: php:5.6.31

    - step:
          - apt-get update
          - apt-get -qq install git-ftp
          - git ftp push --user $SFTP_username --passwd $SFTP_password s

Manually trigger a deploy

The next step for me was to add my production site to Pipelines. Instead of automatically pushing code this would be manually triggered.

To do this I added another section called ‘custom’ to my .yml file (see below). Custom will only trigger manually.

Under ‘Custom’ I created a section called ‘deployment-to-prod’ but you could call it anything. I then copied the same info as in ‘default’ but updated the URL to point to my production site.

The first time you do this you have to make sure you have set it to ‘init’ (just like above) but keep the ‘default’ section set to push as you’ve already initialised (init) it.

image: php:5.6.31

  default: # Pipelines triggered automatically
    - step:
          - apt-get update
          - apt-get -qq install git-ftp
          - git ftp push --user $SFTP_username --passwd $SFTP_password s
  custom: # Pipelines triggered manually
      - step:
          - apt-get update
          - apt-get -qq install git-ftp
          - git ftp init --user $SFTP_username --passwd $SFTP_password s

Once you have a successful pipeline build change ‘init’ to ‘push’ and then commit the pipeline again.

Now you’ve set up your pipeline every time you commit and push code to your Bitbucket repository, your staging site will automatically update, however your production will require you to manually trigger the deploy.

To manually trigger your deploy on production:

  • go to your Bitbucket repository
  • Select ‘Branches’
  • On the master branch line click the three little dots ‘…’
  • Select ‘Run pipeline for a branch’

Screenshot to manually deploy pipeline

Once selected you will get the custom pipeline we created called ‘deployment-to-prod’. Click ‘Run’ to start the deployment.

You should now have a fully functioning pipeline with proper versioning allowing you to rollback any change.


Having Bitbucket Pipelines is great, even for this simple example. I can now easily deploy code from local, staging and to production with very little effort, and at the same time everything is version controlled.

The only downside is you only get 50 free minutes a month to deploy your code.

Each deploy lasted around 30 – 60 seconds which means I could do around 30 commits a month. I think for most hobby sites this is probably fine.

Next I’m going to see if I can introduce some testing with each commit such as unit tests or some automated tests with Selenium.


  1. Kyle says:

    This post assumes little knowledge of web development workflow and clearly explains advanced terms and processes for novice/hobbits devs. Thanks for dummy-friendly instructions!

  2. Fojameru says:


    Nice information unfortunately I was try with .yml file it seems error for –user for the third step following error as follow :
    + apt-get update – apt-get -qq install git-ftp – git ftp init –user $**** –passwd $****
    E: Command line option –user is not understood

    Is there any hint ?

  3. Peter Brumby says:

    Hi Fojameru, I’m not sure. It could be a number of things – the most obvious is you haven’t set up the environmental variable in bitbucket.

    You could also take a look at this page which lists all the commands for git-ftp

    Good luck

  4. Kyle says:

    This configuration has recently stopped working for me. It seems git-ftp is no longer about to log into sftp.

    The maintainer of gift-ftp suggests that git-ftp ignored fingerprints in the past, but now requires fingerprints.

    Have you encountered this same issue? If so, how did you resolve?


  5. Peter Brumby says:


    Just tried deploying to my dev and live site and didn’t have a problem – Sorry 🙃

    I’ve had one issue in the past where my hosting provider changed the url path and another when setting up SFTP. The host provider had to enable it for my hosting. Have you checked with your host?

  6. kilinkis says:

    Good article, thank you for sharing.
    It would be nice how to do this with Circle Ci.
    That’s my next research topic 🙂

  7. Putnik says:

    Hi, how do you process wp updates esp those which changes db?

    • Peter Brumby says:

      Hi Putnik, the quick answer is I don’t know. For wp updates and plugin updates you could use the process I’ve described above. For DB changes I’ve always used plugins to help migrate – but this is just a hobby site.

  8. Peter Woodward says:

    The article mentions the limit on the build minutes. It should be noted that more build minutes are available for as little as $10 a month.

    I’ve been using this method for about 3 years for various clients and it’s a huge time saver!

  9. Danne says:

    Is there a way to make this process faster? Like for example this process “- apt-get -qq install git-ftp”. I read somewhere that you can create your own image to have the git-ftp pre-installed so that it isn’t always running in the pipeline. Do you by any chance know how to go around that without having to create an entirely new image?

  10. Robin says:


    Nice article, instead of using SFTP I have my git and bitbucket connected via ssh, and I would like to now automate the system and add pipelines, can you point me to how/what i need to change above to use ssh instead of sftp?


Leave a Reply

Your email address will not be published. Required fields are marked *

More posts