Updating distributed RaspberryPI’s with automatic code updates

I’m working on a project that has multiple RaspberryPI’s, distributed in multiple locations, but all running the same code (Django Channels with Python background worker scripts). I wanted to be able to deploy any updates to the main code to all the PI’s with as minimal manual interaction as possible. I couldn’t find an obvious way to do this so thought I’d summarise what I’ve set-up in the hope I may get some feedback or it would help someone else.

Basic overview of solution

The main code base is stored in a Git repository in BitBucket.

I have a development PI set-up and when I’m ready I push code changes from it to the main branch of the BitBucket repo.

Each deployed PI has an updateWorker Python script running that periodically downloads the remote code from the BitBucket repository using the git fetch command. It then checks for changes using the git status command and finally updates the code if there are changes using git reset.

Running Git from Python

To run the Git commands from my Python script I’m using a Python Package called sh. It functions as a subprocess replacement that allows you to call any program as if it were a function. I plan to write another post about using it but the basics are intuitive. For example to do a git status call on the local directory and save the response in a variable called statusCheck:

import sh
from sh import git

statusCheck = git("status")

(See this tutorial for more details on how the login to BitBucket is handled)

updateWorker Script

import sh
from sh import git
import time
import os, sys

aggregated = ""

def CheckForUpdate(workingDir):
    print("Fetching most recent code from source..." + workingDir)

    # Fetch most up to date version of code.
    p = git("--git-dir=" + workingDir + ".git/", "--work-tree=" + workingDir, "fetch", "origin", "master", _out=ProcessFetch, _out_bufsize=0, _tty_in=True)               
    print("Fetch complete.")
    print("Checking status for " + workingDir + "...")

    statusCheck = git("--git-dir=" + workingDir + ".git/", "--work-tree=" + workingDir, "status")

    if "Your branch is up-to-date" in statusCheck:
        print("Status check passes.")
        print("Code up to date.")
        return False
        print("Code update available.")
        return True

def ProcessFetch(char, stdin):
    global aggregated

    aggregated += char
    if aggregated.endswith("Password for 'https://yourrepo@bitbucket.org':"):
        print(mainLogger, "Entering password...", True)

if __name__ == "__main__":
    checkTimeSec = 60
    gitDir = "/var/testupdate/"
    while True:
        print("*********** Checking for code update **************")                                                     
        if CheckForUpdate(gitDir):
            print("Resetting code...")
            resetCheck = git("--git-dir=" + gitDir + ".git/", "--work-tree=" + gitDir, "reset", "--hard", "origin/master")
        print("Check complete. Waiting for " + str(checkTimeSec) + "seconds until next check...", True)

updateWorker uses the –git-dir and –work-tree options to define where the local git repo is for the code.

The script parses the response from the git status command to determine if there are updates. If an update is available we will see:

On branch master

Your branch is behind ‘origin/master’ by 2 commits, and can be fast-forwarded.
(use “git pull” to update your local branch)
nothing to commit, working directory clean

On my PIs the updateWorker then kills all the running scripts which are automatically restarted by another background process, this should mean the scripts are all running up to date versions. So far it’s working well but not sure if I’m doing something crazy wrong!

Leave a Reply

Your email address will not be published. Required fields are marked *