Github triggered automated docker deployment - Spotify on RPI

In short: Automatically deploy an app into a container - let's build a Raspberry Pi Spotify player

In this example we show how to use Github and Github runners to deploy a Spotify player for Raspberry Pi on a remote device in a docker container. The player can be parametrized through the github repository. The process starts by pushing any changes to your Github repository. Then, using Github actions and runners, the container files are converted into a tarball and placed into your qbee.io file manager with the qbee API. From there on you can distribute and run it on the group of devices you have specified with file distribution. Any change will retrigger the whole process.

In order to achieve all this we need to do the following:

  • install docker on the RPI (can be automated, but we do it manually here)
  • duplicate the docker container on github such that the repo is private and parameters can be changed
  • create the github runner that creates the tarball and transfers the file to qbee
  • create the qbee.io file distribution that triggers the docker build on the remote device(s)
  • listen to Spotify on your Raspberry Pi (Spotify Premium needed)

The first thing we need to do is to install docker on the Raspberry pi. See below how to do this or see the original post here.

Install docker on a Raspberry Pi
    # Install Docker
    curl -sSL https://get.docker.com | sh

    # Add permission such that pi user can run docker
    sudo usermod -aG docker pi

    # Now you need to reboot

    # Test Docker installation

    docker run hello-world

    # IMPORTANT! Install proper dependencies
    sudo apt-get install -y libffi-dev libssl-dev
    sudo apt-get install -y python3 python3-pip
    sudo apt-get remove python-configparser

    # Install Docker Compose
    sudo pip3 -v install docker-compose            

On github we need to find a good docker container image for Raspotify. We selected the one called flaviostutz/rpi-spotify since it allows to set parameters through a docker-compose.ymlfile and the idea is to show that changes will be automatically deployed to the docker containers.

For this demo this github image needs to be copied into a private repository (to be able to change it and make a github runner action react to those changes). Go to the right hand corner of your github account, click on the plus and "import repository". Then insert the link for the repository and make it private: https://github.com/flaviostutz/rpi-spotify.

github-duplicate

Now we need to setup the automatic workflow:

  • use Github for version control management (any change in the docker container repository will trigger the github runner)
  • setup a Github runner (using Github actions) to create a compressed tar file of the docker container content once a push is triggered
  • copy the new raspotify-demo.tar.gz file to the qbee.io file manager via API calls
  • use the qbee.io file distribution to upload the new file to a list of devices and to run to untar and run `docker-compose.

How to create a repository on Github

Check out the official documentation on how to create a repository on Github.

Once a repository is set up, we specify the Github secrets as shown in the following screenshot. We define the qbee login mail address as QM and the password as QP.

github-secrets

There we specify our qbee.io username and password, as we do not want them to be exposed.

Once that is done, we create a Github actions which runs our setup script on the runners provided by Github (or you can set up your own runners). Github actions can be created as follows (we use "set up a workflow yourself")

github-secrets

The script we used is the following:

Github runner file (action) master.yml
    name: Automated Docker distribution

    on:
      push:
        branches: [ master ]
      pull_request:
        branches: [ master ]

    jobs:  


     build:
        runs-on: ubuntu-latest
        env:
            DOCKERNAME: raspotify-demo.tar.gz   

        steps:
        - uses: actions/checkout@v2

        - name: create tarball
          run: |
            mkdir ./tar
            tar --exclude='./.git' --exclude='./.github' --exclude='./tar' -czvf ./tar/$DOCKERNAME .

        - name: copy to qbee file manager
          run: |
            curl --digest -u ${{ secrets.QM }}:${{ secrets.QP }} -i -X "DELETE" -d "path=/$DOCKERNAME" -H "Content-type: application/x-www-form-urlencoded" https://www.app.qbee.io/api/v2/file
            curl --digest -u ${{ secrets.QM }}:${{ secrets.QP }} -i -X POST -H "Content-Type:multipart/form-data" -F "path=/" -F "file=@./tar/$DOCKERNAME" https://www.app.qbee.io/api/v2/file

Comments on the yaml file

  • this action is triggered on pushes to the master branch. It is possible to use different tags such as release.
  • the runner is a virtual machine with an ubuntu OS as we specified ubuntu-latest
  • on that virtual machine (freshly created for you on every single run) the code from your repository is checked out using the actions/checkout@v2 option
  • Then a zipped tarball is created. Make sure to exclude .git files and also the folder that will contain the .tar.gz file.
  • the output is placed into the root directory of your qbee.io file manager (c.f. file distribution via API)

Note that your credentials aren't revealed as we use Github secrets to encrypt them as you can see in the action output:

github-node-red-secrets

Finally, we distribute our files to the remote devices as usual with the file distribution.

As a "command to run" we use sudo -u pi tar -xvzf /home/pi/raspotify-demo.tar.gz -C /home/pi/ && sudo -u pi docker-compose up -d. This uncompresses the tar file. This needs to run as user pi and since we are in root context we need to define the proper target directory with -C /home/pi/or (or it will end up in root). The following command sudo -u pi docker-compose up -d starts the docker-compose process. Note that this can take some time the first time it runs.

github-docker-file-to-run

When this is done we can use Spotify to detect the docker container running the Raspberry Pi Spotify Client. It connects over Airplay. Please note that you need to have a Spotify subscription to test this. As a result we can stream music to the Raspberry Pi:

qbeeSpeaker-Spotify

Currently we have defined the docker-compose.ymlfile such that the device is called "qbeeSound" (see above):

docker-compose.yml
    version: "3.3"

    services:

      rpi-spotify:
        build: .
        image: flaviostutz/rpi-spotify:1.6.1
        network_mode: host
        devices:
          - "/dev/snd:/dev/snd"
        environment:
          - SPOTIFY_NAME=qbeeSound

When we change the name in the repository to "qbeeSPEAKER" (see below) this will trigger a new Github runner action, creating a new raspotify-demo.tar.gzfile. This will then be noticed the next time the qbee agent runs again on a device and the file will be expanded again and the docker container will be rebuild.

docker-compose.yml
    version: "3.3"

    services:

      rpi-spotify:
        build: .
        image: flaviostutz/rpi-spotify:1.6.1
        network_mode: host
        devices:
          - "/dev/snd:/dev/snd"
        environment:
          - SPOTIFY_NAME=qbeeSPEAKER

This is the change, the output from the runner and the new container identifying as a new Spotify device:

docker-change

docker-compose

qbeeSpeaker2-Spotify

In the container there are many other environment variable that can be defined (even an equalizer). But in essence this is just an example that anything can be build, distributed and then be deployed with Github and qbee.io. In production it could be considered to move the commands to run into a separate script (even getting that from the repository as well).

Automatic deployment

Using this workflow, every time you push changes to the repository the tarball will be rebuild and then replaced within the qbee.io file manager. If the file differs from the one previously placed in the file manager, then the qbee.io file distribution is triggered along with the run command that you provided. This distributes the flows to all devices in scope and rebuilds the docker conatiner. The same technique can also be used to distribute python flows or anything else.

Hence, all your edge devices are updated by a simple git push :)