Hello nerds! I’m hosting a lot of things on my home lab using docker compose. I have a private repo in GitHub for the config files. This is working fine for me, but every time I want to make a change I have to push the changes, then ssh to the lab, pull the changes, and run docker compose up. This is of course working fine, but I want to automate it. Does anyone have a similar setup and know of a good tool? I know I could use watchtower to update existing images, but this is more for if I change a setting or add a new service.

I’ve considered roughly four approaches.

  1. A new container that mounts the whole running directory and the docker socket. It will register a webhook in GitHub to receive notifications when I push to the repo, run git pull and docker up. My worries here are the usual dind gotchas.

  2. Same as 1, but don’t mount anything, instead ssh from container to host and run the steps there. This solves any dind issues, but I don’t love giving the container an ssh key to the host.

  3. Have a service running on the host outside of docker. This is probably the correct approach, but very annoying since my host is a Synology nas and it doesn’t have systemd or anything like that afaik.

  4. Have a GitHub action ssh to the machine and do the steps. Honestly the easiest way but I would prefer to not open ssh to the internet.

Any feedback or tips are much appreciated. I don’t feel like any of my options are very good and I feel like I am probably missing something obvious.

  • Im_old@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 days ago

    Why not host your own git repo (e.g. gitea) so you can do 2 or 4 without opening services outside?

    • bjornsno@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      I’d be a bit concerned with having the git repo also be hosted on the machine itself. If the drives break it’s all gone. I could of course have two remotes but then pushing changes still becomes a multi step procedure.

      • umami_wasabi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 days ago

        Backup mate. Either local or something over the network. When comes to data loss, it will come find you eventually.

        • bjornsno@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 days ago

          I do have nightly off-site backups, that’s true. Still, having the git repo be on the same machine doesn’t seem right to me.

      • Lem453@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        I world strongly suggest a second device like an RPI with Gitea. There what I have.

        I use portainer to pull straight from git and deploy

      • JASN_DE@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        5 days ago

        I’d be a bit concerned with having the git repo also be hosted on the machine itself.

        Please tell me you have a tested backup solution/procedure in place.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    4 days ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    Git Popular version control system, primarily for code
    IP Internet Protocol
    SSH Secure Shell for remote terminal access

    [Thread #827 for this sub, first seen 23rd Jun 2024, 10:55] [FAQ] [Full list] [Contact] [Source code]

  • umami_wasabi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    5 days ago

    GH action is all what you need. If you really worried open ssh over the internet, use Tailscale or Cloudflare Tunnel. Or use a firewall rule to block off traffic except from GH IP ranges. TBH, I have VPSes that have SSH open to the whole world. Yes, it got many hits everyday but they doesn’t do anything beyond that.

  • butitsnotme@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 days ago

    For no 1, that shouldn’t be dind, the container would be controlling the host docker, wouldn’t it?

    If so, keep in mind that this is the same as giving root SSH access to the host machine.

    As far as security goes, anything that allows GitHub to cause your server to download (pull) and use a set of arbitrary of Docker images with arbitrary configuration is remote code execution. It doesn’t really matter what you to secure access to the machine, if someone compromises your GitHub account.

    I would probably set up SSH with a key dedicated to GitHub, specifically for deploying. If SSH is configured to only allow keys for access, it’s not much of a security risk to open it up to the internet. I would then configure that key to only be able to run a single command, which I would make a very simple bash script which runs git fetch, and then git verify-commit origin/main (or whatever branch you deploy), befor checking out the latest commit on that branch.

    You can sign commits fairly easily using SSH keys now, which combined with the above allows you to store your data on GitHub without having to trust them to have RCE on your host.

  • witten@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    I use Ansible to meet this need. Whenever I want to deploy to one or more remote hosts, I run Ansible locally and it connects via SSH to the remote host(s). There, it can run Docker Compose, configure services, lay down files on the host, restart things, etc.

  • ChrislyBear@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 days ago

    Have a look at GitLab.

    I’m doing the same thing you are doing, but automatically. I have a repo per app and a few GitLab runners connected on my Raspis/servers. Everytime I push a change, the shell runner runs the commands configured for the pipeline. I don’t have to lift a finger after changes.

  • Deifyed@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    Just brain storming here:

    You could expose a bare git repo on the server with a git hook that runs Docker compose up on push.

    You could also have GitHub actions ssh in and run git pull && docker compose up on push to main.

  • gencha@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    I prefer having a convenient pull mechanism that I can trigger from a workstation in the lab network. I maintain the setup with Ansible

    • bjornsno@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      That would fill the same role as watchtower I guess? I’ve previously tried to have a look at having portainer manage the docker compose stack that it’s running inside but at least back then it seemed to be a dead end and not really what portainer is meant to do. I’m not interested in moving away from docker compose at this time.

      • realbadat@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        Dockge would be more appropriate for that.

        Watchtower has different functionality, mainly keeping them up to date with images.

        You want Jenkins, GH Actions, or even ansible.

  • capc8m@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    5 days ago

    Why save things on github? I used to save my configs directly in the server running docker. To change anything I had to ssh into it and do the stuff.