Giter Site home page Giter Site logo

Comments (7)

ptoulouse avatar ptoulouse commented on May 22, 2024 1

The proposed setup put everything in the same "top level" folder:

  1. The docker-compose file
  2. The initial configuration files (authelia, traefik2/rules)
  3. The data generated by the containers

In my own environment, I set it up as 2 folders:

  1. The docker-compose and initial config files are in a folder that can be kept in github with a minimal .gitignore file. A few files need to be backed up if you are not using a private repo (e.g. .env).
  2. The data generated by the containers is in a separate folder and is backed up.

from docker-traefik.

fanuch avatar fanuch commented on May 22, 2024

You're looking at it.

It's very common to store configuration files with git.
Since they are just text files that are receiving few changes with the possibility of needing to roll-back, or exist in multiple locations, git is perfect for storing configs.

There's GitHub or GitLab or BitBucket to name a few, or you can roll your own git server; though not recommended as all of the mentioned services allow for private repositories to house your configs.

from docker-traefik.

stratosgear avatar stratosgear commented on May 22, 2024

I am not so sure that you can keep these folders in git. In my 16-18 hours of usage of a docker-compose very similar to what is defined in the sample traefik2 displayed here, the /docker folder is close to 520 MB of data, already!

I'm sure that some of the files are binary files too (images and thumbnails etc)

These are NOT good candidates for a git repo.

I was hoping something along the lines of a borg backup or similar, and I was wondering if someone had testdriven any dockerized solutions, like the other suite of applications shown in the repo.

Thanks, but stuffing all that in the git repo, is not what I would consider proper backup strategy.

from docker-traefik.

fanuch avatar fanuch commented on May 22, 2024

Could debate with you that git can do binary files, images, and there are heaps of git repos that are much larger than you 520MB, but looks like you don't want that.

Sure , use Borg or Rsync. Both can do incremental backups and restores.

from docker-traefik.

htpcBeginner avatar htpcBeginner commented on May 22, 2024

@stratosgear

I can consider adding a backup system.

But your comment on docker binaries etc is easily addressed with .gitignore. My .gitignore by default ignores all the contents of the folder and I explicitly specify which files I want to be published in my git repo.

But the downside is you have to make sure you specify all important files.

from docker-traefik.

stratosgear avatar stratosgear commented on May 22, 2024

Sure, yes, I also intended to share my backup solution when I get around implementing it (along with some other optimizations I have in mind)

The way I see it, is that all docker-compose and other config files are saved and source controlled in a git repo (pretty much as this repo is setup) but everything under ~/docker is properly backed up with a separate system (hopefully something as easy as bringing up another docker instance to do the back up).

Of course you have the issue of not being completely able to backup a set of running docker instances, but you could just docker-compose -f docker-compose-t2.yml stop && docker-compose -f docker-backup.uml run && docker-compose -f docker-compose-t2.yml up through a cron job...

Also the .gitignore solution is not ideal because you never know what kind of binary files (or any other files that you want to keep out of the git repo) a new container will be generating. By the time you find out it could have been already committed to the repo and it will be a never ending battle trying to keep that .gitignore up to date.

[Edit]: I just saw that you have reversed the usage of .gitignore with the use of the *, and that would indeed address the problem I mentioned above. Still, though, it would still require constant meddling in order to be sure that you "save" all important settings, so I still consider this as sub-optimal... :)

Thanks...

from docker-traefik.

beloso avatar beloso commented on May 22, 2024

I've just set up Duplicati service. It backs up folders to a myriad of cloud service.
I set mine up with Google Drive. Appears to be working.

I've used this image: https://github.com/linuxserver/docker-duplicati

from docker-traefik.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.