110 Commits
0.2.0 ... 0.4.0

Author SHA1 Message Date
Einar Forselv
151f3cfeeb Update README.md 2019-12-08 15:13:06 +01:00
Einar Forselv
2848738789 Update README.md 2019-12-08 15:10:06 +01:00
Einar Forselv
cf402d77ed Simplify README 2019-12-08 14:57:17 +01:00
Einar Forselv
ae835f30d3 Make crontab configurable 2019-12-08 06:38:56 +01:00
Einar Forselv
1e21ff422f Detect if a repository should be initialized
This is better than trying to initialize it every time
2019-12-08 05:09:05 +01:00
Einar Forselv
6347529701 Alert when nothing is found to back up + check repo after backup 2019-12-08 04:52:30 +01:00
Einar Forselv
3dacc0bfab Update containers.py 2019-12-08 04:51:07 +01:00
Einar Forselv
fa14880742 Indicate we might want to run check with --with-cache 2019-12-08 04:50:33 +01:00
Einar Forselv
5eb773eb34 Fx broken backup process container test 2019-12-08 04:50:09 +01:00
Einar Forselv
e8123922df Backup process label is now unique for each project 2019-12-08 03:51:57 +01:00
Einar Forselv
be74715595 Already running backup process container is an error message 2019-12-08 03:33:11 +01:00
Einar Forselv
515702ae78 Truncate field sizes in discord webhook 2019-12-08 03:32:42 +01:00
Einar Forselv
ff49d9c018 Bump version 2019-12-08 03:32:14 +01:00
Einar Forselv
187787425a Sane default buffer size in backup_from_stdin 2019-12-08 01:16:10 +01:00
Einar Forselv
fa1c982bf5 pep8 2019-12-08 01:14:57 +01:00
Einar Forselv
2bbd329047 Alert when backup process is already running 2019-12-08 01:05:26 +01:00
Einar Forselv
8097ac79af Remove stale backup process containers 2019-12-08 00:57:23 +01:00
Einar Forselv
5082244949 Gather stale backup process containers 2019-12-08 00:32:39 +01:00
Einar Forselv
d9e5a62458 List all containers including non-running ones 2019-12-08 00:26:02 +01:00
Einar Forselv
6085f5fc03 README: Docker version note 2019-12-08 00:10:40 +01:00
Einar Forselv
d89ed781ef Comment on log streaming for docker ce 17 and 18 2019-12-08 00:08:41 +01:00
Einar Forselv
e2dec9ffa0 Move all labels into enums module
This way we have more control over them
2019-12-07 23:59:27 +01:00
Einar Forselv
2b3a702f21 Mising docstrings 2019-12-07 09:19:16 +01:00
Einar Forselv
3456e1a899 Update README.md 2019-12-07 09:15:34 +01:00
Einar Forselv
105cdbb65e Incorrect pip install in travis.yml 2019-12-07 09:12:01 +01:00
Einar Forselv
d671ffb626 Fix broken travis config 2019-12-07 09:05:48 +01:00
Einar Forselv
f988b42881 Create .travis.yml 2019-12-07 08:43:07 +01:00
Einar Forselv
5d653c2c3c Clean up dockerignore 2019-12-07 08:39:05 +01:00
Einar Forselv
c80b2774d4 Better handling of stdout/stderr logging 2019-12-06 17:19:51 +01:00
Einar Forselv
12da998538 Incorrect requirements dir after moving test package 2019-12-06 17:19:06 +01:00
Einar Forselv
d17d776339 More comments in dev compose file 2019-12-06 10:53:51 +01:00
Einar Forselv
758c3075f1 Bump version 2019-12-06 09:09:05 +01:00
Einar Forselv
9cad6a5c71 Update cli.py 2019-12-06 09:08:48 +01:00
Einar Forselv
4ebe16af14 Properly get the container exit code 2019-12-06 08:21:21 +01:00
Einar Forselv
fd87ddc388 More logging during backup 2019-12-06 08:21:06 +01:00
Einar Forselv
2cbc5aa6fa bump version 2019-12-06 07:36:30 +01:00
Einar Forselv
ffa2dfc119 rcb version 2019-12-06 07:35:14 +01:00
Einar Forselv
cfc92b2284 Bug: The log stream from docker can be str or bytes
We don't know why this is the case..
2019-12-06 07:30:39 +01:00
Einar Forselv
216202dec7 Bump version 2019-12-06 06:07:03 +01:00
Einar Forselv
fab988a05e status command should display repository 2019-12-05 12:57:15 +01:00
Einar Forselv
164834d3a9 bug: cleanup should return integer exit code 2019-12-05 12:55:13 +01:00
Einar Forselv
a0dfb04aa7 Run repo init in status command 2019-12-05 12:54:34 +01:00
Einar Forselv
7f588c57ab bug: forget and prune was never executed 2019-12-05 12:42:21 +01:00
Einar Forselv
e01f7c6cff Update README.md 2019-12-05 12:33:24 +01:00
Einar Forselv
102073cb70 Bug: Do not refer to old tag 2019-12-05 12:30:50 +01:00
Einar Forselv
e060c28c93 Update README.md 2019-12-05 11:26:11 +01:00
Einar Forselv
14903f3bbd Fully working tox run 2019-12-05 11:23:33 +01:00
Einar Forselv
96bd419a24 Move tests into src 2019-12-05 11:23:14 +01:00
Einar Forselv
75ab549370 re-add pytest.ini 2019-12-05 11:15:14 +01:00
Einar Forselv
6f06d25db5 pep8 2019-12-05 11:09:36 +01:00
Einar Forselv
0a9e5edfe4 Add basic tox setup 2019-12-05 11:08:40 +01:00
Einar Forselv
130be30268 Move package to src/ to properly separate what goes into docker image 2019-12-05 10:26:46 +01:00
Einar Forselv
0af9f2e8ee Catch exceptions in backup_runner 2019-12-05 10:16:34 +01:00
Einar Forselv
c59f022a55 Properly log exceptions 2019-12-05 10:15:49 +01:00
Einar Forselv
98fe448348 Delete mail.py 2019-12-05 10:15:29 +01:00
Einar Forselv
3708bb9100 re-add provate alert creds 2019-12-05 10:15:09 +01:00
Einar Forselv
d7039cccf4 bump docker-py version 2019-12-05 09:59:52 +01:00
Einar Forselv
864c026402 backup_runner: stop container 2019-12-05 09:59:41 +01:00
Einar Forselv
fcd18ba1cb Update release.md 2019-12-05 09:59:24 +01:00
Einar Forselv
915695043c bump version 2019-12-05 02:48:24 +01:00
Einar Forselv
0a8bbc40c3 Doc banner 2019-12-05 02:31:20 +01:00
Einar Forselv
69b014e88e Update index.rst 2019-12-05 02:26:50 +01:00
Einar Forselv
51742efd33 sphinx: set master doc 2019-12-05 02:15:36 +01:00
Einar Forselv
c6b9f2dc1e Update setup.py 2019-12-05 02:04:36 +01:00
Einar Forselv
2216d76af5 Update release.md 2019-12-05 02:04:11 +01:00
Einar Forselv
3c891aa8b8 Update README.md 2019-12-05 01:58:50 +01:00
Einar Forselv
e3ab8e0e5a Set up docs 2019-12-05 01:58:39 +01:00
Einar Forselv
2c448bdcae Create LICENSE 2019-12-05 00:58:32 +01:00
Einar Forselv
b9d5233510 Do not expose db passwords when pinging 2019-12-05 00:38:58 +01:00
Einar Forselv
9dabf01051 Allow overriding container env vars 2019-12-05 00:38:09 +01:00
Einar Forselv
fdfb28fc47 Propagate log level to parent container 2019-12-05 00:37:13 +01:00
Einar Forselv
1978ee5946 Do not leak passwords in logs 2019-12-05 00:27:56 +01:00
Einar Forselv
38c59b2436 Use XDG_CACHE_HOME to control cache dir 2019-12-04 23:49:32 +01:00
Einar Forselv
4e480ed8e0 Finetune dockerignore 2019-12-04 23:46:56 +01:00
Einar Forselv
96beeab5bd Support forget / prune 2019-12-04 23:25:15 +01:00
Einar Forselv
7dd72ee5ce Remove debug print 2019-12-04 23:24:56 +01:00
Einar Forselv
ef07645664 Add common env vars to env file 2019-12-04 23:24:35 +01:00
Einar Forselv
9f33cbcc39 Remove old environment block 2019-12-04 23:23:53 +01:00
Einar Forselv
f29eab3249 Don't copy log files into image 2019-12-04 23:23:32 +01:00
Einar Forselv
2864145d56 Update README.md 2019-12-04 23:01:06 +01:00
Einar Forselv
6a4e87a2eb Tweak ingores 2019-12-04 23:01:01 +01:00
Einar Forselv
faa2d9ff7e Do not filter mounts if volume backup is not enabled 2019-12-04 22:25:09 +01:00
Einar Forselv
91901ee35c Broken tests due to incorrect labels 2019-12-04 22:17:57 +01:00
Einar Forselv
d3933f8913 Back up to /volumes and /databases 2019-12-04 22:17:42 +01:00
Einar Forselv
7f6b140a00 rcb cleanup 2019-12-04 22:03:49 +01:00
Einar Forselv
6bc88957e7 snapshots --last 2019-12-04 22:02:53 +01:00
Einar Forselv
eaf8b5cc78 Send alert from the main container 2019-12-04 21:24:10 +01:00
Einar Forselv
1ca678f6b4 Send alert when something went wrong during backup 2019-12-04 20:55:33 +01:00
Einar Forselv
d9a082d044 Generic alert send 2019-12-04 20:55:01 +01:00
Einar Forselv
850df45a69 Working discord webhook 2019-12-04 20:31:58 +01:00
Einar Forselv
96889b02a9 Update smtp.py 2019-12-04 20:31:41 +01:00
Einar Forselv
6a3b06f371 Debug log in alert module 2019-12-04 20:28:06 +01:00
Einar Forselv
1edd7ca771 alert test command 2019-12-04 19:36:32 +01:00
Einar Forselv
a4a8a2f462 Working mail alerts + alert system tweaks 2019-12-04 19:36:14 +01:00
Einar Forselv
26ea7a2a00 Shortcut property for getting project name 2019-12-04 19:34:50 +01:00
Einar Forselv
85e9efb769 Basic alert system setup 2019-12-04 03:58:27 +01:00
Einar Forselv
7ca7f56258 Sane cron default: Every day at 2am 2019-12-04 03:35:45 +01:00
Einar Forselv
9fad6f5f38 Note about Popen buffer size 2019-12-04 03:33:30 +01:00
Einar Forselv
f288f77aa4 rcb snapshots 2019-12-04 03:12:36 +01:00
Einar Forselv
f8a9f0e7e9 Support running commands capturing stdout 2019-12-04 03:12:13 +01:00
Einar Forselv
b8757929fa Reduce log format 2019-12-04 03:11:46 +01:00
Einar Forselv
bdf2ea5e41 More logging cleanup 2019-12-04 01:58:01 +01:00
Einar Forselv
00cf68fa3e Properly capture stdout an std err in backup_from_stdin 2019-12-04 01:57:52 +01:00
Einar Forselv
fb3f5b38c3 Shorten log format 2019-12-04 01:30:05 +01:00
Einar Forselv
4ff0df1b35 Clean up logging 2019-12-04 01:12:26 +01:00
Einar Forselv
c78d208e66 Dockerfile: Remove unnecessary layer 2019-12-04 00:50:05 +01:00
Einar Forselv
f4c2cf9bb7 Update README.md 2019-12-04 00:36:28 +01:00
Einar Forselv
947a56b21e Configurable log level: ENV + cmd 2019-12-04 00:31:13 +01:00
Einar Forselv
4ad575cfe3 Update README.md 2019-12-03 10:19:12 +01:00
Einar Forselv
bd55a691e7 Create release.md 2019-12-03 10:19:09 +01:00
47 changed files with 1346 additions and 511 deletions

View File

@@ -1,9 +0,0 @@
.venv/
.vscode/
.gitignore
Dockerfile
tests/
docker-compose.yaml
*.env
*.egg-info
__pycache__

9
.gitignore vendored
View File

@@ -19,3 +19,12 @@ venv
/private/ /private/
restic_data/ restic_data/
restic_cache/ restic_cache/
alerts.env
# build
build/
docs/_build
dist
# tests
.tox

17
.travis.yml Normal file
View File

@@ -0,0 +1,17 @@
language: python
sudo: false
matrix:
include:
python: 3.7
dist: bionic
sudo: true
install:
- pip install -U setuptools pip wheel
- pip install -r src/tests/requirements.txt
- pip install ./src
script:
- tox

21
LICENSE Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2019 Zetta.IO
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

209
README.md
View File

@@ -1,139 +1,170 @@
# restic-compose-backup # restic-compose-backup
*WORK IN PROGRESS* ![docs](https://readthedocs.org/projects/restic-compose-backup/badge/?version=latest)
Backup using https://restic.net/ for a docker-compose setup. Backup using [restic] for a docker-compose setup.
Currently tested with docker-ce 17, 18 and 19.
Automatically detects and backs up volumes, mysql, mariadb and postgres databases in a docker-compose setup. * [restic-compose-backup Documentation](https://restic-compose-backup.readthedocs.io)
This includes both host mapped volumes and actual docker volumes. * [restic-compose-backup on Github](https://github.com/ZettaIO/restic-compose-backup)
* [restic-compose-backup on Docker Hub](https://hub.docker.com/r/zettaio/restic-compose-backup)
* Each service in the compose setup is configured with a label Features:
to enable backup of volumes or databases
* When backup starts a new instance of the container is created
mapping in all the needed volumes. It will copy networks etc
to ensure databases can be reached
* Volumes are mounted to `/backup/<service_name>/<path>`
in the backup process container. `/backup` is pushed into restic
* Databases are backed up from stdin / dumps
* Cron triggers backup
## Configuration * Backs up docker volumes or host binds
* Backs up postgres, mariadb and mysql databases
* Notifications over mail/smtp or Discord webhooks
Required env variables for restic: Please report issus on [github](https://github.com/ZettaIO/restic-compose-backup/issues).
## Install
```bash
docker pull zettaio/restic-compose-backup
```
## Configuration (env vars)
Minimum configuration
```bash ```bash
RESTIC_REPOSITORY RESTIC_REPOSITORY
RESTIC_PASSWORD RESTIC_PASSWORD
``` ```
Backend specific env vars : https://restic.readthedocs.io/en/stable/040_backup.html#environment-variables More config options can be found in the [documentation].
### Volumes Restic backend specific env vars : https://restic.readthedocs.io/en/stable/040_backup.html#environment-variables
## Compose Example
We simply control what should be backed up by adding
labels to our containers. More details are covered
in the [documentation].
restic-backup.env
```bash
RESTIC_REPOSITORY=<whatever backend restic supports>
RESTIC_PASSWORD=hopefullyasecturepw
# snapshot prune rules
RESTIC_KEEP_DAILY=7
RESTIC_KEEP_WEEKLY=4
RESTIC_KEEP_MONTHLY=12
RESTIC_KEEP_YEARLY=3
# Cron schedule. Run every day at 1am
CRON_SCHEDULE="0 1 * * *"
```
docker-compose.yaml
```yaml ```yaml
version: '3' version: '3'
services: services:
# The backup service # The backup service
backup: backup:
build: restic-compose-backup image: zettaio/restic-compose-backup:<version>
environment:
- RESTIC_REPOSITORY=<whatever restic supports>
- RESTIC_PASSWORD=hopefullyasecturepw
env_file: env_file:
- some_other_vars.env - restic-backup.env
volumes: volumes:
# We need to communicate with docker
- /var/run/docker.sock:/tmp/docker.sock:ro - /var/run/docker.sock:/tmp/docker.sock:ro
# Persistent storage of restic cache (greatly speeds up all restic operations)
example: - cache:/cache
web:
image: some_image image: some_image
# Enable volume backup with label
labels: labels:
# Enables backup of the volumes below
restic-compose-backup.volumes: true restic-compose-backup.volumes: true
# These volumes will be backed up
volumes: volumes:
# Docker volume
- media:/srv/media - media:/srv/media
# Host map
- /srv/files:/srv/files - /srv/files:/srv/files
volumes:
media:
```
A simple `include` and `exclude` filter is also available.
```yaml
example:
image: some_image
labels:
restic-compose-backup.volumes: true
restic-compose-backup.volumes.include: "files,data"
volumes:
# Source don't match include filter. No backup.
- media:/srv/media
# Matches include filter
- files:/srv/files
- /srv/data:/srv/data
volumes:
media:
files:
```
Exclude
```yaml
example:
image: some_image
labels:
restic-compose-backup.volumes: true
restic-compose-backup.volumes.exclude: "media"
volumes:
# Excluded by filter
- media:/srv/media
# Backed up
- files:/srv/files
- /srv/data:/srv/data
volumes:
media:
files:
```
### Databases
Will dump databases directly into restic through stdin.
They will appear in restic as a separate snapshot with
path `/backup/<service_name>/dump.sql` or similar.
```yaml
mariadb: mariadb:
image: mariadb:10 image: mariadb:10
labels: labels:
# Enables backup of this database
restic-compose-backup.mariadb: true restic-compose-backup.mariadb: true
``` env_file:
mariadb-credentials.env
```yaml volumes:
- mysqldata:/var/lib/mysql
mysql: mysql:
image: mysql:5 image: mysql:5
labels: labels:
# Enables backup of this database
restic-compose-backup.mysql: true restic-compose-backup.mysql: true
``` env_file:
mysql-credentials.env
volumes:
- mysqldata:/var/lib/mysql
```yaml
postgres: postgres:
image: postgres image: postgres
labels: labels:
# Enables backup of this database
restic-compose-backup.postgres: true restic-compose-backup.postgres: true
env_file:
postgres-credentials.env
volumes:
- pgdata:/var/lib/postgresql/data
volumes:
media:
mysqldata:
mariadbdata:
pgdata:
cache:
``` ```
## The `rcb` command
Everything is controlled using the `rcb` command.
After configuring backup with labels and restarted
the affected services we can quickly view the
result using the `status` subcommand.
```bash
$ docker-compose run --rm backup rcb status
INFO: Status for compose project 'myproject'
INFO: Repository: '<restic repository>'
INFO: Backup currently running?: False
INFO: --------------- Detected Config ---------------
INFO: service: mysql
INFO: - mysql (is_ready=True)
INFO: service: mariadb
INFO: - mariadb (is_ready=True)
INFO: service: postgres
INFO: - postgres (is_ready=True)
INFO: service: web
INFO: - volume: media
INFO: - volume: /srv/files
```
The `status` subcommand lists what will be backed up and
even pings the database services checking their availability.
The `restic` command can also be used directly in the container.
More `rcb` commands can be found in the [documentation].
## Running Tests ## Running Tests
```bash
pip install -e ./src/
pip install -r src/tests/requirements.txt
tox
``` ```
python setup.py develop
pip install -r tests/requirements.txt ## Building Docs
pytest tests
```bash
pip install -r docs/requirements.txt
python src/setup.py build_sphinx
``` ```
## Contributing
Contributions are welcome regardless of experience level. Don't hesitate submitting issues, opening partial or completed pull requests.
[restic]: https://restic.net/
[documentation]: https://restic-compose-backup.readthedocs.io

View File

@@ -1 +0,0 @@
1 * * * * source /env.sh && rcb backup > /proc/1/fd/1

View File

@@ -1,17 +1,19 @@
version: '3' version: '3'
services: services:
backup: backup:
build: . build: ./src
env_file: env_file:
- restic_compose_backup.env - restic_compose_backup.env
- alerts.env
volumes: volumes:
# Map in docker socket # Map in docker socket
- /var/run/docker.sock:/tmp/docker.sock:ro - /var/run/docker.sock:/tmp/docker.sock:ro
# Map backup database locally # Map local restic repository for dev
- ./restic_data:/restic_data - ./restic_data:/restic_data
- ./restic_cache:/restic_cache # Map restic cache
# Map in project source - ./restic_cache:/cache
- .:/restic-compose-backup # Map in project source in dev
- ./src:/restic-compose-backup
web: web:
image: nginx image: nginx
labels: labels:

20
docs/Makefile Normal file
View File

@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

56
docs/conf.py Normal file
View File

@@ -0,0 +1,56 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
# -- Project information -----------------------------------------------------
project = 'restic-compose-backup'
copyright = '2019, Zetta.IO Technology AS'
author = 'Zetta.IO Technology AS'
# The full version, including alpha/beta/rc tags
release = '0.4.0'
# -- General configuration ---------------------------------------------------
master_doc = 'index'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'sphinx_rtd_theme'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']

21
docs/index.rst Normal file
View File

@@ -0,0 +1,21 @@
.. restic-compose-backup documentation master file, created by
sphinx-quickstart on Thu Dec 5 01:34:58 2019.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to restic-compose-backup's documentation!
=================================================
Simple backup with restic for small to medium docker-compose setups.
.. toctree::
:maxdepth: 2
:caption: Contents:
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

35
docs/make.bat Normal file
View File

@@ -0,0 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

2
docs/requirements.txt Normal file
View File

@@ -0,0 +1,2 @@
sphinx
sphinx-rtd-theme

21
extras/release.md Normal file
View File

@@ -0,0 +1,21 @@
# Making a release
- Update version in `setup.py`
- Update version in `docs/conf.py`
- Update version in `restic_compose_backup/__init__.py`
- Build and tag image
- push: `docker push zettaio/restic-compose-backup:<version>`
- Ensure RTD has new docs published
## Example
When releasing a bugfix version we need to update the
main image as well.
```bash
docker build src --tag zettaio/restic-compose-backup:0.3
docker build src --tag zettaio/restic-compose-backup:0.3.3
docker push zettaio/restic-compose-backup:0.3
docker push zettaio/restic-compose-backup:0.3.3
```

View File

@@ -1,4 +1,4 @@
[pytest] [pytest]
testpaths = tests testpaths = src/tests
python_files=test*.py python_files=test*.py
addopts = -v --verbose addopts = -v --verbose

View File

@@ -4,6 +4,22 @@ DOCKER_BASE_URL=unix://tmp/docker.sock
RESTIC_REPOSITORY=/restic_data RESTIC_REPOSITORY=/restic_data
RESTIC_PASSWORD=password RESTIC_PASSWORD=password
RESTIC_KEEP_DAILY=7
RESTIC_KEEP_WEEKLY=4
RESTIC_KEEP_MONTHLY=12
RESTIC_KEEP_YEARLY=3
LOG_LEVEL=info
CRON_SCHEDULE=10 2 * * *
# EMAIL_HOST=
# EMAIL_PORT=
# EMAIL_HOST_USER=
# EMAIL_HOST_PASSWORD=
# EMAIL_SEND_TO=
# DISCORD_WEBHOOK=u
# Various env vars for restic : https://restic.readthedocs.io/en/stable/040_backup.html#environment-variables # Various env vars for restic : https://restic.readthedocs.io/en/stable/040_backup.html#environment-variables
# RESTIC_REPOSITORY Location of repository (replaces -r) # RESTIC_REPOSITORY Location of repository (replaces -r)
# RESTIC_PASSWORD_FILE Location of password file (replaces --password-file) # RESTIC_PASSWORD_FILE Location of password file (replaces --password-file)

View File

@@ -1,139 +0,0 @@
import argparse
import pprint
import logging
from restic_compose_backup import (
backup_runner,
log,
restic,
)
from restic_compose_backup.config import Config
from restic_compose_backup.containers import RunningContainers
logger = logging.getLogger(__name__)
def main():
"""CLI entrypoint"""
args = parse_args()
config = Config()
containers = RunningContainers()
if args.action == 'status':
status(config, containers)
elif args.action == 'backup':
backup(config, containers)
elif args.action == 'start-backup-process':
start_backup_process(config, containers)
def status(config, containers):
"""Outputs the backup config for the compose setup"""
logger.info("Backup config for compose project '%s'", containers.this_container.project_name)
logger.info("Current service: %s", containers.this_container.name)
# logger.info("Backup process: %s", containers.backup_process_container.name
# if containers.backup_process_container else 'Not Running')
logger.info("Backup running: %s", containers.backup_process_running)
backup_containers = containers.containers_for_backup()
for container in backup_containers:
logger.info('service: %s', container.service_name)
if container.volume_backup_enabled:
for mount in container.filter_mounts():
logger.info(' - volume: %s', mount.source)
if container.database_backup_enabled:
instance = container.instance
ping = instance.ping()
logger.info(' - %s (is_ready=%s)', instance.container_type, ping == 0)
if len(backup_containers) == 0:
logger.info("No containers in the project has 'restic-compose-backup.enabled' label")
def backup(config, containers):
"""Request a backup to start"""
# Make sure we don't spawn multiple backup processes
if containers.backup_process_running:
raise ValueError("Backup process already running")
logger.info("Initializing repository")
# TODO: Errors when repo already exists
restic.init_repo(config.repository)
logger.info("Starting backup container..")
# Map all volumes from the backup container into the backup process container
volumes = containers.this_container.volumes
# Map volumes from other containers we are backing up
mounts = containers.generate_backup_mounts('/backup')
volumes.update(mounts)
result = backup_runner.run(
image=containers.this_container.image,
command='restic-compose-backup start-backup-process',
volumes=volumes,
environment=containers.this_container.environment,
source_container_id=containers.this_container.id,
labels={
"restic-compose-backup.backup_process": 'True',
"com.docker.compose.project": containers.this_container.project_name,
},
)
logger.info('Backup container exit code: %s', result)
# TODO: Alert
def start_backup_process(config, containers):
"""The actual backup process running inside the spawned container"""
if (not containers.backup_process_container
or containers.this_container == containers.backup_process_container is False):
logger.error(
"Cannot run backup process in this container. Use backup command instead. "
"This will spawn a new container with the necessary mounts."
)
return
status(config, containers)
logger.info("start-backup-process")
# Back up volumes
try:
vol_result = restic.backup_files(config.repository, source='/backup')
logger.info('Volume backup exit code: %s', vol_result)
# TODO: Alert
except Exception as ex:
logger.error(ex)
# TODO: Alert
# back up databases
for container in containers.containers_for_backup():
if container.database_backup_enabled:
try:
instance = container.instance
logger.info('Backing up %s in service %s', instance.container_type, instance.service_name)
result = instance.backup()
logger.info('Exit code: %s', result)
# TODO: Alert
except Exception as ex:
logger.error(ex)
# TODO: Alert
def parse_args():
parser = argparse.ArgumentParser(prog='restic_compose_backup')
parser.add_argument(
'action',
choices=['status', 'backup', 'start-backup-process'],
)
return parser.parse_args()
if __name__ == '__main__':
main()

View File

@@ -1,68 +0,0 @@
import logging
from typing import List
from subprocess import Popen, PIPE
logger = logging.getLogger(__name__)
def test():
return run_command(['ls', '/backup'])
def ping_mysql(host, port, username, password) -> int:
"""Check if the mysql is up and can be reached"""
return run([
'mysqladmin',
'ping',
'--host',
host,
'--port',
port,
'--user',
username,
f'--password={password}',
])
def ping_mariadb(host, port, username, password) -> int:
"""Check if the mariadb is up and can be reached"""
return run([
'mysqladmin',
'ping',
'--host',
host,
'--port',
port,
'--user',
username,
f'--password={password}',
])
def ping_postgres(host, port, username, password) -> int:
"""Check if postgres can be reached"""
return run([
"pg_isready",
f"--host={host}",
f"--port={port}",
f"--username={username}",
])
def run(cmd: List[str]) -> int:
"""Run a command with parameters"""
logger.info('cmd: %s', ' '.join(cmd))
child = Popen(cmd, stdout=PIPE, stderr=PIPE)
stdoutdata, stderrdata = child.communicate()
if stdoutdata:
logger.info(stdoutdata.decode().strip())
logger.info('-' * 28)
if stderrdata:
logger.info('%s STDERR %s', '-' * 10, '-' * 10)
logger.info(stderrdata.decode().strip())
logger.info('-' * 28)
logger.info("returncode %s", child.returncode)
return child.returncode

View File

@@ -1,19 +0,0 @@
import os
class Config:
"""Bag for config values"""
def __init__(self, check=True):
self.repository = os.environ['RESTIC_REPOSITORY']
self.password = os.environ['RESTIC_PASSWORD']
self.docker_base_url = os.environ.get('DOCKER_BASE_URL') or "unix://tmp/docker.sock"
if check:
self.check()
def check(self):
if not self.repository:
raise ValueError("CONTAINER env var not set")
if not self.password:
raise ValueError("PASSWORD env var not set")

View File

@@ -1,13 +0,0 @@
import logging
import os
import sys
logger = logging.getLogger('restic_compose_backup')
HOSTNAME = os.environ['HOSTNAME']
level = logging.INFO
logger.setLevel(level)
ch = logging.StreamHandler(stream=sys.stdout)
ch.setLevel(level)
ch.setFormatter(logging.Formatter(f'%(asctime)s - {HOSTNAME} - %(name)s - %(levelname)s - %(message)s'))
logger.addHandler(ch)

View File

@@ -1,37 +0,0 @@
"""
"""
import smtplib
from email.mime.text import MIMEText
EMAIL_HOST = "smtp.gmail.com"
EMAIL_PORT = 465
EMAIL_HOST_USER = ""
EMAIL_HOST_PASSWORD = ""
EMAIL_SEND_TO = ['']
def main():
send_mail("Hello world!")
def send_mail(text):
msg = MIMEText(text)
msg['Subject'] = "Message from restic-compose-backup"
msg['From'] = EMAIL_HOST_USER
msg['To'] = ', '.join(EMAIL_SEND_TO)
try:
print("Connecting to {} port {}".format(EMAIL_HOST, EMAIL_PORT))
server = smtplib.SMTP_SSL(EMAIL_HOST, EMAIL_PORT)
server.ehlo()
server.login(EMAIL_HOST_USER, EMAIL_HOST_PASSWORD)
server.sendmail(EMAIL_HOST_USER, EMAIL_SEND_TO, msg.as_string())
print('Email Sent')
except Exception as e:
print(e)
finally:
server.close()
if __name__ == '__main__':
main()

View File

@@ -1,72 +0,0 @@
"""
Restic commands
"""
import logging
from typing import List
from subprocess import Popen, PIPE
from restic_compose_backup import commands
logger = logging.getLogger(__name__)
def init_repo(repository: str):
"""
Attempt to initialize the repository.
Doing this after the repository is initialized
"""
return commands.run(restic(repository, [
"init",
]))
def backup_files(repository: str, source='/backup'):
return commands.run(restic(repository, [
"--verbose",
"backup",
source,
]))
def backup_from_stdin(repository: str, filename: str, source_command: List[str]):
"""
Backs up from stdin running the source_command passed in.
It will appear in restic with the filename (including path) passed in.
"""
dest_command = restic(repository, [
'backup',
'--stdin',
'--stdin-filename',
filename,
])
# pipe source command into dest command
source_process = Popen(source_command, stdout=PIPE)
dest_process = Popen(dest_command, stdin=source_process.stdout)
dest_process.communicate()
# Ensure both processes exited with code 0
source_exit, dest_exit = source_process.poll(), dest_process.poll()
return 0 if (source_exit == 0 and dest_exit == 0) else 1
def snapshots(repository: str):
return commands.run(restic(repository, [
"snapshots",
]))
def check(repository: str):
return commands.run(restic(repository, [
"check",
]))
def restic(repository: str, args: List[str]):
"""Generate restic command"""
return [
"restic",
"--cache-dir",
"/restic_cache",
"-r",
repository,
] + args

3
src/.dockerignore Normal file
View File

@@ -0,0 +1,3 @@
tests/
__pycache__
.DS_Store

View File

@@ -4,8 +4,8 @@ RUN apk update && apk add python3 dcron mariadb-client postgresql-client
ADD . /restic-compose-backup ADD . /restic-compose-backup
WORKDIR /restic-compose-backup WORKDIR /restic-compose-backup
RUN pip3 install -U pip setuptools RUN pip3 install -U pip setuptools && pip3 install -e .
RUN pip3 install -e . ENV XDG_CACHE_HOME=/cache
ENTRYPOINT [] ENTRYPOINT []
CMD ["./entrypoint.sh"] CMD ["./entrypoint.sh"]

2
src/crontab Normal file
View File

@@ -0,0 +1,2 @@
10 2 * * * source /env.sh && rcb backup > /proc/1/fd/1

View File

@@ -3,6 +3,9 @@
# Dump all env vars so we can source them in cron jobs # Dump all env vars so we can source them in cron jobs
printenv | sed 's/^\(.*\)$/export \1/g' > /env.sh printenv | sed 's/^\(.*\)$/export \1/g' > /env.sh
# Write crontab
rcb crontab > crontab
# start cron in the foreground # start cron in the foreground
crontab crontab crontab crontab
crond -f crond -f

View File

@@ -0,0 +1 @@
__version__ = '0.4.0'

View File

@@ -0,0 +1,43 @@
import logging
from restic_compose_backup.alerts.smtp import SMTPAlert
from restic_compose_backup.alerts.discord import DiscordWebhookAlert
logger = logging.getLogger(__name__)
ALERT_INFO = 'INFO',
ALERT_ERROR = 'ERROR'
ALERT_TYPES = [ALERT_INFO, ALERT_ERROR]
BACKENDS = [SMTPAlert, DiscordWebhookAlert]
def send(subject: str = None, body: str = None, alert_type: str = 'INFO'):
"""Send alert to all configured backends"""
alert_classes = configured_alert_types()
for instance in alert_classes:
logger.info('Configured: %s', instance.name)
try:
instance.send(
subject=f'[{alert_type}] {subject}',
body=body,
)
except Exception as ex:
logger.error("Exception raised when sending alert [%s]: %s", instance.name, ex)
logger.exception(ex)
if len(alert_classes) == 0:
logger.info("No alerts configured")
def configured_alert_types():
"""Returns a list of configured alert class instances"""
logger.debug('Getting alert backends')
entires = []
for cls in BACKENDS:
instance = cls.create_from_env()
logger.debug("Alert backend '%s' configured: %s", cls.name, instance is not None)
if instance:
entires.append(instance)
return entires

View File

@@ -0,0 +1,14 @@
class BaseAlert:
name = None
def create_from_env(self):
return None
@property
def properly_configured(self) -> bool:
return False
def send(self, subject: str = None, body: str = None, alert_type: str = None):
pass

View File

@@ -0,0 +1,48 @@
import os
import logging
import requests
from restic_compose_backup.alerts.base import BaseAlert
logger = logging.getLogger(__name__)
class DiscordWebhookAlert(BaseAlert):
name = 'discord_webhook'
success_codes = [200]
def __init__(self, webhook_url):
self.url = webhook_url
@classmethod
def create_from_env(cls):
instance = cls(os.environ.get('DISCORD_WEBHOOK'))
if instance.properly_configured:
return instance
return None
@property
def properly_configured(self) -> bool:
return isinstance(self.url, str) and self.url.startswith("https://")
def send(self, subject: str = None, body: str = None, alert_type: str = None):
"""Send basic webhook request. Max embed size is 6000"""
logger.info("Triggering discord webhook")
# NOTE: The title size is 2048
# The max description size is 2048
# Total embed size limit is 6000 characters (per embed)
data = {
'embeds': [
{
'title': subject[-256:],
'description': body[-2048:] if body else "",
},
]
}
response = requests.post(self.url, params={'wait': True}, json=data)
if response.status_code not in self.success_codes:
logger.error("Discord webhook failed: %s: %s", response.status_code, response.content)
else:
logger.info('Discord webhook successful')

View File

@@ -0,0 +1,56 @@
import os
import smtplib
import logging
from email.mime.text import MIMEText
from restic_compose_backup.alerts.base import BaseAlert
logger = logging.getLogger(__name__)
class SMTPAlert(BaseAlert):
name = 'smtp'
def __init__(self, host, port, user, password, to):
self.host = host
self.port = port
self.user = user
self.password = password
self.to = to
@classmethod
def create_from_env(cls):
instance = cls(
os.environ.get('EMAIL_HOST'),
os.environ.get('EMAIL_PORT'),
os.environ.get('EMAIL_HOST_USER'),
os.environ.get('EMAIL_HOST_PASSWORD'),
(os.environ.get('EMAIL_SEND_TO') or "").split(','),
)
if instance.properly_configured:
return instance
return None
@property
def properly_configured(self) -> bool:
return self.host and self.port and self.user and self.password and len(self.to) > 0
def send(self, subject: str = None, body: str = None, alert_type: str = 'INFO'):
# send_mail("Hello world!")
msg = MIMEText(body)
msg['Subject'] = f"[{alert_type}] {subject}"
msg['From'] = self.user
msg['To'] = ', '.join(self.to)
try:
logger.info("Connecting to %s port %s", self.host, self.port)
server = smtplib.SMTP_SSL(self.host, self.port)
server.ehlo()
server.login(self.user, self.password)
server.sendmail(self.user, self.to, msg.as_string())
logger.info('Email sent')
except Exception as ex:
logger.exception(ex)
finally:
server.close()

View File

@@ -1,8 +1,7 @@
import logging import logging
import os import os
import docker
from restic_compose_backup.config import Config from restic_compose_backup import utils
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -10,8 +9,7 @@ logger = logging.getLogger(__name__)
def run(image: str = None, command: str = None, volumes: dict = None, def run(image: str = None, command: str = None, volumes: dict = None,
environment: dict = None, labels: dict = None, source_container_id: str = None): environment: dict = None, labels: dict = None, source_container_id: str = None):
logger.info("Starting backup container") logger.info("Starting backup container")
config = Config() client = utils.docker_client()
client = docker.DockerClient(base_url=config.docker_base_url)
container = client.containers.run( container = client.containers.run(
image, image,
@@ -35,7 +33,13 @@ def run(image: str = None, command: str = None, volumes: dict = None,
line = "" line = ""
while True: while True:
try: try:
line += next(stream).decode() # Make log streaming work for docker ce 17 and 18.
# For some reason strings are returned instead if bytes.
data = next(stream)
if isinstance(data, bytes):
line += data.decode()
elif isinstance(data, str):
line += data
if line.endswith('\n'): if line.endswith('\n'):
break break
except StopIteration: except StopIteration:
@@ -51,7 +55,7 @@ def run(image: str = None, command: str = None, volumes: dict = None,
fd.write('\n') fd.write('\n')
logger.info(line) logger.info(line)
container.wait()
container.reload() container.reload()
logger.debug("Container ExitCode %s", container.attrs['State']['ExitCode']) logger.debug("Container ExitCode %s", container.attrs['State']['ExitCode'])
container.remove() container.remove()

View File

@@ -0,0 +1,288 @@
import argparse
import os
import logging
from restic_compose_backup import (
alerts,
backup_runner,
log,
restic,
)
from restic_compose_backup.config import Config
from restic_compose_backup.containers import RunningContainers
from restic_compose_backup import cron, utils
logger = logging.getLogger(__name__)
def main():
"""CLI entrypoint"""
args = parse_args()
config = Config()
log.setup(level=args.log_level or config.log_level)
containers = RunningContainers()
# Ensure log level is propagated to parent container if overridden
if args.log_level:
containers.this_container.set_config_env('LOG_LEVEL', args.log_level)
if args.action == 'status':
status(config, containers)
elif args.action == 'snapshots':
snapshots(config, containers)
elif args.action == 'backup':
backup(config, containers)
elif args.action == 'start-backup-process':
start_backup_process(config, containers)
elif args.action == 'cleanup':
cleanup(config, containers)
elif args.action == 'alert':
alert(config, containers)
elif args.action == 'version':
import restic_compose_backup
print(restic_compose_backup.__version__)
elif args.action == "crontab":
crontab(config)
def status(config, containers):
"""Outputs the backup config for the compose setup"""
logger.info("Status for compose project '%s'", containers.project_name)
logger.info("Repository: '%s'", config.repository)
logger.info("Backup currently running?: %s", containers.backup_process_running)
if containers.stale_backup_process_containers:
utils.remove_containers(containers.stale_backup_process_containers)
# Check if repository is initialized with restic snapshots
if not restic.is_initialized(config.repository):
logger.info("Could not get repository info. Attempting to initialize it.")
result = restic.init_repo(config.repository)
if result == 0:
logger.info("Successfully initialized repository: %s", config.repository)
else:
logger.error("Failed to initialize repository")
logger.info("%s Detected Config %s", "-" * 25, "-" * 25)
# Start making snapshots
backup_containers = containers.containers_for_backup()
for container in backup_containers:
logger.info('service: %s', container.service_name)
if container.volume_backup_enabled:
for mount in container.filter_mounts():
logger.info(' - volume: %s', mount.source)
if container.database_backup_enabled:
instance = container.instance
ping = instance.ping()
logger.info(' - %s (is_ready=%s)', instance.container_type, ping == 0)
if ping != 0:
logger.error("Database '%s' in service %s cannot be reached",
instance.container_type, container.service_name)
if len(backup_containers) == 0:
logger.info("No containers in the project has 'restic-compose-backup.*' label")
logger.info("-" * 67)
def backup(config, containers):
"""Request a backup to start"""
# Make sure we don't spawn multiple backup processes
if containers.backup_process_running:
alerts.send(
subject="Backup process container already running",
body=(
"A backup process container is already running. \n"
f"Id: {containers.backup_process_container.id}\n"
f"Name: {containers.backup_process_container.name}\n"
),
alert_type='ERROR',
)
raise RuntimeError("Backup process already running")
# Map all volumes from the backup container into the backup process container
volumes = containers.this_container.volumes
# Map volumes from other containers we are backing up
mounts = containers.generate_backup_mounts('/volumes')
volumes.update(mounts)
try:
result = backup_runner.run(
image=containers.this_container.image,
command='restic-compose-backup start-backup-process',
volumes=volumes,
environment=containers.this_container.environment,
source_container_id=containers.this_container.id,
labels={
containers.backup_process_label: 'True',
"com.docker.compose.project": containers.project_name,
},
)
except Exception as ex:
logger.exception(ex)
alerts.send(
subject="Exception during backup",
body=str(ex),
alert_type='ERROR',
)
return
logger.info('Backup container exit code: %s', result)
# Alert the user if something went wrong
if result != 0:
alerts.send(
subject="Backup process exited with non-zero code",
body=open('backup.log').read(),
alert_type='ERROR',
)
def start_backup_process(config, containers):
"""The actual backup process running inside the spawned container"""
if (not containers.backup_process_container
or containers.this_container == containers.backup_process_container is False):
logger.error(
"Cannot run backup process in this container. Use backup command instead. "
"This will spawn a new container with the necessary mounts."
)
exit(1)
status(config, containers)
errors = False
# Did we actually get any volumes mounted?
try:
has_volumes = os.stat('/volumes') is not None
except FileNotFoundError:
logger.warning("Found no volumes to back up")
has_volumes = False
# Warn if there is nothing to do
if len(containers.containers_for_backup()) == 0 and not has_volumes:
logger.error("No containers for backup found")
exit(1)
if has_volumes:
try:
logger.info('Backing up volumes')
vol_result = restic.backup_files(config.repository, source='/volumes')
logger.debug('Volume backup exit code: %s', vol_result)
if vol_result != 0:
logger.error('Volume backup exited with non-zero code: %s', vol_result)
errors = True
except Exception as ex:
logger.error('Exception raised during volume backup')
logger.exception(ex)
errors = True
# back up databases
logger.info('Backing up databases')
for container in containers.containers_for_backup():
if container.database_backup_enabled:
try:
instance = container.instance
logger.info('Backing up %s in service %s', instance.container_type, instance.service_name)
result = instance.backup()
logger.debug('Exit code: %s', result)
if result != 0:
logger.error('Backup command exited with non-zero code: %s', result)
errors = True
except Exception as ex:
logger.exception(ex)
errors = True
if errors:
logger.error('Exit code: %s', errors)
exit(1)
# Only run cleanup if backup was successful
result = cleanup(config, container)
logger.debug('cleanup exit code: %s', result)
if result != 0:
logger.error('cleanup exit code: %s', result)
exit(1)
# Test the repository for errors
logger.info("Checking the repository for errors")
result = restic.check(config.repository)
if result != 0:
logger.error('Check exit code: %s', result)
exit(1)
logger.info('Backup completed')
def cleanup(config, containers):
"""Run forget / prune to minimize storage space"""
logger.info('Forget outdated snapshots')
forget_result = restic.forget(
config.repository,
config.keep_daily,
config.keep_weekly,
config.keep_monthly,
config.keep_yearly,
)
logger.info('Prune stale data freeing storage space')
prune_result = restic.prune(config.repository)
return forget_result and prune_result
def snapshots(config, containers):
"""Display restic snapshots"""
stdout, stderr = restic.snapshots(config.repository, last=True)
for line in stdout.decode().split('\n'):
print(line)
def alert(config, containers):
"""Test alerts"""
logger.info("Testing alerts")
alerts.send(
subject="{}: Test Alert".format(containers.project_name),
body="Test message",
)
def crontab(config):
"""Generate the crontab"""
print(cron.generate_crontab(config))
def parse_args():
parser = argparse.ArgumentParser(prog='restic_compose_backup')
parser.add_argument(
'action',
choices=[
'status',
'snapshots',
'backup',
'start-backup-process',
'alert',
'cleanup',
'version',
'crontab',
],
)
parser.add_argument(
'--log-level',
default=None,
choices=list(log.LOG_LEVELS.keys()),
help="Log level"
)
return parser.parse_args()
if __name__ == '__main__':
main()

View File

@@ -0,0 +1,91 @@
import logging
from typing import List, Tuple
from subprocess import Popen, PIPE
logger = logging.getLogger(__name__)
def test():
return run(['ls', '/volumes'])
def ping_mysql(host, port, username) -> int:
"""Check if the mysql is up and can be reached"""
return run([
'mysqladmin',
'ping',
'--host',
host,
'--port',
port,
'--user',
username,
])
def ping_mariadb(host, port, username) -> int:
"""Check if the mariadb is up and can be reached"""
return run([
'mysqladmin',
'ping',
'--host',
host,
'--port',
port,
'--user',
username,
])
def ping_postgres(host, port, username, password) -> int:
"""Check if postgres can be reached"""
return run([
"pg_isready",
f"--host={host}",
f"--port={port}",
f"--username={username}",
])
def run(cmd: List[str]) -> int:
"""Run a command with parameters"""
logger.debug('cmd: %s', ' '.join(cmd))
child = Popen(cmd, stdout=PIPE, stderr=PIPE)
stdoutdata, stderrdata = child.communicate()
if stdoutdata.strip():
log_std('stdout', stdoutdata.decode(),
logging.DEBUG if child.returncode == 0 else logging.ERROR)
if stderrdata.strip():
log_std('stderr', stderrdata.decode(), logging.ERROR)
logger.debug("returncode %s", child.returncode)
return child.returncode
def run_capture_std(cmd: List[str]) -> Tuple[str, str]:
"""Run a command with parameters and return stdout, stderr"""
logger.debug('cmd: %s', ' '.join(cmd))
child = Popen(cmd, stdout=PIPE, stderr=PIPE)
return child.communicate()
def log_std(source: str, data: str, level: int):
if isinstance(data, bytes):
data = data.decode()
if not data.strip():
return
log_func = logger.debug if level == logging.DEBUG else logger.error
log_func('%s %s %s', '-' * 10, source, '-' * 10)
lines = data.split('\n')
if lines[-1] == '':
lines.pop()
for line in lines:
log_func(line)
log_func('-' * 28)

View File

@@ -0,0 +1,34 @@
import os
class Config:
default_backup_command = "source /env.sh && rcb backup > /proc/1/fd/1"
default_crontab_schedule = "0 2 * * *"
"""Bag for config values"""
def __init__(self, check=True):
# Mandatory values
self.repository = os.environ.get('RESTIC_REPOSITORY')
self.password = os.environ.get('RESTIC_REPOSITORY')
self.docker_base_url = os.environ.get('DOCKER_BASE_URL') or "unix://tmp/docker.sock"
self.cron_schedule = os.environ.get('CRON_SCHEDULE') or self.default_crontab_schedule
self.cron_command = os.environ.get('CRON_COMMAND') or self.default_backup_command
# Log
self.log_level = os.environ.get('LOG_LEVEL')
# forget / keep
self.keep_daily = os.environ.get('KEEP_DAILY') or "7"
self.keep_weekly = os.environ.get('KEEP_WEEKLY') or "4"
self.keep_monthly = os.environ.get('KEEP_MONTHLY') or "12"
self.keep_yearly = os.environ.get('KEEP_YEARLY') or "3"
if check:
self.check()
def check(self):
if not self.repository:
raise ValueError("RESTIC_REPOSITORY env var not set")
if not self.password:
raise ValueError("RESTIC_REPOSITORY env var not set")

View File

@@ -2,7 +2,8 @@ import os
from pathlib import Path from pathlib import Path
from typing import List from typing import List
from restic_compose_backup import utils from restic_compose_backup import enums, utils
VOLUME_TYPE_BIND = "bind" VOLUME_TYPE_BIND = "bind"
VOLUME_TYPE_VOLUME = "volume" VOLUME_TYPE_VOLUME = "volume"
@@ -27,8 +28,8 @@ class Container:
if self._labels is None: if self._labels is None:
raise ValueError('Container meta missing Config->Labels') raise ValueError('Container meta missing Config->Labels')
self._include = self._parse_pattern(self.get_label('restic-compose-backup.volumes.include')) self._include = self._parse_pattern(self.get_label(enums.LABEL_VOLUMES_INCLUDE))
self._exclude = self._parse_pattern(self.get_label('restic-compose-backup.volumes.exclude')) self._exclude = self._parse_pattern(self.get_label(enums.LABEL_VOLUMES_EXCLUDE))
@property @property
def instance(self) -> 'Container': def instance(self) -> 'Container':
@@ -63,7 +64,10 @@ class Container:
@property @property
def environment(self) -> list: def environment(self) -> list:
"""All configured env vars for the container as a list""" """All configured env vars for the container as a list"""
return self.get_config('Env', default=[]) return self.get_config('Env')
def remove(self):
self._data.remove()
def get_config_env(self, name) -> str: def get_config_env(self, name) -> str:
"""Get a config environment variable by name""" """Get a config environment variable by name"""
@@ -71,6 +75,17 @@ class Container:
data = {i[0:i.find('=')]: i[i.find('=') + 1:] for i in self.environment} data = {i[0:i.find('=')]: i[i.find('=') + 1:] for i in self.environment}
return data.get(name) return data.get(name)
def set_config_env(self, name, value):
"""Set an environment variable"""
env = self.environment
new_value = f'{name}={value}'
for i, entry in enumerate(env):
if f'{name}=' in entry:
env[i] = new_value
break
else:
env.append(new_value)
@property @property
def volumes(self) -> dict: def volumes(self) -> dict:
""" """
@@ -96,7 +111,8 @@ class Container:
@property @property
def volume_backup_enabled(self) -> bool: def volume_backup_enabled(self) -> bool:
return utils.is_true(self.get_label('restic-compose-backup.volumes')) """bool: If the ``restic-compose-backup.volumes`` label is set"""
return utils.is_true(self.get_label(enums.LABEL_VOLUMES_ENABLED))
@property @property
def database_backup_enabled(self) -> bool: def database_backup_enabled(self) -> bool:
@@ -109,24 +125,27 @@ class Container:
@property @property
def mysql_backup_enabled(self) -> bool: def mysql_backup_enabled(self) -> bool:
return utils.is_true(self.get_label('restic-compose-backup.mysql')) """bool: If the ``restic-compose-backup.mysql`` label is set"""
return utils.is_true(self.get_label(enums.LABEL_MYSQL_ENABLED))
@property @property
def mariadb_backup_enabled(self) -> bool: def mariadb_backup_enabled(self) -> bool:
return utils.is_true(self.get_label('restic-compose-backup.mariadb')) """bool: If the ``restic-compose-backup.mariadb`` label is set"""
return utils.is_true(self.get_label(enums.LABEL_MARIADB_ENABLED))
@property @property
def postgresql_backup_enabled(self) -> bool: def postgresql_backup_enabled(self) -> bool:
return utils.is_true(self.get_label('restic-compose-backup.postgres')) """bool: If the ``restic-compose-backup.postgres`` label is set"""
return utils.is_true(self.get_label(enums.LABEL_POSTGRES_ENABLED))
@property @property
def is_backup_process_container(self) -> bool: def is_backup_process_container(self) -> bool:
"""Is this container the running backup process?""" """Is this container the running backup process?"""
return self.get_label('restic-compose-backup.backup_process') == 'True' return self.get_label(self.backup_process_label) == 'True'
@property @property
def is_running(self) -> bool: def is_running(self) -> bool:
"""Is the container running?""" """bool: Is the container running?"""
return self._state.get('Running', False) return self._state.get('Running', False)
@property @property
@@ -139,6 +158,11 @@ class Container:
"""Name of the container/service""" """Name of the container/service"""
return self.get_label('com.docker.compose.service', default='') return self.get_label('com.docker.compose.service', default='')
@property
def backup_process_label(self) -> str:
"""str: The unique backup process label for this project"""
return f"{enums.LABEL_BACKUP_PROCESS}-{self.project_name}"
@property @property
def project_name(self) -> str: def project_name(self) -> str:
"""Name of the compose setup""" """Name of the compose setup"""
@@ -160,6 +184,10 @@ class Container:
def filter_mounts(self): def filter_mounts(self):
"""Get all mounts for this container matching include/exclude filters""" """Get all mounts for this container matching include/exclude filters"""
filtered = [] filtered = []
if not self.volume_backup_enabled:
return filtered
if self._include: if self._include:
for mount in self._mounts: for mount in self._mounts:
for pattern in self._include: for pattern in self._include:
@@ -182,7 +210,7 @@ class Container:
return filtered return filtered
def volumes_for_backup(self, source_prefix='/backup', mode='ro'): def volumes_for_backup(self, source_prefix='/volumes', mode='ro'):
"""Get volumes configured for backup""" """Get volumes configured for backup"""
mounts = self.filter_mounts() mounts = self.filter_mounts()
volumes = {} volumes = {}
@@ -295,6 +323,7 @@ class RunningContainers:
self.containers = [] self.containers = []
self.this_container = None self.this_container = None
self.backup_process_container = None self.backup_process_container = None
self.stale_backup_process_containers = []
# Find the container we are running in. # Find the container we are running in.
# If we don't have this information we cannot continue # If we don't have this information we cannot continue
@@ -305,10 +334,20 @@ class RunningContainers:
if not self.this_container: if not self.this_container:
raise ValueError("Cannot find metadata for backup container") raise ValueError("Cannot find metadata for backup container")
# Gather all containers in the current compose setup # Gather all running containers in the current compose setup
for container_data in all_containers: for container_data in all_containers:
container = Container(container_data) container = Container(container_data)
# Gather stale backup process containers
if (self.this_container.image == container.image
and not container.is_running
and container.is_backup_process_container):
self.stale_backup_process_containers.append(container)
# We only care about running containers after this point
if not container.is_running:
continue
# Detect running backup process container # Detect running backup process container
if container.is_backup_process_container: if container.is_backup_process_container:
self.backup_process_container = container self.backup_process_container = container
@@ -319,6 +358,16 @@ class RunningContainers:
if container.id != self.this_container.id: if container.id != self.this_container.id:
self.containers.append(container) self.containers.append(container)
@property
def project_name(self) -> str:
"""str: Name of the compose project"""
return self.this_container.project_name
@property
def backup_process_label(self) -> str:
"""str: The backup process label for this project"""
return self.this_container.backup_process_label
@property @property
def backup_process_running(self) -> bool: def backup_process_running(self) -> bool:
"""Is the backup process container running?""" """Is the backup process container running?"""
@@ -328,7 +377,7 @@ class RunningContainers:
"""Obtain all containers with backup enabled""" """Obtain all containers with backup enabled"""
return [container for container in self.containers if container.backup_enabled] return [container for container in self.containers if container.backup_enabled]
def generate_backup_mounts(self, dest_prefix='/backup') -> dict: def generate_backup_mounts(self, dest_prefix='/volumes') -> dict:
"""Generate mounts for backup for the entire compose setup""" """Generate mounts for backup for the entire compose setup"""
mounts = {} mounts = {}
for container in self.containers_for_backup(): for container in self.containers_for_backup():
@@ -338,6 +387,7 @@ class RunningContainers:
return mounts return mounts
def get_service(self, name) -> Container: def get_service(self, name) -> Container:
"""Container: Get a service by name"""
for container in self.containers: for container in self.containers:
if container.service_name == name: if container.service_name == name:
return container return container

View File

@@ -22,11 +22,12 @@ class MariadbContainer(Container):
def ping(self) -> bool: def ping(self) -> bool:
"""Check the availability of the service""" """Check the availability of the service"""
creds = self.get_credentials() creds = self.get_credentials()
return commands.ping_mysql(
with utils.environment('MYSQL_PWD', creds['password']):
return commands.ping_mariadb(
creds['host'], creds['host'],
creds['port'], creds['port'],
creds['username'], creds['username'],
creds['password'],
) )
def dump_command(self) -> list: def dump_command(self) -> list:
@@ -37,15 +38,17 @@ class MariadbContainer(Container):
f"--host={creds['host']}", f"--host={creds['host']}",
f"--port={creds['port']}", f"--port={creds['port']}",
f"--user={creds['username']}", f"--user={creds['username']}",
f"--password={creds['password']}",
"--all-databases", "--all-databases",
] ]
def backup(self): def backup(self):
config = Config() config = Config()
creds = self.get_credentials()
with utils.environment('MYSQL_PWD', creds['password']):
return restic.backup_from_stdin( return restic.backup_from_stdin(
config.repository, config.repository,
f'/backup/{self.service_name}/all_databases.sql', f'/databases/{self.service_name}/all_databases.sql',
self.dump_command(), self.dump_command(),
) )
@@ -65,11 +68,12 @@ class MysqlContainer(Container):
def ping(self) -> bool: def ping(self) -> bool:
"""Check the availability of the service""" """Check the availability of the service"""
creds = self.get_credentials() creds = self.get_credentials()
with utils.environment('MYSQL_PWD', creds['password']):
return commands.ping_mysql( return commands.ping_mysql(
creds['host'], creds['host'],
creds['port'], creds['port'],
creds['username'], creds['username'],
creds['password'],
) )
def dump_command(self) -> list: def dump_command(self) -> list:
@@ -80,15 +84,17 @@ class MysqlContainer(Container):
f"--host={creds['host']}", f"--host={creds['host']}",
f"--port={creds['port']}", f"--port={creds['port']}",
f"--user={creds['username']}", f"--user={creds['username']}",
f"--password={creds['password']}",
"--all-databases", "--all-databases",
] ]
def backup(self): def backup(self):
config = Config() config = Config()
creds = self.get_credentials()
with utils.environment('MYSQL_PWD', creds['password']):
return restic.backup_from_stdin( return restic.backup_from_stdin(
config.repository, config.repository,
f'/backup/{self.service_name}/all_databases.sql', f'/databases/{self.service_name}/all_databases.sql',
self.dump_command(), self.dump_command(),
) )
@@ -135,6 +141,6 @@ class PostgresContainer(Container):
with utils.environment('PGPASSWORD', creds['password']): with utils.environment('PGPASSWORD', creds['password']):
return restic.backup_from_stdin( return restic.backup_from_stdin(
config.repository, config.repository,
f"/backup/{self.service_name}/{creds['database']}.sql", f"/databases/{self.service_name}/{creds['database']}.sql",
self.dump_command(), self.dump_command(),
) )

View File

@@ -0,0 +1,69 @@
"""
# ┌───────────── minute (0 - 59)
# │ ┌───────────── hour (0 - 23)
# │ │ ┌───────────── day of the month (1 - 31)
# │ │ │ ┌───────────── month (1 - 12)
# │ │ │ │ ┌───────────── day of the week (0 - 6) (Sunday to Saturday;
# │ │ │ │ │ 7 is also Sunday on some systems)
# │ │ │ │ │
# │ │ │ │ │
# * * * * * command to execute
"""
QUOTE_CHARS = ['"', "'"]
def generate_crontab(config):
"""Generate a crontab entry for running backup job"""
command = config.cron_command.strip()
schedule = config.cron_schedule
if schedule:
schedule = schedule.strip()
schedule = strip_quotes(schedule)
if not validate_schedule(schedule):
schedule = config.default_crontab_schedule
else:
schedule = config.default_crontab_schedule
return f'{schedule} {command}\n'
def validate_schedule(schedule: str):
"""Validate crontab format"""
parts = schedule.split()
if len(parts) != 5:
return False
for p in parts:
if p != '*' and not p.isdigit():
return False
minute, hour, day, month, weekday = parts
try:
validate_field(minute, 0, 59)
validate_field(hour, 0, 23)
validate_field(day, 1, 31)
validate_field(month, 1, 12)
validate_field(weekday, 0, 6)
except ValueError:
return False
return True
def validate_field(value, min, max):
if value == '*':
return
i = int(value)
return min <= i <= max
def strip_quotes(value: str):
"""Strip enclosing single or double quotes if present"""
if value[0] in QUOTE_CHARS:
value = value[1:]
if value[-1] in QUOTE_CHARS:
value = value[:-1]
return value

View File

@@ -0,0 +1,11 @@
# Labels
LABEL_VOLUMES_ENABLED = 'restic-compose-backup.volumes'
LABEL_VOLUMES_INCLUDE = 'restic-compose-backup.volumes.include'
LABEL_VOLUMES_EXCLUDE = 'restic-compose-backup.volumes.exclude'
LABEL_MYSQL_ENABLED = 'restic-compose-backup.mysql'
LABEL_POSTGRES_ENABLED = 'restic-compose-backup.postgres'
LABEL_MARIADB_ENABLED = 'restic-compose-backup.mariadb'
LABEL_BACKUP_PROCESS = 'restic-compose-backup.process'

View File

@@ -0,0 +1,28 @@
import logging
import os
import sys
logger = logging.getLogger('restic_compose_backup')
HOSTNAME = os.environ['HOSTNAME']
DEFAULT_LOG_LEVEL = logging.INFO
LOG_LEVELS = {
'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
}
def setup(level: str = 'warning'):
"""Set up logging"""
level = level or ""
level = LOG_LEVELS.get(level.lower(), DEFAULT_LOG_LEVEL)
logger.setLevel(level)
ch = logging.StreamHandler(stream=sys.stdout)
ch.setLevel(level)
# ch.setFormatter(logging.Formatter(f'%(asctime)s - {HOSTNAME} - %(name)s - %(levelname)s - %(message)s'))
# ch.setFormatter(logging.Formatter(f'%(asctime)s - {HOSTNAME} - %(levelname)s - %(message)s'))
ch.setFormatter(logging.Formatter(f'%(asctime)s - %(levelname)s: %(message)s'))
logger.addHandler(ch)

View File

@@ -0,0 +1,111 @@
"""
Restic commands
"""
import logging
from typing import List, Tuple
from subprocess import Popen, PIPE
from restic_compose_backup import commands
logger = logging.getLogger(__name__)
def init_repo(repository: str):
"""
Attempt to initialize the repository.
Doing this after the repository is initialized
"""
return commands.run(restic(repository, [
"init",
]))
def backup_files(repository: str, source='/volumes'):
return commands.run(restic(repository, [
"--verbose",
"backup",
source,
]))
def backup_from_stdin(repository: str, filename: str, source_command: List[str]):
"""
Backs up from stdin running the source_command passed in.
It will appear in restic with the filename (including path) passed in.
"""
dest_command = restic(repository, [
'backup',
'--stdin',
'--stdin-filename',
filename,
])
# pipe source command into dest command
source_process = Popen(source_command, stdout=PIPE, bufsize=65536)
dest_process = Popen(dest_command, stdin=source_process.stdout, stdout=PIPE, stderr=PIPE, bufsize=65536)
stdout, stderr = dest_process.communicate()
# Ensure both processes exited with code 0
source_exit, dest_exit = source_process.poll(), dest_process.poll()
exit_code = 0 if (source_exit == 0 and dest_exit == 0) else 1
if stdout:
commands.log_std('stdout', stdout, logging.DEBUG if exit_code == 0 else logging.ERROR)
if stderr:
commands.log_std('stderr', stderr, logging.ERROR)
return exit_code
def snapshots(repository: str, last=True) -> Tuple[str, str]:
"""Returns the stdout and stderr info"""
args = ["snapshots"]
if last:
args.append('--last')
return commands.run_capture_std(restic(repository, args))
def is_initialized(repository: str) -> bool:
"""
Checks if a repository is initialized using snapshots command.
Note that this cannot separate between uninitalized repo
and other errors, but this method is reccomended by the restic
community.
"""
return commands.run(restic(repository, ["snapshots", '--last'])) == 0
def forget(repository: str, daily: str, weekly: str, monthly: str, yearly: str):
return commands.run(restic(repository, [
'forget',
'--keep-daily',
daily,
'--keep-weekly',
weekly,
'--keep-monthly',
monthly,
'--keep-yearly',
yearly,
]))
def prune(repository: str):
return commands.run(restic(repository, [
'prune',
]))
def check(repository: str):
return commands.run(restic(repository, [
"check",
# "--with-cache",
]))
def restic(repository: str, args: List[str]):
"""Generate restic command"""
return [
"restic",
"-r",
repository,
] + args

View File

@@ -1,26 +1,45 @@
import os import os
import logging
from typing import List
from contextlib import contextmanager from contextlib import contextmanager
import docker import docker
from restic_compose_backup.config import Config from restic_compose_backup.config import Config
logger = logging.getLogger(__name__)
TRUE_VALUES = ['1', 'true', 'True', True, 1] TRUE_VALUES = ['1', 'true', 'True', True, 1]
def list_containers(): def docker_client():
config = Config()
return docker.DockerClient(base_url=config.docker_base_url)
def list_containers() -> List[dict]:
""" """
List all containers. List all containers.
Returns: Returns:
List of raw container json data from the api List of raw container json data from the api
""" """
config = Config() client = docker_client()
client = docker.DockerClient(base_url=config.docker_base_url) all_containers = client.containers.list(all=True)
all_containers = client.containers.list()
client.close() client.close()
return [c.attrs for c in all_containers] return [c.attrs for c in all_containers]
def remove_containers(containers: List['Container']):
client = docker_client()
logger.info('Attempting to delete stale backup process containers')
for container in containers:
logger.info(' -> deleting %s', container.name)
try:
c = client.containers.get(container.name)
c.remove()
except Exception as ex:
logger.exception(ex)
def is_true(value): def is_true(value):
""" """
Evaluates the truthfullness of a bool value in container labels Evaluates the truthfullness of a bool value in container labels

View File

@@ -3,12 +3,12 @@ from setuptools import setup, find_namespace_packages
setup( setup(
name="restic-compose-backup", name="restic-compose-backup",
url="https://github.com/ZettaIO/restic-compose-backup", url="https://github.com/ZettaIO/restic-compose-backup",
version="0.2.0", version="0.4.0",
author="Einar Forselv", author="Einar Forselv",
author_email="eforselv@gmail.com", author_email="eforselv@gmail.com",
packages=find_namespace_packages(include=['restic_compose_backup']), packages=find_namespace_packages(include=['restic_compose_backup']),
install_requires=[ install_requires=[
'docker==3.7.2', 'docker==4.1.*',
], ],
entry_points={'console_scripts': [ entry_points={'console_scripts': [
'restic-compose-backup = restic_compose_backup.cli:main', 'restic-compose-backup = restic_compose_backup.cli:main',

View File

@@ -1 +1,2 @@
pytest==4.3.1 pytest==4.3.1
tox

View File

@@ -59,6 +59,7 @@ class ResticBackupTests(unittest.TestCase):
{ {
'service': 'web', 'service': 'web',
'labels': { 'labels': {
'restic-compose-backup.volumes': True,
'test': 'test', 'test': 'test',
}, },
'mounts': [{ 'mounts': [{
@@ -111,7 +112,7 @@ class ResticBackupTests(unittest.TestCase):
with mock.patch(list_containers_func, fixtures.containers(containers=containers)): with mock.patch(list_containers_func, fixtures.containers(containers=containers)):
cnt = RunningContainers() cnt = RunningContainers()
self.assertTrue(len(cnt.containers_for_backup()) == 2) self.assertTrue(len(cnt.containers_for_backup()) == 2)
self.assertEqual(cnt.generate_backup_mounts(), {'test': {'bind': '/backup/web/test', 'mode': 'ro'}}) self.assertEqual(cnt.generate_backup_mounts(), {'test': {'bind': '/volumes/web/test', 'mode': 'ro'}})
def test_include(self): def test_include(self):
containers = self.createContainers() containers = self.createContainers()
@@ -119,7 +120,8 @@ class ResticBackupTests(unittest.TestCase):
{ {
'service': 'web', 'service': 'web',
'labels': { 'labels': {
'restic-compose-backup.include': 'media', 'restic-compose-backup.volumes': True,
'restic-compose-backup.volumes.include': 'media',
}, },
'mounts': [ 'mounts': [
{ {
@@ -142,6 +144,7 @@ class ResticBackupTests(unittest.TestCase):
self.assertNotEqual(web_service, None, msg="Web service not found") self.assertNotEqual(web_service, None, msg="Web service not found")
mounts = web_service.filter_mounts() mounts = web_service.filter_mounts()
print(mounts)
self.assertEqual(len(mounts), 1) self.assertEqual(len(mounts), 1)
self.assertEqual(mounts[0].source, '/srv/files/media') self.assertEqual(mounts[0].source, '/srv/files/media')
@@ -151,7 +154,8 @@ class ResticBackupTests(unittest.TestCase):
{ {
'service': 'web', 'service': 'web',
'labels': { 'labels': {
'restic-compose-backup.exclude': 'stuff', 'restic-compose-backup.volumes': True,
'restic-compose-backup.volumes.exclude': 'stuff',
}, },
'mounts': [ 'mounts': [
{ {
@@ -187,7 +191,7 @@ class ResticBackupTests(unittest.TestCase):
{ {
'service': 'backup_runner', 'service': 'backup_runner',
'labels': { 'labels': {
'restic-compose-backup.backup_process': 'True', 'restic-compose-backup.process-default': 'True',
}, },
}, },
] ]

56
tox.ini Normal file
View File

@@ -0,0 +1,56 @@
# Ensure that this file do not contain non-ascii characters
# as flake8 can fail to parse the file on OS X and Windows
[tox]
skipsdist = True
setupdir={toxinidir}/src
envlist =
py37
pep8
[testenv]
usedevelop = True
basepython =
py37: python3.7
deps =
-r{toxinidir}/src//tests/requirements.txt
commands =
; coverage run --source=restic_compose_backup -m pytest tests/
; coverage report
pytest
[testenv:pep8]
usedevelop = false
deps = flake8
basepython = python3.7
commands = flake8
[pytest]
norecursedirs = tests/* .venv/* .tox/* build/ docs/
[flake8]
# H405: multi line docstring summary not separated with an empty line
# D100: Missing docstring in public module
# D101: Missing docstring in public class
# D102: Missing docstring in public method
# D103: Missing docstring in public function
# D104: Missing docstring in public package
# D105: Missing docstring in magic method
# D200: One-line docstring should fit on one line with quotes
# D202: No blank lines allowed after function docstring
# D203: 1 blank required before class docstring.
# D204: 1 blank required after class docstring
# D205: Blank line required between one-line summary and description.
# D207: Docstring is under-indented
# D208: Docstring is over-indented
# D211: No blank lines allowed before class docstring
# D301: Use r""" if any backslashes in a docstring
# D400: First line should end with a period.
# D401: First line should be in imperative mood.
# *** E302 expected 2 blank lines, found 1
# *** W503 line break before binary operator
ignore = H405,D100,D101,D102,D103,D104,D105,D200,D202,D203,D204,D205,D211,D301,D400,D401,W503
show-source = True
max-line-length = 120
exclude = .tox,.venv*,tests,build,conf.py