Giter Site home page Giter Site logo

web-server-backup's Introduction

Web Server Backup Script (now with S3 sync)

This is a bash script for backing up multiple web sites and MySQL databases into a specified backups directory. It's a good idea to run it every night via cron.

Once configured (variables set within the script), it does this:

  • Creates a directory for your site (file) backups (if it doesn't exist)
  • Creates a directory for your MySQL dumps (if it doesn't exist)
  • Loops through all of your MySQL databases and dumps each one of them to a gzipped file
  • Deletes database dumps older than a specified number of days from the backup directory
  • Tars and gzips each folder within your sites directory (I keep my websites in /var/www/sites/)
  • Deletes site archives older than a specified number of days from the backup directory
  • Optionally syncs all backup files to Amazon S3 or a remote server, using s3sync.rb or rsync respectively

BETA WARNING

This script works fine for me (using Ubuntu 8.04 on Slicehost), but servers vary greatly. USE THIS SCRIPT AT YOUR OWN RISK!! There is always risk involved with running a script. I AM NOT RESPONSIBLE FOR DAMAGE CAUSED BY THIS SCRIPT.

You may very well know more about bash scripting and archiving than I do. If you find any flaws with this script or have any recommendations as to how this script can be improved, please fork it and send me a pull request.

Installation

  • MOST IMPORTANTLY: Open the backup.sh file in a text editor and set the configuration variables at the top (see below).
  • Optionally, edit the tar command on line 91 to add some more --exclude options (e.g. --exclude="cache/*")
  • Place the backup.sh file somewhere on your server (something like /usr/local/web-server-backup).
  • Make sure the backup.sh script is owned by root: sudo chown -R 0:0 /usr/local/web-server-backup
  • Make sure the backup.sh script is executable by root: sudo chmod 744 /usr/local/web-server-backup/backup.sh
  • Set up your Amazon S3 account and bucket (or a remote account for rsync)
  • Set up s3sync (see below)
  • Preferably set up cron to run it every night (see below).

Configuration

There are a bunch of variables that you can set to customize the way the script works. Some of them must be set before running the script!

NOTE: The BACKUP_DIR setting is preset to /backups/site-backups. If you want to use something like /var/site-backups, you'll need to create the directory first and set it to be writable by you.

General Settings:

  • BACKUP_DIR: The parent directory in which the backups will be placed. It's preset to: "/backups/site-backups"
  • KEEP_MYSQL: How many days worth of mysql dumps to keep. It's preset to: "14"
  • KEEP_SITES: How many days worth of site tarballs to keep. It's preset to: "2"

MySQL Settings:

  • MYSQL_HOST: The MySQL hostname. It's preset to the standard: "localhost"
  • MYSQL_USER: The MySQL username. It's preset to the standard: "root"
  • MYSQL_PASS: The MySQL password. You'll need to set this yourself!
  • MYSQL_BACKUP_DIR: The directory in which the dumps will be placed. It's preset to: "$BACKUP\_DIR/mysql/"

Web Site Settings:

  • SITES_DIR: This is the directory where you keep all of your web sites. It's preset to: "/var/www/sites/"
  • SITES_BACKUP_DIR: The directory in which the archived site files will be placed. It's preset to: "$BACKUP_DIR/sites/"

S3sync Settings (recommended):

You can sync to an Amazon S3 bucket, but you'll need to install s3sync.rb first. Get it from here. It is a Ruby script, so you'll need to make sure you have Ruby installed.

The only thing tricky about getting s3sync.rb installed is the CA certificates. If you are on Ubuntu, you can install the ca-certificates package with apt-get or aptitude. You need those certificates to connect to S3 securely (SSL).

  • S3SYNC_PATH: Wherever you installed s3sync.rb. It's preset to: "/usr/local/s3sync/s3sync.rb"
  • S3_BUCKET: The name of the bucket to which you wish to sync.
  • AWS_ACCESS_KEY_ID: Log in to your Amazon AWS account to get this.
  • AWS_SECRET_ACCESS_KEY: Log in to your Amazon AWS account to get this.
  • USE_SSL: If this is set to "true", s3sync will use a secure connection to S3, but you'll need to set the SSL_CERT_DIR or SSL_CERT_FILE to make it work. See the s3sync README for more info.
  • SSL_CERT_DIR: Where your Cert Authority keys live. It's preset to: "/etc/ssl/certs"
  • SSL_CERT_FILE: If you have just one PEM file for CA verification, you can use this instead of SSL_CERT_DIR.

Rsync Settings (alternative to S3):

  • RSYNC: Whether or not you want to rsync the backups to another server. (Either "true" or "false") It's preset to: "true"
  • RSYNC_USER: The user account name on the remote server. Please note that there is no password setting. It is recommended that you use an SSH key. You'll need to set this yourself!
  • RSYNC_SERVER: The server address of the remote server. You'll need to set this yourself! It's preset to: "other.server.com"
  • RSYNC_DIR: The directory on the remote server that will be synchronized with $BACKUP_DIR. It's preset to: "web_site_backups"
  • RSYNC_PORT: If you have set a custom SSH port on your remote server, you'll need to change this. It's preset to: "22"

Date format: (change if you want)

  • THE_DATE: The date that will be appended to filenames. It's preset to: "$(date '+%Y-%m-%d')"

Paths to commands: (probably won't need to change these)

  • MYSQL_PATH: Path to mysql. It's preset to: "$(which mysql)"
  • MYSQLDUMP_PATH: Path to mysqldump. It's preset to: "$(which mysqldump)"
  • FIND_PATH: Path to find. It's preset to: "$(which find)"
  • TAR_PATH: Path to tar. It's preset to: "$(which tar)"
  • RSYNC_PATH: Path to rsync. It's preset to: "$(which rsync)"

Running with cron (recommended)

Once you've tested the script, I recommend setting it up to be run every night with cron. Here's a sample cron config:

SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/bin:/usr/local/sbin
[email protected]
HOME=/root

30 4 * * * root /usr/local/web-server-backup/backup.sh

That'll run the script (located in /usr/local) at 4:30 every morning and email the output to [email protected].

If you want to only receive emails about errors, you can use:

30 4 * * * root /usr/local/web-server-backup/backup.sh > /dev/null

So, take the above example, change the email address, etc., save it to a text file, and place it in /etc/cron.d/. That should do it.

web-server-backup's People

Contributors

postpostmodern avatar

Stargazers

Justin Morris avatar

Watchers

Justin Morris avatar James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.