Automatically Back Up a Linux Server to Google Drive

The intent of this Bash script is to automatically back up a Linux server hosting websites and MySQL / MariaDB databases to Google Drive. The script assembles website files, database dumps and a handful of Apache, PHP, MySQL/MariaDB, and LetsEncrypt configuration files from a Linux server and uploads the package to Google Drive on a scheduled basis in an encrypted archive. While this script references paths and files specific to an AWS Lightsail instance running Debian with a LAMP stack hosting WordPress, it can be easily modified to any other Linux-based environment. The script should also be adjusted to meet your specific archiving requirements by including or excluding files and paths referenced in the script. I fully understand that AWS has an automatic snapshot back up capability, but I want to use Google Drive storage since it is free, i.e. AWS snapshots are inexpensive but they are not free.

Usage

If no options are provided when executing this script, then it will back up all MySQL databases accessible to the user specified in the .mysqldump.cnf file. No other files will be included in the backup in that case.

All files are packaged in a .tar.gz file and encrypted using AES-256 when the -e option is provided.

  • -h displays this help message and exits.
  • -f (full backup) includes both the MySQL dump file as well as other files specified in the script.
  • -e is used to encrypt the backup archive using AES-256. If this flag is set, then a password must be supplied in the .backup.cnf file.
  • -n x sets the number of files to retain on Google Drive in the upload directory (x is an integer value). The default is 30 files.
  • -d sets the parent folder where the backup file is uploaded on Google Drive. If the folder doesn’t exist, then the script will create it. The default folder name is backup.
  • -p sets the filename prefix of the backup file. The script will use the prefix followed by the date/time to set the full filename. The default filename prefix is backup.
  • -m is used to specify the database name to backup. The default is all databases.

Examples

This example uses all of the available options and creates a full encrypted backup of both MySQL and other server files. The backup file is named backup-example-com (with a date/time suffix) and uploaded it to a folder named backup-full-example-com on Google Drive. If the specified folder contains more than 5 files, the script deletes any files beyond the 5 based on a filename sort.

./backup.sh -e -f -n 5 -d backup-full-example-com -p backup-example-com -m db_example_com_wp_prd

Installation

As a personal preference, I store user scripts and associated configuration files in a subdirectory under home named scripts.

The following command creates the scripts subdirectory.

mkdir -m 700 ~/scripts

Next, create a file named .mysqldump.cnf. This file will be used by the mysqldump command in the backup script to pass the MySQL database user and password to dump the WordPress database.

touch ~/scripts/.mysqldump.cnf
chmod 600 ~/scripts/.mysqldump.cnf
vim ~/scripts/.mysqldump.cnf

In the vim editor, add the database username and password with access to dump all of the required tables and data for a backup. Apostrophes surrounding the password are needed if any special characters are used in the actual password.

[mysqldump]
user=username
password='password'

The following commands create the .backup.cnf file which contains a secret key used by the script to encrypt the backup file.

touch ~/scripts/.backup.cnf
chmod 600 ~/scripts/.backup.cnf
vim ~/scripts/.backup.cnf

In the vim editor, add a secret key/password to the file.

pass=passsword

Now we’ll create the actual backup script. Please be sure to modify any referenced paths and constants to your own environment.

touch ~/scripts/backup.sh
chmod 700 ~/scripts/backup.sh
vim ~/scripts/backup.sh

The directory and file structure is set up as follows.

~/ (user's home directory)
 └── .config/
     └── gdrive/
        └── USERNAME_v2.json (Google Drive authentication token created in subsequent steps)
 └── bin/
     └── gdrive_linux_amd64 (gdrive binary compiled in subsequent steps)
 └── scripts/
     ├── .backup.cnf
     ├── .mysqldump.cnf
     └── backup.sh

Source Code

The following is the full Bash script. Please modify any constants/paths for your own environment and requirements.

#!/bin/bash

getConfigVal() {
  echo `grep ${1} "${2}" | cut -d '=' -f 2`
}

dirname_temp_backup="automated_backup"
filename_prefix_backup="backup"
filename_suffix_backup="_$(date +'%Y_%m_%d_%H_%M')"
db_name="--all-databases"
gdrive_backup_dirname="backup"
num_files_to_retain=30
path_script="$(dirname "$(readlink -f "$0")")"
path_mysqldump="$(which mysqldump)"
path_gdrive="/home/admin/bin/gdrive_linux_amd64"
path_backup_cnf="${path_script}/.backup.cnf"
path_mysqldump_cnf="${path_script}/.mysqldump.cnf"
path_backup="${HOME}/${dirname_temp_backup}"

while getopts hefn:d:p:m: flag
do
  case "${flag}" in
    h)
      echo "Usage: backup.sh [OPTION]..."
      echo -e "Assembles files and database dumps in a .tar.gz and uploads the archive to Google Drive.\n"

      echo "Options:"
      echo -e "\t-h displays this help message and exits."
      echo -e "\t-f (full backup) includes both the MySQL dump file as well as other files specified in the script."
      echo -e "\t-e is used to encrypt the backup archive using AES-256. If this flag is set, then a password must be supplied in the .backup.cnf file."
      echo -e "\t-n x sets the number of files to retain on Google Drive in the upload directory (x is an integer value). The default is 30 files."
      echo -e "\t-d sets the parent folder where the backup file is uploaded on Google Drive. If the folder doesn’t exist, then the script will create it. The default folder name is backup."
      echo -e "\t-p sets the filename prefix of the backup file. The script will use the prefix followed by the date/time to set the full filename. The default filename prefix is backup."
      echo -e "\t-m is used to specify the database name to backup. The default is all databases."
      exit 1
      ;;
    e)
      bool_encrypt_backup=1
      encryption_key=$(getConfigVal "pass" "${path_backup_cnf}")

      if [ -z "${encryption_key}" ]
      then
        echo "ERROR: Encryption key not found."
        exit 1
      fi
      ;;
    f)
      bool_full_backup=1
      ;;
    n)
      if ! [[ "${OPTARG}" =~ ^[0-9]+$ ]]
      then
        echo "WARNING: Number of backups to retain must be an integer. Reverting to default value (${num_files_to_retain})."
      else
        num_files_to_retain="${OPTARG}"
      fi
      ;;
    d)
      gdrive_backup_dirname="${OPTARG}"
      ;;
    p)
      filename_prefix_backup="${OPTARG}"
      ;;
    m)
      db_name="--databases ${OPTARG}"
      ;;
  esac
done

if [ -z "${path_script}" ] || [ -z "${path_mysqldump}" ] || [ -z "${path_gdrive}" ] || [ -z "${path_backup_cnf}" ] || [ -z "${path_mysqldump_cnf}" ] || [ -z "${path_backup}" ]
then
  echo "ERROR: One or more required path variable(s) are undefined or missing."
  echo -e "\tPath to backup.sh: ${path_script}"
  echo -e "\tPath to mysqldump: ${path_mysqldump}"
  echo -e "\tPath to gdrive: ${path_gdrive}"
  echo -e "\tPath to .backup.cnf: ${path_backup_cnf}"
  echo -e "\tPath to .mysqldump.cnf: ${path_mysqldump_cnf}"
  echo -e "\tPath to write backup file: ${path_backup}"
  exit 1
fi

echo "Initiating backup with the following options:"
echo -e "\tBackup file prefix: ${filename_prefix_backup}"
echo -e "\tBackup file suffix: ${filename_suffix_backup}"
echo -e "\tDatabase name: ${db_name}"
echo -e "\tGoogle Drive Folder: ${gdrive_backup_dirname}"
echo -e "\tNumber of Files to Retain: ${num_files_to_retain}"

num_files_to_retain=$((num_files_to_retain+1))

sudo rm -rf "${path_backup}"
sudo rm -f "${HOME}/${filename_prefix_backup}"*.tar.gz

mkdir "${path_backup}"

if [ -n "${bool_full_backup}" ]
then
  sudo find /www -type f -not -path '*/cache/*' -exec sudo cp --parents '{}' "${path_backup}" \;

  sudo find /etc/apache2/conf-available -type f -exec sudo cp --parents '{}' "${path_backup}" \;
  sudo find /etc/apache2/sites-available -type f -exec sudo cp --parents '{}' "${path_backup}" \;

  sudo find /etc/letsencrypt/ -maxdepth 1 -name "options-ssl-apache.conf" -type f -exec sudo cp --parents '{}' "${path_backup}" \;
  sudo find /etc/letsencrypt/archive -type f -exec sudo cp --parents '{}' "${path_backup}" \;
fi

"${path_mysqldump}" --defaults-extra-file="${path_mysqldump_cnf}" ${db_name} --no-tablespaces -ce > "${path_backup}/mysqldump.sql"

touch "${HOME}/${filename_prefix_backup}.tar.gz"

chmod 600 "${HOME}/${filename_prefix_backup}.tar.gz"

sudo tar -czf "${HOME}/${filename_prefix_backup}.tar.gz" -C "${path_backup}" .

if [ -n "${bool_encrypt_backup}" ]
then
  touch "${HOME}/${filename_prefix_backup}_enc.tar.gz"

  chmod 600 "${HOME}/${filename_prefix_backup}_enc.tar.gz"

  openssl enc -e -a -md sha512 -pbkdf2 -iter 100000 -salt -AES-256-CBC -pass "pass:${encryption_key}" -in "${HOME}/${filename_prefix_backup}.tar.gz" -out "${HOME}/${filename_prefix_backup}_enc.tar.gz"

  sudo rm -f "${HOME}/${filename_prefix_backup}.tar.gz"
  mv "${HOME}/${filename_prefix_backup}_enc.tar.gz" "${HOME}/${filename_prefix_backup}.tar.gz"
fi

mv "${HOME}/${filename_prefix_backup}.tar.gz" "${HOME}/${filename_prefix_backup}${filename_suffix_backup}.tar.gz"

folder_id=`"${path_gdrive}" list -m 1000 --no-header -q "trashed = false and mimeType = 'application/vnd.google-apps.folder'" | grep -m 1 -w "${gdrive_backup_dirname}" | head -1 | cut -d ' ' -f1`

if [ -z "${folder_id}" ]
then
  "${path_gdrive}" mkdir "${gdrive_backup_dirname}"
  folder_id=`"${path_gdrive}" list -m 1000 --no-header -q "trashed = false and mimeType = 'application/vnd.google-apps.folder'" | grep -m 1 -w "${gdrive_backup_dirname}" | head -1 | cut -d ' ' -f1`
fi

"${path_gdrive}" upload "${HOME}/${filename_prefix_backup}${filename_suffix_backup}.tar.gz" --parent "${folder_id}"

sudo rm -rf "${path_backup}"
sudo rm -f "${HOME}/${filename_prefix_backup}"*.tar.gz

expired_file_ids=`"${path_gdrive}" list -m 1000 --no-header -q "'${folder_id}' in parents and trashed = false and mimeType != 'application/vnd.google-apps.folder'" | sort -k 2r | tail -n +"${num_files_to_retain}" | cut -d ' ' -f1`

if [ -n "${expired_file_ids}" ]
then
  while read -r file_id; do
    "${path_gdrive}" delete "${file_id}"
  done <<< "${expired_file_ids}"
fi

echo -e "\nBackup complete"

Setting Up Google Drive CLI Client (gdrive)

The backup script has a dependency on the freely available Google Drive CLI Client (gdrive) project in order for it to communicate with Google Drive. There are several forks of the gdrive project in GitHub, but the version linked in this article is updated to continue working after Google blocked OAuth out-of-band projects.

There are three high-level steps required to enable gdrive. The first step creates a new Google Cloud project and credentials to provide gdrive with access to the appropriate Google Drive location. The second step compiles gdrive with the OAuth credentials created in the first step. The third step authenticates gdrive as a third-party app with account access and creates the required authentication token.

Step 1 – Creating a Google Cloud Project and Credentials

Creating a Google Cloud Project

  1. Navigate to the Google Cloud Platform Console. Log in using the account associated with the Google Drive where the backup files will be stored.
  2. From the project drop-down list, create a new project.
Google Cloud Platform – Add New Project
Google Cloud Platform – Add New Project
  1. On the New Project screen, provide a unique Project name and select a Location. No organization is a valid Location selection.
Google Cloud Platform – New Project Creation
Google Cloud Platform – New Project Creation
  1. Click the CREATE button. It may take several seconds for the project to be created.
Google Cloud Platform – Project Created Notification
Google Cloud Platform – Project Created Notification
  1. Once the project is created, verify that the project is selected and you are working in that specific project. If you are not working within that project, then select it from the project drop-down list.
Google Cloud Platform – Select Project
Google Cloud Platform – Select Project

Enabling Google Drive API

  1. Navigate to the Google Drive API – Marketplace to enable the Google Drive API.
  2. Click the ENABLE button if it is not already enabled.
Google Drive API – Enabled
Google Drive API – Enabled

Creating an OAuth Consent Screen

Now that we have established a Google Cloud project, we can set up a corresponding OAuth consent screen.

  1. Navigate to the OAuth consent screen. Set the User Type field to External. Click the CREATE button.
Google Cloud Platform – OAuth Consent Screen
Google Cloud Platform – OAuth Consent Screen
  1. The OAuth consent screen is displayed. In the App information section, complete the App name and User support email fields. The App name field does not need to match the project name.
Google Cloud Platform – OAuth Consent Screen – App Information
Google Cloud Platform – OAuth Consent Screen – App Information
  1. All fields in the App domain and Authorized domains sections can remain blank.
Google Cloud Platform – OAuth Consent Screen – App Domain
Google Cloud Platform – OAuth Consent Screen – App Domain
  1. In the Developer contact information section, provide an email address. Click the SAVE AND CONTINUE button.
Google Cloud Platform – OAuth Consent Screen – Developer Contact Information
Google Cloud Platform – OAuth Consent Screen – Developer Contact Information
  1. The Scopes screen is displayed. All fields can remain blank. Click the SAVE AND CONTINUE button.
Google Cloud Platform – OAuth Consent Screen – Scopes
Google Cloud Platform – OAuth Consent Screen – Scopes
  1. The Test users screen is displayed. All fields can remain blank. Click the SAVE AND CONTINUE button.
Google Cloud Platform – OAuth Consent Screen – Test Users
Google Cloud Platform – OAuth Consent Screen – Test Users
  1. The Summary screen is displayed. Verify that the expected values are displayed. Click the BACK TO DASHBOARD button.
Google Cloud Platform – OAuth Consent Screen – Summary
Google Cloud Platform – OAuth Consent Screen – Summary
  1. The dashboard displays a summary of the OAuth consent screen for the project. In the Publishing status section, the status is displayed as Testing. Click the PUBLISH APP button.
Google Cloud Platform – OAuth Consent Screen – Prepublish
Google Cloud Platform – OAuth Consent Screen – Prepublish
  1. The Push to Production? screen is displayed. Click the CONFIRM button.
Google Cloud Platform – OAuth Consent Screen – Push to Production
Google Cloud Platform – OAuth Consent Screen – Push to Production
  1. Returning to the dashboard screen, the Publishing status is now displayed as In production. A new Verification Status section is displayed with the status Verification not required.
Google Cloud Platform – OAuth Consent Screen – Published
Google Cloud Platform – OAuth Consent Screen – Published

Creating OAuth Client Credentials

  1. Navigate to the Credentials screen.
  2. Click CREATE CREDENTIALS.
Google Cloud Platform – Create Credentials
Google Cloud Platform – Create Credentials
  1. Select OAuth client ID.
Google Cloud Platform – Create Credentials – OAuth Client ID
Google Cloud Platform – Create Credentials – OAuth Client ID
  1. The Create OAuth client ID screen is displayed. Set Application type to Desktop App and provide a Name for the OAuth 2.0 client. The name does not need to match any previously entered value.
  2. Click the CREATE button.
Google Cloud Platform – Create Credentials – OAuth Client Application Type
Google Cloud Platform – Create Credentials – OAuth Client Application Type
  1. The OAuth client created screen is displayed. Retrieve the Client ID and Client Secret values.
Google Cloud Platform – Create Credentials – OAuth Client Created
Google Cloud Platform – Create Credentials – OAuth Client Created

Step 2 – Compiling Google Drive CLI Client (gdrive)

The following instructions are tested on an OS Only Debian instance in AWS Lightsail. The commands may need to be adjusted for other Linux distributions or environments.

Downloading and Installing Go

  1. Navigate to Go Download and Install for Linux. Retrieve the URL to the latest stable version.
  2. Update and upgrade the server instance to ensure the latest packages are available.
sudo apt update && sudo apt upgrade -y
  1. Using cURL, retrieve the latest stable version of Go for Linux. Update the following command with the latest URL since version numbers change with each stable version. This guide is using version 1.19.3 which is the current stable version at the time of writing.
curl -OL https://golang.org/dl/go1.19.3.linux-amd64.tar.gz
  1. Verify the file authenticity and integrity by comparing the checksum from the following command with the SHA256 checksum listed on the official Go Downloads page.
sha256sum go1.19.3.linux-amd64.tar.gz

With this version of Go, the sha256sum command will output the following checksum.

74b9640724fd4e6bb0ed2a1bc44ae813a03f1e72a4c76253e2d5c015494430ba go1.19.3.linux-amd64.tar.gz
  1. Use the tar command to extract the Go binary to the /usr/local directory. Change the filename as needed.
sudo tar -C /usr/local -xvf go1.19.3.linux-amd64.tar.gz
  1. Execute go version to verify that the binary works as expected and the version matches the download.
/usr/local/go/bin/go version

The command displays the following output.

go version go1.19.3 linux/amd64
  1. Update the environment PATH to include Go by editing the user’s .profile file.
vim ~/.profile

Add the following line to the end of the .profile file. Using vim, press i to make changes to the file, save/write the changes by pressing esc and entering :w, and then enter :q to close the file.

export PATH=$PATH:/usr/local/go/bin
Go – Update Environment Path
Go – Update Environment Path
  1. Returning to the command line, load the PATH changes by executing the source command.
source ~/.profile
  1. The Go binary is now executable without providing the full path. Verify the PATH changes work by executing the following command which displays the Go version.
go version

Executing the command displays the expected output.

go version go1.19.3 linux/amd64
  1. Remove the downloaded Go file as it is no longer required. Adjust the filename as needed.
rm go1.19.3.linux-amd64.tar.gz

Compiling the gdrive Binary

  1. Update and upgrade the server instance to ensure the latest packages are available.
sudo apt update && sudo apt upgrade -y
  1. Install the unzip utility if it is not already available on the instance.
sudo apt install unzip
  1. Download the Google Drive CLI Client (gdrive) source from GitHub.
sudo wget -c https://github.com/carstentrink/gdrive/archive/refs/heads/main.zip
  1. Unzip the gdrive package. Files are unzipped into the ~/gdrive-main/ directory.
unzip ~/main.zip
  1. Update the handlers_drive.go file with the OAuth Client ID (var clientId) and Client Secret (var clientSecret) values created earlier in the Google Cloud Platform project.
vim ~/gdrive-main/handlers_drive.go
Google Drive CLI Client – Update handlers_drive.go
Google Drive CLI Client – Update handlers_drive.go
  1. Change to the gdrive-main directory and compile gdrive by executing the following commands.
cd ~/gdrive-main
./compile
  1. Remove the downloaded gdrive source file as it is no longer required. Adjust the filename in the command as needed.
rm -f ~/main.zip
  1. Create a bin directory within the user’s home directory and move the compiled gdrive binary (gdrive_linux_amd64) into that directory.
mkdir ~/bin
mv ~/gdrive-main/bin/gdrive_linux_amd64 ~/bin/
  1. Remove the temporary files and directories created by the build process. Adjust or ignore these commands if these files and directories are otherwise used.
rm -rf ~/.cache
rm -rf ~/gdrive-main
sudo rm -rf ~/go

Step 3 – Installing and Authenticating Google Drive CLI Client (gdrive)

Congratulations for making it this far in the guide. Unfortunately, this is where it gets a little tricky if you are running a Linux instance that does not have access to a local browser. I’ll describe how I overcame that limitation after I cover the straightforward scenario where a browser is available on the localhost.

To grant gdrive access to your Google Drive, there is web-based authentication process between the gdrive client application on a localhost port and Google Drive.

  1. From the command line, execute the gdrive binary gdrive_linux_amd64 with the about option. This command assumes the binary was moved to the ~/bin directory as part of the earlier steps.
~/bin/gdrive_linux_amd64 about

The command will display a message similar to the following.

Authentication needed
Go to the following url in your browser:
http://127.0.0.1:46729/authorize

Waiting for authentication response
Google Drive CLI Client – Authentication Needed
Google Drive CLI Client – Authentication Needed
  1. Open a local browser and navigate to the specified address and port. The Google sign in screen is displayed. Verify that the app name provided in an earlier step for the gdrive client Google Cloud Project is displayed correctly. Sign in to the Google account where the gdrive client should have access to Google Drive.
Google Drive CLI Client – Google Sign In
Google Drive CLI Client – Google Sign In
  1. Google displays an intimidating red triangle with an exclamation mark. The warning states Google hasn’t verified this app. The lack of verification is already known because Google warned us as part of the OAuth consent screen creation. Click Advanced to continue.
Google Warning – Google Hasn't Verified This App (Initial)
Google Warning – Google Hasn’t Verified This App (Initial)
  1. An option to Go to gdrive (unsafe) is displayed. Click the link to continue.
Google Warning – Google Hasn't Verified This App (Advanced)
Google Warning – Google Hasn’t Verified This App (Advanced)
  1. Another screen stating that gdrive is requesting access to your Google Account is displayed. Click Continue to proceed.
Google – App Wants Access to Your Google Account
Google – App Wants Access to Your Google Account
  1. The authentication process is complete. The browser displays the following message.
Authentication response received. Check the terminal where gdrive is running
  1. Returning to the terminal, gdrive displays a message similar to the following indicating that it has access to Google Drive.
Authentication needed
Go to the following url in your browser:
http://127.0.0.1:46729/authorize

Waiting for authentication response
User:
Used:
Free:
Total:
Max upload size:

NOTE: As I mentioned earlier, authenticating gdrive with your Google Account is more complicated when a local browser is not available. I tried opening ports on the Debian instance, using an SSH tunnel from another machine with a browser, and even using cURL, but I could not get any of those options to work. Instead, I used a Windows machine to authenticate by installing Go for Windows, compiling the Google Drive CLI Client (gdrive) on the Windows machine, and authenticating on that same Windows machine using the gdrive Windows binary. If you need to follow this route, the authentication process creates a file named USERNAME_v2.json in the path C:\Users\<user>\AppData\Local\gdrive\. Once you have that file, simply copy the file or the contents to the Linux server in ~/.config/gdrive/USERNAME_v2.json. With the file in place on the Linux instance, executing the gdrive about command should display a confirmation that it has access to Google Drive.

Scheduling the Backup Script

Now that the script and supporting files are set up, the following commands are used to schedule the script in the user crontab. The following example schedules the script to run a full encrypted backup once per month on the 1st and 15th of each month at 1:00 AM and a database only encrypted backup runs every day at 2:00 AM. The full backup is uploaded to a Google Drive folder named backup_full-example-com and the five most recent backups are retained. The database only backup is uploaded to a Google Drive folder named backup_db_only-example-com and the 30 most recent backups are retained. Since the script depends on other binaries, e.g. mysqldump and gdrive, please make sure cron has access to the appropriate user path. If it doesn’t have access to the appropriate path, then add it to the crontab. Use the command env $PATH to display the current user path and replace the example below with the appropriate path for your environment.

env $PATH

crontab -e

PATH=$PATH:/home/admin/bin:/usr/bin

0 1 1,15 * * /home/admin/scripts/backup.sh -e -f -n 5 -d backup_full-example-com
0 2 * * * /home/admin/scripts/backup.sh -e -n 30 -d backup_db_only-example-com

crontab -l

Decrypting Backup Files

Let’s assume everything executes as expected and Google Drive is populated with backups from the script. Download a backup from Google Drive to your local machine. Since the file is encrypted, you won’t be able to open it directly. In order to decrypt the file, openssl is used with the -d flag and the same options that were used to encrypt it. Specify the filename of the downloaded encrypted file with the -in option and the filename of resulting decrypted file in the -out option. When prompted, please be sure to enter the same password used by the script to encrypt the file (found in the .backup.cnf file). If you are downloading/decrypting on a Windows machine, you will need to install OpenSSL.

openssl enc -d -a -md sha512 -pbkdf2 -iter 100000 -salt -AES-256-CBC -in "c:\backup_full-example-com_2022_12_15_01_00.tar.gz" -out "c:\decrypted_backup_full-example-com_2022_12_15_01_00.tar.gz"

The resulting decrypted file is now readable as a regular .tar.gz file.

Further Reading

Leave a Comment