Bash Script to Automatically Backup Amazon Lightsail WordPress Server to Google Drive

The intent of this Bash script is to backup the WordPress installation directory, the MySQL database, and a handful of Apache/PHP/MySQL configuration files to Google Drive on a regular basis in an encrypted archive. While this script references paths/files specific to a Bitnami based Amazon Lightsail instance, it can be easily modified to any other Linux based environment. I fully understand that AWS has an automatic snapshot/backup capability, but I want to use Google Drive and I don’t want to incur any additional costs since AWS snapshots are not free.


If no options are specified when executing this script it will backup all MySQL databases accessible to the user specified in the .mysqldump.cnf file. No other files will be included in the backup. All files are packaged in a tar.gz file and encrypted using AES-256.

  • -f (full backup) includes both the MySQL dump as well as other files specified in the script.
  • -e is used to encrypt the backup archive. If this flag is set, then a password must be supplied in the .backup.cnf file.
  • -n x sets the number of files to retain on Google Drive in the upload directory (x is an integer value). The default is 30 files.
  • -d sets the parent folder where the backup file is uploaded on Google Drive. If the folder doesn’t exist, then the script will create it. The default folder name is “backup”.
  • -p sets the filename prefix of the backup file. The script will use the prefix followed by the date/time to set the full filename. The default filename prefix is “backup”.
  • -m is used to specify the database name to backup. The default is all databases.


This example uses all of the available options and will create a full encrypted backup of both MySQL and other files. The backup file is named backup-example-com (with a date/time suffix) and uploaded it to a folder named backup-full-example-com on Google Drive. If the specified folder contains more than 5 files, the script deletes any files beyond the 5 based on a filename sort.

./ -e -f -n 5 -d backup-full-example-com -p backup-example-com -m db_example_com_wp_prd


As a personal preference, I store user scripts and associated configuration files in a subdirectory under home named “scripts”.

The following command creates the “scripts” subdirectory.

mkdir -m 700 ~/scripts

Next, create a file named .mysqldump.cnf. This file will be used by the mysqldump command in the backup script to pass the MySQL database user and password to dump the WordPress database.

touch ~/scripts/.mysqldump.cnf
chmod 600 ~/scripts/.mysqldump.cnf
vim ~/scripts/.mysqldump.cnf

In the vim editor, add the database username and password with access to dump all of the required tables and data for a backup. Apostrophes surrounding the password are needed if any special characters are used in the actual password.


The following commands create the .backup.cnf file which contains a secret key used by the script to encrypt the backup file.

touch ~/scripts/.backup.cnf
chmod 600 ~/scripts/.backup.cnf
vim ~/scripts/.backup.cnf

In the vim editor, add a secret key/password to the file.


Now we’ll create the actual backup script. Please be sure to modify any referenced paths and constants to your own environment.

touch ~/scripts/
chmod 700 ~/scripts/
vim ~/scripts/

The following is the full Bash script. Please modify any constants/paths for your own environment and requirements.


getConfigVal() {
  echo `grep ${1} "${2}" | cut -d '=' -f 2`

filename_suffix_backup="_$(date +'%Y_%m_%d_%H_%M')"
path_script="$(dirname "$(readlink -f "$0")")"
path_mysqldump="$(which mysqldump)"
path_gdrive="$(which drive)"

if [ -z "${path_script}" ] || [ -z "${path_mysqldump}" ] || [ -z "${path_gdrive}" ] || [ -z "${path_backup_cnf}" ] || [ -z "${path_mysqldump_cnf}" ] || [ -z "${path_backup}" ]
  echo "ERROR: One or more required path variable(s) are undefined or missing."
  exit 1

while getopts efn:d:p:m: flag
  case "${flag}" in
      encryption_key=$(getConfigVal "pass" "${path_backup_cnf}")

      if [ -z "${encryption_key}" ]
        echo "ERROR: Encryption key not found."
        exit 1
      if ! [[ "${OPTARG}" =~ ^[0-9]+$ ]]
        echo "WARNING: Number of backups to retain must be an integer. Reverting to default value (${num_files_to_retain})."
      db_name="--databases ${OPTARG}"


sudo rm -rf "${path_backup}"
sudo rm "${HOME}/${filename_prefix_backup}"*.tar.gz

mkdir "${path_backup}"

if [ -n "${bool_full_backup}" ]
  find /www -type d -not -path '*/cache/*' -exec mkdir -p "${path_backup}"/{} \;
  find /www -type f -not -path '*/cache/*' -exec sudo cp '{}' "${path_backup}"/{} \;
  cp /opt/bitnami/apache2/conf/vhosts/example-com-https-vhost.conf "${path_backup}"
  cp /opt/bitnami/apache2/conf/vhosts/example-com-vhost.conf "${path_backup}"
  cp /opt/bitnami/apache/conf/httpd.conf "${path_backup}"
  cp /opt/bitnami/apache/conf/bitnami/bitnami-ssl.conf "${path_backup}"
  cp /opt/bitnami/php/etc/php.ini "${path_backup}"
  cp /opt/bitnami/mysql/conf/my.cnf "${path_backup}"

"${path_mysqldump}" --defaults-extra-file="${path_mysqldump_cnf}" ${db_name} --no-tablespaces -ce > "${path_backup}/mysqldump.sql"

touch "${HOME}/${filename_prefix_backup}.tar.gz"

chmod 600 "${HOME}/${filename_prefix_backup}.tar.gz"

sudo tar -czf "${HOME}/${filename_prefix_backup}.tar.gz" -C "${path_backup}" .

if [ -n "${bool_encrypt_backup}" ]
  touch "${HOME}/${filename_prefix_backup}_enc.tar.gz"

  chmod 600 "${HOME}/${filename_prefix_backup}_enc.tar.gz"

  openssl enc -e -a -md sha512 -pbkdf2 -iter 100000 -salt -AES-256-CBC -pass "pass:${encryption_key}" -in "${HOME}/${filename_prefix_backup}.tar.gz" -out "${HOME}/${filename_prefix_backup}_enc.tar.gz"

  sudo rm "${HOME}/${filename_prefix_backup}.tar.gz"
  mv "${HOME}/${filename_prefix_backup}_enc.tar.gz" "${HOME}/${filename_prefix_backup}.tar.gz"

mv "${HOME}/${filename_prefix_backup}.tar.gz" "${HOME}/${filename_prefix_backup}${filename_suffix_backup}.tar.gz"

folder_id=`"${path_gdrive}" list -m 1000 -n -q 'trashed = false' | grep -m 1 "${gdrive_backup_dirname}" | head -1 | cut -d ' ' -f1`

if [ -z "${folder_id}" ]
  "${path_gdrive}" folder -t "${gdrive_backup_dirname}"
  folder_id=`"${path_gdrive}" list -m 1000 -n -q 'trashed = false' | grep -m 1 "${gdrive_backup_dirname}" | head -1 | cut -d ' ' -f1`

"${path_gdrive}" upload --file "${HOME}/${filename_prefix_backup}${filename_suffix_backup}.tar.gz" --parent "${folder_id}"

sudo rm -rf "${path_backup}"
sudo rm "${HOME}/${filename_prefix_backup}"*.tar.gz

expired_file_ids=`"${path_gdrive}" list -m 1000 -n -q "'${folder_id}' in parents and trashed = false and mimeType != 'application/'" | sort -k 2r | tail -n +"${num_files_to_retain}" | cut -d ' ' -f1`

if [ -n "${expired_file_ids}" ]
  while read -r file_id; do
    "${path_gdrive}" delete --id "${file_id}"
  done <<< "${expired_file_ids}"

In order for the script to communicate with Google Drive, it has a dependency on a freely available binary on GitHub. The following commands download the binary to a new subdirectory under home named “bin”.

cd ~
mkdir -m 700 ~/bin
cd ~/bin
wget -O drive
chmod 500 ~/bin/drive

Execute the binary once it is downloaded and follow the directions to grant it permission to Google Drive.


Go to the following link in your browser:

Enter verification code:


Now that the script and supporting files are set up, the following commands are used to schedule the script in the user crontab. The following example schedules the script to run a full encrypted backup once per month on the 1st of the month at 1 AM and a database only encrypted backup runs every day at 2 AM. The full backup is uploaded to a Google Drive folder named “backup_full-example-com” and the five most recent backups are retained. The database only backup is uploaded to a Google Drive folder named “backup_db_only-example-com” and the thirty most recent backups are retained. Since the script depends on other binaries, e.g. mysqldump and drive, please make sure cron has access to the appropriate user path. If it doesn’t have access to the appropriate path, then add it to the crontab. Use the command env $PATH todisplay the current user path and replace the example below with the appropriate path for the environment.

env $PATH

crontab -e


0 1 * * 1 /home/bitnami/scripts/ -e -f -n 5 -d backup_full-example-com
0 2 * * * /home/bitnami/scripts/ -e -n 30 -d backup_db_only-example-com

crontab -l


Let’s assume everything is executing as expected and Google Drive is populated with backups from the script. Download a backup from Google Drive to your local machine. Since the file is encrypted, you won’t be able to open it directly. In order to decrypt the file, OpenSSL is used with the -d flag and the same options that were used to encrypt it. Specify the filename of the downloaded encrypted file with the -in option and the filename of resulting decrypted file in the -out option. When prompted, please be sure to enter the same password used by the script to encrypt the file (found in the .backup.cnf file). If you are downloading/decrypting on a Windows machine, you will need to install OpenSSL.

openssl enc -d -a -md sha512 -pbkdf2 -iter 100000 -salt -AES-256-CBC -in "c:\backup_full-example-com_2021_03_23_01_00.tar.gz" -out "c:\decrypted_backup_full-example-com_2021_03_23_01_00.tar.gz"

The resulting decrypted file is now readable as a regular tar.gz file.

Automate Amazon Lightsail Maintenance Activities Using Bash Script

With Amazon Lightsail, like most other virtual private servers (VPS), you are responsible for performing server maintenance activities, e.g. applying patches, once the instance is running. While it’s easy to perform this manually based on a calendar reminder, it’s also easy to forget to do it periodically. When added to the user crontab, the following bash script automatically performs OS patches on a schedule that you define. Please be aware that most Lightsail images are Bitnami based and this method will not apply upgrades/patches on its packaged applications. Updating Apache, for example, requires moving to a new instance with the latest Bitnami image.

Since this example assumes a Lightsail environment based on a Bitnami stack (Debian), user and path specifics may need change for your specific environment.

As a personal preference, I write user scripts to a “scripts” directory in my home directory. This first step creates a new “scripts” directory in my home directory with 700 permissions (read, write, execute). Then, I create a blank file named “” to contain the script. I prefer to use vim to edit files, but please feel free to adjust to your preferred editor.

mkdir -m 700 ~/scripts

touch ~/scripts/
chmod 700 ~/scripts/
vim ~/scripts/

The next step is to add the script to the blank file. The script performs only a few actions: “apt update” retrieves the latest list of available packages and versions, but it does not upgrade any packages; “apt upgrade” upgrades existing/installed packages to the latest version. Finally, “apt autoclean” removes package files from the local repository that have been uninstalled or they are no longer available for downloaded.



sudo apt update && sudo apt upgrade -y
sudo apt autoclean

The last step is the schedule the job using cron. The next command allows you to edit the user crontab.

crontab -e

As an example, add the following line to the user crontab to schedule the script to run at 3:00 AM every Sunday.

0 3 * * 0 /home/bitnami/scripts/

Exit the editor and the your maintenance activities will be performed automatically as scheduled.

Obtaining the AWS Certified Cloud Practitioner (CCP) Certification

The AWS Certified Cloud Practitioner certification is a foundational (entry-level) examination to demonstrate a basic understanding of cloud concepts specific to the Amazon Web Services (AWS) environment. As of December 2020, it’s a 90 minute, 65 question exam (multiple choice, multiple answer) that can either be taken in-person at an exam center or proctored online. A passing score is a minimum of 700 out of 1000.

I didn’t feel this was a difficult exam, but it did require a decent amount of studying and memorization. Although I was familiar with general cloud concepts, I didn’t have specific knowledge of the AWS environment and its available products/services at the time I started studying. While the official exam guide recommends at least 6 months of experience with the AWS Cloud, I didn’t have any direct experience working in AWS. As with other exams, the amount of study time you will need to pass the exam will depend entirely on your background and prior experience. For reference, I spent approximately 25 hours studying.

Study Resources

The following are the study resources I found most valuable to pass the exam.

AWS Training

The AWS Training and Certification site has a substantial volume of free resources (videos, white papers, etc.) available to study for any of the AWS certifications.

AWS Cloud Practitioner Essentials (6 hours): Consists of mixed content study resources including video presentations, demonstrations, links to resources, and knowledge checks.

AWS White Papers (5 hours): There are many white papers available on the AWS Training and Certification site. The documents are fairly dense and I skimmed most of this list to get the important points. The white papers to focus on are:

  • Overview of Amazon Web Services
  • How AWS Pricing Works
  • Compare AWS Support Plans
  • AWS Well-Architected Framework
  • An Overview of the AWS Cloud Adoption Framework
  • AWS Migration Whitepaper


AWS Certified Cloud Practitioner Training 2020 (4 hours): This is a full course covering all topics necessary to pass the exam.

Tutorials Dojo

Tutorials Dojo Study Guide eBook – AWS Certified Cloud Practitioner (4 hours): This eBook condenses important points from other study materials into a single, more readable document.

AWS Certified Cloud Practitioner Practice Exams (6 hours): Practice exam questions that mimic the style and difficulty of the actual exam. The explanations of why an answer is correct/incorrect are very helpful to identify weak points in your studying. There are many options available to simulate timed exams or to focus on particular subject matter domains.