Enable Browser Caching Directives in Amazon Lightsail Apache Server

By default in an Amazon Lightsail LAMP image, the Apache HTTP server is not configured with browser caching directives enabled. Specifically, the mod_expires Apache module is not loaded. If you’re not familiar with browser caching directives, this module controls the Expires HTTP header and the max-age directive of the Cache-Control HTTP header in server responses. These inform the browser how long to cache responses. If a resource is mostly static and doesn’t change very often, then the directive should establish a longer caching period vs. a resource that may be changing hourly. This directive reduces bandwidth consumption and server load while improving user experience by loading screens faster since the browser isn’t repeatedly requesting the same file.

Log into the Amazon Lightsail server and edit the httpd.conf file. In this example, the httpd.conf file is located in /opt/bitnami/apache/conf/ so you may need to adjust the path depending on your image.

sudo vim /opt/bitnami/apache/conf/httpd.conf

Find the mod_expires.so entry in the file, remove the comment (#), and save the file. The server does not need to be restarted.

LoadModule expires_module modules/mod_expires.so

In the .htaccess file for your site, you can add various Expires directives for relevant file types. This example provides a small sample of common file types and can be expanded to meet your particular needs. Please adjust the expiry times as appropriate for your site. A more detailed walkthrough of .htaccess files can be found at Hardening WordPress Security Using .htaccess.

<IfModule mod_expires.c>
  ExpiresActive On
  ExpiresDefault "access plus 1 month"
  ExpiresByType text/html "access plus 0 seconds"
  ExpiresByType image/gif "access plus 1 year"
  ExpiresByType image/png "access plus 1 year"
  ExpiresByType image/jpg "access plus 1 year"
  ExpiresByType image/jpeg "access plus 1 year"
  ExpiresByType text/css "access plus 1 month"
  <IfModule mod_headers.c>
    Header append Cache-Control "public"

As a demonstration of the directive in action, PNG image files are set to a one year cache from the time the file is accessed. In the following image, a file accessed on June 24, 2021 (date) expires in the cache one year later on June 24, 2022 (expires). This information can be viewed in most browsers using the Inspect feature and activating the Network Monitor.

Response Headers: cache-control
Response Headers: cache-control

The browser also shows that the file was transferred from the cache and no additional bandwidth was consumed for that resource.

File Cache Transfer Status
File Cache Transfer Status

Obtaining the AWS Certified Cloud Practitioner (CCP) Certification

The AWS Certified Cloud Practitioner certification is a foundational (entry-level) examination to demonstrate a basic understanding of cloud concepts specific to the Amazon Web Services (AWS) environment. As of December 2020, it’s a 90 minute, 65 question exam (multiple choice, multiple answer) that can either be taken in-person at an exam center or proctored online. A passing score is a minimum of 700 out of 1000.

I didn’t feel this was a difficult exam, but it did require a decent amount of studying and memorization. Although I was familiar with general cloud concepts, I didn’t have specific knowledge of the AWS environment and its available products/services at the time I started studying. While the official exam guide recommends at least 6 months of experience with the AWS Cloud, I didn’t have any direct experience working in AWS. As with other exams, the amount of study time you will need to pass the exam will depend entirely on your background and prior experience. For reference, I spent approximately 25 hours studying.

Study Resources

The following are the study resources I found most valuable to pass the exam.

AWS Training

The AWS Training and Certification site has a substantial volume of free resources (videos, white papers, etc.) available to study for any of the AWS certifications.

AWS Cloud Practitioner Essentials (6 hours): Consists of mixed content study resources including video presentations, demonstrations, links to resources, and knowledge checks.

AWS White Papers (5 hours): There are many white papers available on the AWS Training and Certification site. The documents are fairly dense and I skimmed most of this list to get the important points. The white papers to focus on are:

  • Overview of Amazon Web Services
  • How AWS Pricing Works
  • Compare AWS Support Plans
  • AWS Well-Architected Framework
  • An Overview of the AWS Cloud Adoption Framework
  • AWS Migration Whitepaper


AWS Certified Cloud Practitioner Training 2020 (4 hours): This is a full course covering all topics necessary to pass the exam.

Tutorials Dojo

Tutorials Dojo Study Guide eBook – AWS Certified Cloud Practitioner (4 hours): This eBook condenses important points from other study materials into a single, more readable document.

AWS Certified Cloud Practitioner Practice Exams (6 hours): Practice exam questions that mimic the style and difficulty of the actual exam. The explanations of why an answer is correct/incorrect are very helpful to identify weak points in your studying. There are many options available to simulate timed exams or to focus on particular subject matter domains.

How to Migrate an Existing WordPress Site to Amazon Lightsail LAMP

When I first started www.dalesandro.net, I found a relatively inexpensive shared hosting provider. The site remained in that environment for years. I never had any significant technical issues with the provider and the cost had remained flat. Keep in mind that this is an extremely low traffic site running a straightforward WordPress installation.

After receiving a renewal notice from the shared hosting provider with a substantial price increase, I decided to research other available options. Around the same time, I obtained the AWS Certified Cloud Practitioner (AWS CCP) certification where I learned of an offering called Amazon Lightsail. At its core, Lightsail allows you to instantiate one or more virtual private servers – a virtual machine, storage, bandwidth, and a static IP – for a (somewhat) predictable monthly price.

Within Amazon Lightsail, there are a number of options to establish an instance, e.g. selecting the operating system, application stack, and server size, etc. While you can select a Linux/Unix instance with WordPress already installed, I chose the LAMP (PHP 7) blueprint because I wanted to migrate an existing WordPress installation on my own and avoid any potential issues with a pre-installed package.

Step 1 – AWS Basics

If you don’t already have an AWS account, the first step is to create an account. Navigate to AWS and follow the on-screen instructions. I won’t walk through this part since it is a straightforward process to create an account. Also, I suggest researching AWS pricing to better understand how actions or selections impact the monthly bill. While Lightsail instances are considered fixed-price per month, if products/services are activated or the bandwidth allotment is exceeded, then the monthly bill will fluctuate. Depending on the cost trigger, the pricing change could be substantial.

Step 2 – Migration Preparation

In my opinion, this is the most difficult and time-consuming part of the process. Since I have an existing site with a shared hosting provider, I need to move the WordPress files as well as the database. The shared host uses cPanel, so there is a capability to download a copy of both the account directory as well as a dump of any MySQL databases. These backups are available in “Account Backups” within cPanel.

In my case, the challenge is that the directory structure on my shared host doesn’t match the structure that I want to use in Lightsail. After downloading both the account directory and MySQL backups, I make copies of the files in order to have a clean backup and a working copy. I then restructure the files manually, i.e. moving directories and changing paths in the database data. There are plugins available for WordPress to handle this search-and-replace activity, however, I am not familiar with them and I am comfortable making the changes myself. Also, since the site is mostly static, I work through the changes manually and at my own pace without worrying much about losing comment submissions in the meantime.

As far as the directory structure changes, I am moving from having WordPress installed within a sub-directory of my account to having WordPress installed in a primary directory. I have to update all sub-directory path references in files such as wp-config.php and .htaccess as well as the WordPress database tables. For my site, the majority of this search-and-replace activity is in the MySQL dump. Multiple references to both the sub-directory and the underlying file path on the shared host are in the options table as well as various plug-in tables. I find many path references included in serialized data within the MySQL dump so I take a risk updating it manually under the assumption it could be re-serialized by simply re-saving the settings once the site was up-and-running.

Take time and extra care as you work through this process.

At the end of this manual process, I create a new archive containing all of the WordPress files called wordpress_migration.tar.gz. Consolidating all of the necessary files makes the upload in Step 11 easier/faster. The archive is structured so that all of the WordPress files are under a parent directory named migration/public_html. I store the revised MySQL dump file named wrdprss.sql from the existing site into the migration directory (not the migration/public_html) in the archive.


 ├── wrdprss.sql  <-- MySQL dump file
 └── public_html/
     ├── index.php
     ├── license.txt
     ├── readme.html
     ├── wp-activate.php
     ├── wp-admin/
     ├── wp-blog-header.php
     ├── wp-comments-post.php
     ├── wp-config.php
     ├── wp-content/
     ├── wp-cron.php
     ├── wp-includes/    
     ├── wp-links-opml.php
     ├── wp-load.php
     ├── wp-login.php
     ├── wp-mail.php
     ├── wp-settings.php
     ├── wp-signup.php
     ├── wp-trackback.php
     └── xmlrpc.php

Step 3 – Testing the Migration Files

For some of the available blueprints, Amazon Lightsail uses stack installers/images developed by Bitnami. I download the Bitnami LAMP Virtual Machine (VM) and run it locally using VirtualBox. This allows me to test my migration steps as well as my migration archive/files before incurring costs on AWS. Follow Steps 7 to Step 15 as a local test prior to performing the migration on the production Lightsail instance.

Step 4 – Create an Amazon Lightsail Instance

Login to the AWS Management Console and search for Lightsail. Follow the on-screen instructions to create a new Lightsail instance. I instantiate a Lightsail instance using the LAMP blueprint (Linux / Apache / MySQL / PHP).

Step 5 – Create and Attach a Static IP

Follow these instructions to create a static IP and attach it to the Lightsail instance.

Step 6 – Use SSH to Connect to the Instance

Once the Lightsail instance running, I need to connect to it in order to start performing the migration. A connection to the instance can be made directly through the browser using the “Connect to your instance” option (follow these instructions). The other option is to use PuTTY and a private key. I prefer to use PuTTY since it feels more stable than the browser based connection. Follow these instructions to make a PuTTY-based connection to your instance.

Step 7 – Update and Upgrade the Instance

At the command prompt, I execute the following command to update and upgrade the instance packages/software. This is a maintenance step that should be performed regularly as this will apply any available patches. With Lightsail, you are responsible for the ongoing maintenance activities associated with your server.

sudo apt update && sudo apt upgrade

Step 8 – Create a New DocumentRoot Directory

The DocumentRoot directory is where the WordPress application files will eventually be stored and served. While the Bitnami installation of Apache defaults to /opt/bitnami/apache/htdocs, I prefer to create a separate directory structure outside of the server path. You may use the default path or change this folder structure to match your own requirements.

The mkdir -p option creates each parent/child directory in the specified path instead of executing individual mkdir commands for each sub-directory. Replace example-com as appropriate.

sudo mkdir -p /www/example-com/public_html

Step 9 – Create Apache vhost Files

vhost files are used to configure Apache to serve files from the new DocumentRoot. For this step, I’ve modified the instructions from the Bitnami “Create a custom PHP application” guide. Depending on which Bitnami image was used to create the Lightsail instance, the following command returns either Approach A or Approach B.

test ! -f "/opt/bitnami/common/bin/openssl" && echo "Approach A: Using system packages." || echo "Approach B: Self-contained installation."

In my example, Approach A is returned so the following command sequence is appropriate for that path. These commands create copies of the existing vhost sample files for both http and https. vim is then used to edit both files to reflect the new DocumentRoot from Step 8. Replace/rename example-com as appropriate.

cd /opt/bitnami/apache2/conf/vhosts
cp sample-vhost.conf.disabled example-com-vhost.conf.disabled
vim example-com-vhost.conf.disabled
mv example-com-vhost.conf.disabled example-com-vhost.conf

Modify the DocumentRoot and Directory sections to /www/example-com/public_html following the sample file format.

If you’re not familiar with editing in vim, please spend a few minutes learning the basic keyboard commands. The file opens in command mode. To make changes to the file, press i to start editing. After completing the revisions, press escape to return to command mode. Once satisfied with the changes, enter :w to write (save) the file. To exit vim, enter :q in command mode.

Make the same changes to the https version of the .conf file.

cp sample-https-vhost.conf.disabled example-com-https-vhost.conf.disabled
vim example-com-https-vhost.conf.disabled
mv example-com-https-vhost.conf.disabled example-com-https-vhost.conf

Restart Apache.

sudo /opt/bitnami/ctlscript.sh restart apache

Step 10 – Enable and Start SSH (Optional)

This step may not be necessary. On the actual Lightsail instance, SSH is already enabled and running so these next commands are not needed. However, on my local VirtualBox instance of the Bitnami LAMP VM, SSH is not running.

To verify ssh is enabled and running, execute the following command. sshd is listed in the output if it is running

ps aux | grep sshd

If it is enabled and running, execute the following command to find the port number SSH is listening on.

netstat -plant | grep :22

If SSH is not running, execute the following commands to enable and start SSH. These commands follow the Bitnami “Enable or Disable the SSH Server” guide.

sudo rm -f /etc/ssh/sshd_not_to_be_run
sudo systemctl enable ssh
sudo systemctl start ssh

Step 11 – Use Putty SFTP to Upload Migration Archive

I use PuTTY PSFTP to transfer the wordpress_migration.tar.gz file (created in Step 2) from my local machine to the Lightsail instance. Replace <static ip> with the actual static IP address and replace the path and filename as appropriate. This sequence of commands transfers the file to the Lightsail user account’s home directory on the instance.

open <static ip>
put c:\local\path\to\wordpress_migration.tar.gz

Step 12 – Stop and Disable SSH (Optional)

Like Step 10, this step may not be necessary. If SSH was not running in Step 10 and you want revert to that state, e.g. stop/disable SSH, then execute the following commands. Otherwise skip this step.

sudo systemctl stop ssh
sudo systemctl disable ssh

Step 13 – Unpack the WordPress Files

In this step, the archive file containing all of the WordPress files from the existing site are unpacked and moved into the DocumentRoot defined in the vhost files. Recall in Step 2 that the parent directory migration/public_html is used in the archive structure. Since public_html is also used in the DocumentRoot directory structure, it makes the file movement easier. Adjust the following sequence of commands to match the directory structure defined in the migration file and DocumentRoot.

These commands unpack the migration archive into the home directory. In my example, this creates a folder structure of ~/migration/public_html containing all of the WordPress files from my existing site. Once the files are unpacked, the contents of ~/migration/public_html are moved to /www/example-com/public_html. Then the appropriate permissions are set to give Apache access to the directories and files and wp-config.php is secured. Apache is running as daemon in this example, but it may vary from instance to instance.

cd ~
tar -xzvf wordpress_migration.tar.gz
cd migration
sudo mv ./public_html /www/example-com
sudo chown -R daemon:daemon /www/example-com/public_html
sudo find /www/example-com/public_html -type d -exec chmod 755 {} \;
sudo find /www/example-com/public_html -type f -exec chmod 644 {} \;
sudo chmod 600 /www/example-com/public_html/wp-config.php

Step 14 – Create a New Database and Import Existing SQL Data

If the suggested directory structure from Step 2 was used, then the MySQL export file named wrdprss.sql from the existing site will be located in the migration directory when the archive is unpacked. The following sequence of commands creates a new database and imports the existing data. This sequence also assumes WordPress is not using root to access the database. Please make sure to use the user/password from the existing site to re-create the MySQL user/service account to access the database.

If a brand new user/password is created in this step, then wp-config.php must be updated to reflect the new user/service account. After the site is up-and-running, the service account password should be changed in case it was compromised in transit during the migration.

The output from the command “cat ~/bitnami_credentials” is the root password. Use the password when prompted after executing “mysql -u root -p”. Again, adjust the sequence to reflect the database name, user, password, and MySQL dump filename and path.

cat ~/bitnami_credentials
mysql -u root -p
CREATE DATABASE db_example_com_wp_prd;
CREATE USER 'u_example_com_wp_prd'@'localhost' IDENTIFIED BY '<password>';
GRANT ALL PRIVILEGES ON db_example_com_wp_prd.* TO 'u_example_com_wp_prd'@'localhost';
USE db_example_com_wp_prd;
SOURCE ~/migration/wrdprss.sql

Step 15 – Access the Site Using Static IP Address

Navigate to the site using the AWS static IP address to verify the site is accessible and the migration was successful. The browser may display a warning if the site is redirected to https since new certificates have not yet been issued for this domain/IP combination.

Step 16 – Update DNS A Resource Record to AWS Static IP

Update the A resource record for the domain to use the AWS static IP. The DNS update needs to propagate prior to issuing new certificates in Step 17. Instructions to update the resource record are specific to your DNS provider.

Step 17 – Install SSL Certificates

If the site allows/forces https, then use the bncert-tool to create certificates through Let’s Encrypt. Execute the command and follow the on-screen instructions.

sudo /opt/bitnami/bncert-tool

Step 18 – Access the Site Using Domain Name

Navigate to the site using the domain name (instead of the AWS static IP address) to verify the site is up-and-running as expected.

Step 19 – Cleanup

If everything is working as expected, the final step is to remove the migration files from the home directory. This sequence deletes the migration archive as well as the unpacked migration directory.

cd ~
rm -f wordpress_migration.tar.gz
rm -rf migration

Further Reading

At this point, I hope your site has been migrated successfully to Amazon Lightsail. While the above steps will get a site up-and-running on Amazon Lightsail, it does not cover other aspects of managing your own server such as on-going maintenance and security.

For further reading, please consider:

Hardening WordPress Security Using .htaccess

Enable Browser Caching Directives in Amazon Lightsail Apache Server

Automate Amazon Lightsail Maintenance Activities Using Bash Script

Bash Script to Automatically Backup Amazon Lightsail WordPress Server to Google Drive