Backup website to cloud storage services is the ideal solution today.
Because it’s simply free. For example, Google Drive is free 15GB.
For WordPress websites, you can use the UpdraftPlus plugin to backup to GoogleDrive.
If you are using a VPS you have the option to backup your VPS to GoogleDrive.
The solution here is to use Rclone .
Rclone is a tool to synchronize data between servers and cloud storage services. It supports a lot of services: Amazon S3, Amazon Drive, Google Drive, Google Cloud Storage, Openstack Swift / Rackspace cloud files / Memset, Memstore, Dropbox, Microsoft OneDrive…
In this article, I will guide you to use Rclone to backup VPS with EasyEngine installation.
If you do not fall into this situation, you can consult more from Google. I see there are quite a few tutorials on Rclone.
So this article I only apply to VPS running Ubuntu using EasyEngine. But the commands are completely workable with CentOS.
Contents
Step 1: Install Rclone
Here I will download the latest version of Rclone at the time of writing. Then extract and copy to /usr/sbin/ directory.
For 64-bit Ubuntu, run the following set of commands:
|
1
2
3
4
5
|
cd /root/
wget http://downloads.rclone.org/rclone-v1.37-linux-amd64.zip
unzip rclone–v1.37–linux–amd64.zip
cp rclone–v*–linux–amd64/rclone /usr/sbin/
rm –rf rclone–*
|
For 32-bit Ubuntu, you use the following set of commands:
|
1
2
3
4
5
|
cd /root/
wget http://downloads.rclone.org/rclone-v1.36-linux-386.zip
unzip rclone–v1.37–linux–386.zip
cp rclone–v*–linux–386/rclone /usr/sbin/
rm –rf rclone–*
|
You can now use the rclone command in the terminal. Details of the commands you see here .
Step 2: Backup VPS to Google Drive
Connect to Rclone with Google Drive
This step you only do once. Next time, skip it.
Run the following command:
|
1
|
rclone config
|
You will get the message no remotes found. Type n to create a new remote:

Next, enter a name for the remote. This name you will use to represent the connection to the cloud storage service in the backup script below.

Next you choose which hosting service you want. Type 8 to select Google Drive and then Enter.

Client Id and Client Secret you leave blank by pressing Enter.
In the use auto config section, type n and then Enter.

Now copy the messy path in the terminal and paste it into your browser.

In the browser, click the Allow button to allow Rclone to access your Google Drive account:

Google will generate a code for you. Your task is to copy it and paste it into the terminal and then Enter.
If asked Configure this is a team drive then type n then enter. Next type y to confirm everything OK.

Finally type q to exit Rclone’s config:

Now test it with the following command (replace remote_name with the remote name you created when connecting to Google Drive): rclone lsd remote_name:
If successful, the above command will list the folders in your Google Drive.
Script to backup VPS and upload to Google Drive
Create a script file named backup.sh in the /root/ directory:
|
1
|
nano /root/backup.sh
|
If nano is not installed, use the following command to install it: apt-get install nano
Copy the entire script file below into the backup.sh file. Some places you need to change are as follows:
SERVER_NAME : the name of the folder containing the backup on GoogleDrive.
Password : you replace it with your password. For EasyEngine you use this command to get the MySQL root account password: cat /etc/mysql/conf.d/my.cnf
This paragraph a2hostingvps:$SERVER_NAME : a2hostingvps is the name of your remote instead of the name of the remote you created in the Google Drive connection step.
Note: the script uses the zip command for compression. You install the zip with the following command before running the script: apt-get install zip
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
|
SERVER_NAME=A2HOSTING_BACKUP
TIMESTAMP=$(date +“%F”)
BACKUP_DIR=“/root/backup/$TIMESTAMP”
MYSQL_USER=“root”
MYSQL_PASSWORD=“ZjaJhbBI”
MYSQL=/usr/bin/mysql
MYSQLDUMP=/usr/bin/mysqldump
SECONDS=0
mkdir –p “$BACKUP_DIR/mysql”
databases=`$MYSQL —user=$MYSQL_USER –p$MYSQL_PASSWORD –e “SHOW DATABASES;” | grep –Ev “(Database|information_schema|performance_schema|mysql)”`
echo “Starting Backup Database”;
for db in $databases; do
$MYSQLDUMP —force —opt —user=$MYSQL_USER –p$MYSQL_PASSWORD —databases $db | gzip > “$BACKUP_DIR/mysql/$db.sql.gz”
done
echo “Finished”;
echo ”;
echo “Starting Backup Website”;
# Loop through /home directory
for D in /var/www/*; do
if [ -d “${D}” ]; then #If a directory
domain=${D##*/} # Domain name
echo “- “$domain;
zip –r $BACKUP_DIR/$domain.zip /var/www/$domain/htdocs/ –q –x /var/www/$domain/htdocs/wp–content/cache/**\* #Exclude cache
fi
done
echo “Finished”;
echo ”;
echo “Starting Backup Nginx Configuration”;
cp –r /etc/nginx/ $BACKUP_DIR/nginx/
echo “Finished”;
echo ”;
size=$(du –sh $BACKUP_DIR | awk ‘{ print $1}’)
echo “Starting Uploading Backup”;
/usr/sbin/rclone —transfers=1 move $BACKUP_DIR “a2hostingvps:$SERVER_NAME/$TIMESTAMP” >> /var/log/rclone.log 2>&1
# Clean up
rm –rf $BACKUP_DIR
/usr/sbin/rclone –q —min–age 2d delete “a2hostingvps:$SERVER_NAME” #Remove all backups older than 2 day
/usr/sbin/rclone –q —min–age 2d rmdirs “a2hostingvps:$SERVER_NAME” #Remove all empty folders older than 2 day
echo “Finished”;
echo ”;
duration=$SECONDS
echo “Total $size, $(($duration / 60)) minutes and $(($duration % 60)) seconds elapsed.”
|
In the code, there is a point you need to note in the move statement of rclone, you have an additional parameter –transfers=1. This parameter means that rclone will move each file in turn to google drive. Without this parameter rclone will move multiple files at once. My VPS ram is only 512MB, so when moving many heavy files at the same time, the rclone service is killed midway. 252
Grant execute permission to backup.sh . file
|
1
|
chmod +x /root/backup.sh
|
Now you just run the following command to backup:
|
1
|
/root/backup.sh
|
To check if the backup was really successful or not, you can go to Google Drive to check again.
Create cron job to automatically run backup
Run the following command to open the cron job:
|
1
|
EDITOR=nano crontab –e
|
Paste the following line into the cron job
|
1
|
0 2 * * * /root/backup.sh > /dev/null 2>&1
|
The following command means: 2am every day will backup VPS and save it on Google Drive.
Step 3: Download backup from Google Drive and restore website
In case you want to restore the website, you do the following:
First you download the entire backup to the /root folder (remember to replace the remote name with your remote name and diretory_name with the folder on Google Drive with your name).
|
1
|
rclone copy “remote_name:/directory_name/2017-09-16” /root/
|
The example below is the backup files on my Google Drive:

The file khamphaso.com.zip will contain the source code of the website. The mysql directory will contain the compressed sql file in .gz format.
When you download the directory /root. First you extract the source code with the unzip command:
|
1
|
unzip /root/khamphaso.com.zip
|
Then copy overwrite the website folder:
|
1
|
cp –rf /root/var/www/khamphaso.com/ /var/www/khamphaso.com
|
Next you restore the db. First you extract the sql file with the following command:
|
1
|
gunzip /root/mysql/khamphaso_com.sql.gz
|
Navigate to the root of your website and use the WP CLI import command (this was installed when you installed EasyEngine as per my instructions):
|
1
2
3
|
ee site cd khamphaso.com
cd htdocs
wp db import /root/mysql/khamphaso_com.sql —allow–root
|
It’s done. You have successfully restored the website.
Epilogue
Through the article, you have learned a way to backup your website on VPS using EasyEngine.
If you have any problems, leave a comment below.

