Skip to content

Backup Strategies

Backup to Amazon S3 Bucket

If you have your own AWS account, you can send files and backups directly to an S3 bucket.

Install the AWS CLI inside your Stratus instance using the Bundled Install. Use the Install the AWS CLI without Sudo (Linux, macOS, or Unix) method

$ curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
$ unzip awscli-bundle.zip
$ ./awscli-bundle/install -b ~/bin/aws

Run export to make the ~/bin path available

export PATH=~/bin:$PATH

Add the same to the ~/.bash_profile file so it will be available every time you log in. By default the ~/.bash_profile file does not exist.

Run aws configure and follow the prompts entering the appropriate region and access key for your AWS account.

Now you can run commands like aws s3 ls to show your existing buckets and manage other AWS resources. In this example, we will create a new one named my-magento-backup with:

magemojo@svc-magentoperformance-io:~$ aws s3 mb s3://my-magento-backup
make_bucket: my-magento-backup
magemojo@svc-magentoperformance-io:~$ aws s3 ls
2019-03-19 13:23:50 my-magento-backup

Now we can start uploading a backup, as a reminder, you can create backups in Magento 2 with the built-in system. Here we'll simply create a database dump and tar up the web root. If you have a very large site creating an archive can take a long time

#from document root
$n98-magerun2 db:dump s3_backup.sql
$tar -zcvf s3_backup_3_19_2019.tar.gz /srv/public_html/

Now we have an archive, we can upload it to the S3 bucket.

magemojo@svc-magentoperformance-io:~$ aws s3 ls
2019-03-19 13:23:50 my-magento-backup
magemojo@svc-magentoperformance-io:~$ aws s3 cp s3_backup_3_19_2019.tar.gz s3://my-magento-backup
upload: ./s3_backup_3_19_2019.tar.gz to s3://my-magento-backup/s3_backup_3_19_2019.tar.gz

You will see it upload with progress details. Once uploaded, you should also see the archive within the bucket in the AWS Console or web interface. S3 Bucket

Well done! Learn more about S3 commands from Amazon. From here you can throw together a script that creates a backup and uploads it to S3. Then set a cron to automatically run it, in a script similar to below:

#!/bin/bash
#S3 Backup Example

date="$(date "+%m-%d-%Y")"
bucket_name="my-magento-backup"
magento_root="/srv/public_html"

#create databasedump
echo "Creating database dump..."
/usr/local/bin/n98-magerun2 --root-dir=$magento_root db:dump db-backup-$date.sql

#create archive of webroot, excluding var
echo "Creating tar archive of files and database dump..."
tar --exclude=/srv/public_html/var/* -zcf $date-backup.tar.gz /srv/public_html/

#upload to s3
echo "Uploading to S3..."
aws s3 cp $date-backup.tar.gz s3://my-magento-backup

#clean up
echo "Removing local files and cleaning up..."
rm $date-backup.tar.gz
rm $magento_root/db-backup-$date.sql

echo "Done!"

The above would output:

magemojo@svc-magentoperformance-io:~$ ./backup.sh
Creating database dump...


  Dump MySQL Database


Start dumping database db_cded1u2ypqu to file db-backup-03-19-2019.sql
Finished
Creating tar archive of files and database dump...
tar: Removing leading `/' from member names
Uploading to S3...
upload: ./03-19-2019-backup.tar.gz to s3://my-magento-backup/03-19-2019-backup.tar.gz
Removing local files and cleaning up...
Done!

Dropbox backups

The official Dropbox CLI utility from Dropbox is not currently supported on Stratus. But you can use this 3rd party script to push files to a Dropbox folder with the proper access tokens.  See https://github.com/andreafabrizi/Dropbox-Uploader for more details.

Backup to Google Cloud

To backup your Stratus instance to Google Cloud, you’ll first want to create an account if you do not yet have one. Next, you’ll want to create a project from the Google Cloud Console - In this case, I’ll be using my-backups-256118

We will be using the gsutil for this tutorial. gsutil is a Python application that accesses Google Cloud Storage from the command line.

We will start by downloading the files from Google and extracting them locally;

mark@svc-m2-markmuyskens-com:~$ wget https://storage.googleapis.com/pub/gsutil.tar.gz mark@svc-m2-markmuyskens-com:~$ tar -zxvf gsutil.tar.gz

Next, we will want to configure gsutil to connect to Google;

mark@svc-m2-markmuyskens-com:~$ ./gsutil/gsutil config
This command will create a boto config file at /srv/.boto containing
your credentials, based on your responses to the following questions.
Please navigate your browser to the following URL:
https://accounts.google.com/o/oauth2/auth?scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&client_id=909320924072.apps.googleusercontent.com&access_type=offline
In your browser you should see a page that requests you to authorize access to Google Cloud Platform APIs and Services on your behalf. After you approve, an authorization code will be displayed.

Enter the authorization code: [REDACTED]

Please navigate your browser to https://cloud.google.com/console#/project,
then find the project you will use, and copy the Project ID string from the
second column. Older projects do not have Project ID strings. For such projects,
 click the project and then copy the Project Number listed under that project.

What is your project-id? my-backups-256118

gsutil developers rely on user feedback to make improvements to the
tool. Would you like to send anonymous usage statistics to help
improve gsutil? [y/N] y

Boto config file "/srv/.boto" created. If you need to use a proxy to
access the Internet please see the instructions in that file.
mark@svc-m2-markmuyskens-com:~$

Next, you will want to create a bucket. I will be using “mark-stratus-backups” as my bucket name. Buckets need to be a unique name as they share a global naming scheme at Google. We are also using standard storage in this example. A list of storage types can be found HERE.

mark@svc-m2-markmuyskens-com:~$ ./gsutil/gsutil mb -c standard -l US -p my-backups-256118 gs://mark-stratus-backups
Creating gs://mark-stratus-backups/...
mark@svc-m2-markmuyskens-com:~$

We can then create a manual backup by switching to the document root, dumping a copy of the database, and then creating an archive;

mark@svc-m2-markmuyskens-com:~$ cd public_html/
mark@svc-m2-markmuyskens-com:~/public_html$ n98-magerun2 db:dump backup.sql
mark@svc-m2-markmuyskens-com:~/public_html$ tar -zcvf backup_10_16_2019.tar.gz /srv/public_html/

Once archived, it can be copied to Google - running the ls command after, will show that the file has been copied successfully to the Google bucket;

mark@svc-m2-markmuyskens-com:~$ ./gsutil/gsutil cp /srv/public_html/backup_10_16_2019.tar.gz gs://mark-stratus-backups
Copying file:///srv/public_html/backup_10_16_2019.tar.gz [Content-Type=application/x-tar]...
\ [1 files][126.1 MiB/126.1 MiB]                                                
Operation completed over 1 objects/126.1 MiB.                                    
mark@svc-m2-markmuyskens-com:~$ ./gsutil/gsutil ls -l gs://mark-stratus-backups
 132217623  2019-10-16T18:52:52Z  gs://mark-stratus-backups/backup_10_16_2019.tar.gz
TOTAL: 1 objects, 132217623 bytes (126.09 MiB)
mark@svc-m2-markmuyskens-com:~$

We can also see this in the Google Cloud Console;

Screenshot

That's it! From here you can throw together a script that creates a backup and uploads it to Google. Then set a cron to automatically run it, in a script similar to below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
#!/bin/bash
#Google Cloud Backup Example

date="$(date "+%m-%d-%Y")"
bucket_name="mark-stratus-backups"
magento_root="/srv/public_html"

#create database dump
echo "Creating database dump..."
/usr/local/bin/n98-magerun2 --root-dir=$magento_root db:dump db-backup-$date.sql

#create archive of webroot, excluding var
echo "Creating tar archive of files and database dump..."
tar --exclude=/srv/public_html/var/* -zcf $date-backup.tar.gz /srv/public_html/

#upload to Google
echo "Uploading to Google..."
./gsutil/gsutil cp $date-backup.tar.gz gs://$bucket_name

#clean up
echo "Removing local files and cleaning up..."
rm $date-backup.tar.gz
rm $magento_root/db-backup-$date.sql

echo "Done!"