While backups are often one of the most overlooked planks in a comprehensive data security plan, they are are probably among the most important things one can do for data security. It’s works as an insurance policy against data loss which can be caused by a myriad of things ranging from accidental deletion, to drive failure, to ransomeware attacks.
A good backup strategy usually will not co-locate the backup data with the original for a number of reasons. A few might be things like fires and theft. In the past, data was backed up to media removable and stored offsite in a place such as a safe deposit or the like. Nowadays with the high-speed internet and readily available cloud-based storage, backing up over the internet to the cloud is a possibility.
One such cloud storage is Azure Blob Storage. Originally, Azure only had 1 tier for blob storage that was general purpose. Recently though, Azure introduced storage tiers for Azure Storage accounts, it opened up blob storage to a whole new set of use cases. The 3 storage tiers are hot, cool, and archive. Hot storage is intended for applications that need data to be readily available and that will be read and written to fairly often. Archive storage is intended for long-term archival of data. Data is not stored in a readily available state, so to recover the data requires that it go through a “hydration” process that can take a lot of time. Cool storage sits between hot and archive offering a lower-cost option that is available for use, but not intended for access. Cool storage in most regions is .01 per GB per month. This means that one terrabyte is roughly $10 a month. Azure does not charge for writing to cool storage, but it does charge for reading from cool storage. Given that the intent of this is a backup, you need only read from it in the event of data loss.
Azure Storage is only half the equation. To get data onto Azure Storage, you need a utility/agent that will move data from your local computer to the storage account and this is where RClone comes in. RClone is a command line utility that performs one-way syncs between your local data and the cloud. When it runs, it looks for changes on the local file system, then uploads those changes to the storage account. Anything unchanged is left alone. The initial upload will obviously take some time, but once it’s finished only changes are sent up.
To be clear, Azure does have a backup as a service offering, which can be used for more robust backups and schemes. However if you’re looking for a simple solution, this little “hack” might just be for you.
Setting up a storage account in the Azure Portal is easy.
Select Create a resource ► Storage ► Storage account – blob, file, table, and queue. This will open the blade to configure the storage account.
Use the following settings in the blade.
Once you’ve filled out the form, click Create. It usually takes less than a minute to provision.
rclone config
n
for New remote
and press Enter
.
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name
, enter azure
and press Enter
.
name> azure
Storage
enter 15
for Microsoft Azure Blob Storage
and press Enter
.
1 / Alias for a existing remote
"alias"
2 / Amazon Drive
"amazon cloud drive"
...
15 / Microsoft Azure Blob Storage
"azureblob"
...
23 / Yandex Disk
"yandex"
24 / http Connection
"http"
Storage> 15
Storage Account Name
, type in the same name you gave in the first setting (the unique DNS Name) when you configured the storage account above.
Storage Account Name
account> blaizebackup
Storage Account Key
and press Enter
.
Storage Account Key
key> tj/+mJVQ...==
Endpoint for the service - leave blank normally
, just press Enter
unless you are using Azure Gov Cloud or something other than the standard Azure offering.y
, then press Enter
.
Remote config
--------------------
[azure]
type = azureblob
account = blaizebackup
key = tj/+...==
endpoint =
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y
q
to quit.
Current remotes:
Name Type
==== ====
azure azureblob
e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q
rclone mkdir remote:container
, where remote
is the name of the remote you created with rclone config
and containername
is the name of the blob container you’ll create on Azure.
rclone mkdir azure:backup
Now, Rclone is configured to talk to Azure and use it for backups.
Rclone will sync a local directory with the remote container, storing all the files in the local directory in the container. Rclone uses the syntax, rclone sync source destination
, where source
is the local folder and destination
is the container on Azure you just created.
rclone sync /path/to/my/backup/directory azure:backup
Scheduling a backup is important to automating backups. Depending on your platform will depend on how you do this. Windows can use Task Scheduler while Mac OS and Linux can use crontabs.
Before scheduling a job, make sure you have done your initial upload and it has completed.
backup.bat
somewhere on your computer and paste in the command you used in the section on Syncing a Directory. It will look something like the following. Specify the full path to the rclone.exe
and don’t forget to save the file.
C:fullpathtorclone.exe sync "C:pathtomybackupdirectory" azure:backup
schtasks
to schedule a job. This utility takes a number of parameters.
DAILY
Backup
backup.bat
file you just created. schtasks /Create /RU username /RP "password" /SC DAILY /TN Backup /TR C:pathtobackup.bat /ST 01:05:00
If you want to back up multiple directories, simply add multiple containers using rclone mkdir
and add a new line for each directory in the batch file for the source and corresponding destination container.
backup.sh
somewhere on your computer, and paste the command you used in the section on Syncing a Directory. It will look something like the following. Specify the full path to the rclone
executable and don’t forget to save the file.
#!/bin/sh
/full/path/to/rclone sync /path/to/my/backup/directory azure:backup
chmod
.
chmod +x backup.sh
sudo crontab -e
*
will denote all. To make the backup.sh
run at Daily at 1:05 AM, use something that looks like this:
5 1 * * * /full/path/to/backup.sh
If you want to back up multiple directories, simply add multiple containers using rclone mkdir
and add a new line for each directory in the script for the source and corresponding destination container.
This simple utility offers a nice way to backup local data to Azure and will work for a lot of simple and even some more complex use cases. Here are a few Do’s and Dont’s
Dos
Don’ts
Happy Backing Up!
Cloud management is difficult to do manually, especially if you work with multiple cloud…
Azure’s scalable infrastructure is often cited as one of the primary reasons why it's the…
https://www.youtube.com/watch?v=wDzCN0d8SeA Watch our "Unlocking the Power of AI in your Software Development Life Cycle (SDLC)"…
FinOps is a strategic approach to managing cloud costs. It combines financial management best practices…
Using Kubernetes with Azure combines the power of Kubernetes container orchestration and the cloud capabilities…
In the intricate landscape of modern business, compliance is both a cornerstone of operational integrity…