I have been using Microsoft’s Windows Home Server (WHS) for many years, mostly as a local backup for my pictures, but also to run a backup offsite to Amazon AWS Glacier. As the support for Windows Server 2008 R2, on which WHS is based on, is almost at its end, i started to look for alternatives. All i really needed was some sort of storage with a way to do online backups, and a fully blown server felt like overkill. I eventually decided to splash out on a Synology NAS, which seemed like the device that would provide all i needed.
It took a while to get the system working the way i wanted, especially all the web services protected by the Sophos UTM’s web server protection, but this is probably a topic for different post. Once i was happy with the general workings, i started to look at options for offsite backup. Amazon Glacier worked pretty well on my WHS, especially using Cloudberry’s plugin for WHS. It was really cheap for the amount of data i needed to protect, and provided the option of client side encryption.
The natural step was to look at the Glacier backup application from the Synology store. But once i installed it, i found this provides no client side encryption. Some research showed there might have been some workarounds, but i really wanted a simple system, with as little moving parts as possible.
Than i remembered a recent article about improvements in Azure Archive storage, and i though i would give it a try. So i logged to Azure portal and went to setup a storage account, but hey, there is no option to select an archive storage tier…
For testing purposes, i went ahead and setup a Storage Account with Cool tier option.
Once i had the account set up, i created a container which will become the target for the Synology Hyper Backup.
With the Azure side of things in place, i logged in to my Synology’s DSM and started configuring the Hyper Backup. Most settings would be common across most backup destination types, with the exception of the Backup destination setting itself, where i had to provide the storage account name and access key. After these were provided i could select the container name from the list for the Azure backup destination. In addition the wizard also asked for a backup directory.
Once the backup destination was configured, the wizard went through the remaining settings for schedule, selection of folders to be protected, backup encryption and rotation. Finishing the wizard, i had the option to run the backup, which i did. When the backup finished, i went back to the Azure portal to see what way the files are being stored in the container. There were a number of files that i guess were indexes and other miscellaneous files, but then there was a folder called Pool, where the actual backup files have been saved as 50MB chunks (buckets). Having the understanding now of how files are stored in the azure container, i looked at how to avail of using the Archive tier. First though was that PowerShell most likely will provide some means of achieving this, but than i found that the tools needed are already built in, and it’s called Blob Lifecycle Management.
I created a new rule that would move the files not modified in 3 days to Archive storage.
In the second step, i set the filter to apply to files in the Pool folder only.
Now i had my new Lifecycle management rule in place.
All i had to do is wait a few days, and check again the result of the lifecycle management rule. Looking at the files in the container, i could new see that the older files have actually been moved to the Archive tier, whereas the files newer than 3 days remained in the Cool storage tier.