- Accidental DBA
- Posts
- Ransomware Sucks
Ransomware Sucks
Move your backups off the server

Bad actors full of malware headed your way…
Ransomware – Get Ahead of It!
I got a very proactive call from one of my clients recently.
An “industry adjacent” firm got hit hard by ransomware and had no offsite backups, which impacted the firm as well as others up and downstream from them.
My client, being the very smart guy that he is wanted to ensure we have local backups as well as offsite. He understands the value of having backups in a location that ransomware cannot get to via a compromised account.
I’ve been building and testing the simplest possible thing I can come up with to copy his backups to Azure Blob Storage.
So far the process (broadly) looks like this:
Create Azure resource group
Create Azure storage account
Create container (blob)
Get Access Key
Install/copy AZCopy to backup server
Install / upgrade PS version?
Install az.Storage powershell module?
Set up SQL Agent job:
Copy Files (All databases, or subset?) (cmd line)
Purge (days?) (Powershell .ps1 file)
I’ve set the Powershell cleanup to be 1 hour longer than the local backup server retention so I don’t accidentally copy a file that just got purged. These backups are huge. I’m testing with just the StackOverflow and a 100Mb/sec upload speed.
This is the code I’m using, but have changed the keys and storage container names to keep shenanigans from happening 😊
Copy files to cold storage:
azcopy.exe copy "G:\SQLBackups\PRECISION7770-1\*" "https://redacted.blob.core.windows.net/sqlbackups/?sv=2023-01-03&se=2024-09-07T18%c&sp=rwl&sig=00iknET1uxOyJu4USnFxIkDgwYD2t50%3D" --overwrite=false --from-to=LocalBlob --blob-type Detect --follow-symlinks --check-length=true --put-md5 --follow-symlinks --block-blob-tier=Cold --disable-auto-decoding=false –recursive
Powershell code to purge:
#Script to delete backup files
# Set variables
$container="sqlbackups"
$StorageAccountName="redacted-files"
$StorageAccountKey="L15vU7QpFlUwiasDRujidcIIKZESRbl/+AStA5jktw=="
$context = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$retentiondate = Get-Date
# read the list of files from the container
$filelist = Get-AzStorageBlob -Container $container -Context $context #-Blob
foreach ($file in $filelist | Where-Object {$_.LastModified.DateTime -lt (($retentiondate.ToUniversalTime().AddDays(-4)))})
{
$removefile = $file.Name
if ($removefile -ne $null)
{
Write-Host "Removing file $removefile"
Remove-AzStorageBlob -Blob $removeFile -Container $container -Context $context
}
}
So far, so good…what have I left out?
(I know that backing up straight to Azure is simple, but that doesn’t meet the local backups requirement)
Get a FREE SQL Server Health Check from Dallas DBAs
My Recent LinkedIn Posts:
Interesting Stuff I Read This Week
Transaction Commit latency acceleration using Storage Class Memory in Windows Server 2016/SQL Server 2016 SP1 - Microsoft Community Hub – I didn’t know this existed until this week!
Major Canada freight railroads come to halt without new labor contracts | AP News – this is going to affect a lot of folks on both sides of the border, depending on how long it lasts
SQL tidBITs:
Did you know you can set individual steps of a SQL Agent job to retry on failure ‘x’ number of times with ‘y’ minutes between retries? Its in the Advanced tab of the job step:

Share with your friends…7,343 subscribers and counting! How long until they all get to our new home on beehiiv?
Reply