Archiving the Archives...
Hello!
As you can probably tell I haven't written in a while. There are many reasons for this, ranging from competing obligations to faith, family, and community, to being involved in technical work I really can't talk much about in detail.
Honestly this makes me sad. I've been blogging and journaling since high-school when Xanga (yes, Xanga) was a thing. Writing has always helped solidify concepts and ideas in my head that would otherwise be lost to those frequent neurological reboot cycles often referred to as sleep. After a while the delays from writing an idea turn into justifications that I'm just too busy to write anymore or that the idea is too mundane to bother. So the cycle continues.
But not today! (oh...it's tonight...or rather early tomorrow morning...anyway).
En lieu of more grand writing projects, today's gem is related to a script I wrote while working for a college many years ago. I reference it here in my GitHub. It's a file archiving script, but incorporates both encryption 'over the wire' and at rest, meaning data transferred is secured before, during, and after transmission.
Here's the scenario:
DataServer (DS) contains data it needs to share with the AWS EC2 instance, DR. Since DR lives outside our domain, there is concern over it being compromised along with the data backups. To mitigate, all the data lives in an encrypted share on DR which stays locked unless data is actively being uploaded. In this way if the machine is ever popped, the threat actor will only have an encrypted blob of data...assuming said actor isn't waiting around to see what happens.
Whenever data needs to be moved, DS will reach out to DR over SSH (encrypted channel using keys) and pass a secret file used to decrypt the secure volume
#!/bin/bash
#
# push local key to remote machine
#
scp disk_secret_key youruser@ec2-DR-west-2.compute.amazonaws.com:/home/youruser;
...
The following series of commands will unlock and mount the secure volume, followed by the deletion of the key file.
...
#
# remote to DR, set up encrypted filesystem, then remove key
#
ssh -t youruser@ec2-DR-west-2.compute.amazonaws.com '\
sudo mv disk_secret_key /root/;\
sudo chown root:root /root/disk_secret_key;\
sudo cryptsetup -v luksOpen /decryptfs decryptfs --key-file=/root/disk_secret_key;\
sudo mount /dev/mapper/decryptfs /decryptfs;\
sudo chown -R youruser:youruser /decryptfs;\
sudo rm -f /root/disk_secret_key';
...
Once the volume is unlocked and mounted, a secure-copy connection can be made from DS to facilitate the secure transfer of data. In this case we'll use SCP.
...
#
# Push data to remote encrypted directory
#
scp -r /BACKUPDATA youruser@ec2-DR-west-2.compute.amazonaws.com:/decryptfs;
...
Finally we can lock the secure volume back down once we complete the transfer. I added an email notification to mine to alert me when a dump was made.
#
# Close remote encrypted volume
#
ssh -t youruser@ec2-DR-west-2.compute.amazonaws.com '\
sudo umount /decryptfs;\
sudo cryptsetup -v luksClose decryptfs;'
#
# OPTIONAL: Email results...assumes email...
#
RESULT=$?;
if [ $RESULT -eq 0 ]; then
mail -s "Data backup to encrypted system completed successfully" youruser@email.domain < /dev/null
else
mail -s "Data backup to encrypted system failed. See logs." youruser@email.domain < /dev/null
fi
And that's it! With and SSH client and some familiarity setting up encrypted file systems on Linux, you too could be archiving random files to a remote server you control but don't trust...for whatever reason.
Cheers and happy archiving!
Sources:
LUKS Encrypted Partitions