HomeLab: Archiving systems for external disks

krycor

Honorary Master
Joined
Aug 4, 2005
Messages
18,546
So long story short..

I have a microserver running plex, internal software raid 5ish? tb (3x3tb). Cool..

But I have a few external disks we have from past like a 2tb x2 (old school with sata/ide internal) and a power supply and my 1x2TB notebook portable drive in case in a backup.

So wondering what’s an effective backup solution that’s cost effective using the spare disks and also is there a *nix application I can use to “age out” non-accessed files.

So I’m considering doing some like..

External powered disk = longer term back up, every 1months the plug activates (yay for sonoff switches) and drive pops up on server + auto mounts. Kicks off whatever script I do to back up everything needed.
Easy!

Now for the more challenging one.. external usb powered disk sits connected 24/7. Internal stats on media files(self generated content too), back up files etc gets pushed into this disk as the content is rarely accessed post write.

This data want to move to the disk faking the original location.. or I directly place files there(easy lazy way). If I could do this where the files move off the raid I reckon I make best use of raid.

Question is justif there is some sort of file system that does this or do I manually have to track access.. particular for media files. For our own media content the decision is easy..

Ps. All our content, ie media, files, work etc is backed up to iCloud and replicated to GoogleDrive presently. So this is just a local caching back up. And for the inevitable video content which makes iCloud - Google as a back up costly (more and more vids are now 4K shot with iPhone or GoPro etc and that adds up fast.)
 
Last edited:

Dairyfarmer

Executive Member
Joined
Apr 17, 2016
Messages
6,213
Kicks off whatever script I do to back up everything needed.
Robocopy is what you want to use for that. The log option lets you output file and folder names to a text file.

Robocopy code in a .cmd file
Code:
robocopy "c:\Alpro\Backup" "G:\Backups\Month End Backups\2020\%date:~3,2%\Alpro backups" auto_2020-%date:~3,2%-*.cab /log+:"G:\Backups\Logs\Log.txt" /v

Log file
Code:
   ROBOCOPY     ::     Robust File Copy for Windows                          
-------------------------------------------------------------------------------

  Started : 11 July 2020 01:00:04
   Source : c:\Alpro\Backup\
     Dest : G:\Backups\Month End Backups\2020\07\Alpro backups\

    Files : auto_2020-07-*.cab
     
  Options : /V /DCOPY:DA /COPY:DAT /R:1000000 /W:30

------------------------------------------------------------------------------

                      16    c:\Alpro\Backup\
              same           6.9 m    auto_2020-07-01 00.03.cab
              same           6.8 m    auto_2020-07-02 00.03.cab
              same           6.8 m    auto_2020-07-03 00.03.cab
              same           6.8 m    auto_2020-07-04 00.03.cab
              same           6.8 m    auto_2020-07-05 00.03.cab
              same           6.8 m    auto_2020-07-06 00.03.cab
              same           6.7 m    auto_2020-07-06 09.33.cab
              same           6.7 m    auto_2020-07-07 00.04.cab
              same           6.7 m    auto_2020-07-07 04.04.cab
              same           6.7 m    auto_2020-07-08 00.04.cab
              same           6.7 m    auto_2020-07-08 04.04.cab
              same           6.8 m    auto_2020-07-09 00.04.cab
              same           6.8 m    auto_2020-07-09 04.04.cab
              same           6.8 m    auto_2020-07-10 00.04.cab
        New File             6.8 m    auto_2020-07-10 04.04.cab
 

InvisibleJim

Expert Member
Joined
Mar 9, 2011
Messages
2,927
Maybe have a look at Duplicati2 which supports quite a few cloud storage providers including Azure, AWS, Dropbox, Onedrive, Google etc. It also has a command line interface so scripting an incremental backup to run when you connect the drive would be possible.

I was looking to see if there was any info on backing up to Azure archive tier (which is about +/- R35/TB/month) or Amazon Glacier. So far I found a workaround in this article from 2013 for Glacier and something similar should work in Azure if you have a script to set the files to archive tier periodically. If I come across anything that works a bit more out of the box I will post here.
 

Dairyfarmer

Executive Member
Joined
Apr 17, 2016
Messages
6,213
@Dairyfarmer Alpro lol. I have to support it here as well. And dimssa.
I learned to use Alpro in 20 minutes. There is not much to learn. The DeLeval agents ask me for help now.

I also use FarmKeeper. It is still around and you can still activate it even if you can no longer buy licenses.
 
Last edited:

Dairyfarmer

Executive Member
Joined
Apr 17, 2016
Messages
6,213
@kolaval
You wouldn't know if there is an ISO code for feeding WAIT (Step Up/Down Delay) . For feed 1 there is 300197 (Ration), 300198 (Target) and 300202 for Target Days. But to put in a Wait I have to still do it manually.
Annotation 2020-07-11 142726.jpg 1594471020180.png
 

kolaval

Executive Member
Joined
May 13, 2011
Messages
8,985
@kolaval
You wouldn't know if there is an ISO code for feeding WAIT (Step Up/Down Delay) . For feed 1 there is 300197 (Ration), 300198 (Target) and 300202 for Target Days. But to put in a Wait I have to still do it manually.
View attachment 873033 View attachment 873035
Nope sorry, I don't work with it, I just have to keep everything running.
I'm not farming, artificial insemination is not for me.
:sneaky:
 

krycor

Honorary Master
Joined
Aug 4, 2005
Messages
18,546
So the software is for back up.. but is there an app to automatically manage slow and fast disks dynamically?

Eg

/data/storage/..

/data/slow-disk/storage/..
/data/fast-disk/storage/..


So you access everything via location one.. but there is actually two places it stores the data and the application moves or shuffles the data based on access frequency?

So new data is always pushed to fast disk, when access hasn’t happen in 90d it pushes it to a ‘slower’ disk which is still accessible but the like to data remains the same.
 

SauRoNZA

Honorary Master
Joined
Jul 6, 2010
Messages
47,848
Take the drives out of the casings and put them inside the Microserver.

You can hook up 6 with a little effort and little
Cost.
 
Top