![]() Probably because of tiny datasets (and maybe because it's used mostly with cloud and not with great uplinks) nobody cares about performance. Nobody tells you that TBs or tens of TBs aren't supported or that you should at least increase the blocksize if you want the sqlite database to still work at all, never mind a restore for a few TBs taking weeks. EXCEPT that I suspect it's tested on (if not built for) very small (for us) data-sets, probably some tens of gigs maybe very few hundreds GBs) of documents one might have on a laptop with a "regular" (think not crazy expensive, although prices are falling) SSD. I'm definitely open to any and all suggestions right now because I have no backups.ĭuplicati has a decent GUI, excelent set of feature, active development, totally open and is almost everything one would want from such a tool. Ideally I'd like something like Crashplan/Duplicati where I can access the settings via a web interface and pick out just a few files I need to restore if required. Obviously since that is no longer a viable option I've been trying to replace it with Duplicati but its just not reliable. I used to use Crashplan with the computer-to-computer backup option and it worked perfectly. If its this finicky to backup I fear when I need to restore from it. ![]() Point is, only once in a while would the backup actually start successfully. Often times the backup would stop at "Starting backup", or while it was doing some sort of file verification. Problem is, while in theory everything is peachy, duplicati just seems to fall flat on its face trying to handle this amount of data. Duplicati (running on a Linux container) was pulling in the source data via a samba share then uploading to the 8TB using SSH. Source data lives on my QNAP nas, and the 8TB drive is hooked up to a Raspberry Pi at a friend's house offsite. I have about 3.5TB worth of data to backup, and I have an 8TB drive I'm trying to back it up to. I've been trying to back up my data for far too long and struggled far too much trying to get Duplicati to work reliably. Just make sure to tag the post with the flair and give a little background info/context. On Fridays we'll allow posts that don't normally fit in the usual data-hoarding theme, including posts that would usually be removed by rule 4: “No memes or 'look at this '”
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |