New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
I don't store anything there.
You could always encrypt your backups to feel safer about storing them there.
Only my gpg keys know.
Yes, why not?
Encrypted (gpg) tar backups, and you can access the ftp service from anywhere to recover them.
Note: whatever you plan to use it for there's a limit on the number of files you can store 1,000 if I recall. It's intended for large infrequent backup archives.
@cochon @GM2015 , thanks - I need to read up on gpg backups.
File limit shouldn't be an issue. It's daily mail / DNS backups and I don't need more than 3 months.
I found their ftp space to time out too many times with filezilla from our residential connection in the UK. Connecting from their network or from delimiter atlanta gave no time out issues. I'd suggest using a socks proxy(ssh port forwarding or danted) for manual file downloads.
I've google these when I started with gpg:
You need to know that using your gpg public key(which can be shared anyway) is good enough for encrypting backups.
http://beta.pastee.com/api/get/4dv54/raw
http://beta.pastee.com/api/get/b45sf/raw
http://beta.pastee.com/api/get/s2jze/raw
This is something I've written to back up from windows with cygwin but I failed to get cron working on windows. Or you can just write a new one from scratch to fit your scripting style anyway.
Anyway, this has rsync, ftp upload automated and curl upload to webdav storage like owncloud. I've got curl to webdav sample from the 1TB free NL storage thread.
http://beta.pastee.com/292sq