Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Sign In with OpenID
Advertise on

In this Discussion

Automating downloading / syncing. Help?!
New on LowEndTalk? Please read our 'Community Rules' by clicking on it in the right menu!

Automating downloading / syncing. Help?!

eastoncheastonch Member
edited July 2012 in General

So anyway... I'm thinking about making a small script that will periodically download files from this location:

which has files on: However, when trying to access this area anonymously through an FTP client, I can't seem to view or download anything, but if I connect directly through it via then I can.

My question is, is there a way for me to pick up every single file name from and then do a command to "wget" them all and place them onto my server? But only download 'new' ones.

The reason for this is that when a new patch is released, DayZ (a mod) tends to also update to support new beta patches, which means that popular servers are running latest patches, the beta patch site is incredibly slow due to load, if I could run a script that would check every 6 hours and wget if there's a new one, that would save me time to just download it from a more "unused" location, also allows me to advertise my "unofficial" mirror online.

Thanks, Chris.

Intital thoughts were to just wget recursively using a lengthly command i found, but doesnt seem to work, due to it can't get a directory listing of; I've contacted the dev's to see if there's a way I can do this for a secondry mirror for myself, but as of yet heard nothing back.

Security Consultant


  • wget -r --follow-ftp

    will grab all the files linked to on that page, though I have no idea how to get timestamping working, since you can't access the FTP root. Could try messing with this?

    "We are in a prison drama. This is like The Shawshank Redemption, only with more tunneling through shit and no fucking redemption."
  • wow, that's actually working. Thanks alot mate! What would be the command to grab the .log's too?

    Security Consultant

  • TazTaz Disabled
    edited July 2012


    Time is good and also bad. Life is short and that is sad. Dont worry be happy thats my style. No matter what happens i won't lose my smile!

  • @ihatetony it won't allow me to use the -N feature to check to see if it's newer or not on my end. Is this something that will work after the first download?

    Security Consultant

  • Throw in -A.log (I think, anyway) to get the logs as well. -N won't work for FTP downloads unless wget can access the FTP listing, it seems. If nothing changes in the old files, you could try -nc to just have it skip downloading existing files?

    "We are in a prison drama. This is like The Shawshank Redemption, only with more tunneling through shit and no fucking redemption."
  • Ah; that looks like fun, so it just matches the filenames?

    Nothing will be changing since their just builds, not edited builds.

    Also, tried to get --timestamping working, but that's obviously irrelevant if i'm just using -nc.

    :'] I'll try and let you know. @ihatetonyy

    Security Consultant

Sign In or Register to comment.