Automating downloading / syncing. Help?!

Automating downloading / syncing. Help?!

eastoncheastonch Member
edited July 2012 in General

So anyway... I'm thinking about making a small script that will periodically download files from this location: http://www.arma2.com/beta-patch.php/

which has files on: ftp://downloads.bistudio.com/arma2.com/update/beta/ However, when trying to access this area anonymously through an FTP client, I can't seem to view or download anything, but if I connect directly through it via ftp://downloads.bistudio.com/arma2.com/update/beta/ARMA2_OA_Build_95208.zip then I can.

My question is, is there a way for me to pick up every single file name from arma2.com/beta-patch.php and then do a command to "wget" them all and place them onto my server? But only download 'new' ones.

The reason for this is that when a new patch is released, DayZ (a mod) tends to also update to support new beta patches, which means that popular servers are running latest patches, the beta patch site is incredibly slow due to load, if I could run a script that would check every 6 hours and wget if there's a new one, that would save me time to just download it from a more "unused" location, also allows me to advertise my "unofficial" mirror online.

Thanks, Chris.

Intital thoughts were to just wget recursively using a lengthly command i found, but doesnt seem to work, due to it can't get a directory listing of ftp://downloads.bistudio.com/arma2.com/update/beta/; I've contacted the dev's to see if there's a way I can do this for a secondry mirror for myself, but as of yet heard nothing back.

Systems Administrator | IWFHosting

Comments expressed are solely my own opinion and not of that of the companies, unless stated.

Comments

  • wget -r -A.zip --follow-ftp http://www.arma2.com/beta-patch.php/
    

    will grab all the files linked to on that page, though I have no idea how to get timestamping working, since you can't access the FTP root. Could try messing with this?

    "We are in a prison drama. This is like The Shawshank Redemption, only with more tunneling through shit and no fucking redemption."
  • wow, that's actually working. Thanks alot mate! What would be the command to grab the .log's too?

    Systems Administrator | IWFHosting

    Comments expressed are solely my own opinion and not of that of the companies, unless stated.

  • TazTaz Disabled
    edited July 2012

    Nvm.

    Time is good and also bad. Life is short and that is sad. Dont worry be happy thats my style. No matter what happens i won't lose my smile!

  • @ihatetony it won't allow me to use the -N feature to check to see if it's newer or not on my end. Is this something that will work after the first download?

    Systems Administrator | IWFHosting

    Comments expressed are solely my own opinion and not of that of the companies, unless stated.

  • Throw in -A.log (I think, anyway) to get the logs as well. -N won't work for FTP downloads unless wget can access the FTP listing, it seems. If nothing changes in the old files, you could try -nc to just have it skip downloading existing files?

    "We are in a prison drama. This is like The Shawshank Redemption, only with more tunneling through shit and no fucking redemption."
  • Ah; that looks like fun, so it just matches the filenames?

    Nothing will be changing since their just builds, not edited builds.

    Also, tried to get --timestamping working, but that's obviously irrelevant if i'm just using -nc.

    :'] I'll try and let you know. @ihatetonyy

    Systems Administrator | IWFHosting

    Comments expressed are solely my own opinion and not of that of the companies, unless stated.

Sign In or Register to comment.