All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Sort of Server Deployment - Automatic Updates of Web Server Folder/Files?
I've been trying to come up with ways in which a web server could be set up to retrieve the latest version of a website from a remote source. The idea is that multiple servers could grab their relevant files from their remote directory (or a single archive file) when updated.
I was thinking of going with ssh but firewalls might not allow access at some server locations.
So now I'm thinking something like a script using wget -N to fetch the relevant latest tar file, removing the old files and replacing with the new files in the folder. Git would be a possibility too.
Maybe Ansible?
I know that there are loads of deployment solutions but I can't find an open-source one that can be set up to just update a website folder at regular intervals.
Does anyone have any advice?
Thanks for reading
Comments
https://www.lowendtalk.com/discussion/70175/caddy-0-8-a-web-server-with-automatic-https-via-let-s-encrypt#latest
Use Git to store site, caddy will pull it from git for you.
No way related to the question .
facepalm
Take a look at syncthing
syncthing looks like it might do the job thanks. I'll need to read up on it as I'm still not entirely sure what it does exactly, but as long as it can run headless on the 'local' web server side I'd be happy with it.
Git is a very good solution , I don't see what any open source can do that git cannot a simple auto-deploy script should do it. A ssh key pair in github and auto pull every minute or 5 minutes using cron is worth than going through custom deployment application development ( reinventing the wheel )
BitTorrent Sync can be an better option. Can get more control to choose servers during deployment.
You could use a git repo for this and just set up a few hooks, that's what I'm doing for my cluster
I've started working with Ansible, using this playbook: https://github.com/ansible/ansible-examples/tree/master/lamp_haproxy
Basically, it'll pull the latest update to the next server when the previous one is done, so if you have at least 2 servers you won't have any downtime.
If I couldn't use ssh due to firewall rules etc, and would prefer for the 'local' webservers to initiate contact with the host server containing the files (they might have never connected before), ideally over http for firewall reasons, is there anything that is made to work like that?
Perhaps using wget or similar?
Why not use rsync?
Why not use rsync?
I use Fabric to run commands on multiple machines its a Python library and very straightforward to use.
The servers authenticate through SSH keys and you could just use the put command in your fabfile.py to replace the files.
I thought rsync only worked over ssh. If it can operate differently then that's interesting
The worry is that ssh might be blocked. I have to assume I won't be able to change ports etc at the firewall
Lots of devops tools do this using Git. DeployHQ is good and free. There's also Codeship and some others.
Have you considered using something like a docker image?
I think @ALinuxNinja was referring to the ability for Caddy (the web server) to automatically pull the latest changes after a git push: https://caddyserver.com/docs/git
Leave the .git folder to the web root is very bad idea.
Try use some deploy tool, like fabric or capistrano.