New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
How do you deploy to your web document root?
Hi, I'm configuring a debian 6 php/mysql installation to host my different php projects and homepage, and I'm now deciding what to set up to remotely access the server document root.. do you use old-style ftp? or samba? or some new fancy thing like dropbox?..
At work I use samba and it's very comfort from windows as you have the server document root mounted as any other drive so can tinker around from the windows file manager..
Comments
I use sftp
I usually use rsync when on a Linux desktop, and CoreFTP + SFTP when on a Windows desktop. The advantage of the former is that the transfer is gzipped and faster in general, especially with lots of small files. Using rsync is somewhat similar in speed and how it works to tar.gz-ing a directory, uploading it, and unpacking it on the remote server, as far as I know.
rsync also.
Or finish the project then tar.gz it up and move it all at once... :-)
SCP (WinSCP) via my designated user account created via virtualmin. Pretty much the only way I do it.
Straight up FTP baby!
SFTP plugin for Notepad++
Vftpd setup using ssl has some reuse key bug or something so disabling that makes it really slow to use( at least with coreftp)
I just zip -r -P, then wget to another location and unzip. You don't need any fancy sftp at all.
Actually, another setup I made for a specific project was just pushing the code to a git repository and having a bash script on the server that pulled from the git repo and then rsynced the needed files (locally!) to the document root, excluding config files etc.
Simply doing
ssh [email protected] deploy
would be enough to run the deployment script that did the above.Either dropbox (drag->copy link->wget) or scp, depending on OS. (pscp is not always handy on windows) or direct vi/nano.
fish from Midnight Commander most of the time as i am constantly changing something. When I only transfer a picture or litte movie, simple scp is ok
I highly recommend Midnight Commander as it is both powerful and user friendly, especially for newbies. Back in the days was easy to show it to ppl starting on linux as norton commander was still in use by windows ppl, today...
M
@Maounique Can you transfer multiple files at once using Midnight Commander? Uploading several thousand files one by one is painfully slow, and no, zip, tar etc is usually not an option
Ehm, why not an option ? mc can also do that.
M
I need to transfer many files between Server A and Server B. Zip is not an option because I don't have access to Server B to unzip the files later (only FTP).
Tried to use mc on Server A, but it only sends the files one by one (I couldn't find the way to make multiple connections to server B and send more than one file in parallel, and sending files one by one is slow as FTP sucks.
Right now I download the files to home and use filezilla which can make ~30 connection at once which makes it 30 times faster, but it would be more convenient to do it directly from server A.
Assuming you are coding custom app, Git pull from your code repo.
@vedran if you only have ftp... well, then ftp it is, to use fish you need some ssh access, i thought you already have that, sorry.
M
You can use Dropbox! It is super easy to do it, and it is instant deploy
svn export
when I'm doing things “right”; sftp when I'm not.Share a folder (not your webroot) via protocol of choice (say /home/share) and let that rsync with your webroot. (and make seperate backups of your webroot)
I SFTP to the server, and now I don't really update my websites (I'll get round to doing that some day), so I just archive 'em, rsync or scp them to the new server and walla!
On this note, never ever ever git push to your production server. It's too easy to mess something up by picking the wrong remote (or for example unexpected behaviour from Git), and having to actually log on to the server you're running the command on (the production server), gets you to think in the right 'context' - as in, you will be aware that the changes you make apply to that production server instead of the local machine.
Coda for Mac does it all.
Capistrano is great for deployments
Sorry to revive this thread, just to let you know what I've finally came up with.. after posting this thread I realized I wanted to be able to deploy from my workplace which is proxified and only lets through ports 80 and 443.
I already had dropbear listening on 443 so I setup webdav on nginx with basic authentication over ssl (ssl served on port 80, just in a different subdomain eg. dav.myserver.com). It's working really well so far and from the client side I use www.bitkinex.com webdav client on windows which is freeware and very nice piece of software.
I was able to setup full webdav support on nginx thanks to webdav-nginx-play-nice (look out the comments too).
As a note my server is a debian 6 setup with Tiger's vps scripts which are very nice also.
TL;DR; I was able to serve webpages and deploy them on server using the same single port (80) thanks to the awesome webdav protocol that's partially built-in on nginx.