New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
congrats Abhay Bhushan on its 50th FTP anniversary.
PASV mode?
I discontinued my FTP site in 2011, and instead directed readers to download from Microsoft SkyDrive.
Too many complaints about firewall problems, not working over IPv6, files corrupted due to downloading in ASCII mode…
In the meanwhile, premium providers are still using FTP on their DirectAdmin shared hosting plans, but it's running over TLS now.
Using FTP every single day, just because it works, without issues. Though, never for sensetive or critical data.
Flashbacks to how I used to maintain sites in the early 2000s... Uploading files using WS_FTP, with no encryption of course
Sourceforce used to have a weird publish flow where you'd upload files via FTP, then select them in a web UI to associate them with your project. It was useful because there was not really any support for resuming or showing progress for HTTP uploads back then.
I don't think I've used FTP for 15 years or more... SFTP (via SSH) has pretty much replaced it. FTPS (FTP over SSL) OK but it still has issues with firewalls.
Good ol' times...
I get an "access denied" error on your image, so have a screenshot I found online:
It is/was a similar screenshot I stole from another website... so nothing fancy.
May I ask which country you're from? My poor man's image hosting runs on a shared webhost and they thought it would be a good idea to implement some geo-blocking to "protect their customers" as they said. I guess I have to move things...
I'm from Australia but I'm currently living in the USA.
On Linux, back in the day, I was a fan of the NcFTP client, but nowadays I use lftp. My first graphical ftp client was gFTP, which I found super cool and intuitive.
I still operate an anonymous FTP server, so FTP is still very much alive for me.
FTP sends your username and password in cleartext. It's not secure.
My Rclone sync backup is using FTP, so far no problem found.
There's no good reason to use FTP in 2021.
Switch to SFTP.
FTP has the worst design... You connect on port 21, then there's a RETURN connection back to you on port 20. Blows up with NAT or firewall, without an FTP proxy on your firewall. PASV mode is worse... The connection is made in the correct direction now, but it's made to & from completely random ports, no way to write an ACL to allow just PASV FTP without opening up almost all outgoing connections.
HTTP is better in almost every way, EXCEPT nobody ever had the good sense to make a command-line HTTP file-transfer client. Without that, FTP remains relevant. It's just so much easier to browse an FTP site, pick out some files you want, and download them.
SFTP has the overhead (and lag) of encryption, not to mention protocol mismatches, and just various interoperability problems. Great for security but terrible for performance when you don't need it, like downloading public files.
FTP is the best option for a lot of stuff. All the HTTP based Android file sharing apps I see have file size limits or a terrible, slow interface. Several good FTP based ones.
Which by the way is exactly the same with HTTP and many other protocols.
That's why people invented TLS, which (thanks to OSI layer model) works perfectly fine with many (insecure) protocols like HTTP, SMTP and FTP as well. It's not required for the design of a L7 protocol to provide encryption as long as it is compatible with TLS on L5.
But of course as with the other protocols as well, one should take care to always use the version ending with an S.
Thanks @brueggus and @Daniel15 for the sweet memories of publishing files on Sourceforge with WS_FTP.
Similiar like FreeBSD finally dropped kernel support for 3C509 NICs a few days ago in new v13 release..
FTP was invented before firewalls.
NAT is evil.
PASV mode can be made firewall compatible, by setting a narrow range of ports.
I setup vsftpd for computer networking class project at University of Arizona.
I asked 100 ports to be open in the department firewall between student computer lab and the server, and used 10 of these ports for PASV mode.
https://github.com/yoursunny/VNL/blob/7753882157f75aacbdb6cf02431a7624bff4228f/guest-apps/vsftpd.conf
WebDAV and
rclone mount
.Nowadays, everything needs encryption, including public websites serving static files.
It's for privacy, so that your neighbor doesn't know which software package you are downloading.
Android is terrible.
Switch to the PinePhone, a real Linux machine that you can install normal packages.
Or just use Termux.
With PinePhone, I can:
You can't do most of those in Termux app.
Used FTP since the 90s and still today!
Recently built a house surveillance camera system that uses lightweight FTP to upload realtime video to a storage server in a datacenter. Cheaper, higher video quality, and longer retention than 24/7 video hosting services (Nest Cam, Ring cloud, Arlo cloud, etc.).
I just read that Mozilla are removing the FTP client from Firefox. I'm surprised it was still there... I don't think any other browsers still have FTP clients built-in! https://www.zdnet.com/google-amp/article/mozilla-to-start-disabling-ftp-next-week-with-removal-set-for-firefox-90/
It was invented before NAT was commonplace, back when every device on the internet had a public IP address. Also it wasn't really much of a problem for firewalls, you just had to open port 20.
I wish that IPv6 had become widespread much earlier so NAT could have been totally avoided... There's a number of pain points with NAT.
Encryption doesn't have much overhead on modern processors with AES-NI extensions (which means it's hardware-accelerated)
Another dick move by Mozilloogle Megacorp. Oh, well. Just quit updating, and you get to keep any required protocol (and let's butfuc all the paranoids out there).
The problem (and the brake) isn't the data encryption. It's the session establishment (KEX, etc.).
It's fine on local or private networks and saves the overhead.
You'd have to be connecting to a 486 or do a massive amount of transfers before SFTP overhead would be an issue.
I mean, yes, rsh and rexec and telnet and root accounts without passwords are fine on private networks as well, but why?
You'd think that, but no, it's more hassles and slower. Application being like 50 text files, so it's seconds I could live with, but it was definitely slower. You're adding protocol overhead, not just per packet processing.
Well, because it's supported and works. I'm working with carrier grade radios used by cellular (ie, $10k a link) that has telnet (VLAN access only), as does nearly all carrier stuff. SSH is available with license upgrade, but doesn't work with plink (some sort of interactive requirement) so telnet is preferred for automated testing.
Embedded devices don't always have factory installed unique SSH keys that stay persistent across boots and firmware upgrades, so fingerprints change and end up making user auto accept any security prompts anyway or break things from unexpected finger prompts.
Tl;dr automation, QA and testing, for starters. Tons of other reasons. Factory testing and configuration, etc. SSH adds nothing but hassles in cases where you have private access.