Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


50 years of FTP
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

50 years of FTP

valkvalk Member

The original specification for the File Transfer Protocol was written by Abhay Bhushan and published as RFC 114 on 16 April 1971

FTP is sucks when firewall :trollface:

Comments

  • congrats Abhay Bhushan on its 50th FTP anniversary.

    Thanked by 1valk
  • @valk said: FTP is sucks when firewall

    PASV mode?

  • yoursunnyyoursunny Member, IPv6 Advocate

    I discontinued my FTP site in 2011, and instead directed readers to download from Microsoft SkyDrive.
    Too many complaints about firewall problems, not working over IPv6, files corrupted due to downloading in ASCII mode…

    In the meanwhile, premium providers are still using FTP on their DirectAdmin shared hosting plans, but it's running over TLS now.

    Thanked by 1coreflux
  • Using FTP every single day, just because it works, without issues. Though, never for sensetive or critical data.

    Thanked by 3brueggus valk chrisp
  • Daniel15Daniel15 Veteran
    edited April 2021

    Flashbacks to how I used to maintain sites in the early 2000s... Uploading files using WS_FTP, with no encryption of course :smiley:

    Sourceforce used to have a weird publish flow where you'd upload files via FTP, then select them in a web UI to associate them with your project. It was useful because there was not really any support for resuming or showing progress for HTTP uploads back then.

    I don't think I've used FTP for 15 years or more... SFTP (via SSH) has pretty much replaced it. FTPS (FTP over SSL) OK but it still has issues with firewalls.

    Thanked by 3angstrom yoursunny valk
  • brueggusbrueggus Member, IPv6 Advocate

    @Daniel15 said: Flashbacks to how I used to maintain sites in the early 2000s... Uploading files using WS_FTP, with no encryption of course

    Good ol' times...

  • @brueggus said:

    @Daniel15 said: Flashbacks to how I used to maintain sites in the early 2000s... Uploading files using WS_FTP, with no encryption of course

    Good ol' times...

    I get an "access denied" error on your image, so have a screenshot I found online:

  • brueggusbrueggus Member, IPv6 Advocate

    @Daniel15 said: I get an "access denied" error on your image, so have a screenshot I found online:

    It is/was a similar screenshot I stole from another website... so nothing fancy. :|

    May I ask which country you're from? My poor man's image hosting runs on a shared webhost and they thought it would be a good idea to implement some geo-blocking to "protect their customers" as they said. I guess I have to move things...

    Thanked by 1mike1s
  • @brueggus said: May I ask which country you're from?

    I'm from Australia but I'm currently living in the USA.

    Thanked by 1brueggus
  • angstromangstrom Moderator

    @Daniel15 said: Flashbacks to how I used to maintain sites in the early 2000s... Uploading files using WS_FTP, with no encryption of course

    On Linux, back in the day, I was a fan of the NcFTP client, but nowadays I use lftp. My first graphical ftp client was gFTP, which I found super cool and intuitive. :)

    I still operate an anonymous FTP server, so FTP is still very much alive for me. :)

    Thanked by 3skorous Shot2 valk
  • FTP sends your username and password in cleartext. It's not secure.

    Thanked by 2yoursunny valk
  • FritzFritz Veteran
    edited April 2021

    My Rclone sync backup is using FTP, so far no problem found.

  • raindog308raindog308 Administrator, Veteran

    @Fritz said: My Rclone sync backup is using FTP, so far no problem found.

    There's no good reason to use FTP in 2021.

    Switch to SFTP.

    Thanked by 2jsg valk
  • rcxbrcxb Member

    FTP has the worst design... You connect on port 21, then there's a RETURN connection back to you on port 20. Blows up with NAT or firewall, without an FTP proxy on your firewall. PASV mode is worse... The connection is made in the correct direction now, but it's made to & from completely random ports, no way to write an ACL to allow just PASV FTP without opening up almost all outgoing connections.

    HTTP is better in almost every way, EXCEPT nobody ever had the good sense to make a command-line HTTP file-transfer client. Without that, FTP remains relevant. It's just so much easier to browse an FTP site, pick out some files you want, and download them.

    SFTP has the overhead (and lag) of encryption, not to mention protocol mismatches, and just various interoperability problems. Great for security but terrible for performance when you don't need it, like downloading public files.

    FTP is the best option for a lot of stuff. All the HTTP based Android file sharing apps I see have file size limits or a terrible, slow interface. Several good FTP based ones.

  • dfroedfroe Member, Host Rep

    @chihcherng said:
    FTP sends your username and password in cleartext. It's not secure.

    Which by the way is exactly the same with HTTP and many other protocols.
    That's why people invented TLS, which (thanks to OSI layer model) works perfectly fine with many (insecure) protocols like HTTP, SMTP and FTP as well. It's not required for the design of a L7 protocol to provide encryption as long as it is compatible with TLS on L5.
    But of course as with the other protocols as well, one should take care to always use the version ending with an S. :)

    Thanks @brueggus and @Daniel15 for the sweet memories of publishing files on Sourceforge with WS_FTP. :)
    Similiar like FreeBSD finally dropped kernel support for 3C509 NICs a few days ago in new v13 release..

  • yoursunnyyoursunny Member, IPv6 Advocate

    @rcxb said:
    FTP has the worst design... You connect on port 21, then there's a RETURN connection back to you on port 20. Blows up with NAT or firewall, without an FTP proxy on your firewall.

    FTP was invented before firewalls.
    NAT is evil.

    PASV mode is worse... The connection is made in the correct direction now, but it's made to & from completely random ports, no way to write an ACL to allow just PASV FTP without opening up almost all outgoing connections.

    PASV mode can be made firewall compatible, by setting a narrow range of ports.

    I setup vsftpd for computer networking class project at University of Arizona.
    I asked 100 ports to be open in the department firewall between student computer lab and the server, and used 10 of these ports for PASV mode.
    https://github.com/yoursunny/VNL/blob/7753882157f75aacbdb6cf02431a7624bff4228f/guest-apps/vsftpd.conf

    HTTP is better in almost every way, EXCEPT nobody ever had the good sense to make a command-line HTTP file-transfer client. Without that, FTP remains relevant. It's just so much easier to browse an FTP site, pick out some files you want, and download them.

    WebDAV and rclone mount.

    SFTP has the overhead (and lag) of encryption, not to mention protocol mismatches, and just various interoperability problems. Great for security but terrible for performance when you don't need it, like downloading public files.

    Nowadays, everything needs encryption, including public websites serving static files.
    It's for privacy, so that your neighbor doesn't know which software package you are downloading.

    FTP is the best option for a lot of stuff. All the HTTP based Android file sharing apps I see have file size limits or a terrible, slow interface. Several good FTP based ones.

    Android is terrible.
    Switch to the PinePhone, a real Linux machine that you can install normal packages.

    Thanked by 2Pixels valk
  • @yoursunny said: Android is terrible. Switch to the PinePhone, a real Linux machine that you can install normal packages.

    Or just use Termux.

  • yoursunnyyoursunny Member, IPv6 Advocate

    @stevewatson301 said:

    @yoursunny said: Android is terrible. Switch to the PinePhone, a real Linux machine that you can install normal packages.

    Or just use Termux.

    With PinePhone, I can:

    • change the kernel
    • change the filesystem
    • enable full disk encryption
    • add a USB floppy drive
    • capture and transmit raw Ethernet frames
    • launch WiFi deauth attack (for educational purpose in own networks)
    • send AT commands to the LTE MODEM
    • access I²C peripheral

    You can't do most of those in Termux app.

  • jon617jon617 Veteran
    edited April 2021

    Used FTP since the 90s and still today!

    Recently built a house surveillance camera system that uses lightweight FTP to upload realtime video to a storage server in a datacenter. Cheaper, higher video quality, and longer retention than 24/7 video hosting services (Nest Cam, Ring cloud, Arlo cloud, etc.).

    Thanked by 2that_guy valk
  • I just read that Mozilla are removing the FTP client from Firefox. I'm surprised it was still there... I don't think any other browsers still have FTP clients built-in! https://www.zdnet.com/google-amp/article/mozilla-to-start-disabling-ftp-next-week-with-removal-set-for-firefox-90/

  • @rcxb said: FTP has the worst design... You connect on port 21, then there's a RETURN connection back to you on port 20. Blows up with NAT or firewall,

    It was invented before NAT was commonplace, back when every device on the internet had a public IP address. Also it wasn't really much of a problem for firewalls, you just had to open port 20.

    I wish that IPv6 had become widespread much earlier so NAT could have been totally avoided... There's a number of pain points with NAT.

    @rcxb said: SFTP has the overhead (and lag) of encryption

    Encryption doesn't have much overhead on modern processors with AES-NI extensions (which means it's hardware-accelerated)

  • Shot2Shot2 Member

    @Daniel15 said:
    I just read that Mozilla are removing the FTP client from Firefox. I'm surprised it was still there... I don't think any other browsers still have FTP clients built-in! https://www.zdnet.com/google-amp/article/mozilla-to-start-disabling-ftp-next-week-with-removal-set-for-firefox-90/

    Another dick move by Mozilloogle Megacorp. Oh, well. Just quit updating, and you get to keep any required protocol (and let's butfuc all the paranoids out there).

    Thanked by 1jsg
  • jsgjsg Member, Resident Benchmarker

    @Daniel15 said:
    Encryption doesn't have much overhead on modern processors with AES-NI extensions (which means it's hardware-accelerated)

    The problem (and the brake) isn't the data encryption. It's the session establishment (KEX, etc.).

  • @raindog308 said:

    @Fritz said: My Rclone sync backup is using FTP, so far no problem found.

    There's no good reason to use FTP in 2021.

    Switch to SFTP.

    It's fine on local or private networks and saves the overhead.

  • raindog308raindog308 Administrator, Veteran

    @TimboJones said: It's fine on local or private networks and saves the overhead.

    You'd have to be connecting to a 486 or do a massive amount of transfers before SFTP overhead would be an issue.

    I mean, yes, rsh and rexec and telnet and root accounts without passwords are fine on private networks as well, but why?

    Thanked by 1valk
  • TimboJonesTimboJones Member
    edited April 2021

    @raindog308 said:

    @TimboJones said: It's fine on local or private networks and saves the overhead.

    You'd have to be connecting to a 486 or do a massive amount of transfers before SFTP overhead would be an issue.

    You'd think that, but no, it's more hassles and slower. Application being like 50 text files, so it's seconds I could live with, but it was definitely slower. You're adding protocol overhead, not just per packet processing.

    I mean, yes, rsh and rexec and telnet and root accounts without passwords are fine on private networks as well, but why?

    Well, because it's supported and works. I'm working with carrier grade radios used by cellular (ie, $10k a link) that has telnet (VLAN access only), as does nearly all carrier stuff. SSH is available with license upgrade, but doesn't work with plink (some sort of interactive requirement) so telnet is preferred for automated testing.

    Embedded devices don't always have factory installed unique SSH keys that stay persistent across boots and firmware upgrades, so fingerprints change and end up making user auto accept any security prompts anyway or break things from unexpected finger prompts.

    Tl;dr automation, QA and testing, for starters. Tons of other reasons. Factory testing and configuration, etc. SSH adds nothing but hassles in cases where you have private access.

    Thanked by 1valk
Sign In or Register to comment.