• LCrawford (unregistered)

    FTP is the Frist Transfer Protocol - the workhorse was out there transferring files long before you whipper snappers started with your reliable and well-defined SCP stuff.

  • bvs23bkv33 (unregistered)

    Orpheus was uploading his frist album using FTP

  • Little Bobby Tables (unregistered)

    Haven't used it for a long time, getting nigh on 20 years now, but when we did use it, we were just delighted to get it working at all. It did the job and transferred our files. And when we had perfected our DCL script that ran our entire process automatically, and had done the job of reporting back to us if the FTP call failed, we slapped each other on the back and declared a job well done.

    To the best of my knowledge, 10 years later that DCL script it was still going strong, having worked tirelessly and trouble-free ever since it was completed and deployed. However much our sales team implored the customer to upgrade to something new, he steadfastly refused, because the system he had in place was working perfectly for what he wanted it to do.

  • dpm (unregistered) in reply to Little Bobby Tables

    our DCL script

    Oh my goodness. Another VAX/VMS user?

  • Pabz (unregistered)

    FtpWebRequest? I consider that to be a very strange name for a class, don't know about anyone else.

  • Church (unregistered) in reply to Pabz

    It is a bit odd. It should really just be WebRequest as the ftp part is a bit unnecessary. I haven't seen a web client library that couldn't also do ftp stuff in a while. However, if this code never does any web stuff and only does ftp, I could see that as a compelling reason to call out its actual use in code.

  • Brian Boorman (google) in reply to Church

    I think the opposite. FTP has nothing to do with the Web. Web == http(s)

  • WTFguy (unregistered)

    @Church

    This looks like C#. Assuming so, the real WTF is that the .Net class hierarchy v1.0 was originally built using the noun "Web" to mean "Anything URI-based". See https://docs.microsoft.com/en-us/dotnet/api/system.net. Which includes ftp:\, file:\, and any other future or homebrew protocol schemes you care to define. Plus oh yeah, http(s), the only protocol that's actually webby.

    Oops. I agree there are a dearth of other good terms. But naming a collective after the most common member thereof is a fundamental obstruction to clear thinking and clear communication.

  • WTFguy (unregistered)

    It seems Brian types faster than I do.

  • Ross Presser (unregistered)

    NetRequest seems like it might have been a better idea.

  • Conradus (unregistered) in reply to Brian Boorman

    http/https isn't the only protocol on the web, it's just the one protocol that's pretty much web-exclusive. Your browser can handle ftp:// URLs just fine.

  • (nodebb) in reply to dpm

    DCL was also available on RSX-11 (running on PDP-11) it was a major change from MCR.

  • cschneid (unregistered)

    It's often unclear what people mean when they write "mainframe." Some people mean !Linux && !Windows. Every supported version of z/OS on an IBM Z mainframe (or its predecessors) has an SFTP implementation based on OpenSSH included in the box. It's been that way for at least a decade. As noted, maybe the problem is that no one knows how to set it up. Which would be interesting. And sad.

  • (nodebb)

    FTP was great. You could achieve godlike status when people had ftp problems simply by repeating what they did but typing the bin command first.

    Then somebody invented firewalls and much wailing and gnashing of teeth ensued.

  • Kashim (unregistered)

    On our FIRST DAY of Net+ class, my professor specifically covered the fact that FTP passwords are transmitted in clear text by having the entire class open up wireshark and having us sniff the password off of the line.

    Day 1: "Just so you know, this is why you don't FTP. We are teaching you to be Net Admins, and if we hear about a security breach at your company caused by FTP, and we get wind that it was one of our graduates, we will find out who it is, and we WILL revoke your degree."

  • Cynical Techie (unregistered) in reply to Jeremy Pereira

    And on that day, your ability to attain godlike status when people had FTP problems shifted over to being able to repeat what they did after typing the passive commend first.

  • Jaime (unregistered) in reply to Kashim

    FTP has more problems than that. Insecure logon credential handling doesn't affect use cases like distributing files to the general public via anonymous FTP. The biggest problem with FTP is that's its command/response protocol was designed to be read and understood directly by a human, so it is highly inconsistent and has no formal specification. Any FTP library is pretty much 98% vendor specific workarounds.

  • giammin (unregistered)

    you could just list the parent directory and see if there is that directory

  • P (unregistered)

    lol, but then what'd be the alternative?

    SSH? that's opening a lot more things, especially infinite more ways to let people tinker with it and mess up even harder, not to mention now you have a whole OS instead of just a file system exposed to users.

    HTTP? SMB protocol is even harder to work with, and gives less guarantees.

    SMTP? enterprise CMS like Wordpress? RDBMS? no thanks, now you're just insane.

    see, this is TRWTF: you can shit on FTP all you want, but there are no better alternatives. and before you ask, no, torrents and magnet links aren't the solution either.

  • ConcernedCitizen (unregistered) in reply to giammin

    no guarantee of a uniform response you can check for the requested directory, according to the article.

  • Jordan (unregistered)

    FTP has its problems, but Lothar's problem isn't one of them. That's just inexcusably poor error handling on the server side. The server knows it had an error, so there's no excuse for a 2xx response code. At some level, the server knows whether it had a memory allocation error or a permissions error... it's just losing that information somewhere in its guts.

    Karl's problem is a real FTP problem. It derives from the fact that FTP was always intended for human use, and in several ways is poor for automated use. Start with the fact that it has no way to retrieve metadata about a file. Why not? Files might or might not have owners, and might or might not have permissions, and if they do have permissions there's no telling what their semantics are. I'm not even sure you could get "obvious" things like the length because of the file system semantics that might be in play. There might or might not be directory-like constructs, and they might or might not be hierarchical. Remember that all the world is not Windows and UNIX, especially in the 1980s.

    There's nothing wrong with FTP's command structure. The syntax of commands and responses is clearly defined. The fact that it's also human readable is great; it makes diagnosis easy and lets you experiment without having to write a client. Nobody ever expected humans to use it directly; it was always expected that you would have a client program.

    The semantics are often problematic... and most of that comes from it being unobvious how to represent an arbitrary file system (again, not just UNIX) across a generic transport. A non-UNIX server fakes stuff up so that it's sort of compatible with a client that expects UNIX, and another client runs into quirks in that facade and compensates for them, and so on.

    The transfer scheme, using a separate connection, is a problem but derives from a laudable goal: using system A to move a file directly from B to C.

  • ZB (unregistered) in reply to Jordan

    FTP was always intended for human use

    Nobody ever expected humans to use it directly

    What.

  • Fizzlecist (unregistered) in reply to giammin

    There's no guarantee the parent directory exists either, which brings you back to the original problem

  • mihi (unregistered) in reply to Pabz

    Hey, even Netscape Navigator (and I believe NCSA Mosaic too) supported FTP, so FTP can surely be considered part of the Web...

  • Foo AKA Fooo (unregistered) in reply to Kashim

    Good professor, but I hope he added that it's just the same in HTTP (without the S)!

  • (nodebb) in reply to mihi

    It would be more accurate to consider web browsers as capable of accessing parts of the Internet outside the web. The web being that bit covered by the http(s) protocol. This was back when televisions still had 4:3 aspect ratios and were designed to function as particle accelerators.

    Web ⊂ Internet

  • Diane B (unregistered)

    Last sentence so funny, gave me a good hard chuckle

  • Olivier (unregistered) in reply to Fizzlecist

    Yes there is: the root directory you are pointing at when you login. Then you can descend from that.

  • Zirias (unregistered)

    The real WTF here is the existence of a "DirectoryExists" function. After all, what will you ever do with this information? If you use it to "know" later actions requiring the existence of the directory will succeed, you're doing it wrong, anyone might remove or rename it after you checked. If not, DirectoryExists() is just utterly useless information. For some kind of user interface, you WILL do some "list" command, but obviously show the results to the user.

  • The Middle Man (unregistered)

    The RWTF is not knowing the tool you use. Know thy FTP.

  • nasch (unregistered)

    My browser can open local files too, does that mean those are part of the web?

  • Johnny (unregistered) in reply to Zirias

    Do you always assume your end-user will be actively trying to sabotage your programs?

  • Jaime (unregistered) in reply to Johnny

    Yes

  • Zirias (unregistered) in reply to Johnny

    That isn't the point. Do you actually assume to be the only one, using an FTP service? Or a filesystem? Apart from that, yes, you always assume users behaving in any way remotely imaginable. Also see e.g. https://stackoverflow.com/a/52761509

  • tlhonmey (unregistered)

    Saying that you should never use FTP or similar because it's got terrible security is like saying that you should have bank-vault grade locks on all the internal doors in your house. Yes it would technically be less vulnerable to intrusion, but it comes with a whole host of other issues. Know your threat model.

Leave a comment on “Failure To Process”

Log In or post as a guest

Replying to comment #:

« Return to Article