When is it useful to use FTP?

In my experience, I see many architecture diagrams that make extensive use of FTP as a medium for linking architectural components.

As someone who doesn’t make architectural decisions but tends to look at architecture diagrams, can someone explain how important the use of FTP is, where it is suitable and when transferring data as files is a good idea.

I understand that there are often outdated systems that just have to work this way - although any historical understanding would be interesting as well

I see attraction when transferring files (especially if this is what you need to transfer) due to simplicity and familiarity, and wonder if the reasoning goes beyond that.

Edit: thanks to those who indicate that SFTP is preferable, however my question is wider than the recommendation on file transfer protocol. Sorry for the confusion.

+6
architecture ftp file-transfer
source share
7 answers

When is it useful to use FTP?

Before the invention of SFTP.


Editing address (e.g. a wider question in this question)

It all comes down to the intended use. Look at your situation and determine

  • What data am I moving?
  • What format is it generated initially? (PDF on disk, text output from web server scripts, etc.)
  • How is the data used?
  • When is the data consumed? (Instantly, scheduled batch jobs?)
  • What communication medium connects the data generator and the data consumer?

For example:

The process creates PDF documents by writing them to a local RAID array. You have another PC designed to print all the PDF files created on many servers connected to the local Gigabit LAN through the cron job, which is scheduled to run at midnight.

Given that the data is likely to be too large for everyone to sit in RAM on the print server, it makes sense to use SFTP to transfer PDF files so that they can be captured from disk as they are printed. A.

Another example:

The machine must capture a large number of small files from the machine in a special way, analyze them and save the results in a database. In this case, using SFTP to move them from disk to another disk, which you need to immediately read and insert into the database, is simply stupid. There is no reason that smaller files do not fit into RAM until they are parsed and run into the database, and therefore SFTP is probably not the best solution.

+8
source share

Some lecacy systems use folders to transfer data in the form of XML or CSV, etc., in which cases the files must be written to disk. If you integrate into another system outside the network / on the Internet, it makes sense to make them available on an FTP site. Newer systems may use WebServices or other in-wire technologies to reduce disk savings. It is possible that if these files are very large, FTP may be the best solution.

In some industries, such as the printing industry, large PDF files are routes through various work processes where PDF files are processed, processed, etc. through this workflow. In the printing industry, the use of folders (and, in turn, FTP) is common, and they usually call them "Hot Folders"

+3
source share

If you need to send a physical letter to the most popular language, it is difficult to run a 2000-year-old mail service . If you need to send a file to a place of return water capabilities, it is difficult to complete the 40-year Postel service .

+3
source share

If security doesn't matter, then FTP can be useful.

However, given modern options, I would probably never use it, instead instead of SFTP / SCP / rsync or HTTP (possibly using WebDAV). Firstly, all of these protocols have options for increasing security (HTTP, at least over SSL). In addition, these are simpler protocols. FTP has unpleasant information that actual data is transmitted over a separate connection than management commands, which makes it difficult to work with a firewall. In addition, in non-passive mode, this connection is from server to client, which makes the firewall an almost nightmare. If you need intermediate interaction, this can be useful, but HTTP client programs and libraries are available, so I just used it these days.

+3
source share

FTP is an easy, cross-platform way to transfer files, if you have a reliable connection and you absolutely don't need any security (do not fool it by asking about passwords - there is no real security there).

Many times, people really needed security, but they were wrong when using FTP because they simply believed that everything was done. It is best to use SFTP (I like the OpenSSH implementation) or transfer data using a secure web service.

Of course, the correct implementation of SFTP means that developers will need to correctly generate, store and exchange their keys and understand how trust works. This is often too much effort for them, so people tend to just go the easy route and use FTP. It is sad if you ask me.

+1
source share

File-based sharing (e.g. via FTP, SFTP, SCP ...) is good for

  • large data transfer
  • batch work scripts
  • asynchronous communication

There is nothing wrong with using files. This is a well-understood mature technology, easy to use, easy to track and debug.

+1
source share

I assume that security and disconnected networks or network segments may come into play. I had different projects when someone needed to import data from another system, and FTP is an easy / safe way to get data through a firewall. Typically, you can schedule it to start automatically, and most network security users will be fine with open FTP ports.

-one
source share

All Articles