UGTS Document #54 - Last Modified: 8/29/2015 3:23 PM
FTP Server Best Practices

FTP was invented in 1985 and as such is a very old protocol which is lacking in security and firewall friendliness. The default implementation of FTP transmits usernames and passwords for login in plaintext, and uses two separate TCP sockets for control and data.

When you couple this weak security with Active Directory authentication, FTP sites are favorites attack points for hackers, because hackers can brute-force attack an FTP site to lockout accounts or guess passwords with impunity. If access is gained, the site can be used to store illegal content, or the credentials can be used on other services (such as VPN and Remote Desktop) to login to them.

FTPS and SFTP are two separate and entirely different solutions to the plaintext transmission problem. FTPS takes an ordinary FTP conversation and encrypts it with TLS, much like HTTPS is encrypted compared to HTTP. SFTP is entirely different and runs an FTP conversation over an SSH session. Very few Windows FTP servers support SFTP because Windows doesn't have native support for SSH like Unix does, but most support FTPS.

The most commonly used free FTP servers on Windows are FileZilla and IIS. The IIS FTP service is not worth using unless you have FTP 7.5 on IIS 7 on Windows 7 or Windows Server 2008 or higher, because lower versions do not support FTPS. FileZilla, on the other hand, supports FTPS, and can be installed anywhere.

To mitigate risk to accounts (lockouts and password cracking), it is also recommended that FTP sites use locally defined accounts separate from Active Directory, or use that they use small set of explicitly defined Windows accounts intended for use only with FTP. Under no circumstances should admin Windows accounts be used on an FTP server.

Making FTP available across a firewall also presents a problem in that two TCP connections are required - the initial control connection, and a separate data connection. The control connection can be passed through a firewall without too much trouble, but the data connection is a real problem to setup. The reason for this is that FTP only defines two ways to do the data connection, and neither is NAT friendly. The two ways are:

  • Active Mode - the FTP server makes an outbound connection to the FTP client on the port one higher than the client's control port. If the client has a firewall (almost always true, or the server and client would be on the same network, and you'd just use file sharing rather than FTP), then the active connection will fail, being stopped at the client's firewall.
  • Passive Mode - the FTP server opens up a data socket from a pool of ports, and passes back the IP address and port number back to the client. The client then connects to the data port using the address supplied. If the server is behind a firewall (again, this is almost always true - if client and server were on the same network then you wouldn't be torturing yourself with FTP) then the IP address passed back to the client will be an internal address which is not reachable from the client because it is not NATted.
The problem with Active Mode is usually insurmountable. Passive mode would have been a simple fix except that it sends back the internal IP address, instead of leaving it up to the client to just re-use the IP address it used before. Because of this, passive mode is also a no-go. To make passive mode work, two solutions are usually used:
  • Packet Inspection - the firewall watches for FTP control connections and intercepts and rewrites the PASV response so that it does NAT on the IP address, and also dynamically opens up the port that the client will use to connect to the server. Cisco firewalls support this. Unfortunately, this only works for unencrypted FTP. FTPS and SFTP are encrypted and the firewall can't read the conversation to fix it, unless the firewall also knows the private key used to do the encryption.
  • Fixed Data Port Pool and External IP - if the server sends back the external IP address rather than the internal one, then the FTP client won't get confused. Both FileZilla and IIS FTP 7.5 support this. As of March 2012, FileZilla currently has an edge over IIS in that it can work even if your FTP server has a dynamic external IP address. Also, if the ports used for data are a well defined range of ports, and the firewall publishes all the ports in that range to the FTP server, then the connection can be made.
Packet inspection is much more difficult to setup securely than using a Fixed Data Port Pool. Packet inspection can be made to work, but since best practices dictate that the data ports should be well defined anyway (to avoid conflicts with other published services), you may as well be using the Fixed Data Port Pool. Also, packet inspection will often only work when the published port is the default port 21. You sometimes want to use a non-standard port just to hide your FTP site from scripting attacks and worms.

So there you have it: 6 patches and 27 years later, FTP can still be made to work reliably and securely. All you have to do is:

  • Use FTPS (rather than plain FTP)
  • Use Passive Mode (rather than active mode)
  • Adjust the Passive Mode IP Address (rather than publish the local IP address)
  • Explicitly Define a Range of Data Ports (rather than let the server pick a random data port)
  • Publish the Data Port Range to the FTP Server (rather than publish only the control port)
  • Lock Down FTP Accounts (use non-AD accounts or greatly restrict which accounts can be used with FTP. Disregard this if your site is read-only with anonymous access)

Note also that at some point in the future, if IPv6 ever takes hold, FTP will have to be patched again, because it embeds IPv4 addresses in the conversation.