The Hidden Costs of FTP
The World Wide Web with its network-centric architecture has spontaneously achieved what years of standards efforts failed to do, liberating front ends from back ends and content from infrastructure. Any type of client can access any server platform, giving specialized systems a new lease on life.
But the Web’s rapid expansion and unprecedented heterogeneity are exposing weaknesses in networking technologies, such as File Transfer Protocol (FTP) that were created in another age.
While some information is published on Web servers, most corporate data still exists in a variety of proprietary formats on different legacy systems. Companies need a secure, reliable and uniform way for people to access this data via the Internet and move the files from one environment to another. FTP is ubiquitous and has been a faithful file-transfer workhorse for more than two decades, but it is getting increasingly expensive and even risky to use.
And FTP is far from free. The purchase price of software is negligible compared to the cost of implementing and maintaining it and training people to use it. In today’s far-flung multiplatform networks, with more and more interbusiness data exchange, FTP is an administrative nightmare.
Most enterprise networks evolved gradually over the years into a geographically dispersed patchwork of different computer platforms and protocols. While FTP is a standard, there are different implementations for every environment and each has a different interface for users to learn. FTP users must know and navigate through complicated file paths and directory structures to get to the information they need. Upgrading server environments usually means changing FTP implementations, rewriting scripts and retraining users. Similarly, employees who are transferred to a new department or location may find themselves in a different network environment with a new FTP interface to learn.
FTP is also a TCP/IP-only solution. Other file-transfer methods must be used on subnets that are running SNA, IPX/SPX, NetBIOS or other non-TCP/IP protocols. Consequently, some companies have deployed so many different file-transfer products that whole teams of people are dedicated to managing them. This is a luxury that few can afford as downsized IS staffs struggle to cope with the fastest technological transformation in history.
FTP is particularly inefficient when it comes to moving multiple-megabit files, and it leaves users free to initiate such transfers and choke the corporate intranet. The user doesn’t get a logical view of the file structures, and can’t browse through files and opt to download just part of them. Consequently, unnecessary data gets sent across the network, and it can get in the way of files that serve as input to applications. The file-transfer process needs to offer reliability and guaranteed delivery to these applications and integrate with host-based schedulers. FTP can’t offer this functionality unless programmers wrap scripts around it.
Security is increasingly important as companies get involved in business-to-business data exchange and electronic commerce. FTP has no built-in security, and its vulnerabilities are frequently exploited by hackers. The standard FTP file transfer includes the user ID and password, which are sent in the clear over the network. The files can be encrypted first, but this adds extra steps that users often decide to skip. And when the encryption process is used, it creates a second (encrypted) copy of the file that users frequently forget to remove. Since FTP security is dependent upon cumbersome scripts or add-ons, companies are often forced to implement some very inconvenient and labor-intensive procedures to protect their data.
To eliminate these problems, today's businesses need an enhanced file-transfer solution that looks the same on all platforms, runs across all network environments, and can be managed from a single point.
This is a tall order, but one that can be filled by building middleware that exploits the OSI model - particularly the Transport Layer Interface. TLI sits between an application and the transport layer and lets the application operate without knowing anything about the underlying protocol stack. It basically abstracts all the details and heterogeneity of the transport, network, data-link, and physical layers, enabling a file-transfer application to run transparently across IP, SNA, LU 6.2, LU 2.0, SPX, IPX and NetBIOS environments.
With this type of enterprise-wide file-transfer infrastructure in place, users with the appropriate rights can access files anywhere on the corporate network without any regard to the various communications protocols encountered along the way. Transferring files to and from remote systems is as simple as using the Windows Explorer to move files from one folder to another on the Windows desktop. All the files show up on a single logical drive, no matter where they are actually located.
This infrastructure approach gives enterprise-wide file-transfer applications a single point of administration. It is much easier for administrators to accommodate mobile and remote users, or to delegate and distribute administrative responsibilities. Administrators don’t have to maintain separate sets of user IDs and passwords for UNIX, NT and other systems, and there is less chance that a defunct account will be overlooked and remain active.
Staff resources that would be tied up supervising different platforms in an FTP solution can focus instead on providing connectivity and meeting distributed computing needs. Centralized control also lets administrators do some bandwidth management by establishing policies for the movement of files around the network. They are aided by built-in security that complements existing LAN and host security on the corporate network and integrates with leading security products. Support for multi-level security is also essential as enterprise networks evolve into intranets and extranets that weave in and out of the Internet.
Clearly, value-added file-transfer products provide all the services of FTP plus a lot more. They can run across multiple protocols and enable companies to migrate to TCP/IP gradually, at their own convenience. They also provide a flexible infrastructure that can be re-used by future applications in a rapidly changing business environment.
AUTHOR THE AUTHOR:
André Lévesque is President & Chief Technology Officer of Micro Tempus, Incorporated in Montreal. His 20 years of experience and technical knowledge in the computing industry has been a significant factor in the success of Tempus Connectivity Solutions (TCS), the company’s signature product.
Back to article