In-Depth
Your Security Umbrella Integrating Encryption, Authentication and Access Control
Green screens may not be as attractive as browser interfaces, but for manycompanies they can be more effective. Web-to-host technologies are not going toindiscriminately replace all PC-to-host solutions anytime soon. [See "Thin-Client vs.Fat-Client in the Host Connectivity World" in the December 1998 issue of ESJ.]The two approaches are often complementary rather than exclusive solutions, butWeb-to-host thin clients are gaining momentum, particularly for building new intranets andextranets because of the lower cost of ownership and simplified management capabilities.As more outside users, such as partners, suppliers and customers are given access toinside host-based applications, security is more critical than ever.
Diminishing are the days of secure terminals, hardwired to mainframes. Most haveevolved into PC-to-host software on PCs that typically reside within a LAN (and are stillpopular today), which evolved further into systems that now include remote dial-upcapabilities. Most users of these systems tend to be trusted employees. New Web-to-hostclients expose host applications even more, with the introduction of more"casual" users, third-party users and more dynamic content, so more powerfulsecurity and management is required. The trend toward greater local and remoteinterconnectedness (with both thick and thin clients) is driving the development of moresophisticated security and management solutions.
The Need for Adaptability
Just as predictions of the death of mainframes have not yet been realized, PC-to-hostfat clients still have a healthy shelf life. GUI front-ends are not always an improvementover native interfaces, particularly for massive data entry or complex applicationnavigation. Until integration tools mature and "full featured" Web-to-hostsoftware is available, PC-to-host systems will persist. Even if a company plans tore-engineer its host applications to be Web-based, the transition takes time. Mostcompanies move toward the Web in small steps. At no point should a company be withoutsecurity and management to control access to its host applications.
And what if, as is often the case, the Web-to-host applications are added after thesecurity for the regular PC-to-host emulation is in place potential problem. A possibleredo of all of the security or a completely new solution may be required. Doing somehomework can really pay off when finding a security solution that can satisfy the needs ofboth types of host access applications and that doesn't require significant modificationswhen the new Web-to-host approach is rolled-out. With a large set of users, bothapproaches will likely reside together and have to be supported together.
There is no single best approach to securing access to host applications; rather thereare recommended practices depending on intended use. Anytime outside users are being givenaccess to business-critical information whether using traditional thick clients likeTN3270 and Telnet or new thin clients that use Java or Active X companies need tightlyintegrated encryption, user-based authentication and easy to manage granular accesscontrol.
Encryption. Any business-critical connection between a user and hostapplication, particularly over the Internet, absolutely needs to be encrypted. But howmuch is enough? The higher the encryption strength, the better, but there is a performancepenalty to be paid at both ends, for the encryption and decryption (see sidebar page 55).Sometimes it is better to offer a variety of methods and strengths so that sensitive datacan use the highest level of encryption and more public information can be passed alongwith minimal encryption. Some method of compression, for performance reasons, is alsorecommended.
Authentication. There are a wide variety of authentication methods, includingusername/passwords, digital certificates, token cards and biometrics. For extranet use itis essential to identify, not just the machine at the other end, but the individual user.Large token implementations are becoming increasingly common. They can be either hard(like an ATM card) or soft (a software version of an ATM card), and they are consideredtwo-factor authentication because they require something the user has (the token itself)and something the user knows (the secret passcode). Smartcards, which are like token cardsbut are scanned with a digital reader, as well as digital certificates like X.509certificates, are gaining popularity in Europe and Asia, though they have yet to make abig push in the United States. What is important to note is that security systems that cansupport multiple methods of authentication have an advantage, especially for extranet use,where each partner may be using different methods. When making systems decisions there isan important trade-off to consider; every additional method must be supported by what isan often overtaxed IT staff.
Access Control. Once a user has been authenticated, it is necessary to ensurethat access control, or authorization, is applied. This is probably the single mostimportant step in securing access to host applications.
Think of each layer as analogous to building security at the FBI: Only users with theright keys and proper credentials can get to high-security areas of the building. Somevisitors have permission to view and check out almost any document in the building,whereas other visitors have very restricted access to information and are able to viewonly limited files while on the premises. With an extranet, companies need to have theability to grant different permissions to different individuals, and those user profilesmust be easy to manage or else errors could lead to security breaches.
The best security solutions for companies that want to give a diverse user base accessto host applications are those that seamlessly integrate encryption, authentication andaccess control under one easy-to-use, flexible management umbrella.
Web-to-Host Security Issues
Web-to-host architectures vary from one vendor implementation to another. Many of thesame providers that offer PC-to-host applications now offer Web-to-host applications aswell. These implementations are either Java or Active X browser-based incantations ofterminal emulation. They are targeted at a cross-section of users that include some of theformer thick-client users who want a common UI and the ability to manage what havetraditionally been client-resident applications centrally. An even larger set is newcasual users who want and need host/server access only occasionally. GartnerGroup predictsthat this latter segment of occasional users will increase by an order of magnitude by2002, reflective of the growing extranet market.
Most Web-to-host architectures are based upon the client connecting to a server thatserves up these Java or ActiveX applications or applets dynamically. The difference is inhow many servers are involved and how the connections are made. For instance, with atwo-tier architecture, two servers are involved the Web server that provides theapplets and the target host/server. Three-tier architectures involve, not surprisingly,three servers, with the intermediary server providing the connectivity to the target. Insome cases, the client connects to a Web server that delivers the application to therequesting client; then the connection is made directly to the host without the Web serverhaving to act as the intermediary. The benefit of this configuration, among other things,is fewer points of failure.
By design, Web-to-host applications are less expensive and easier to deploy and managethan their predecessors, but their dynamic nature is more difficult to secure. Considerthe following:
A user wants to use an application on the corporate AS/400 server, the Java TN5250applet is dynamically downloaded to the user (through the firewall). The user then wantsto connect to that AS/400 and does so, over the Internet and back through the firewall viaa different port.
The same user also wants to get to an application on a UNIX server on the same networkand down comes another applet (telnet), through the same or another firewall, through yetanother port. The user connects in, back through the firewall you're right anotherport, and while connected decides to download a file with FTP and, you guessed it, moretrips over the Internet and through the firewall which now has several ports open, all ofwhich are a potential security breach.
In Java parlance, these open ports in the firewall are referred to as "backconnects." In the security industry, these are known as holes. Several Web-to-hostvendors recognize this security threat and have tested and/or integrated security productsto help overcome this problem. All that use a browser-based model have been helped by themost popular browsers supporting HTTPS thus providing encryption for HTTP traffic, (use ofthe HTTPS protocol does require a secure server to handle the request). A couple of thevendors have taken it upon themselves to include some security as a part of theirWeb-to-host offerings.
It is important to realize that non Web-to-host Java and ActiveX applications sufferfrom this same back connect security dilemma if the connection is being made from theoutside and must traverse a firewall.
Whether providing direct access to host-based legacy applications or building new Webapplications that reside on these hosts/servers, providing key business partners,suppliers and customers (in addition to employees) with secure and well-managed access tohost resources is a powerful and potentially risky proposition. Companies that feeloverwhelmed by the options for host access should concentrate on the security andmanagement aspects of providing access to a variety of applications and take their timechoosing the right combination of tools for secure, easily manageable host connectivity.In many cases users will need to support both Web-to-host and PC-to-host concurrently, aswell as custom applications and, increasingly, object-oriented applications. Securitysystems that can provide centralized management and the flexibility to accommodate theneeds of a diverse user base while securing and helping to manage these applications aresteps ahead of other approaches.
About the Authors:
Rob Spence is Director of Product Marketing for Aventail Corporation (Seattle,Wash.), and Tracy Shuford is Internet Marketing Manager for Aventail. They can be reachedat (206) 215-1111.
Encryption: Essential
But Not Enough
The strength of any connectivity security solution relates directly to how well thefollowing security components are integrated:
- Strong encryption, such as DES, 3DES and RC4
- User-based authentication, such as token cards, CHAP, CRAM, digital certificates and biometrics
- Granular access control, with parameters such as source, destination, user identity, group affiliation, time of day, authentication type, etc.
When data is being transferred over a public network (particularly the Internet),encryption is essential for protecting data from everyone but the intended recipient. Whenthat "intended recipient" is actually a particular LAN, such as that of a branchoffice, as opposed to an individual, then companies can use routers and/or firewalls toencrypt traffic between the two perimeters. The assumption in that scenario is that allusers sitting behind the firewall or router are equally trusted and/or share the same setof privileges.
More often, companies want to grant only limited network access to particularindividuals. That means the people sitting at the machines on the other end must bestrongly identified through at least two-factor authentication. In most cases, those usersshould have access only to specific resources, not the entire LAN. Simply makingpermit/deny decisions based upon a client IP address does not suffice when the desktopbelongs to a third-party or untrusted user. In secure remote access and extranetscenarios, the trust relationships model the real world in that they are between people,not machines. Companies need to be able to strongly identify users and limit theirconnectivity to a specific set of network resources.
Access control should be flexible enough to be implemented using a variety ofparameters, such as the time of day, the service port, the subnetwork, the type ofauthentication used, etc. For example, if "User X" authenticates using SecurIDtokens, his or her access to corporate resources may be quite extensive. However, if"User Y" authenticates using a reusable user ID and password pair, s/he may berestricted to accessing only certain applications that are not mission critical. Beingable to apply access trol to a user based on the user's ciphersuite support and encryptionlevel can also be useful. As a general rule of thumb, however, all mission-critical datashould be strongly encrypted. The longer the key length of the cryptographic algorithm,the stronger the encryption. Because encryption can affect performance and because thelegality of exporting strong encryption is still cloudy in some areas, being able to turnoff encryption when it is not required is sometimes desirable.
One of the most talked-about aspects of secure internetworking right now relates topublic key infrastructures, which provides authentication and encryption but not accesscontrol. A public key infrastructure, or PKI, lays the groundwork for establishing trustrelationships among organizations and individuals. It involves public/private keycryptography, digital certificates, and standards like S/MIME (Secure MultipurposeInternet Mail Extensions) and SSL (Secure Sockets Layer). Companies can generate andregister their own key pairs or have a third-party vendor such as VeriSign or Entrust doit for them. There are many competing standards in this arena, and there is little clarityyet on how all the variants are going to be integrated.
Look for connectivity solutions that tie encryption, authentication, and access controltightly together, providing the framework for a variety of security and managementschemes. Other features to consider for internetworking solutions include seamlessfirewall traversal; low-impact deployment; directory and application integration; networkinteroperability; simple product distribution and updating; usage tracking; and a highlyscalable, client-to-host/server, standards-based architecture. Overall, the key to anysuccessful security plan is to know who the users are and be able to define whichresources they have access to.
- R.S.