How Leveraging Application Delivery Can Optimize Your Data Center Infrastructure

Using a hosting provider and new ADC technology can help small and mid-size businesses compete with their larger rivals.

By Peter Melerud

Although the Internet has largely leveled the playing field, allowing the SMB (small and mid-size businesses) to compete with much larger enterprises, one area in which the enterprise has continued to enjoy a healthy advantage is the area of Web application infrastructure.

Large enterprises have long benefited from the tools, talent, and flexibility that their healthy budgets can afford, while SMBs are typically much more resource constrained. However, SMBs have long looked to hosting providers as technology partners, giving them many of the benefits of an enterprise deployment at a much lower cost. In addition, a new generation of tools has become available to SMBs that were previously out of reach because of price.

When large enterprises or SMBs embark on deploying a Web application, they are faced with the choice of either hosting their Web infrastructure internally or housing the infrastructure at a hosting provider. While large enterprises may opt to maintain their own data centers; SMBs find it far more cost-effective to utilize the resources and talent of a hosting provider that specializes in Web application deployment.

There are two primary benefits that SMBs enjoy in utilizing a hosting provider: facilities and talent. With facilities, the hosting provider will have the connectivity, power, rack space, cooling, and other necessities of a modern datacenter that would either be prohibitively expensive or logistically impossible for an SMB.

Security is an added benefit of a hosting provider. Part of a datacenter-grade installation includes a secured environment, including cameras, man traps, key card access, and security guards. This is typically not possible (or very expensive) if an SMB hosts in their own facilities.

In particular, connectivity is a major hurdle for an SMB to provide in a home-grown datacenter. Most SMBs have T1 connectivity, DSL lines, or cable modems for Internet connectivity. Scaling beyond a T1 line can be time consuming, logistically difficult, and expensive at an office location. DSL and cable modem lines, while great for outbound connectivity, are typically poor choices (and often contractually prohibited) for hosting Web connectivity. They don’t scale well, and they typically do not have SLAs (service level agreements) appropriate for a hosted site. By deploying infrastructure at a hosting provider’s location, the SMB gains access to what is almost always higher quality bandwidth. The hosting provider will typically have multiple high-speed connections (OC-3 or better), multiple network providers with BGP peering, which allows a link to a provider to fail without losing the site’s connectivity. A locally hosted site typically will not have any link redundancy.

The other primary value-add that hosting providers bring is talent in the form of network administration, server system administration, and 24/7 operations. With network administration, the hosting provider will have the 24/7 highly specialized engineering talent to make sure the network runs smoothly. These benefits are available for a fraction of the price of hiring and retaining a similar talent pool that would likely be underutilized.

Web applications are deployed on a number of different types of servers such as Web servers, application servers, database servers, and other types of systems that typically comprise a Web application. With the SMB, the choice is the age-old question: buy or lease? By buying (often called simply co-location), the SMB provides the servers and the hosting provider provides the bandwidth, rack space, and environment. For the SMB, this involves a hefty initial capital outlay, as well as the responsibility for any hardware problems.

As a result, most SMBs tend to lease servers from the hosting provider. Referred to as dedicated servers or leased servers, involving a significantly lower initial capital outlay and fixed maintenance costs. The hosting provider is usually responsible for any physical issues with the equipment.

With dedicated leased servers, another value-add that a hosting provider can deliver is system administration. Often called managed hosting, or managed servers, the hosting provider will take care of most of the server system administration such as hardware maintenance, operating system updates, and security patches, leaving the SMB to concentrate on its core competency: developing, deploying, and maintaining Web applications.

Of course, the SMB needs to be aware of inherent risks in this approach. The hosting provider could become insolvent, and the SMB would find its site without a home, and their Web application and data stranded in limbo. Thus, SMB must make sure that the Web application and corresponding data are backed up to its own office or another third-party facility. From a security standpoint, encrypted drives would also be advised to prevent confiscated or repossessed equipment from being a source of a secure data leak. This would be true whether you hosted internally or at a hosting provider.

Given the choice between hosting locally and using a hosting provider, most SMBs use a hosting provider. Imagine if an SMB hosted its entire site (Web apps, e-mail, etc.) from a cable modem, and that link went down. How long would it be before it came back up? I’ve had my cable modem out for days, and the providers typically aren’t sympathetic, and there is typically no associated SLA. Or imagine if the facility was broken into. Some office facilities enjoy relatively good security (access control, video surveillance), but many offices do not— only a locked door with a consumer-grade lock prevents criminals from walking away with servers (and data).

Another tool that hosting providers can use with SMBs to level the Web deployment playing field is the ADC (Application Delivery Controller). These devices are the descendants of the ubiquitous load balancer, which are devices that split up Web and other traffic among a group of servers, but the ADCs do much more. The ADC might best be described as the Web infrastructure’s Swiss-army knife.

The ADC Advantage

In addition to equitable load distribution, ADCs can perform secure socket layer ( SSL) termination/acceleration (removing the CPU burden of SSL operations from the servers), content switching (dividing up traffic based on Layer 7 characteristics), server health checking, caching, compression, and intrusion protection service ( IPS)/Web application firewalls.

These are great features, and until recently, have only been affordable to large enterprises. However, a group of vendors has evolved to bring this advanced technology to the SMBs. Hosting providers are using these new products to make SMB deployments rock-solid and affordable.

To demonstrate how these products are often used by an SMB in a hosted environment, take the following example. An SMB is deploying a Web application with 10 servers. In addition to the servers, rack space, and network bandwidth that the SMB would get from the hosting provider, the hosting provider deploys an ADC in front of the servers. More than just basic round-robin load balancing, the ADC uses various server performance metrics, such as CPU utilization to determine how the load is equitably distributed.

Because this application, like most Web applications is stateful -- that is, the application requires that a particular user’s requests return to the same server for subsequent requests -- the ADC can be configured for Layer 7 persistence (which uses the HTTP headers to tie a user to a specific server). The ADC would also perform health checks on these servers, testing the Web application to ensure that the servers are responding correctly.

Because the site utilizes SSL for secure transmission, the ADC would be configured to terminate the SSL sessions from the users and pass the traffic directly to the servers in plain text. This would allow the ADC to handle persistence for SSL based on the HTTP headers, and it would relieve the servers from performing the CPU-intensive SSL operations. If SSL operations were left for the servers to perform, the number of servers required to handle the same amount of traffic could double because of the excessive CPU overhead of SSL.

By activating caching, further workload is taken off the server, as some objects would be served from the cache on the ADC rather than from the servers. As the ADC’s cache is populated with initial requests, cacheable objects can be served from the ADC, rather than the request even hitting the server.

With compression, certain types of objects (such as HTML pages) would be compressed before being sent to the user. A smaller file means download times are faster and less bandwidth is used, significantly boosting the speed of a Web site for users with limited bandwidth. The objects can be uncompressed by the user’s browser, which is a browser feature supported by virtually all modern Web browsers (Microsoft IE, Firefox, Safari, Opera, etc.).

As is true for larger, enterprise-targeted ADCs, all of these functions can be performed by one ADC device (or, more commonly, a pair of devices in redundant configuration), greatly simplifying the administration, and making it easier for a hosting provider to deliver these services to an SMB.

By using a hosting provider and new ADC technology specifically developed for the SMB market segment, SMBs have a much more level playing field in terms of cost and operational capacity to compete with much larger enterprises. SMBs can deploy Web infrastructure with the service and reliability that would normally be associated with much greater budgets.

Peter Melerud is the vice president of product management for KEMP Technologies. You can reach the author at

Must Read Articles