ANALYSIS: Client/Server Performance Tuning

By Chris Gloede

Client/Server based applications are difficult to deploy and even more difficult to tune. Sophisticated solutions demand sophisticated tuning techniques. There are so many variables involved, choosing a starting place can be difficult at best.

After initially deploying a client server solution, before tweaking any settings to improve performance, it makes sense to examine the application and its architecture. Simply because an application is labeled as 'client/server' one should not assume that it will function at all like any other client/server solution. Architecturally there are two and three tier deployments and data may be handled in entirely different ways. Before embarking on any performance analysis, it pays to have a clear understanding of the basic architecture of the product(s) that you will be working with.

The next step is to analyze the network traffic of the application(s). How often are requests being transmitted across your network, and how large are the requests and subsequent data returns. Sometimes these requests are for large bunches of data and are few in number, sometimes they are small in size and large in number and sometimes they are in the middle. It is for you to know which and when so that you can determine the appropriate manner in which to tune your network.

Recently, I came across a tool that helps me do just that. It helps me analyze the traffic from a client, not simply in terms of numbers of bytes, although this is truly important, but in the ways that the information wants to be sent. It helps me isolate data base time from bandwidth constraints from latency issues. This tool is cool!

Optimal--Application Expert from Optimal Networks Corp (Mountainview, Calif.--www.Optimal.Com) is a tool that provides the company deploying client/server solutions the knowledge necessary to achieve maximum throughput in the shortest time with the least wasted effort.

Application Expert helps you visually review application behavior in a distributed, multi-tier network and see how each tier is affected by a given application. It operates across LAN, WAN and the Web, so it can be used in any number of ways.

It will perform predictive modeling, so if you are testing with a single workstation you can model fifty workstations and examine network bottlenecks. You can perform response time analysis, look at when an application is most chatty with the use of the Time Plot tool, determine tier load balancing and so on.

On the chance you have not picked up on this, I think this is a great tool. As you look at a bar graph depicting three versions of the same application and seeing it's base timeline, data transmission time and latency time, all clearly depicted for each version against one another, the value becomes clear. No more wasted time or money fixing a problem that isn't really a problem.

I usually try not to single out a product as a highlight like this, but I am impressed with this product and have seen its benefits. I spend a great amount of time deploying complex multi-tier client/server solutions and have not, until now, had access to a tool that will easily enable me to actually see what is going where with such clarity. The improved decision making process, time to decision, and better choices in improving network performance make this thing worth its weight in gold.

Client/Server solutions, especially multi-tiered ones, are extremely difficult to deploy as well as tune. Any tools that help to better manage this will only add value to the applications that you spend so much money on. Tools like Optimal's Application Expert are a great place to start.

A veteran of the IBM midrange arena since 1983, Chris Gloede is executive VP for Business Solutions Group in Wayne, Pa. cgloede@thebsg.com.

Must Read Articles