In-Depth

Princeton Softech Goes Beyond Robust Database Archiving

Archive does more than just backup data. On closer inspection, our commentator discovers how it can enhance DB migration, data transfer, consolidation, and even disaster recovery

As a rule, I tend to be circumspect about products that try to close the barn door after the cows have already escaped. Initially, this was my view of Princeton Softech’s wares, which I saw as a set of archive utilities for databases. Frankly, I took the position that any respectable DBA should have built the appropriate archiving capabilities into his or her relational database system from the get-go. If they were doing their job properly, they shouldn’t need to bolt on a third-party archiving product after the fact.

As a result of this bias, I tended to ignore press releases coming out of companies like Princeton Softech. In fact, I began to find them a bit annoying when they started couching their value proposition in terms of regulatory compliance and information lifecycle management—two much-abused terms in contemporary storage marketing parlance.

It took my Managing Partner for South America, Oscar Ernst, and more importantly, his wife (a certified database expert working for a major on-line check processing firm) both kicking me in the shins about their visit to Princeton Softech to shake me up with respect to the company. They told me that what they had seen at the company’s Princeton, NJ headquarters had blown them away and that I would be doing my readers a disservice if I didn’t take a closer look.

Jim Lee, VP of Product Management, was a gracious facilitator. We spent several hours going through his product slide show, then a demonstration of the software itself. In that brief period, I was impressed by both the man and the product.

First, let me explain that I was wrong. Princeton Softech’s Archive—flavors are available for nearly every form of RDBMS and DB-based application suite (think Siebel, Oracle Applications, and PeopleSoft)—is not simply an archiving tool. Sure, it performs this function, and very well, from what I could see. You can use the product to free up storage space by identifying and moving older data from the database to second-tier or to offline storage. This is a key sales point for the vendor and an important consideration for many shops I have visited, where up to 80 percent of the data stored in databases on their most expensive disk is either old and stale or ready to be retired to less-expensive platforms better suited to its infrequent and read-only access patterns.

While important, I would have expected my DBAs to build such functionality directly into our databases without the need for external products. Unfortunately, in many shops, this has not been done, either because of their lack of perceived need or the time constraints when the database was first being rolled out. So, the need for a toolset such as the one Princeton Softech offers is clearly out there.

What surprised me, though, was the enormous value the product brings to other scenarios I hadn't considered. One is migration. More than a few companies are migrating data from one vendor's RDBMS to another product from a different vendor. Suppose you have outgrown SQL Server and are moving onto Oracle or DB2. Do you think Microsoft will provide you with tools to facilitate data migration? I doubt it.

Another scenario arises from mergers and acquisitions in terms of both database users and database vendors. How do you take a database from an acquired company and transform it with any efficiency so that it fits with the RDBMS product and the schema that you are using for your primary database? Volumes could be (and have been) written on the hair-pulling trial-and-error process of database consolidation. So onerous is the task in the absence of friendly tools like those from Princeton Softech, and so dreaded the prospect of taking on such a Sisyphian effort, it is not uncommon to walk into companies where they have settled on a strategy of replicating and passing data between whatever DBs already exist—even if they number 20 or more. Maintaining EDI, XML, and spit-and-bailing-wire scripts for data interchange represents an enormous tax on efficiency and a cost accelerator of the first magnitude.

Princeton Softech’s tools for facilitating DB consolidation are powerful and easy to use. In fact, the company is getting dangerously close to being able to represent their product as business-user friendly. If they just add some graphical elements, doubtless scheduled for subsequent releases, even a hack like me who does not eat, sleep, and dream columns, rows, and relationships may be able to use this product successfully.

Other benefits: extracting data to create a data mart or warehouse on the fly; setting up policy criteria quickly that enable you to copy or segregate data that may be required to answer legal subpoenas within tight time constraints; or reworking the physical layout of data to better optimize disk resources and speed up DB-centered applications. The product gives you the essential tools you need to do all of these things.

For those who are migrating between different storage hardware platforms, this product can be a boon that you might not have considered. A common gotcha in platform migration is the lack of an intermediary storage platform to serve as a temporary container for data as it is transformed from the layout used on your old gear and reconstructed into a new layout preferred by the new gear. War stories abound about vendors who didn’t tell you how migrating your data to their next generation frame was going to require you to lease another box, adding hundreds of thousands of dollars per month to the migration project that you didn’t anticipate. Using Princeton Softech’s toolset, you can eliminate the extra box, extra cost, and extra pain.

A final important takeaway from my conversations with Jim is the potential utility of this product for disaster recovery. More and more, companies are concerned about the efficacy of their DR strategies as the volume of data to be restored increases. Given the best automation available, you can only restore from tape at a rate of 1 to 2 TBs per hour. I have always advised clients that they don’t need to restore 100 percent of their operational data in an emergency—at least, not within the first 24 to 48 hours following the disaster. In fact, if 80 percent or more of that data in the DBMS is worthy of retirement, you may not need to restore it at all during recovery operations. This fact would make the restoration of a multi-TB database less daunting and would dramatically reduce the cost of recovery planning for the stand-by equipment and capacities required.

With the Princeton Softech offering, you could conceivably build a recovery dataset consisting only of the must-have data from your database that is a fraction of the size of your total dataset. That allows you to recover data more quickly and get back to work more rapidly in the wake of an outage. As a plus, Princeton Softech enables you to mask the actual data in your database from the view of workers in your own shop and at your hot site vendor facility, thereby preserving privacy and adding greater security to your recovery operations.

From where I’m standing, this is a great product that delivers real value. I would like to see more graphical representation of DB structures in the user interface and support for MySQL, which is now on the preferred DBMS list at SAP and elsewhere. That said, Princeton Softech’s offering is world class and worth a look.

I would be interested in hearing from users of the product about their experiences. Write to me at jtoigo@toigopartners.com.

One final word: after seeing the Princeton Softech wares up close and personal, I have decided to add Jim Lee to the line-up at our forthcoming Data Management and Compliance Summit at Networld+Interop in Las Vegas, NV on May 3 through 5. I invite all readers to attend this important event and to chat with Jim about his product. I will post a URL shortly with additional info about the event.

About the Author

Jon William Toigo is chairman of The Data Management Institute, the CEO of data management consulting and research firm Toigo Partners International, as well as a contributing editor to Enterprise Systems and its Storage Strategies columnist. Mr. Toigo is the author of 14 books, including Disaster Recovery Planning, 3rd Edition, and The Holy Grail of Network Storage Management, both from Prentice Hall.

Must Read Articles