Q&A: How to Successfully Migrate Legacy Data
Updating applications with current technology means IT must make sure no data is left behind.
Updating applications with current technology means IT must make sure that data makes the move smoothly and efficiently as well. To learn more about the dangers and what’s involved in upgrading data along with applications, we turned to the president of Southern Technology Group, Bob McCoy. He explains the pitfalls and best practices for keeping data in tact when upgrading legacy systems.
Enterprise Strategies: Managing legacy systems and data is nothing new for IT. Why is such management harder today than it was, say, five years ago?
Bob McCoy: There are more technology sources like MO [magneto-optical] and jukeboxes being retired because no one makes or maintains them any longer. Such sources were the backbone of the archival world that was promised a growth path. It is also a very proprietary technology.
What are the biggest problems IT makes when trying to upgrade legacy systems?
In most companies today, money (cost) is always a driving force, and if a conversion needs to be done and wasn’t budgeted, it can be very problematic, so the first impulse is to do it internally. Part of this is also respect for ones organization and job. The boss might think -- If I can’t do this task, why am I drawing a salary? -- when, in truth, conversions can be quite complex and no professional should feel ashamed because they sought help to do the job right the first time.
What often ends up happening is that IT will expend resources at the cost of something else that needs to be done (we all know there are never enough IT people around) and can’t do the job and still has to go outside. If IT feels it can do this work, then it should have a history of completing such work. Archived data is often too precious to experiment with regardless of how good you believe you are.
As I mentioned, resources are often an issue and priorities are always a moving target. If a company is going to do this themselves, it has to be asked: At what cost to other projects? It may well be that the conversion is more important than these other projects, but if it takes IT three times as long with a lower probably of success because of a lack of experience, the conversion will suffer, as will the other projects that were bumped. What is the real productivity savings? It’s not always easy to define, but certainly merits collecting the thoughts and input of various sources.
How does IT transition to a new system? Must it retain the legacy info for compliance, security, competitive, historical or some other reason? If so, is it feasible to limp along on the legacy system as a new system is built?
The answer to this generally lies in the legal department. Different documents have different compliance requirements. HR has one set of compliance and AP/AR another. Compliance is a strong driving force, but historically it can be equally impelling.
IT has to work closely with the specific department and determine what their requirement is. In some cases, limping along is a legitimate option. As you rebuild a strategy to implement a state-of-the-art system, it’s important to not repeat past decisions, although in fairness to those who made them, it may have been all that was available at the time.
One element everyone is getting much more savvy about is open architecture, which means using standards such as SQL, .tiff, or .pdf, each of which has pros and cons, but all lend themselves to a much easier migration on the next go-round. I think we are all becoming aware that no technology lasts forever, so standards are an important part of IT’s tool kit. If limping along is not feasible, then IT must migrate from an out-of-date and many times proprietary system to one using popular standards such as those I’ve mentioned.
How much must IT understand about the structure behind the data to be successful?
I think today almost everyone in IT understands the Microsoft products. They may not be embraced by all, but in the archival world, where data does not get accessed a lot and can be easily forgotten until urgently needed, it is important that known standards be used. Standards that a lot of folks have written code around and have open source sort of solutions available.
I am not touting MS products one way or the other, but they have done a great job over the last 20 years building and growing the relational DB market with SQL. Keep in mind that archiving requires two components: the image and the index structure. A complete system cannot live without the both, so having .tiff images with a little-known or proprietary data structure puts IT right back where it started. Any company that understands as much as possible about the information it manages generally has a leg up on its competition.
IT cannot afford to just relegate this task to someone else; on the other hand, management of the company that funds IT has to understand that this takes resources to keep up with and sometimes those resources are consultants.
What should IT do about data that cannot be converted, such as annotations, workflows, or images?
Annotations and images can almost always be converted. It often depends on what price you’re willing to pay. There are rare occasions that the image is so proprietary that it cannot be converted except by those that hold the keys to the kingdom. In that case, it’s a business decision and shame on them if they fall into the same pit again. This is where your corporate legal department can give some gentle guidance and make sure that you don’t find yourself in the same position again.
Workflows are a different issue. Everyone’s workflows are different and there is no simple way to convert these because they depend more on code sequences than on data structure. That being said, there are very good workflow products such as the Microsoft-based command set that Laserfiche uses that is very graphical -- so a person can take a flow diagram (which one hopes is available from the old workflow) and reprogram the workflow in a fairly short order. This can be done by a third party, but my recommendation is that it would be best in the long term to do this internally if possible so future modifications and management of the workflow are much easier.
Keep in mind employees leave just like vendors go out of business, so IT and management have to view this as an important asset and act accordingly.
If the effort is outside IT's expertise, what are some of the best ways to find a reseller/consultant to perform the work? How does IT recognize an expert and how know the project estimate price is fair and the project plan realistic?
I think that references are a good start. Frank discussion between IT and department heads is important. Sometimes the choices are pretty limited as to who can do the job, so try to check them out from several angles. Does the software supplier recommend this individual or company? How many of these conversions have they done? Do they speak the right language?
Have a programmer speak with them to determine if they really understand the internal structures or are they just using high-level words. Another way (and one that we almost always implement before we even accept the job) is to do a free sample. Not only does this build confidence with the client, but it tells us what we are getting into. Clients do not intentionally mislead us, but they do not always understand the nuances we have to deal with, so the free sample is a good way to uncover many of these ahead of time.
Price for a conversion is based more on what the data is worth to you than the marketplace. First, there is no marketplace for these services, so judgments have to be made on what the value proposition is and how comfortable you are with the competency of the company doing the work. Low cost is not always the best criteria. If I were in IT, I would ask at least three different people from different departments and skill levels in my company to interview the companies and then sit down together to compare notes. Ask yourselves if you are all comfortable that this company can take your valuable data and preserve it.
It’s the same thing as “Beam me up, Scotty.” When your data gets in the transporter, it has to come back together again properly or it’s of zero value.
Is there a rule of thumb about how resource intensive such a project will be? Can you recommend a formula for estimating how much time a conversation will take?
Our pricing model is by the total GB of images and data structure. We haven’t changed the model in years and it has proven 90 percent reliable. We get into trouble when the customer doesn’t give us complete information.
What is the state of automated tools to help with data conversions?
This is a very customized business. Doing the same type of conversions over and over lends itself to using similar tools, but each system has its differences so they all have to be set up manually to be sure data maps over correctly.
What consolidation is occurring in the industry? Are there new document management systems available as old ones discontinue service, or are options declining?
I think options are broadening. New ones are always coming on the market, but there are some pretty good tried-and-true oldies also. Laserfiche, for instance, has been around since 1985, but over that period the user interface, the feature set, the support options, and the training options have grown exponentially as have those of several other vendors. Others such as Filenet have bought up other companies such as OTG, which means they have a broader offering. There are a number of price/performance options to a company. Just perform a Google search on “document or records management” and see what turns up.
Archiving is an important part of compliance today and will become more important in the future, so the market will drive these opportunities. Another option is companies that provide EMR (electronic medical record) systems are integrating a type of document management in their solution. These are not as robust as some of the standalone records management (RM) systems, but in some cases they’re adequate. Where they are not look, for RM solutions that will easily integrate with the application you need to archive. Does it have a comprehensive SDK/API that works with several languages? Remember, one size does not fit all and any one application can not solve all your company’s requirements.