Issues and Challenges Facing Legacy Systems, Page 2
When Legacy Migration Makes Sense
Legacy migration is best suited for companies looking to implement a new business model, such as an Internet-based procurement or other B2B system on either of the two major platforms, J2EE from Sun Microsystems and partners or Microsoft's .NET. Both emerging development/deployment environments support XML and SOAP, standards used in exporting and consuming Web services across heterogeneous platforms. Another justification for embarking on a complex migration project would be the increasing expense and difficulty of maintaining and modifying the old system.
The first step in the migration process is the analysis and assessment of the legacy system. Typically, this includes taking stock of all application artifacts, such as source code, copybooks, and Job Control Language. A complete database analysis is also necessary, including tables, views indexes, procedures and triggers, and data profiling.
Database vendors, such as Oracle and IBM, provide tools that help automate the database migration, which is separate from the application migration. All source database schema and elements must be mapped to the target database. Depending on the complexity of the system, from 80% to 90% of the migration process can be automated. However, there will always be issues with stored procedures and triggers that are indecipherable by an automated parser, requiring manual tweaking.
Database migration can add considerable time to completing a project. For example, Mercy Ships, a Christian charity organization headquartered near Tyler, Texas, migrated its 4GL Informix application on a SCO Unix server to Informix's Java-based Cloudscape database running on Linux. The project was necessary to reduce maintenance costs of the system used to track contributors and donations. In addition, the new system gave Mercy Ships a modern development platform for modifying and adding services.
Using an automated migration tool, Mercy Ships ported its 80,000-line application, called PartnerShip, in less than a month. But the total project, including setting up seven locations in Europe and the U.S. with databases and writing Java servlets for maintenance and replication, took seven months. If everything had stayed on the same database, then the project would have been finished in about a month.
Legacy Application Migration
In the early stages of the migration process, core business logic must be identified and mapped out to show the interrelationships of the code performing the application's business function. Program-affinity analysis can be performed to produce call maps and process flow diagrams, which contain program-to-program call/link relationships. These maps and diagrams make it possible to visually identify connected clusters of programs, which are good indicators of related business activity. Companies providing tools to help in the analysis and assessment process include MigraTEC, Netron, Semantic Designs, and McCabe and Associates.
Once core business logic is identified and mapped, it can be broken up into standalone components deployable on client/server and Internet-based environments. This process creates collections of programs that perform a specific business function. In addition, the components have clearly defined APIs and can be accessed through modern, industry-standard protocols. Components can remain on a mainframe such as COBOL, PL/I, or Natural programs, or be re-deployed into modern, distributed environments, such as Java 2 or .NET.
As part of the transformation process, a special class of components that exist in every system needs to be identified. These components perform common system utility functions such as error reporting, transaction logging, and date-calculation routines, and usually work at a lower level of abstraction than business components. To avoid processing redundancy and to ensure consistency in system behavior, these components need to be standardized into a system-wide reusable utility library.
When selecting a migration tool, organizations need to consider the quality of the generated code. Tools that map every component in the legacy language to a code equivalent in the target language can be a major time saver. Developers who are experts in the legacy language will find it easier to understand the generated code if it comprises representations of the legacy code's language and structure.
In addition, organizations may find it more convenient to break up the conversion process into two steps. The first is the translation of existing code, data migration, and associated testing; the second is the addition of new functionality. Before making any changes to program logic and structure, organizations should first test the end-result of the migration process for functional equivalence with the original legacy application.
The Central Bank of Costa Rica is an example of an organization that chose migration to leverage the advantages of Microsoft's new .NET platform. BCCR had 1.3 million lines of Visual Basic 6.0 code in its automated payment system and real time gross settlements and clearinghouse service for inter-banking operations. The system serves more than 65 financial customers and processes about $500 million worth of transactions a day.
The project, scheduled for completion by the end of the year, will give BCCR a platform in which XML batch files can be validated faster, leading to fewer delays for customers. In addition, .NET applications provide support for the Windows messaging service, which will enable the Central Bank to send notifications to customers about information available on the bank's extranet, or when checks and fund transfers have cleared. (For more detail, see http://www.microsoft.com/presspass/features/2002/sep02/09-16CentralBank.asp.)
Despite the challenges, legacy modernization is crucial for organizations spending too much to maintain the business value of their outdated information systems. Also driving the need for change is the industry's movement toward new Internet-based platforms, such as .NET and J2EE. These new computing paradigms leverage a component-based, distributed computing model capable of automating business processes internally or with partners via Web services. Adopting newer computing systems can cut operating costs and make it easier to adapt an IS to market changes or competitive pressure.
A variety of options exist, including replacing the system with a packaged application or non-intrusive measures such as screen scraping and code wrapping. Each of the approaches makes sense under certain circumstances. The latter methods provide quick and inexpensive access to legacy functionality, while the former can eliminate legacy applications in which the code quality is too poor to migrate. But for those companies looking to preserve and extend the functionality of their older system on a modern platform, legacy transformation using today's migration tools may be the most cost-effective approach.
Information from the Following Articles and White Papers Were Used in this Article
"Legacy Transformation," Declan Good, copyright Club De Investigacion Technologica 2002
"The New Way of Doing Migration," Maynor Alvarado A, ArtinSoft, May 2002
"New Value for the Next Generation: Legacy Modernization," Hurwitz Group Inc., February 2002
"Leveraging Legacy Systems in Modern Architectures," Len Erlikh, Relativity Technologies Inc., 2001
"Chickens and Turkeys Migrate, But Not Necessarily In IT," Ben Wilson, ANUBEX, February 2002
About the Author, Dr. Federico Zoufaly
Dr. Zoufaly holds a Ph.D in Computer Science from the University of Florida and is currently an executive vice president in charge of operations at ArtinSoft. He has researched extensively into alternative ways of upgrading legacy applications and is a regular presenter on the subject at important technological events in North America and Europe. His most recent presentations include: VSLive! Orlando 2002 Conference, VSLive! San Francisco 2001, Microsoft Professional Developer's Conference PDC 2001.
Dr. Zoufaly, a faculty member of both the ITCR (Costa Rican Institute of
Technology) and the University of Florida Computer Science Departments, has
had an active participation in several local electronic and computer
research projects. He is a founding member of the Costa Rican Association of
Electronic Engineers and has served as a member of its Board of Directors
since 1992, holding its Presidency on two occasions. He is also a member of
the Association for Computing Machines, the IEEE, as well as an active
member of the College of Technology Engineers.
Dr. Zoufaly can be reached at firstname.lastname@example.org.