JavaIssues and Challenges Facing Legacy Systems

Issues and Challenges Facing Legacy Systems

Developer.com content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Maintaining and upgrading legacy systems is one of the most difficult challenges CIOs face today. Constant technological change often weakens the business value of legacy systems, which have been developed over the years through huge investments. CIOs struggle with the problem of modernizing these systems while keeping their functionality intact. Despite their obsolescence, legacy systems continue to provide a competitive advantage through supporting unique business processes and containing invaluable knowledge and historical data.

Despite the availability of more cost-effective technology, about 80% of IT systems are running on legacy platforms. International Data Corp. estimates that 200 billion lines of legacy code are still in use today on more than 10,000 large mainframe sites. The difficulty in accessing legacy applications is reflected in a December 2001 study by the Hurwitz Group that found only 10% of enterprises have fully integrated their most mission-critical business processes.

Driving the need for change is the cost versus the business value of legacy systems, which according to some industry polls represent as much as 85-90% of an IT budget for operation and maintenance. Monolithic legacy architectures are antitheses to modern distributed and layered architectures. Legacy systems execute business policies and decisions that are hardwired by rigid, predefined process flows, making integration with customer relationship management (CRM) software and Internet-based business applications torturous and sometimes impossible. In addition, IT departments find it increasingly difficult to hire developers qualified to work on applications written in languages no longer found in modern technologies.

Several options exist for modernizing legacy systems, defined as any monolithic information system that’s too difficult and expensive to modify to meet new and constantly changing business requirements. Techniques range from quick fixes such as screen scraping and legacy wrapping to permanent, but more complex, solutions such as automated migration or replacing the system with a packaged product.

A Brief History

Debate on legacy modernization can be traced more than a decade, when reengineering experts argued whether it was best to migrate a large, mission-critical information system piecemeal or all at once.

Rewriting a legacy system from scratch can create a functionally equivalent information system based on modern software techniques and hardware. But the high risk of failure associated with any large software project lessens the chances of success. Researchers from the pioneering 1991 DARWIN project at the University of California, Berkeley, listed several factors working against the so-called “Cold Turkey” approach:

  • Management rarely approves a major expenditure if the only result is lower maintenance costs, instead of additional business functionality.
  • Development of such massive systems takes years, so unintended business processes will have to be added to keep pace with the changing business climate, increasing the risk of failure.
  • Documentation for the old system is frequently inadequate.
  • Like most large projects, the development process will take longer than planned, testing management’s patience.
  • And finally, there’s a tendency for large projects to end up costing much more than anticipated.

DARWIN advocated the incremental approach, popularly referred to as “Chicken Little,” because it split a large project into manageable pieces. An organization could focus on reaching specific milestones throughout the long-term project, and management could see progress as each piece was deployed on the target system. Industry experts challenged this model several years later, saying the need for legacy and target systems to inter-operate via data gateways during the migration process added complexity to an already complex process. In addition, gateways were a significant technical challenge.

Many migration projects failed because of the lack of mature automated migration tools to ease the complexity and technical challenges. That started to change in the mid-1990s with the availability of tools from companies such as Anubex, ArtinSoft, FreeSoft, and Relativity Technologies. These tools not only convert legacy code into modern languages, but, in doing so, also provide access to an array of commercially available components that provide sophisticated functionality and reduce development costs. They help break up a legacy system’s business knowledge into components accessible through modern industry-standard protocols, a component being a collection of objects that perform specific business services and have clearly defined application-programming interfaces (APIs).

Choosing A Modernization Approach

The Internet is often the driving force behind legacy modernization today. The Web can save an organization time and money by delivering to customers and partners business processes and information locked within a legacy system. The approach used in accessing back-office functionality will depend on how much of the system needs to be Internet-enabled.

Screen scrapers, often called “frontware,” is an option when the intent is to deliver Web access on the current legacy platform. The non-intrusive tools add a graphical user interface to character-based mainframe and minicomputer applications. Screen scrapers run in the personal computer, which is used as a terminal to the mainframe or mini via 3270 or 5250 emulation. Popular screen scrapers include Star:Flashpoint, Mozart, and ESL. This technique provides Internet access to legacy applications without making any changes to the underlying platform. Because they’re non-intrusive, screen scrapers can be deployed in days and sometimes hours. However, scalability can be an issue because most legacy systems cannot handle nearly as many users as modern Internet-based platforms.

Legacy wrapping is a second non-intrusive approach. The technique builds callable APIs around legacy transactions, providing an integration point with other systems. Wrapping does not provide a way to fundamentally change the hardwired structure of the legacy system, but it is often used as an integration method with Enterprise Application Integration (EAI) frameworks provided by companies such as SeeBeyond Technology, Tibco, Vitria, and WebMethods.

EAI moves away from rigid application-to-application connectivity to more loosely connected message- or event-based approaches. The middleware also includes data translation and transformation, rules- and content-based routing, and connectors (often called adapters) to packaged applications. Vendors generally offer one of three system-wide integration architectures: hub-and-spoke, publish and subscribe, or business process automation. XML-based EAI tools are considered the state-of-the-art of loosely coupled modern architectures.

EAI vendors advocate wrapping as a way to tap legacy data while avoiding the misery of trying to modify the underlying platform. This approach also enables integration vendors to focus on the communications and connectivity aspects of their solutions, while avoiding the complexity of legacy systems. Like screen scraping, wrapping techniques are applicable in situations where there’s no need to change business functionality in the existing platform. However, none of the above approaches address the high cost associated with maintaining a legacy system or finding IT professionals willing to work on obsolete technology.

Another option is replacing an older information system with modern, packaged software and hardware from any one of a variety of ERP vendors, including Lawson Software, Manugistics, PeopleSoft, Oracle, and SAP. This approach makes sense when the code quality of the original system is so poor that it can’t be reused. However, deploying a modern ERP system is not a panacea. An organization either has to customize the software or conform to its business processes. The first option is necessary if the original system was custom-made and provided a critical business advantage. Over the last couple of years, major ERP vendors have added tools to help adapt their applications to a customer’s specific needs. However, customization still carries enormous risks that the system won’t be able to duplicate a unique set of business processes.

In addition, a packaged system requires retraining of end users whose productivity will slow as they adjust to a new way of doing their jobs. IT staff also will need training on the new system. Finally, ERP applications carry hefty licensing fees that remain throughout the life of the software.

When Legacy Migration Makes Sense

Legacy migration is best suited for companies looking to implement a new business model, such as an Internet-based procurement or other B2B system on either of the two major platforms, J2EE from Sun Microsystems and partners or Microsoft’s .NET. Both emerging development/deployment environments support XML and SOAP, standards used in exporting and consuming Web services across heterogeneous platforms. Another justification for embarking on a complex migration project would be the increasing expense and difficulty of maintaining and modifying the old system.

The first step in the migration process is the analysis and assessment of the legacy system. Typically, this includes taking stock of all application artifacts, such as source code, copybooks, and Job Control Language. A complete database analysis is also necessary, including tables, views indexes, procedures and triggers, and data profiling.

Database vendors, such as Oracle and IBM, provide tools that help automate the database migration, which is separate from the application migration. All source database schema and elements must be mapped to the target database. Depending on the complexity of the system, from 80% to 90% of the migration process can be automated. However, there will always be issues with stored procedures and triggers that are indecipherable by an automated parser, requiring manual tweaking.

Database migration can add considerable time to completing a project. For example, Mercy Ships, a Christian charity organization headquartered near Tyler, Texas, migrated its 4GL Informix application on a SCO Unix server to Informix’s Java-based Cloudscape database running on Linux. The project was necessary to reduce maintenance costs of the system used to track contributors and donations. In addition, the new system gave Mercy Ships a modern development platform for modifying and adding services.

Using an automated migration tool, Mercy Ships ported its 80,000-line application, called PartnerShip, in less than a month. But the total project, including setting up seven locations in Europe and the U.S. with databases and writing Java servlets for maintenance and replication, took seven months. If everything had stayed on the same database, then the project would have been finished in about a month.

Legacy Application Migration

In the early stages of the migration process, core business logic must be identified and mapped out to show the interrelationships of the code performing the application’s business function. Program-affinity analysis can be performed to produce call maps and process flow diagrams, which contain program-to-program call/link relationships. These maps and diagrams make it possible to visually identify connected clusters of programs, which are good indicators of related business activity. Companies providing tools to help in the analysis and assessment process include MigraTEC, Netron, Semantic Designs, and McCabe and Associates.

Once core business logic is identified and mapped, it can be broken up into standalone components deployable on client/server and Internet-based environments. This process creates collections of programs that perform a specific business function. In addition, the components have clearly defined APIs and can be accessed through modern, industry-standard protocols. Components can remain on a mainframe such as COBOL, PL/I, or Natural programs, or be re-deployed into modern, distributed environments, such as Java 2 or .NET.

As part of the transformation process, a special class of components that exist in every system needs to be identified. These components perform common system utility functions such as error reporting, transaction logging, and date-calculation routines, and usually work at a lower level of abstraction than business components. To avoid processing redundancy and to ensure consistency in system behavior, these components need to be standardized into a system-wide reusable utility library.

When selecting a migration tool, organizations need to consider the quality of the generated code. Tools that map every component in the legacy language to a code equivalent in the target language can be a major time saver. Developers who are experts in the legacy language will find it easier to understand the generated code if it comprises representations of the legacy code’s language and structure.

In addition, organizations may find it more convenient to break up the conversion process into two steps. The first is the translation of existing code, data migration, and associated testing; the second is the addition of new functionality. Before making any changes to program logic and structure, organizations should first test the end-result of the migration process for functional equivalence with the original legacy application.

The Central Bank of Costa Rica is an example of an organization that chose migration to leverage the advantages of Microsoft’s new .NET platform. BCCR had 1.3 million lines of Visual Basic 6.0 code in its automated payment system and real time gross settlements and clearinghouse service for inter-banking operations. The system serves more than 65 financial customers and processes about $500 million worth of transactions a day.

The project, scheduled for completion by the end of the year, will give BCCR a platform in which XML batch files can be validated faster, leading to fewer delays for customers. In addition, .NET applications provide support for the Windows messaging service, which will enable the Central Bank to send notifications to customers about information available on the bank’s extranet, or when checks and fund transfers have cleared. (For more detail, see http://www.microsoft.com/presspass/features/2002/sep02/09-16CentralBank.asp.)

Conclusion

Despite the challenges, legacy modernization is crucial for organizations spending too much to maintain the business value of their outdated information systems. Also driving the need for change is the industry’s movement toward new Internet-based platforms, such as .NET and J2EE. These new computing paradigms leverage a component-based, distributed computing model capable of automating business processes internally or with partners via Web services. Adopting newer computing systems can cut operating costs and make it easier to adapt an IS to market changes or competitive pressure.

A variety of options exist, including replacing the system with a packaged application or non-intrusive measures such as screen scraping and code wrapping. Each of the approaches makes sense under certain circumstances. The latter methods provide quick and inexpensive access to legacy functionality, while the former can eliminate legacy applications in which the code quality is too poor to migrate. But for those companies looking to preserve and extend the functionality of their older system on a modern platform, legacy transformation using today’s migration tools may be the most cost-effective approach.

Information from the Following Articles and White Papers Were Used in this Article

“Legacy Transformation,” Declan Good, copyright Club De Investigacion Technologica 2002

“The New Way of Doing Migration,” Maynor Alvarado A, ArtinSoft, May 2002

“New Value for the Next Generation: Legacy Modernization,” Hurwitz Group Inc., February 2002

“Leveraging Legacy Systems in Modern Architectures,” Len Erlikh, Relativity Technologies Inc., 2001

“Chickens and Turkeys Migrate, But Not Necessarily In IT,” Ben Wilson, ANUBEX, February 2002

About the Author, Dr. Federico Zoufaly

Dr. Zoufaly holds a Ph.D in Computer Science from the University of Florida
and is currently an executive vice president in charge of operations at
ArtinSoft. He has researched extensively into alternative ways of upgrading
legacy applications and is a regular presenter on the subject at important
technological events in North America and Europe. His most recent
presentations include: VSLive! Orlando 2002 Conference, VSLive! San
Francisco 2001, Microsoft Professional Developer’s Conference PDC 2001.

Dr. Zoufaly, a faculty member of both the ITCR (Costa Rican Institute of
Technology) and the University of Florida Computer Science Departments, has
had an active participation in several local electronic and computer
research projects. He is a founding member of the Costa Rican Association of
Electronic Engineers and has served as a member of its Board of Directors
since 1992, holding its Presidency on two occasions. He is also a member of
the Association for Computing Machines, the IEEE, as well as an active
member of the College of Technology Engineers.
Dr. Zoufaly can be reached at fzoufaly@artinsoft.com.

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories