As dedicated surfers, we know there’s enormous amounts of information on the Internet. However, as developers, we also know it’s not easy to harness that power in our applications because the data is in a human-friendly but somewhat computer-hostile format: HTML.
XML, the eXtensible Markup Language, promised to change all this. When XML appeared, there was talk of more-intelligent searches, thanks to better structure of the content. A couple of months down the road, it’s time to stop, review the progress so far and the road ahead.
One word of warning first: XML was only finalized in February 1998. So, at the time of this writing, none of its companion standards (XSL, XLink, DOM, RDF, SMIL and so on) are final. The situation is a familiar one for Internet developers: There is a frustrating gap between where we are and where we’re heading to. In this article, we will try to draw a clear line between what is available and forward-looking statements.
A look at three-tierBefore going any further, let’s briefly review the three-tier architectures in general. These architectures implement a form of client/server where the application is divided into three layers (see figure 1):
- the data layer
- the logic layer, or middle-tier
- the presentation layer, or client.
Figure 1: Three-tier architecture
The logic layer differentiates three-tier from “traditional” two-tier client/server architectures. In practice, it means that the fat clients from traditional client/server have been broken into two pieces:
- a thin client, usually a Web browser for display and data entry
- the application logic, running on a server.
This architecture has some unique advantages — for example, because the client is simple and logic is centralized, deployment is made easier.
Why XML?Three-tier client/server is not new, several tools, such as Netscape Application Server, support this architecture. These applications are typically built on top of middleware such as CORBA, DCOM or RPC. However, this middleware was designed primarily for distributed applications that are implemented within an organization. Traditional middleware is very efficient for intra-organization applications; unfortunately, it falls short when trying to build applications that span several organizations.
Indeed, traditional middleware enforces a very strong coupling between the various tiers. Such a strong coupling is difficult to achieve when the tiers are run by different organizations. It requires a very significant standardization effort, which is not always acceptable.
For multi-organization applications, there is a need for a different approach to middleware, with a different trade-off in terms of performance and flexibility — slightly less efficient but dramatically more flexible. For these applications, XML appears a viable alternative.
One exampleImagine we want to build a worldwide investment system. Such an application would integrate data from stock exchanges (NYSE, the Belgian stock exchange or other bourses), mutual fund managers (Fidelity, Schwab and so forth) and more from all over the world. The information is readily available on the Internet. In fact, there’s enormous amount of investor information online. However, the information is fragmented: Some data is on one Web site, other data on another. Furthermore, the information is available in a format that is intended for human beings — HTML — so it’s difficult to compare and aggregate data from different sources automatically.
With HTML technology, the normal approach is hyperlinking. In other words, one would create a Web site which links to the other sites, à la Yahoo!.
However, if the same data was available in XML (acting as a data layer), then an investment application could munch it all and compile useful reports in XML (being the logic layer) for the end-user (through the presentation layer). This is the vision behind three-tier XML distributed application.
Obviously the same vision holds with non-financial applications, such as traveling (pulling together weather, flights and sightseeing data), electronic shopping (aggregating pricing, shipping and payment data) or even within an intranet (combining internal production with a supplier’s stock and pricing).
Some new toolsTool vendors and think tanks (such as the XML/EDI Group) are lining up behind the idea of Web-wide distributed applications. We choose, somewhat arbitrarily, two techniques as an illustration:
- WDDX from Allaire
- WIDL from webMethods.
WIDL is another proposal, put forward by webMethods. WIDL automates Web site navigation, which makes it possible to convert existing Web sites into XML data sources. WIDL is implemented in webMethods’ B2B Integration Server. The server is a middle layer for XML sources and converted HTML sources.
Likewise, major vendors including Oracle, SAP, IBM and Microsoft have announced their plans to integrate XML in their transaction-based solutions. For example, SAP has announced it will release an XML API to its flagship product, R/3.
And existing onesAs noted previously, many of the most powerful XML tools are still announcements. However, even with today’s tools it is possible to build XML-enabled Web sites.
Listing 1: Hacking SSJS into producing XML
ConclusionXML is unlikely to replace traditional middleware such as CORBA or DCOM for intranets, but it could enable a new class of distributed applications that span organizations. In fact, thanks to XML, one day our applications may be able to tap the largest online database: the Web.
Special thanks to XML/EDI Group members, in particular Martin Bryan and David Webber, for many enlightening discussions of XML.
Benoît Marchal runs his own consulting company, Pineapplesoft sprl. His interests include distributed applications, object-oriented programming and design, system programming, hand-helds, and computer languages (notably including Java). He also enjoys teaching and writing. He can be contacted at email@example.com.
|Cold Fusion Studio V4.0|
|Delphi V4.0 client-server 2-yr Maintenance|