Project ManagementTechnology of the Year 2006

Technology of the Year 2006 content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

In 2005, there were a number of technologies that continued to be important. Several, however, rose to the forefront of the news and to the forefront of what developers were doing. Of these, four technologies were recognized as the most important (and possibly most visible) in 2005. These four technologies were the finalists for the Technology of the year recognition. These are AJAX, RSS, SOA, and virtualization.

You might ask the question, where is blah? Of course, “blah” might be blogging, Web services, business intelligence, grid computing, multi-core processing, file sharing, or any of hundreds of other technologies that were present in 2005. Although some of these technologies were mentioned and nominated for the recognition, none of them received the same type of push as the four that made it into the finalist category.

And the Winner Is…

Prior to the start of the voting, it was anyone’s guess as to which technology would win as the Technology of the Year. All four of the finalists have made been very visible and very important in 2005 and will continue to be so in 2006. Early projections were on RSS taking the lead spot. Enterprise developers were promoting SOA for the winning recognition. Virtualization is a technology that many developers have come to love as well. It was, however, AJAX that took the top honor of Technology of the Year.


AJAX is not only tough on grease, but it also can be used for creating smoothly operating Web sites! AJAX, as named by Jesse James Garrett of Adaptive Path, stands for Asynchronous JavaScript and XML. It is an acronym that describes technologies that have been around for many years. Both XML and JavaScript have become commonplace in developing Web pages and Web sites.

In addition to XML and JavaScript, AJAX takes advantage of other technologies. This includes DHTML, which is primarily JavaScript and Cascading Style Sheets (CSS). Additionally, it takes advantage of requests and responses with HTTP.

AJAX takes all of these technologies and twists them to a new level by providing sites that look and feel like standard applications by having rich interactive content. This can include features such as the display of additional data, the zooming in or out of a picture or map, dragging and dropping of content, and more. This is all done without the normal delays or notable trips back to the server.

Figure 1: – Using AJAX for a color dialog

Sites appear to be more interactive by obtaining information in advance of needing it. Additionally, information can be pulled from the server as a background process before it is needed. The end result is that a site can seem to be more responsive.

In a recent article on, Andre Charland lists his top ten reasons why AJAX is here to stay. The best way to understand some of the power of AJAX and to understand why it has caught the attention of Web developers is to look at sites already using it. The following are just a few sites using AJAX functionality:

As you can see, the ability to zoom, drag & drop, and display additional data “on the fly” makes these sites seem much more responsive to what a user would want to experience on a site. As AJAX evolves, it is expected that even more robust applications will be created. This includes applications for functionality such as word processing, spreadsheets, and presentations.

AJAX is not an “all powerful” tool, nor is it the right technology to use for everything. Even so, it is being used to bring Web sites to the next level of interactivity. As such, it is not a surprise to see that it dominated the voting to become the Technology of the Year for 2006.

If you’d like to learn more about AJAX, you can check out Jesse James Garrett’s article, which includes a Q&A regarding some of the history.


Although AJAX won the Technology of the year, there were three other finalists that deserve recognition as well. One technology that might be better known than AJAX is RSS.

RSS, in its simplest definition, is a format for syndicating content from the Web. It is actually a standardized usage of XML. This standardization, or specification, allows anyone to pull information from an RSS file because they will be able to know what mark-up has been used.

RSS has its history going back to 1997. That year, Netscape designed RSS 0.90, which supported the scriptingNews format created by David Winer. Over the years, RSS evolved and released in several versions including .91, .92, and in 1993 the most recent release was 2.0. The RSS 2.0 specification was released to the public by Harvard under a Creative Commons license. This license allows you to use a work commercially and non-commercially.

As mentioned, RSS is simply an XML format that can be used to mark up content that is to be shared with others (syndicated). The format requires tags for identifying a channel or site, a link to the channel or site, and a description of the site or channel. A number of other information tidbits can be included to describe the RSS feed including languages, copyright, managing editor information, publication date, category, a time to live, and a rating.

Within the channel, a number of items can be specified. Whereas each item has to have at least a title or description, it may also include a link to the item, an author, a category, comments, a guid, a pubDate, a source, and more.

Each of the items has specific tags and rules for their use as detailed in the specification. The end result, however, is a standardized XML file layout that can be easily used by others. It is the flourishing of RSS aggregators and the extension of programs such as FireFox and Microsoft Internet Explorer that have helped to make even more popular.

One technology that was nominated for the product of the year that didn’t make it to the finalist list was blogs. While blogs have become popular, it is no surprise that serious blogs now incorporate RSS as well. This allows people to use an RSS aggregator to have the information from the blog come to them rather than having to go to the Web page where the blog resides. Figure 2 shows the FireFox browser with the Sage add-in for viewing RSS feeds. As you can see, the information from any feed is easily formatted and displayed because a single standard is being used.

Figure 2: Sage RSS Feed add-in for FireFox brings RSS content to you.

Additionally, RSS feeds can be incorporated into applications and Web sites. Ultimately, RSS has provided a new, standardized method for disseminating information. More importantly, it has made it easy for anyone to tap into that information and use it in a variety of creative ways.


Another concept that has risen to the top over the past few years is SOA, which stands for Service-Oriented Architecture. Like RSS and AJAX, the core of SOA is not a bunch of new technologies, but rather, a more focused use of existing technologies. Also like other technologies, what SOA means depends on whom you ask.

In general, service-oriented architecture focuses on how you design and architect solutions more so than the implementation of a given solution. More importantly, it focuses on breaking your business functions into individual components or services that can be used for one or more applications. By breaking business functions into services, the theory is that these services then can be easily used as needed in one or more applications.

In the past, it has been common to build big applications that have all the functionality built in. With SOA, rather than building in functionality that may be needed or used elsewhere, that functionality is built separately as a unique service that can be called upon and that may then return results to the application. More importantly, the service is built by abstracting the business logic and creating a formal process for interacting with any implementation of that logic. This formal process should contain information on the endpoints and interface for the service. Once the service is built and tested, it then is available to be used by any or all applications without the need to test its internals again. Thus, it is reusable.

Service Oriented Architecture (SOA) represents a collection of best practices principles and patterns related to service-aware, enterprise-level, distributed computing. – OASIS Web site.

SOA should shine the brightest in a distributed application. Additionally, a service-oriented architecture also shines when you are integrating among multiple organizations. It shines best when the focus is on the architecture and not the implementation. Although SOA can be confused with Web Services, they are not the same. Web Services are one type of implementation that can be used within an SOA-based solution. Web Services are not required.

Over the past year, the SOA has been solidifying in regard to how it should be applied or used. Additionally, standards groups have taken the initiative to begin to nail down some of the details of what “SOA” means. OASIS (Organization for the Advancement of Structured Information Standards) has a number of committees working to define a reference model for using SOA. There are a number of technical committees working on individual pieces of SOA, with an overall focus on workflows, translation coordination, orchestration, collaboration, loose coupling, business process modeling, and other concepts that support agile computing. You can find out more about the different OASIS committees at

The importance of SOA is reflected in the fact that every major company within the software industry is talking about it and publishing information regarding it. It is no surprise that SOA was the runner-up in the Technology of the Year 2006 award.


Virtualization was also a finalist for the technology of the year. Like the other technologies in this category, virtualization is not new, but rather is simply gaining notice because of what it offers and because computer hardware is now more than capable of supporting it effectively.

Virtualization is also a term that can be defined in numerous ways. Ultimately, however, virtualization refers to the simulation or the emulation of processes or systems. Virtualization is used in a number of ways. For the average developer, virtualization can be used to create several different operating systems running on a single machine.

For a network or system administrator, virtualization can be used to create backups or fail-safe versions of running systems that can be launched on backup hardware to keep systems running. Intel makes the following comment regarding the value of virtualization1:

“Virtualization reduces server proliferation, simplifies server management, and significantly improves server utilization, network agility, and network reliability. It does this by consolidating multiple applications onto fewer enterprise-level servers.”

During his presentation on the release of Microsoft Windows Server 2003 R2, Microsoft’s Bob Muglia stated, “Virtualization enables the consolidation of systems, thus reducing costs and getting more management control over your systems.” He went on to state that “We [Microsoft] see virtualization becoming ubiquitous as a part of your infrastructure over the next three to five years.”

One of the clearest definitions of virtualization comes from VMWare, a company that makes a leading virtualization product. It states that “Virtualization is an abstraction layer that decouples the physical hardware from the operating system to deliver greater IT resource utilization and flexibility.”2

There are numerous reasons for the growing popularity of virtualization. For example, legacy systems may not run on current hardware. By setting up a virtual environment on the hardware, the applications can be “tricked” into working properly.

Additionally, today’s hardware is getting faster and faster. In many cases, a machine’s resources are not being utilized fully. Virtualization can help change this in several possible ways. One way is that virtual systems can be created on the machine that runs separate processes. Alternatively, the unused resources of a machine can be linked into a grid system so that the “unused cycles” of the system can be given to the distributed grid application. All of the machines within the grid application result in a virtualization of a single application that is actually running across numerous machines.

There are two uses of virtualization that are helping the average developer. The first is the ability to create a virtual environment on their system to install new or beta software. This virtual environment allows them to test the software in an isolated sandbox, yet still on their primary machine. If the software ends up being unstable, the entire virtual machine can simply be removed without having an impact on the regular operations of the system. With the amount of beta software and pre-released software in the industry, this has helped numerous developers avoid the need for a second machine or the need to risk affecting their primary system’s operations.

The second use of virtualization that is helping the average developer is the ability to create virtual environments that simulate other systems and system setups. No longer does a developer need to have multiple systems to run different operating systems or configurations to test their applications. Rather, they can create virtual versions of the different configurations that they want to test, and they can do it all on a single (or at least fewer) system(s).

These uses of virtualization by developers are just one of the many things that have led Microsoft to adjust their licensing. In many cases, this is changing from the concept of “installing” a product on a machine to “running on a machine.” In other words, rather than charging you for each copy of a product installed on a machine, whether running in a virtual machine or on the primary operating system, Microsoft is only charging for the copies running at a given time. For products such as Windows server 2003 R2, they are even going as far as to state that the licensing includes four virtual instances.

If a technology can cause a company to change their licensing agreement, it is not surprising that it is having an impact on the industry.

In summary, virtualization at its simplest is being used to make multiple systems look and operate like as single system as well as being used to make a single system look and operate like multiple system.

In Conclusion recognizes AJAX on being the Technology of the Year 2006. SOA, RSS, and virtualization also deserve recognition as the finalists in this category. If these technologies have not had an impact on what you are doing today, there is a very likely chance they will in the near future.

In the meantime, it will be interesting to see what technologies move into the spotlight over the next twelve months. Podcasting and SaaS (Software as a service) are among a number of topics getting attention today. Only time will tell whether these will overshadow this year’s list of technologies.

To see the entire list of Product of the Year 2006 winners go to

# # #


  1. Virtualization Improves IT Efficiency, Reliability
  2. What is Virtualization

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories