JavaCapital Budgeting: IT Project Portfolio Optimization Redux

Capital Budgeting: IT Project Portfolio Optimization Redux

This is the second in a series of articles that examines decision-making methods for the selection or rejection of individual projects throughout the project portfolio management process. These methods determine whether or not a given project (either proposed or in process) should be included in your next capital budget.

Capital budgeting is a multi-faceted activity that includes: the formulation and articulation of long-term goals; searching for new and profitable uses for investment funds; the proactive preparing of technical resources, marketing (of both the external or internal kind), and financial forecasts and estimates; the preparation of appropriation and control budgets and integration of these budgets in the firm’s information system; the economic evaluation of alternative projects; and the post-audit of performance of past projects.

Although all of these capital budgeting activities are important, this series narrowly examines only the following, albeit very-powerful, methodologies for selecting or rejecting projects:

  • Optimization using deterministic input variables (Previous article)
  • Optimization using stochastic input variables (This article)
  • Decision-tree analysis, with and without real options (Next article)
  • Other (See References 12, 13, and 14)

These articles will focus on the sorting of projects based on their risk-adjusted returns calculated as their net present value (NPV), which is based on cash. However, other bases such as economic value added (EVA), which is based on profit, are commonly employed. For a discussion of the rudiments of NPV, see Appendix A of the previous article—Reference 15 below.

My rationale for preferring NPV over EVA: The refrain “Cash is a fact, but profit is an opinion.” reflects the thought that the profit and loss statement (also known as the income statement) is subject to much interpretation using the GAAP (Generally Accepted Accounting Principles—a widely accepted set of rules, conventions, standards, and procedures for reporting financial information, as established by the Financial Accounting Standards Board), whereas cash is tangible and well understood without ambiguity.

Furthermore, these articles put financial performance at the top of the value chain. (Although your company may insist on profitability (financial viability), a regulator may insist on its social or economic viability.) Insofar as projects return more in financial resources than they absorb, they will be selected for inclusion in the next budget. But, as discussed in the previous article, when the benefits of diversification are also considered, this financial-performance rule for selection may be waived.

Why Organizations Undertake [Different Kinds of] Projects

First, take a look at the fundamental question of why organizations undertake projects and, then, the different kinds of projects that they might want to take on.

When investors buy securities, they expect to earn, in the long run, rates of return that will compensate them for time and risk: not more and not less. This is not the case, however, in investments in real assets such as new computer hardware or software, where each investment is unique in terms of such factors as timing, the way it fits into the overall operation of the firm, the firm’s operating expertise, and the marketing network available. But, above all, unlike investments in the security market (where a known set of securities is available to all), investments in real assets are the result of conditions or goals that are unique to a particular firm. Thus, a firm can develop investment proposals whose costs are below—even substantially below—the present value of their expected future cash flows. Such investments give investors expectations of compensation that exceeds the market’s required rate of return. Consequently, they increase demand or the firm’s stock and bid up the price. Clearly, the search for, selection, and adoption of such investments is an important objective of the firm because they help to increase the value of the stock.

Classification by type of cash flow

Projects’ forecasted cash flows are examined and classified using criteria such as whether they are short- or long-lived and whether or not they are expected to change sign more than once. Cash flow estimation is the most critical, and most difficult, part of the capital budgeting process. Cash flow components sometimes must be forecasted many years into the future, and estimation errors are bound to occur. However, large firms evaluate and accept many projects every year, and as long as the cash flow estimates are unbiased and the errors are random, the estimation errors will tend to cancel each other out. That is, some projects will have NPV estimates that are too high and some will have estimates that are too low, but the average realized NPV on all projects accepted should be relatively close to the aggregate NPV estimate. Later, I’ll comment on the fact that these forecasts are often biased and offer some suggestions on how to deal with this problem.

Classification by size

A company might classify projects by size, as follows:

  • Routine proposals involving small investments
  • Capital projects that require no more than a certain amount of cash outlay
  • Major proposals

The size of the proposed project usually determines not only the depth of analysis to which the project is subjected but also the level within the company at which the project is approved or disapproved. For example, the company might establish the following approval authority:

  • Plant manager: Investments up to $10,000
  • Division manager: Investments up to $50,000
  • Vice President: Investments up to $150,000
  • Executive Vice President: Investments up to $500,000
  • Executive Committee: Investments over $500,000

It is unlikely that the plant manager would use sophisticated capital budgeting techniques when making investments of under $10,000; but such techniques are warranted when large investment proposals are analyzed.

Classification by purpose

When an investment proposal is advanced, the individual or group advancing the proposal usually submits an appropriation request that provides relevant information about the proposal. In addition to the dollar amount, the proposal usually specifies the function or purpose of the investment. These might be categorized in the following way:

  • Upgrading an existing software application
  • Developing a new software application
  • Expanding existing products in existing markets
  • Expanding existing products into new markets
  • Others

Whereas the benefits and costs of routine upgrading of existing software applications are only roughly estimated, very careful analysis is required when it comes to such projects as expansion to new markets, developing a new software application, and the like. In short, the larger the proposed investment and the less routine its nature, the more formal and rigorous the analysis should be.

Relationship between investment projects

The classification of proposals by size and purpose helps determine who in the firm will make the decision to accept or reject the proposal and what resources will be allocated to the analysis. But, there is one more classification important to the evaluation of the project’s profitability. (Profitability can come from increased revenues from external product sales or decreased operations costs from internal use … say, of newly developed software.) You can distinguish among

  • Independent projects
  • Dependent projects
  • Mutually exclusive projects

This classification is important when several investment proposals are being evaluated simultaneously.

Independent, dependent, and mutually exclusive projects

Two projects are said to be economically independent if the acceptance or rejection of one has no effect on the cash flows of the other. If the cash flows from an investment are affected by the decision to accept or reject another investment, the first investment is said to be economically dependent on the second. And, two investments are said to be mutually exclusive if the potential benefits derived from one investment will completely disappear if the other investment is accepted. The acceptance of one of two (or more) mutually exclusive projects results in the rejection of all of the others.

Two projects could be mutually exclusive because of limited capital or manpower or bothSiri Krishnathe two resources simulated in the optimization study discussed below.

Sometimes, the amount of cash available for investment by the firm in any one period of time is limited by management; in other words, there’s capital rationing. In those cases, alternative investments compete for funds.

Other classifications

Clearly, many other classifications are possible. Some firms assign a priority rating to alternative proposals; classifying projects as urgent, required, desirable, and so on. Others classify investment alternatives by the location of the projects within the firm, division, or even department. The various schemes are not mutually exclusive; a firm can, and many do, use all of the above-mentioned classifications at one stage or another of their budgeting process.

I’ve taken the time to discuss these rather mundane matters before advancing to a state-of-the-art, computer-assisted capital budgeting study to emphasize that the high-tech solution that follows does not take place in a vacuum!

Simulation and Optimization

Computer simulation is a tool that can be used to incorporate uncertainty (risk) explicitly into spreadsheet models. A simulation model is the same as a regular spreadsheet model except that some cells include random quantities. Each time the spreadsheet recalculates, new values of the random quantities occur, and these typically lead to different bottom-line results. By forcing the spreadsheet to recalculate many times, a business manager is able to discover the results most likely to occur, those that are least likely to occur, and best-case and worst-case results. This is just as if he or she ran hundreds or thousands of what-if analyses on the spreadsheet, all in one sitting.

Computer optimization uses simulation, sometimes called Monte Carlo simulation, to do a risk analysis on each possible solution generated during an optimization.

In the previous article, an optimization was conducted on an economic model that didn’t incorporate uncertainty (risk) in the definition of the input variables (mean and standard deviation in that case). In this article, uncertainty (risk) will be included in the definition of the input variables (manpower and capital usage).

It’s important to note that simulation and optimization are sometimes applied only when large projects are being evaluated and after considerable effort has been made to estimate future cash-flows.

Capital budget (portfolio) optimization enables the decision maker to set goals and constraints for the project portfolio at the same time. Then, optimization can use a variety of what-if scenarios to focus on what the portfolio delivers rather than on whether individual projects are good or bad. By asking managers to describe the outputs they want from a portfolio, the debate is around the trade-off of strategic priorities rather than individual projects. This avoids many of the conflicts with sponsors and project managers inherent in other approaches and results in decisions that build confidence among company stakeholders.

Figure 1 illustrates a project selection scenario that’s subject to resource limitation and uncertain usage. This screen shot shows the Palisade RISKOptimizer optimization/simulation tool (a Microsoft Excel spreadsheet add-in) maximizing the NPV contributed by the selected projects. Ten projects (A,B,C,D,E,F,G,H,I, and J) are under consideration. Each project uses an uncertain amount of manpower (in thousands of hours) and capital (in millions of dollars). The simulation/optimization maximizes the mean NPV subject to the constraint that there is at most a 5% chance of using more capital or labor than is available. 70,000 man-hours and $85 million are available. Adjustable cells (C3 to C12) will be “1” for those projects selected or “0” for those not selected by the optimization process.

The optimization selected Projects B, C, D, and I, which earn an NPV of $107 million. There is no chance of a manpower shortage, but a 4% chance of a capital shortage. An authoritative discussion of this and other optimizations under uncertainty can be found in Reference 1. In addition, an overview of how the optimization engine is implemented is available in Appendix B below.

Figure 1: Capital budgeting with resource limitation and uncertain usage.

One shortcoming in the model shown in Figure 1 is the tacit assumption that each unit of manpower is fungible: That is, in one hour, day, or week, Nate, your most-junior programmer, will be as productive as Jane, your most-senior programmer. But, that’s generally not the case. So, to introduce these differences into your model, you could create a standard person and assign a standard-person coefficient to each programmer. Thus, Nate’s might be 0.9 and Jane’s might be 2.1. Then, the values for manpower in the spreadsheet model could be entered in standard-person units.

Alternatively, pay could be used to differentiate one programmer from another. Where this unit of measure reflects a constant, corporate-wide cost per unit of productivity, pay needs no adjustment before it is entered into the spreadsheet. Here, $1000 would presumably buy the project more than twice as many hours of Nate’s time as Sally’s. There are, of course, other refinements that could be made to this model. But, it’s here simply to introduce the process of optimization of an uncertain system. By the time you’ve read this entire series of articles on Capital Budgeting, you should be able to identify some of the other assumptions made above.

Probability and Statistics

In general, risk or uncertainty refers to the probability that some unfavorable event will occur. A probability distribution is defined as a set of possible outcomes, with a probability of occurrence attached to each outcome. The values for resource (Manpower and Capital) usage in Figure 1 appear to be constants. But, under the hood, they are probability distributions such as those shown in Figure 2.

Figure 2: Pop-up Define Probability Distribution windows

Figure 3: Specify correlations between your inputs via correlation matrices.

During a simulation, it is important to account for any correlation between input variables. Correlation occurs when the sampling of two or more input distributions are related. But, the random numbers generated by the pop-up windows (probability distributions) in Figure 2 are probabilistically independent. This means, for example, that if a random value in one cell is much larger than its mean, the random values in other cells are completely unaffected. They are no more likely to be abnormally large or small than if the first value had been average or less than average. Sometimes, however, this independence is unrealistic. Therefore, you want the random numbers to be correlated in some way. If they are positively correlated, large numbers tend to go with large numbers, and small with small. If they are negatively correlated, then large tend to go with small and small with large.

The dialog window shown in Figure 3 allows you to specify correlations between your inputs through correlation matrices. This limited illustration specifies a positive correlation of .8 between available capital and available resources for Project A. This might be the case for a project in which the cost of human resources generally consumed a very large part of the investment.

Figure 4: Specifying the goals, variables, and limits used by the optimizer.

Figure 4 shows the main dialog box by which you link the RISKOptimizer optimizer to the Excel spreadsheet. The dropdown list in Figure 4 is shown open in Figure 5.

Figure 5: Selecting the statistic to maximize or minimize

To select the statistic for the target cell which you wish to minimize, maximize, or set to a specific value, simply select the desired statistic from the displayed dropdown list. If you want to select a Percentile or Target for the target cell’s distribution, simply select Percentile(x) or Target(x), where “x” is an actual value.

Politics, Biases, and Other Vulnerabilities

For internal political reasons, management on occasion will be prepared to rearrange project priorities. In a sense, the capital budgeting decision represents a resolution of conflict (hopefully not too unfriendly) among colleagues in the same organization. In reality, the management of each division or department is constantly trying to outmaneuver its internal competition within the firm to win a bigger slice of the available funds. In such a situation, political and prestige factors sometimes may overrule rational decision-making, but in the longer run the race is not always to the swift, and the ability to present cogent arguments based on careful and accurate forecasting of the relevant cashflows is crucial.

At the same time that cash flow estimation is the most critical part of the capital budgeting process, it’s also the most difficult. Unfortunately, several studies indicate that capital budgeting cash flow forecasts are not unbiased—rather, managers tend to be overly optimistic in their forecasts, and as a result, revenues tend to be overstated and costs tend to be understated. The end result is an upward bias in net operating cash flows and thus an upward bias in estimated NPVs.

Forecasting errors result from our personal biases: For example, supporting evidence bias is our tendency to want to confirm what we already suspect and look for facts that support it. We avoid asking tough questions and discount new information that might challenge our preconception. Suppose, for example, you are considering an investment to automate some internal business function. Your first inclination is to call an acquaintance who has been boasting about the good results his or her organization obtained from doing the same. What response do you expect other than, “It’s the right choice”?

Despite an inclination to look for supporting evidence, it is usually much more informative to seek out contradictory evidence. Confirming evidence often fails to discriminate among possibilities well. To illustrate, in one study students were given the initial sequence of numbers 2, 4, 6 and told to determine the rule that generated the numbers. To check hypotheses, they could choose a possible next number and ask whether that number was consistent with the rule. Most students asked whether a next number “8” would be consistent with the rule. When told it was, they expressed confidence that the rule was, “The numbers increase by 2.” Actually, the rule was, “Any increasing sequence.” A better test would have been to check whether a next number incompatible with their hypothesis (for example, “7”) was consistent with the unknown rule.

Sometimes, poor forecasting occurs because divisional (or departmental) managers are rewarded on the basis on the size of their divisions (or departments); hence, they are motivated to maximize the number of projects accepted rather than the profitability of the projects. Even when this is not the case, managers often become emotionally attached to their projects, and therefore are unable to objectively assess a project’s potential negative factors.

A first step in recognizing poor cash flow estimation, especially for projects that are estimated to be highly profitable, is to ask this question: What is the underlying cause of this project’s profitability? If the firm has some inherent advantage, such as patent protection, unique expertise, or even a well-regarded brand name, projects that utilize such an advantage may truly be extraordinarily profitable. However, in the longer run, profitability will probably be eroded by competition until the returns on projects within an industry are close to the normal return, which is the cost of capital.

If there is reason to believe that this situation exists, and if division (department) managers cannot identify any unique factors that could support a project’s estimated high profitability, senior management should be concerned about the possibility of estimation bias. Recognizing that biases may exist, senior managers at many firms develop data on the forecast accuracies of their divisional (departmental) managers, and then consider this information in the capital budgeting decision process. Some companies lower the cash flow estimates of managers whose track records suggest that their forecasts are too rosy, while other companies increase the cost of capital, or hurdle rate, applied to such project submissions. I’ll have more to say on the cost of capital in the next article of this series.

Conclusion

The computational side of complex project portfolio selection is greatly aided by the availability of relative low-cost, high-speed computers on which you can model the complex equations of modern economic theory. However, this high technology is only one link in the chain of things all of which must be done right before you can deliver a beneficial capital budget. So, before you do anything else, you need to address the issues outlined in the “Politics, Biases, and Other Vulnerabilities” section. Otherwise, the good names of Mathematics, Economics, and the like may be besmirched by the bad practice of more “Garbage in, Garbage out.”

Appendix A: Other Methods

A concise treatment of the following, commonly-used, cash- and profit-basis methods for project and portfolio selection in the capital budgeting process can be found in Reference 12.

  • Cash flow
  • Discounted cash flow
  • Discount rate
  • Present value
  • Net present value
  • Profitability index

Some of the methods described in this reference, for example payback, are trivial and oversimplify the real world. But, their use may be appropriate for small projects because such methods are not expensive to carry out. They’re often applied to individual projects, not portfolios.

Appendix B: Everything You Wanted to Know About how RiskOptimizer Works Under the Hood, but Were Afraid to Ask

Most real-world problems have several solutions, and some may have an infinite number of solutions. The purpose of optimization is to find the best possible solution among the many potential solutions for a given problem in terms of some effectiveness or performance criterion. A problem that admits of only one solution does not have to be optimized. Optimization can be accomplished by many strategies ranging from quite sophisticated analytical and numerical mathematical procedures to the intelligent application of simple arithmetic. Nowadays, the former can be used by the neophyte thanks to the powerful computational engines that run “under the hood” in commercial-off-the-shelf tools such as RiskOptimizer.

Regardless of whichever optimization method is used, the model should reasonably represent the significant features of the real process if the extremum sought is to be meaningful. Even when this qualification is met, typical difficulties encountered with the model are:

  • The criterion to be optimized may be insensitive to changes in the independent (decision) variables, so that no clear-cut extremum can be located.
  • The criterion to be optimized or one or more of the constraints may become unbounded in the range of the search for the extremum. The way to overcome this difficulty is to suitably restrict the range of the independent variables by adding constraints to the problem or to reformulate the mathematical model.
  • There may be poor scaling among the variables. Scaling difficulties can occur, for instance, when one of the terms in the criterion is of a much different order of magnitude than another in view of the significant figures in each term. Then, the criterion is insensitive to changes in the values of the variables in the small term. It is not easy to rescale in a mathematical model.
  • There may be interaction among the variables in a poorly designed mathematical model. Parameter interaction can be illustrated by examining an extremely simple criterion in which two parameters are multiplied by each other:
  • y = 2x1x2 + 10

    The individual values of x1 and x2 can range over any series of values for a given value of the product x1x2. Scaling is more difficult if interaction exists.

Figure 6: Hill-climbing optimization technique.

RISKOptimizer and Evolver (the Excel add-in discussed in the previous article and the source of the genetic algorithm-based optimizer used in RISKOptimizer) do not use hill climbing as shown in Figure 6. Rather, they use a stochastic, directed search technique, dubbed a genetic algorithm, as shown in Figure 7. This lets RISKOptimizer jump around in the solution space of a problem, examining many combinations of input values without getting stuck in local optima. In addition, RISKOptimizer lets good scenarios communicate with each other to gain valuable information as to what the overall solution landscape looks like, and then uses that information to better guess which scenarios are likely to be successful.

Figure 7: RISKOptimizer generates many possible scenarios, and then refines the search based on the feedback it receives.

Traditional Excel-based optimization problems analyzed using Evolver are comprised of:

  • An output or “target” cell that you want to minimize or maximize
  • A set of input or “adjustable cells” whose values you control
  • A set of constraints that need to be met, often specified using expressions such as COSTS<100 or A11>=0

During an Evolver optimization, the adjustable cells are changed across allowable ranges you specify. For each possible set of adjustable cell values, the model is recalculated, and a new value for the target cell is generated. When the optimization is complete, an optimal solution (or combination of adjustable cell values) is found. This solution is the combination of adjustable cell values which yields the best (in other words, minimum or maximum) value for the target cell while satisfying the constraints you’ve entered.

When a model has uncertain elements, however, Evolver cannot generate optimal solutions. In the past, many optimization models just ignored uncertainty, making models unrealistic but optimizable.

With RISKOptimizer, the uncertainty present in a model may be included and reliable optimal solutions that take that uncertainty into account can be generated. RISKOptimizer uses simulation (from @RISK, Palisade Corporation’s risk analysis and simulation add-in for Excel) to deal with the uncertainty present in the model and uses genetic algorithms (from Evolver) to generate possible values for the adjustable cells. The result of simulation optimization is the combination of values for the adjustable cells that minimizes or maximizes a statistic for the simulation results for the target cell. You may, for example, want to find the combination of adjustable cell values that maximizes the mean of the target cell’s probability distribution, or minimizes the standard deviation.

For modeling uncertainty, RISKOptimizer allows you to describe the possible values for any spreadsheet element using any of the probability distribution functions available in @RISK—two of which are shown in Figure 2. A value of 10, for example, in a spreadsheet cell could be replaced with the @RISK function =RiskNormal(10,2). This would specify that the possible values for the cell are described by a probability distribution with a mean of 10 and a standard deviation of 2. As in @RISK, probability distributions can be correlated by using @RISK functions such RiskCorrmat and DepC or the dialog shown in Figure 3.

When optimizing, RISKOptimizer runs a full simulation for each possible trial solution that is generated by its optimizer. In each iteration of a trial solution’s simulation, probability distribution functions in the spreadsheet are sampled and a new value for the target cell is generated. At the end of a simulation, the result for the trial solution is the statistic for the distribution of the target cell which you want to minimize or maximize. This value then is returned to the optimizer and used by the genetic algorithms to generate new and better trial solutions. For each new trial solution, another simulation is run and another value for the target statistic is generated. See Figure 8.

As in traditional optimizers, constraints that need to be met can be entered in RISKOptimizer. Constraints can be checked either at each iteration of a simulation (an iteration constraint) or at the end of each simulation (a simulation constraint). Iteration constraints are typically traditional Evolver style constraints, such as A11>1000. Simulation constraints are constraints that reference a statistic on the distribution of simulation results for any cell in your model you specify. A typical simulation constraint could be “Mean of A11>1000” or the mean of the distribution of simulation results for cell A11 must be less than 1000. As in Evolver, constraints can be hard or soft, and a violated hard constraint causes a trial solution to be rejected.

Because large numbers of simulations are being run by RISKOptimizer, two important techniques are used to minimize runtimes and generate optimal solutions as quickly as possible. First, RISKOptimizer uses convergence monitoring to determine when a sufficient number of iterations have been run (but not too many). This insures that the resulting statistic from the target cell’s probability distribution is stable, and that any statistics from output distributions referenced in constraints are stable. Secondly, RISKOptimizer uses Evolver’s genetic operators to generate trial solutions that move toward an optimal solution as quickly as possible. RISKOptimizer comes with a set of simulation statistics functions that can be used to return simulation results directly to your spreadsheet. The function RiskMean(cell reference), for example, returns the mean of the simulated distribution for the entered cell directly to a worksheet cell or formula. In addition, any model built in RISKOptimizer can be directly simulated in @RISK when you want to get detailed graphics and statistics on the best model solution found by RISKOptimizer. Because RISKOptimizer‘s simulation is based on @RISK, no changes to a RISKOptimizer model are required to simulate it in @RISK!

Figure 8: Overview of simulation/optimization with uncertainty

Appendix C: Software Developer’s Kit

The RISKOptimizer Developer’s Kit (RODK) is Palisade’s optimization and simulation programming toolkit. The RODK allows you to build genetic algorithm optimization models that also incorporate Monte Carlo simulation using Windows and .NET programming languages.

This software developer’s kit offers access to all of the optimization and simulation technology available in RISKOptimizer.

Because “A picture is worth a thousand words,” take a look at the user interface shown in Figure 9 of a running RODK-built application. The objective of this desktop app is to maximize the mean return of a portfolio, while keeping the standard deviation of the portfolio below 0.1.

Figure 9: Desktop optimization app (.NET framework, C# language) developed using RODK.

For many users, the spreadsheet is the preferred modeling environment. However, many times an application written in a standard programming language needs optimization capabilities, and will have uncertain factors for which simulation can account. The application will have its own user interface, variables, and model calculations. In addition, a developer may want to allow users to access the application over a corporate network or over the Internet.

The RODK allows custom applications such as these to run [genetic algorithm] optimizations to maximize or minimize a particular output, while simulating the different possible scenarios of each trial solution. Applications written with the RODK often will run faster and can contain larger numbers of adjustable cells and distribution functions when compared with models written in spreadsheet versions of RISKOptimizer. This is because RODK applications are compiled and do not contain additional routines executed during spreadsheet recalculation.

References

  1. Winston, W. Decision Making Under Uncertainty with RISKOptimizer, Palisade (2000)
  2. Levine, H. Project Portfolio Management, Wiley (2005)
  3. Bonham, S., IT Project Portfolio Management, Artech House (2005)
  4. Gido, J., Clements, J. Successful Project Management 3rd Ed., Thomson (2006)
  5. Ross, S. et al. Corporate Finance, McGraw-Hill Irwin (2005)
  6. Shapiro, A. Capital Budgeting and Investment Analysis, Prentice Hall (2005)
  7. Knight, G. Analyzing Business Data with Excel, O’Reilly (2006)
  8. Carlberg, C. Business Analysis with Microsoft Excel 2nd Ed., Que (2002)
  9. Moskowitz, K., Kern, H. Managing IT as an Investment, Prentice Hall (2003)
  10. Snedaker, S. How to Cheat at IT Project Management, Syngress (2005)
  11. Jayaswal, S., Shetty, Y. Practical .NET for Financial Markets, Apress (2006)
  12. http://cbdd.wsu.edu/kewlcontent/cdoutput/TR505r/page15.htm
  13. http://cbdd.wsu.edu/kewlcontent/cdoutput/TR505r/page16.htm
  14. http://cbdd.wsu.edu/kewlcontent/cdoutput/TR505r/page17.htm
  15. http://www.developer.com/mgmt/article.php/3595036

About the Author

Marcia Gulesian has served as Software Developer, Project Manager, CTO, and CIO over an eighteen-year career. She is author of well more than 100 feature articles on Information Technology, its economics, and its management. You can e-mail Marcia at marcia.gulesian@verizon.net.

© 2006 Marcia Gulesian

Get the Free Newsletter!
Subscribe to Developer Insider for top news, trends & analysis
This email address is invalid.
Get the Free Newsletter!
Subscribe to Developer Insider for top news, trends & analysis
This email address is invalid.

Latest Posts

Related Stories